Giving Up On Cybersecurity


Recent years have witnessed a dramatic increase in digital information and connected devices, but constant revelations about hacks make painfully clear that security has not kept pace.  Societies today network first, and ask questions later.

This Essay argues that while digitization and networking will continue to accelerate, cybersecurity concerns will also prompt some strategic retreats from digital dependence.  Individuals, businesses, and governments will “give up” on cybersecurity by either (1) adopting low-tech redundancies for high-tech capabilities or digital information, or (2) engaging in technological regression or arrest, foregoing capabilities that technology could provide because of concerns about cybersecurity risks.  After cataloguing scattered examples of low-tech redundancy and technological regression or arrest that have occurred to date, the Essay critically evaluates how laws and regulations have fostered situations where giving up on cybersecurity is necessary.  The Essay concludes by proposing ways that law can help to guide consideration of when to engage in low-tech redundancy or technological regression moving forward.



Recent years have witnessed a dramatic escalation in digital information and connected devices.  By some estimates, 90 percent of all data has been created in the last two years alone,1 and the number of connected devices doubled between 2010 and 2015 and will double again or perhaps even quadruple by 2020.2  The constant revelations about hacks of individuals, institutions, businesses, and governments, however, make painfully clear that security has not kept pace.3  Societies today network first, and ask questions later.

This Essay argues that while the next fifteen years will undoubtedly see the predicted dramatic expansions of digitization and networked technologies, they will also be marked by instances where cybersecurity concerns prompt some strategic retreats from digital dependence.  Individuals, businesses, and governments will “give up” on cybersecurity by either (1) adopting low-tech redundancies for high-tech capabilities or digital information, or (2) engaging in technological regression or arrest, choosing to forego capabilities that technology could provide because of security concerns.  Scattered examples of giving up on cybersecurity are occurring now, and they will and should become more frequent going forward.

Law often stands on the sidelines as technology charges ahead, intervening only after a significant delay, and that is certainly part of the story of the last fifteen years.  But sometimes law has pushed the adoption of technologies and digitization of information in circumstances where it now appears that giving up on cybersecurity may be a better option.  Law’s role in pushing toward digital dependency suggests that it may also have a role to play in pulling back and guiding consideration of when to adopt low-tech redundancy or technological regression.

Part I of this Essay first defines “low-tech redundancy” and “technological regression or arrest.”  Part I.A explains how these concepts respond to concerns about the confidentiality, integrity, and availability of information.  Part I.B catalogues examples of low-tech redundancy and technological regression or arrest that have occurred so far.  Part II then critically evaluates how laws and regulations have fostered situations where giving up on cybersecurity is necessary and proposes ways that law can help to guide consideration of when to engage in low-tech redundancy or technological regression moving forward.

I.  Two Ways to Give Up on Cybersecurity

The concept of “giving up” on cybersecurity captures two distinct phenomena spurred by cybersecurity concerns.  The first is “low-tech redundancy.”  Low-tech redundancy involves deliberate decisions to retain low-tech or no-tech versions of capabilities or nondigital versions of content.4  Think of this as knowing how to navigate without Google Maps’ turn-by-turn instructions or as maintaining paper backups.  Low-tech redundancy gives up on cybersecurity in the sense that it plans for the worst case scenario.  It assumes that cybersecurity measures will fail and that digital files or technological capabilities will be rendered inaccessible, inoperable, or untrustworthy.  When that occurs, the low-tech alternatives ensure resilience.5  They function as a failsafe, allowing continued operations and perhaps restoration of high-tech capabilities.  Until cybersecurity fails, the high-tech and low-tech mechanisms proceed in parallel.6

The second phenomenon is “technological regression or arrest.”  Technological regression involves walking back from technological capabilities because of concern about the inability to properly secure the technology.  Technological arrest is similar, capturing the deliberate decision not to proceed with developing a technical capacity because of security concerns.  Technological arrest occurs when the security concerns are appreciated ex ante; technological regression occurs when the security implications are recognized only after the technology has been developed or deployed.  Technological regression and arrest give up on cybersecurity in the sense that they assess that cybersecurity will fail and that the implications of that failure are sufficiently dire that the best course of action is to forego a technological capability entirely.

A.  Giving Up as a Response to Security Concerns

Low-tech redundancy and technological regression and arrest respond to concerns about the confidentiality, integrity, and availability of data, which information security specialists often call the “CIA triad.”7

A confidentiality problem involves access to data by individuals or entities that the owner of the data does not intend.8  Data breaches are confidentiality problems: criminals hack a business’s payment card system and obtain information, such as credit card numbers, that the business and individual card holders intend to keep confidential.  Intellectual property theft is another example of a confidentiality problem: hackers steal trade secrets, whose very intellectual property protection depends on their status as confidential information.

Availability problems occur when data or systems are not accessible to authorized users when they are supposed to be.9  For example, in 2012 and early 2013, distributed denial of service (DDoS) attacks rendered the websites of numerous U.S. financial institutions inaccessible by flooding them with traffic and thereby preventing legitimate customers from accessing their accounts.10  Data or technological capabilities could also be rendered unavailable due to physical damage to technical systems.  Imagine a physical attack that disables or destroys satellites used for the Global Positioning System (GPS).

Integrity problems may be even more pernicious than confidentiality and availability problems.  Data integrity problems involve unauthorized changes to data.11  Integrity problems are particularly troubling because they are difficult to detect and once any integrity problem is discovered, it tends to cast doubt on the accuracy and reliability of all the other data on the system.  The paranoia and time-consuming efforts to verify information that an integrity attack induces may be more damaging than the attack itself.12  Although attacks that compromise the integrity of data have been rarer than the widespread confidentiality and availability problems,13 they have occurred.  One significant example occurred with the alleged U.S. and Israeli cyberattack against Iranian nuclear facilities.14  The operation, known as “Stuxnet,” infected Iran’s Natanz nuclear facility with malware and caused nuclear centrifuges to spin out of control, rendering them nonoperational.15  In addition to the physical damage, the code recorded the centrifuges’ normal operation, and then while the centrifuges spun out of control, it “sent signals to the Natanz control room indicating that everything downstairs was operating normally,” a feature that one U.S. official called “the most brilliant part of the code.”16  According to the New York Times, the Iranians became “so distrustful of their own instruments that they . . . assigned people to sit in the plant and radio back what they saw,” and they shut down entire “stands” of 164 centrifuges, looking for sabotage, when a few centrifuges failed.17  U.S. officials have recently begun sounding warnings about integrity attacks.18

Low-tech redundancy primarily responds to concerns about availability and integrity.  Governments, businesses, other organizations, and individuals may be forced to rely on redundant low-tech capabilities or paper backups when high-tech systems are unavailable due to a cyberattack or when cyber intrusions undermine confidence in the reliability of high-tech methods or digital data.  Technological regression and arrest also respond to integrity concerns, and they may be used to address confidentiality problems as well.  In circumstances where, for example, the accurate functioning of a particular device is crucial, fear that the device cannot be secured—that its data will not remain confidential and that hackers could manipulate the data—may prompt a decision that the device should not be networked or that it should not be used at all.

B.  Giving Up So Far

Scattered examples of both low-tech redundancy and technological regression and arrest exist now and will become increasingly common in the coming years.

1.  Low-Tech Redundancy

One striking example of low-tech redundancy has emerged from the U.S. Naval Academy.  After a nearly twenty-year hiatus, the Academy has resumed teaching cadets to navigate by the stars due to concern about vulnerabilities in the systems that the U.S. Navy currently uses for navigation.19  The old-school navigation training will soon be expanded to enlisted personnel as well.20  The advent of GPS drove the abandonment of celestial navigation training in the 1990s.21  As Lieutenant Commander Ryan Rogers explained, the Navy “went away from celestial navigation because computers are great . . . . The problem is . . . there’s no backup.”22  Knowledge about celestial navigation now serves as the backup.  While experts have raised significant concerns about the security vulnerabilities of GPS,23 “you can’t hack a sextant.”24

Another example of a shift to nondigital redundancy involves voting machines.  In the wake of the controversy about “hanging chads” in the 2000 presidential election, jurisdictions across the United States moved to modernize their voting equipment, including by adopting electronic voting machines or direct record electronic machines.25  Almost immediately, computer scientists raised concerns about security vulnerabilities in electronic voting machines that could be exploited to tamper with election results.26  Some jurisdictions responded to the security concerns by establishing low-tech redundancy: a paper record of each vote cast electronically.  In February 2003, Santa Clara County, which includes Silicon Valley, became the first U.S. county to purchase electronic voting machines that produce a voter-verified paper receipt.27  Later that same year, the California Secretary of State announced that beginning in July 2006, all electronic voting machines in California must produce a “voter verified paper audit trail.”28  Many other states have followed suit,29 adopting laws requiring a paper backup for ballots cast electronically.30  Some states have been slow to respond to security concerns: Only in 2015 did Virginia decertify WINVote touchscreen voting machines, which suffered from numerous severe security flaws and produced no paper backups.31  Other states still use vulnerable voting machines without paper backups.32  Concerns about hacking of voting machines have become increasingly urgent in light of the alleged Russian hacking of the Democratic National Committee and the release of stolen information in an apparent attempt to influence the presidential election.33

A more quotidian example of low-tech redundancy is printing hard copies of important records or treasured photos.  The last several years have seen a dramatic rise in ransomware—malicious software that encrypts a computer’s hard drive and renders the information on it permanently inaccessible unless the victim pays the attackers (often in Bitcoin) to restore access.34  Ransomware strikes not just individuals, but increasingly businesses, including hospitals,35 which have paid to restore access to electronic systems.36  Even Vint Cerf—one of the “Fathers of the Internet” and currently Google’s “Chief Internet Evangelist”37—exhorted people to print important items.  In a 2015 speech, he warned, “If there are pictures that you really really care about then creating a physical instance is probably a good idea.  Print them out, literally.”38  The motivation for Cerf’s warning was not security so much as the march of technology and the possibility that future technology will lack the backwards compatibility necessary to read earlier file formats, effectively creating a digital “Dark Age” of inaccessible data.39  But the basic point is the same: low-tech redundancy in the form of paper copies of digital files as a way to mitigate the risks of inaccessibility or compromised integrity of digital files.

2.  Technological Regression or Arrest

Examples of technological regression or arrest also run the gamut from issues of national security to corporate and consumer contexts.

In the wake of Edward Snowden’s revelations, Russia’s Federal Guard Service (FSO), which protects high-ranking Russian officials, reportedly ordered typewriters in an attempt to keep sensitive communications from being electronically surveilled.40  An FSO source explained to the Russian newspaper Izvestiya that “the practice of creating paper documents will expand.”41  The technological regression may not be limited to Russia.  A German member of parliament who heads a parliamentary inquiry into National Security Agency activities said in an interview that “he and his colleagues were seriously thinking of ditching email completely,” and when asked whether they considered typewriters, he replied, “As a matter of fact, we have—and not electronic models either.”42

Technological regression goes beyond communications technologies.  The rapid increase of a wide range of networked devices as part of the “Internet of Things” is prompting cybersecurity concerns related to everything from medical devices43 to children’s toys44 to cars.45  One example of technological regression came to light in a 2013 60 Minutes interview with former Vice President Dick Cheney.  Cheney disclosed that “his doctor ordered the wireless functionality of his heart implant disabled due to fears it might be hacked in an assassination attempt.”46  Cheney’s revelation shows technological regression for one medical device, but regression on a broader scale might occur as a result of regulation or in the wake of an incident of patient harm from hacking of a medical device.

Similarly, consumers may drive demand for regression in some instances.  For example, German researchers in March 2016 released a study showing that twenty-four different models of cars from nineteen manufacturers are vulnerable to a “radio ‘amplification attack’ that silently extends the range of unwitting drivers’ wireless key fobs to open cars and even start their ignitions,” greatly facilitating car theft.47  Although consumers undoubtedly enjoy the convenience of keyless entry and ignition, cybersecurity concerns might push at least well-informed consumers to demand old school, physical car keys.48

While technological regression involves undoing a technological capability in response to security concerns, examples of technological arrest are characterized by a deliberate decision not to go high-tech—not to network a device, not to create a digital file—due to security concerns.49  For example, in the wake of the 2014 cyberattack on Sony Pictures, Hollywood studios are working to improve their cybersecurity.50  Some of the techniques involve using more sophisticated technology, like encryption, to secure digital copies of movie scripts, but other techniques involve technological arrest.  According to reports, “[t]he most-coveted scripts are still locked in briefcases and accompanied by bodyguards whose sole job is to ensure they don’t end up in the wrong hands.”51

The Apple-versus-FBI dispute over access to the iPhone of one of the San Bernardino shooters provides another technological arrest example.  In February 2016, a magistrate judge in the Central District of California ordered Apple to assist the FBI in accessing the iPhone by writing software that would, among other things, override a feature of the phone that caused it to auto-erase after ten incorrect attempts to guess its passcode.52  Apple raised many legal and policy objections to the court’s order,53 and one is essentially an argument for technological arrest.54  Specifically, Apple argued that the court’s order would require “Apple to design, create, test, and validate a new operating system that does not exist, and that Apple believes—with overwhelming support from the technology community and security experts—is too dangerous to create.”55  Apple cited the risks that the code would be leaked or stolen by hackers as a reason for its refusal to write the code in the first place.56

II.  Law’s Push and Pull

Numerous drivers have pushed the digital revolution and increased dependence on technology.  Businesses and governments adopt technology because it improves efficiency and gives them new capabilities.  Customers seeking convenience or just the coolest new device form a vast market for high-tech gadgets, mobile phones, and tech-dependent services.  Companies seeking to capture pieces of these highly lucrative markets rush products onto (often digital) shelves, fiercely competing with similarly situated firms.  Because of these interests, adoption of technologies often occurs before full consideration of their security implications.  As the examples of low-tech redundancy and technological regression and arrest show, demands for efficiency, convenience, and greater capabilities often lead to adoption first, careful consideration later.

Law and regulation are at least complicit in this situation.  Law often lags behind technology, only belatedly catching up to a technology’s implications and uses after the technology has been deployed.  But in some circumstances, laws and regulations are partially to blame for creating the situation in which dependence on technology outpaces efforts to secure it.  Government entities sometimes adopt technologies themselves without fully considering security problems.  Consider, for example, the electronic voting machines that jurisdictions across the United States approved and purchased without appreciating that they could be hacked and compromise the integrity of elections and voter confidence in the electoral process.  Another example is the federal government’s adoption of electronic processing of security clearance investigations, including electronic security clearance forms and digital fingerprints.57  This information was stored in a centralized database that China compromised in last year’s hack of the Office of Personnel Management.58  In the wake of the intrusion, the government reverted to hard copy security clearance applications, at least temporarily.59

Sometimes the government has also mandated or provided incentives for other entities to adopt technologies.  One example is digitization of medical records.  Passed as part of the American Recovery and Reinvestment Act of 2009,60 the Health Information Technology for Economic and Clinical Health (HITECH) Act61 “established incentive programs for eligible hospitals and professionals adopting and meaningfully using certified electronic health record . . . technology.”62  Regulations issued under the Act provide for incentive payments under Medicare and Medicaid to health care providers that meaningfully use electronic health records, but the regulations also provide for decreased Medicare payments to providers who fail to meet electronic health record standards.63  Although these laws were a well-intentioned attempt to improve efficiency and patient safety,64 they have also pushed hospitals toward dependence on digital records that are now subject to hacks and ransomware attacks.65

On the upside, the fact that laws and regulations are partly responsible for pushing toward digital dependency suggests that they may also be able to play a constructive role in pulling back from it in narrowly tailored and strategic ways.  Setting aside legal regulation to improve cybersecurity and the debates that accompany it, law and regulations could help to require or incentivize giving up on cybersecurity via low-tech redundancy or technological regression.

Examples of laws mandating low-tech redundancy are easy to imagine and in some cases already exist.  For example, the same jurisdictions that passed laws or ordinances requiring electronic voting could just as easily require all voting machines to produce a paper backup.66  Similarly, the laws and regulations that incentivize digitization of medical records could mandate or provide incentives for health care providers to produce periodic paper backups of at least some documents, for instance, patient allergy information.  Laws could similarly incentivize individuals to maintain low-tech capabilities.  Ownership of self-driving cars could be contingent on the owner obtaining a driver’s license for conventional cars.  None of these legal moves would abandon the advantages of digitization or new technology, but they would preserve low-tech redundancy that could be drawn upon if the high-tech options were destroyed, made inaccessible, or rendered untrustworthy.

Laws and regulations could similarly foster consideration of technological regression.  For industries that are already regulated, agencies could require a risk assessment for networked devices that would specifically evaluate whether the benefits of the networking outweigh the security risks.  Forcing explicit consideration of cybersecurity risk could push companies toward technological regression.  Examples in this category might be the Food and Drug Administration’s regulation of medical devices,67 and the National Highway Traffic Safety Administration’s regulation of motor vehicle safety.68

In addition to federal government agencies, state laws and regulations might provide other avenues for prompting companies to consider technological regression explicitly.  States and state attorneys general in particular have been active in protecting consumer privacy through mechanisms such as state data breach notification statutes.69  Many have consumer protection-focused mandates as well.70  States might issue guidance or provide incentives or mandates for consumer product companies that sell networked devices to consider device security.  They might even require product manufacturers to preserve the ability to de-network consumer products and to disclose to consumers how to de-link devices from the Internet.  To be sure, such a requirement would pose practical challenges, including how to provide the information to consumers and whether the information on de-linking the device could be communicated in a sufficiently understandable way for the average consumer to follow the instructions if he or she chose to do so.71

Another way law could incentivize consideration of low-tech redundancy or technological regression is by incorporating such consideration into the standard of care for what constitutes reasonable cybersecurity.  For the last decade,72 the Federal Trade Commission (FTC) has brought administrative actions against companies that demonstrate weak cybersecurity with respect to customer data.  The FTC actions are based on Section 5 of the Federal Trade Commission Act’s prohibition on “unfair or deceptive acts or practices in or affecting commerce.”73  The Commission’s authority to regulate inadequate cybersecurity pursuant to Section 5 received a significant boost in 2015 when the Third Circuit Court of Appeals rejected a challenge to the FTC’s authority by Wyndham Hotels, which was the subject of an FTC enforcement action after three data breaches compromised credit card information of more than 619,000 customers and resulted in “at least $10.6 million” in fraudulent charges.74  Most of the FTC’s more than fifty enforcement actions have resulted in settlements,75 and the settlements focus on companies’ failure to employ basic cybersecurity practices, such as requiring secure passwords and keeping software updated.76

In the future, the FTC’s understanding of what counts as “unfair” practices could evolve to include maintenance of low-tech redundancy and consideration of technological regression.  For example, in response to a wave of attacks that renders customer services unavailable—think inability to access personal health tracking data or travel reservations—the Commission could come to regard a company’s failure to maintain low-tech redundancy in the form of paper backups or other means for continued customer access to data as an unfair practice.  In other words, the failure to maintain continuity of operations during a cyberattack—including through low-tech redundancy—could be understood to be an unreasonable cybersecurity practice and one that is unfair to consumers.

The FTC might also address technological regression as an extension of existing concerns about unnecessary collection of consumers’ data.  The FTC already advises businesses to avoid collecting personal information they do not need and to retain consumers’ personal data only for as long as there is a legitimate business need for the data.77  The FTC has highlighted these “data minimization” best practices specifically in connection with the Internet of Things.78  The Commission gave an example of a wearable device that tracks a health condition and has the ability to monitor the wearer’s physical location.79  The Commission suggests that until the company needs the geolocation information for a future product feature that would allow users to find medical care near their location, the company should not collect the location data.80  The FTC’s example could be understood to push for consideration of technological regression: The wearable technology company would turn off a feature of the device absent a business need that would justify collecting data and thereby putting it at risk of compromise.81  It is worth emphasizing that under the FTC’s current approach, the existence of a business need for data appears sufficient to justify its collection.  A full embrace of consideration of technological regression, on the other hand, might change the analysis so that the mere existence of a business need for data is not necessarily sufficient; rather the reasonableness of a company’s practices could turn on a balancing between the cybersecurity risks of a technology and the benefits it provides to consumers.


The website of “I Am The Cavalry”—a “grassroots organization” focused on the intersection of computer security and public safety82—notes that “[a]s the question around technology is less-and-less ‘can we do this’ we must more-and-more be asking ‘should we do this.’”83  As the examples of low-tech redundancy and technological regression suggest, sometimes that answer should be “no.”  Going forward, legal institutions from executives to legislatures to regulatory agencies should consider whether low-tech redundancy should be maintained alongside high-tech capabilities and digital data and whether in limited circumstances, the convenience, efficiency, and other benefits of a technology might not overcome the cybersecurity risks that it poses.

Discussions of technological progression don’t often end in discussions of technological regression.  But maybe they should.

[1]See Matthew Wall, Big Data: Are You Ready for Blast-Off?, BBC (Mar. 4, 2014), []; What is Big Data?, IBM, [].

[2]See, e.g., [small-caps]Dave Evans, CISCO, The Internet of Things: How the Next Evolution of the Internet Is Changing Everything[end-small-caps] 3 (2011) (predicting that the number of connected devices will rise from 12.5 billion in 2010, to 25 billion in 2015, to 50 billion in 2020); [small-caps]Intel, A Guide to the Internet of Things: How Billions of Online Objects Are Making the Web Wiser[end-small-caps], (predicting growth of connected objects from 2 billion in 2006, to 15 billion in 2015, to 200 billion in 2020); Philip N. Howard, Sketching Out the Internet of Things Trendline, [small-caps]Brookings Inst.[end-small-caps] (June 9, 2015), [] (aggregating various predictions about the growth of the Internet of Things).

[3].  [small-caps]Adam Segal, The Hacked World Order: How Nations Fight, Trade, Maneuver, and Manipulate in the Digital Age [end-small-caps]49 (2016) (“The history of cyberspace and cyber conflict is short, but the pace of history is rapidly accelerating.  Whereas years or months once separated notable cyberattacks, now they come almost weekly, if not sometimes daily.”).

[4].  The strategy of low-tech redundancy is not necessarily limited to the cybersecurity context.  The New York Times recently reported that to address security concerns posed by drones, European officials are considering using trained eagles to intercept drones that appear to threaten, for example, airports or public gatherings—“a low-tech solution for a high-tech problem.”  Stephen Castle, Dutch Firm Trains Eagles to Take Down High-Tech Prey: Drones, [small-caps]N.Y. Times [end-small-caps](May 28, 2016), [] (quoting Sjoerd Hoogendoorn, the co-founder of the eagle program company, Guard From Above).  The drone-hunting eagle program may be an example of low-tech redundancy: Dutch police detective chief superintendent Mark Wiebes explained to the Times that “subject to a final assessment,” eagles are “likely to be deployed soon in the Netherlands, along with other measures to counter drones,” such as “jamming drone signals.”  Id.

[5]See, e.g., Presidential Policy Directive/PPD-21--Critical Infrastructure Security and Resilience, [small-caps]White House[end-small-caps] (Feb. 12, 2013), [] (“The term ‘resilience’ means the ability to prepare for and adapt to changing conditions and withstand and recover rapidly from disruptions.  Resilience includes the ability to withstand and recover from deliberate attacks, accidents, or naturally occurring threats or incidents.”).

[6].  “High-tech” and “low-tech” are, of course, relative terms.  As newer, more sophisticated technologies are developed, today’s high-tech will become the future’s low-tech.  Today’s cars are high tech as compared to the Model T, which was high-tech as compared to horse-drawn carriages, but today’s cars will be low-tech when assessed against the future’s driverless cars.

[7]See, e.g., [small-caps]P.W. Singer & Allan Friedman, Cybersecurity and Cyberwar[end-small-caps] 35 (2014); Chad Perrin, The CIA Triad, [small-caps]TechRepublic[end-small-caps] (June 30, 2008, 8:13 AM), [].

[8]See Singer & Friedman, supra note 7, at 35 (discussing confidentiality).

[9]See id. (discussing availability).

[10]. See Nicole Perlroth & Quentin Hardy, Bank Hacking Was the Work of Iranians, Officials Say, [small-caps]N.Y. Times[end-small-caps] (Jan. 8, 2013), []; see also Indictment, United States v. Fathi, No. 16 Crim. 48 (S.D.N.Y. Mar. 24, 2016), (charging seven hackers with ties to the Iranian government with crimes related to the distributed denial of service (DDoS) attacks on U.S. financial institutions).

[11]. See Singer & Friedman, supra note 7, at 35 (discussing integrity).

[12]. Id. at 129 (arguing, in discussing attacks that compromise the integrity of military information, that “[o]nly a relatively small percentage of attacks would have to be successful in order to plant seeds of doubt in any information coming from a computer.  Users’ doubt would lead them to question and double-check everything from their orders to directions. . . . The impact could even go beyond the initial disruption.  It could erode the trust in the very networks needed by modern military units to work together effectively . . . .”).

[13]. James R. Clapper, Director of National Intelligence, Worldwide Cyber Threats: Hearing Before the H. Permanent Select Comm. on Intelligence, 114th Cong. 5 (2015) , [] (“Most of the public discussion regarding cyber threats has focused on the confidentiality and availability of information; cyber espionage undermines confidentiality, whereas denial-of-service operations and data deletion attacks undermine availability.  In the future, however, we might also see more cyber operations that will change or manipulate electronic information in order to compromise its integrity (i.e., accuracy and reliability) instead of deleting it or disrupting access to it.  Decisionmaking by senior government officials (civilian and military), corporate executives, investors, or others will be impaired if they cannot trust the information they are receiving.”).

[14]. See David E. Sanger, Obama Order Sped up Wave of Cyberattacks Against Iran, [small-caps]N.Y. Times[end-small-caps] (June 1, 2012), [].

[15]. Id.

[16]. Id.

[17]. Id.  Another example of an integrity attack occurred in 2007 when Israel bombed a Syrian nuclear facility.  A cyber attack on Syrian air defense computer systems caused Syrian radar operators to see false images—ones that did not reveal that Israeli planes had entered Syrian airspace—and “the air defense network never fired a shot.”  [small-caps]Singer & Friedman[end-small-caps], supra note 7, at 127.

[18]. See, e.g., James R. Clapper, Director of National Intelligence, Worldwide Threat Assessment of the US Intelligence Community: Hearing Before the S. Armed Serv. Comm., 114th Cong. 2 (2016), [] (“Future cyber operations will almost certainly include an increased emphasis on changing or manipulating data to compromise its integrity (i.e., accuracy and reliability) to affect decisionmaking, reduce trust in systems, or cause adverse physical effects.”); Clapper, supra note 13, at 5 (warning about future attacks on data integrity); Katie Bo Williams, Officials Worried Hackers Will Change Your Data, Not Steal It, [small-caps]The Hill [end-small-caps](Sept. 27, 2015, 8:00 AM), [] (reporting on congressional testimony by National Security Agency Director Michael Rogers warning about future cyberattacks aimed at undermining the integrity of data).

[19]. Andrea Peterson, Cybersecurity Fears Are Making U.S. Sailors Learn to Navigate by the Stars Again, [small-caps]Wash. Post[end-small-caps] (Oct. 14, 2015), [].

[20]. Tim Prudente, Seeing Stars, Again: Naval Academy Reinstates Celestial Navigation, [small-caps]Capital Gazette [end-small-caps](Oct. 12, 2015), [].

[21]. Id.

[22]. Id.

[23]. See, e.g., Jose Pagliery, GPS Satellite Networks Are Easy Targets for Hackers, CNN (Aug. 4, 2015, 6:54 AM), [] (reporting on research that compromised a commercial GPS tracking network); Michael Peck, The Pentagon Is Worried About Hacked GPS, [small-caps]Nat’l Int.[end-small-caps] (Jan. 14, 2016), [] (explaining the U.S. military’s concerns about GPS jammers and physical attacks on GPS satellites and detailing the military’s efforts to develop backup systems).

[24]. Prudente, supra note 20.  For additional arguments about low-tech redundancy in the military context, see [small-caps]Jacquelyn Schneider, Ctr. for New American Security, Digitally-Enabled Warfare: The Capability-Vulnerability Paradox[end-small-caps] 9 (2016), (arguing that the U.S. military should improve its resiliency by “acquiring technologies with both digital and manual capabilities” and increasing training for “back-up manual procedures”).

[25]. For an overview of the post-2000 shift in voting equipment through 2005, see Daniel P. Tokaji, The Paperless Chase: Electronic Voting and Democratic Values, 73 [small-caps]Fordham L. Rev. [end-small-caps]1711, 1717–41 (2005).

[26]. See id. at 1734–36.

[27]. See Receipts Sought for Votes Cast Electronically,[small-caps] N.Y. Times[end-small-caps] (Feb. 26, 2003), [].

[28]. News Release, Sec’y of State Kevin Shelley, Secretary of State Kevin Shelley Announces Directives to Ensure Voter Confidence in Electronic Systems (Nov. 21, 2003), [].  For an overview of the history of this shift in California, see [small-caps], Back to Paper: A Case Study[end-small-caps] 8–10 (2008), [].

[29]. Some states have not engaged in low-tech redundancy (that is, paper backups), but instead have engaged in technological regression, abandoning electronic voting altogether in favor of a return to paper ballots.  See, e.g., [small-caps],[end-small-caps] supra note 28, at 4 (discussing New Mexico’s return to paper ballots).

[30]. See Cory Bennett, States Ditch Electronic Voting Machines, [small-caps]Hill[end-small-caps] (Nov. 2, 2014), [] (reporting that “[m]ore than 60 percent of states” have passed laws requiring paper trails for electronic votes); see also The Verifier-Polling Place Equipment-Current, [small-caps]Verified Voting[end-small-caps], [] (showing various types of polling place equipment used by states).

[31]. See Kim Zetter, Virginia Finally Drops America’s ‘Worst Voting Machines’, [small-caps]Wired[end-small-caps] (Aug. 17, 2015, 7:00 AM), [] (cataloguing numerous security problems with the machines, including insecure encryption, default passwords, and software that had not been patched since 2005).

[32]. See Grant Gross, A Hackable Election? 5 Things To Know about E-Voting, [small-caps]Computerworld[end-small-caps] (July 22, 2016, 8:57 AM), [] (highlighting security concerns stemming from some states’ continued use of electronic voting machines without paper backups).

[33]. Bruce Schneier, By November, Russian Hackers Could Target Voting Machines, [small-caps]Wash. Post [end-small-caps](July 27, 2016), [] (“Longer term, we need to return to election systems that are secure from manipulation.  This means voting machines with voter-verified paper audit trails . . . . I know it’s slower and less convenient to stick to the old-fashioned way, but the security risks are simply too great.”).  For details on the evidence that Russia is responsible for the Democratic National Committee hack, see, for example, David E. Sanger & Eric Schmitt, Spy Agency Consensus Grows That Russia Hacked D.N.C., [small-caps]N.Y. Times[end-small-caps] (July 26, 2016), []; Patrick Tucker, How Putin Weaponized Wikileaks to Influence the Election of an American President, [small-caps]DefenseOne[end-small-caps] (July 24, 2016), [].

[34]. Ransomware on the Rise: FBI and Partners Working to Combat This Cyber Threat, FBI (Jan. 20, 2015), []; Kim Zetter, Hacker Lexicon: A Guide to Ransomware, the Scary Hack That’s on the Rise, [small-caps]Wired[end-small-caps] (Sept. 17, 2015, 4:08 PM), [].

[35]. John Woodrow Cox et al., Virus Infects MedStar Health System’s Computers, Forcing an Online Shutdown, [small-caps]Wash. Post [end-small-caps](Mar. 28, 2016), [] (noting that the infection of the Medstar computer system forced “hospital staff . . . to revert to seldom-used paper charts and records”).

[36]. See Sean Gallagher, Patients Diverted to Other Hospitals After Ransomware Locks Down Key Software, [small-caps]Ars Technica[end-small-caps] (Feb. 17, 2016), []; Richard Winton, Hollywood Hospital Pays $17,000 in Bitcoin to Hackers; FBI Investigating, [small-caps]L.A. Times[end-small-caps] (Feb. 18, 2016), [].

[37]. Vinton G. Cerf, [small-caps]Research at Google[end-small-caps], [].

[38]. Sarah Knapton, Print out Digital Photos or Risk Losing Them, Google Boss Warns, [small-caps]Telegraph[end-small-caps] (Feb. 13, 2015, 11:06 AM), [] (quoting Vinton Cerf).

[39]. Id.

[40]. Miriam Elder, Russian Guard Service Reverts to Typewriters After NSA Leaks, [small-caps]Guardian[end-small-caps] (July 11, 2013), [].

[41]. Id. (quoting a source inside the Federal Guard Service).

[42]. Philip Oltermann, Germany ‘May Revert to Typewriters’ to Counter Hi-Tech Espionage, [small-caps]Guardian[end-small-caps] (July 15, 2014, 1:04 PM), [] (quoting Patrick Sensburg).

[43]. See, e.g., News Release, U.S. Food & Drug Admin., FDA Outlines Cybersecurity Recommendations for Medical Device Manufacturers (Jan. 15, 2016), []; Cybersecurity Vulnerabilities of Hospira Symbiq Infusion System: FDA Safety Communication, [small-caps]U.S. Food & Drug Admin.[end-small-caps], (July 31, 2015), [] (recommending that hospitals cease using the Hospira Symbiq Infusion System because cybersecurity vulnerabilities allow the pump to be remotely accessed and thus allow unauthorized users to change the dosage the pump administers).

[44]. See, e.g., Whitney Meers, Hello Barbie, Goodbye Privacy? Hacker Raises Security Concerns, [small-caps]Huffington Post[end-small-caps] (Nov. 30, 2015, 4:45 PM), [].

[45]. See, e.g., Sean Gallagher, Highway to Hack: Why We’re Just at the Beginning of the Auto-Hacking Era, [small-caps]Ars Technica[end-small-caps] (Aug. 23, 2015, 8:00 AM), [].

[46]. Andrea Peterson, Yes, Terrorists Could Have Hacked Dick Cheney’s Heart, [small-caps]Wash. Post[end-small-caps] (Oct. 21, 2013), [].

[47]. Andy Greenberg, Radio Attack Lets Hackers Steal 24 Different Car Models, [small-caps]Wired [end-small-caps](Mar. 21, 2016, 10:33 AM), [].

[48]. As just one example, in response to an article about the radio amplification attacks, Shawn Henry, the president of cybersecurity firm Crowdstrike Services, tweeted, “My ignition key worked pretty well for the past 30 years.  Maybe we don’t need to incorporate tech into EVERYTHING?!”  Shawn Henry (@Shawn365Henry), [small-caps]Twitter[end-small-caps] (Mar. 23, 2016, 8:36 AM), [].

[49]. See, e.g., Darren Samuelsohn, GOP Shuns Electronic Ballots at Open Convention, [small-caps]Politico[end-small-caps] (May 1, 2016, 4:56 PM), [] (reporting that senior Republican party officials “rul[ed] out a change to convention bylaws that would allow for electronic voting on” presidential and vice presidential nominees due in part to concerns about hacking); Schneier, supra note 33 (arguing against Internet voting due to cybersecurity concerns).

[50]. Nicole Perlroth, Secrecy on the Set: Hollywood Embraces Digital Security, [small-caps]N.Y. Times [end-small-caps](Mar. 29, 2015), [].

[51]. Id.

[52]. Order Compelling Apple, Inc. to Assist Agents in Search, In re Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, California License Plate 35KGD203, No. ED 15-0451M (C.D. Cal. Feb. 16, 2016),

[53]. See Apple Inc’s Motion to Vacate Order Compelling Apple Inc. to Assist Agents in Search, and Opposition to Government’s Motion to Compel Assistance at 14-35, In re the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, California License Plate 35KGD203, No. CM 16-10 (SP) (C.D. Cal. Feb. 25, 2016) [hereinafter Apple Brief],

[54]. Id. at 2.

[55]. Apple Inc.’s Reply to Government’s Opposition to Apple Inc.’s Motion to Vacate Order Compelling Apple Inc. to Assist Agents in Search at 16, In re the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, California License Plate 35KGD203, No. CM 16-10 (SP) (C.D. Cal. Mar. 15, 2016) [hereinafter Apple Reply Brief],; see also Apple Brief, supra note 53, at 2 (arguing that the court’s order “compels Apple to create a new operating system—effectively a ‘back door’ to the iPhone—that Apple believes is too dangerous to build”).

[56]. Apple Reply Brief, supra note 55, at 19–20.

[57]. See Security Clearance Reform: Moving Forward on Modernization: Before the Subcomm. on Oversight of Gov’t Mgmt., the Fed. Workforce, and the District of Columbia, U.S. S. Comm. on Homeland Sec. & Governmental Affairs, 111th Cong. (Sept. 15, 2009) (statement of John Berry, Director, U.S. Office of Personnel Management), [] (describing federal government agencies’ adoption of electronic background investigation forms and digital fingerprint records as part of security clearance investigations).

[58]. See David E. Sanger, Hackers Took Fingerprints of 5.6 Million U.S. Workers, Government Says, [small-caps]N.Y. Times[end-small-caps] (Sept. 23, 2015), [] (reporting that the Office of Personnel Management hack, attributed to China, compromised personal information of 22 million people and fingerprints of 5.6 million U.S. government employees).

[59]. Billy Mitchell, OPM Reverts to Paper Forms During e-QIP Suspension,[small-caps] FedScoop[end-small-caps] (July 6, 2015, 5:20 PM), [].

[60]. American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, 123 Stat. 115 (2009).

[61]. Health Information Technology for Economic and Clinical Health Act, Pub. L. No. 111-5, §§ 13001–13424, 123 Stat. 226 (2009).

[62]. Frank Pasquale, Grand Bargains for Big Data: The Emerging Law of Health Information, 72 [small-caps]Md. L. Rev. [end-small-caps]682, 708–09 (2013).  For an overview of the incentive programs, see Medicare and Medicaid EHR Incentive Program Basics, [small-caps]Ctrs for Medicare & Medicaid Servs.[end-small-caps], [].

[63]. Medicare and Medicaid Programs; Electronic Health Record Incentive Programs—Stage 3 and Modifications to Meaningful Use in 2015 Through 2017, 80 Fed. Reg. 62,761, 62,765 (Oct. 16, 2015) (to be codified at 42 C.F.R. pts. 412 and 495).

[64]. See President Barack Obama & Vice President Joe Biden, Remarks by the President and Vice President at Signing of the American Recovery and Reinvestment Act, [small-caps]White House[end-small-caps] (Feb. 17, 2009), [] (explaining the American Recovery and Reinvestment Act as “an investment that will take the long overdue step of computerizing America’s medical records to reduce the duplication and waste that costs billions of health care dollars, and medical errors that cost thousands of lives each year”).

[65]. See supra note 35 and accompanying text.

[66]. See supra notes 28 and 30 and accompanying text.

[67]. See Overview of Device Regulation, [small-caps]U.S. Food & Drug Admin.[end-small-caps], [].

[68]. See Federal Motor Vehicle Safety Standards, 49 C.F.R. § 571 (2015).

[69]. See, e.g., Security Breach Notification Laws, [small-caps]Nat’l Conf. St. Legislatures[end-small-caps] (Jan. 4, 2016), [] (compiling state data breach notification laws).

[70]. See, e.g., Protecting Consumers, [small-caps]St. Cal. Dep’t Just.: Off. Att’y Gen.[end-small-caps], []; Consumer Protection, [small-caps]Att’y Gen. Tex. Ken Paxton[end-small-caps], []; Consumer Protection, [small-caps]Att’y Gen. Mark R. Herring[end-small-caps], [].

[71]. Some of these challenges occur with respect to privacy policies for connected devices, which do not even attempt to convey technical information about how to alter a device’s functions.  See Scott R. Peppet, Regulating the Internet of Things: First Steps Toward Managing Discrimination, Privacy, Security, and Consent, 93 [small-caps]Tex. L. Rev.[end-small-caps] 85, 140–43 (2014).

[72]. For an overview of the FTC’s history of data security enforcement actions, see [small-caps]Gina Stevens, Cong. Research Serv., The Federal Trade Commission’s Regulation of Data Security Under Its Unfair or Deceptive Acts or Practices (UDAP) Authority[end-small-caps] 7–8 (2014), (tracing the history of FTC data security enforcement actions beginning in 2006).

[73]. 15 U.S.C. § 45(a) (2012).

[74]. Fed. Trade Comm’n v. Wyndham Worldwide Corp., 799 F.3d 236, 242 (3d Cir. 2015).

[75]. [small-caps]Fed. Trade Comm’n, Start With Security: A Guide for Business: Lessons Learned From FTC Cases[end-small-caps] 1 (2015),

[76]. Id. at 4–5, 12.

[77]. Id. at 2.

[78]. [small-caps]FTC Staff Rep., Internet of Things: Privacy & Security in a Connected World[end-small-caps] 33–39 (2015),

[79]. Id. at 36.

[80]. Id.

[81]. [small-caps]Fed. Trade Comm’n[end-small-caps], supra note 75, at 2 (highlighting security risk to unnecessarily collected and retained data); [small-caps]FTC Staff Rep.[end-small-caps], supra note 78, at 34–35 (explaining that collection and retention of unnecessary data increases security risk because “[l]arger data stores present a more attractive target for data thieves”).

[82]. Executive Summary, [small-caps]I Am The Cavalry[end-small-caps], [].

[83]. Id.

About the Author

Assistant Professor, UCLA School of Law. For helpful comments, I am grateful to participants in the Program on Understanding Law, Science, and Evidence (PULSE) conference on “Imagining the Legal Landscape: Technology and the Law in 2030.” Thanks to Andrew Brown for excellent research assistance.

By uclalaw