Emerging Digital Technology and the "Law of the Horse"


In the mid-1990s, Judge Frank Easterbrook warned that the topic of “property in cyberspace” was tantamount to a “law of the horse.” He cautioned that adopting a “new law” for cyberspace risked not only isolating technology from broader systems of law, but also undermining the hard work necessary to “put the law of the [tech] horse in the context of broader rules.”1 An emphasis on a discrete body of tech law for the burgeoning internet age, in other words, risked skipping right past the extant “unifying principles” that law already provided. In response, Lawrence Lessig contested that “cyberlaw” was no “law of the horse”; to the contrary, it could offer valuable and more generalizable lessons about “the limits on law as a regulator [of behavior] and about the techniques for escaping those limits.”2

Fast-forward some twenty-odd years to the burgeoning “online merges with offline,” or “OMO” era.3 In a world where life is at once digital and physical, where we might walk down the street and say “hi” to our neighbor while simultaneously Snapping a pic and saying “hi” to our friend halfway around the world, do we have a firmer verdict on tech law’s status?

I. The Scope of Tech Law Questions

In the contemporary world, perhaps the question of whether tech law is wholesale different or discrete is a red herring. The practical reality is that code-based digital technology today is touching human lives, for good and for ill, in an ever-broader range of substantive domains. From risk assessment algorithms to digital data brokers to the Internet of Things and embedded devices to smart contracts and blockchain technologies, it is increasingly difficult to think of a sector or domain that is not affected by code.

These technical innovations affect myriad legal domains. Consider, for instance, the ways that algorithmic decisionmaking raises fundamental fairness and ethical questions and, particularly in criminal law, implicates constitutional values; how relying on “big data” for technical advances and commercial applications creates new security and privacy risks; how the Internet of Things may change our understandings of cognizable harm in tort law; or how blockchain-based “smart contracts” might alter our understandings of binding agreements. Tech law is everywhere.4 And if we want to move forward, together, then perhaps we should emphasize whether our overarching legal frameworks and policies support the progress we wish to see, given the directions that technology affords us. In other words, rather than worrying too much about whether we’re astride a horse or riding backseat in a self-driving car, let’s critically analyze the complex relationships among law, policy, and technology and distill lessons to forge the best paths.

The pieces in this series contend with the promise and perils of emerging digital technologies, or “disruptive technologies,” and assess the legal, policy, and human stakes. With a specific focus on “data-driven” artificial intelligence, or AI, and “code-driven” blockchain,5 the authors grapple with the manifold challenges of these disruptive technologies and their interactions with law and policy. These challenges not only span diverse substantive domains but also involve all three branches of government. Turning to the judiciary, Shaverdian and Park, respectively, consider how blockchain may call for changes in copyright doctrine6 and how algorithmic risk assessment is already affecting judicial practice in criminal cases.7 In legislative bodies, Ott evaluates how a state statute meant to provide public protections for digital consumer privacy may imperil companies’ ability to use blockchain and AI technologies.8 And in the executive branch, Fung contemplates whether blockchain might be a viable record-keeping solution for administrative bodies that directly provide public benefits.9

The authors’ analyses raise important questions about the relationship between, on the one hand, technological opportunities and risks and, on the other, the current impact and future potential impact of top-down statutes, administrative law, common law, or doctrinal interventions. Critically, these discussions are not arcane technical explorations; to the contrary, as these pieces reveal, choices about technology law and policy affect human beings and core human values. One important aspect of this evaluation is descriptive: How does the disruptive potential of an emerging digital technology fit within existing law and policy regimes—or not? And when does technical progress in one area interact with preexisting rules and regulations in unexpected ways? A second step is more analytical: When do these technologies so change the paradigm of current law or regulation that the underlying rules should change? And together, these questions raise a series of normative issues: How do we know if change is in order? Are calls for a different legal process or rule limited to a particular technical context, or do they also extend more generally to other areas of law and life? And how should we, as humans, respond to disruption—with its potential for progress and for harm—while preserving the systems that sustain our core values and ideals?

II. To Adapt or Not to Adapt?

By definition, “disruption” entails unsettlement. This unsettlement may be good or bad, but perturbation of an existing system comes with a choice: Adapt or maintain the status quo. When we consider an existing legal doctrine or practice, or contemplate a change in legislative or administrative law, when should disruption foster adaptation in the law—and to what extent—and when should we work to integrate the disruption within an existing structure or rule?

The authors in this series contend with this challenge, in part, by suggesting how it may be more constructive to approach this question with a narrower lens. There are at least two important sub-options within the “adaptation” category: We might update the concepts contained within an existing rule to better comport with a contemporary innovation, or we might change a law or doctrine wholesale. Shaverdian’s contribution on blockchain and copyright law’s “first sale doctrine” illustrates each of these options. His analysis pinpoints how “courts often conflate the idea of tangibility with rivalrousness” and underscores how digital assets may appear “intangible” in ways that “blind courts.”10 He then explains how “[b]lockchain offers, for the first time, the potential to create an intangible rivalrous asset.”11 Blockchain, then, may call for updating the legal concepts contained in a longstanding copyright doctrine through crisper delineation of the difference between “tangible” and “rival” goods and the elimination of “tangibility” as a proxy for “rivalrousness.” It may also call for wholesale updating of that copyright doctrine, as enshrined in law, to reflect this new technological possibility. What is at stake is not just the accuracy of a century-old doctrine in the modern era, but also the opportunity to remain true to copyright law’s underlying objectives of incentivizing creative productions, protecting the rights of the creator, and also permitting public enjoyment of authors’ works.

As we assess the potential need for change in the legal system, we must also consider the broader system within which a given intervention operates. If we adjust one element, what other values or existing laws might be enhanced or compromised? Ott’s analysis of the California Consumer Privacy Act of 2018 provides a practitioner’s perspective on commercial risks that may ensue when the state enacts new substantive law on consumer privacy. As he explores in detail, the statute enacts a much stricter set of consumer rights and procedural mandates, and compliance with these “requirements may create substantial obstacles to the wide-scale deployment of AI and blockchain technologies in the California and national markets.”12 Ott’s inventory of risks elucidates the challenge of defining progress—is it a matter of more substantive privacy protections for the public, or more space for companies to deploy emerging technologies and deliver their benefits to the public?

These pieces suggest how analysis of disruptive technologies as they operate in the world demands that we pay close attention to complex systemic interactions and tradeoffs. This is true not only at the highest level, when we consider whether we should wholesale update the law, but also when we make seemingly smaller choices about whether to adapt current law or policy to better accord with contemporary conditions, without changing the overall legal or policy infrastructure.13

III. The Human Stakes

The authors in this series also suggest a corollary lesson: We cannot answer questions about values in the abstract, but rather must do so with reference to the precise entities and individuals that a given choice is likely to affect. In this values analysis, moreover, we should bear in mind two key tradeoffs. First, as Ott’s work implies, when might the pursuit of one goal, such as consumer privacy, be at odds with another, such as commercial innovation? And second, when might robust protection of a fundamental right properly take precedence over the application of an available technological advance? Park and Fung each contribute to these debates, Park by interrogating the use of data-driven patterns in criminal justice sentencing and Fung by examining the potential use of code-driven blockchain in government agencies.

Though advanced technology may feel alien and mechanized to the lay observer, the digital technologies of AI and blockchain each rely on fundamentally human elements. As Ott and Park recognize, the presently dominant AI method, machine learning (“ML”), depends on statistical inferences drawn from extremely large datasets.14 The machine “learns” to identify patterns in data and then produces an algorithm that can be applied to new data to determine when similar patterns hold.15

Putting technical details to the side, the bottom line is the connection between human data inputs and ML outputs. As Ott explains, “AI technologies, in a very real sense, are composed of, and depend on, the data that they incorporate.”16 And as Park stresses, these “algorithms are trained on data produced by humans.”17 But as we know, human beings are far from perfect.18 Nor is our data perfect. It reflects “historic and structural biases” along fault lines such as race, gender, sexual preference, or other sensitive characteristics.19 And when it comes to using this data to build AI technologies, the “taint” of any entrenched social bias can infect algorithms with “runaway feedback loops” that reflect historical biases and “further skew data against marginalized groups.”20

In this way, technology’s reliance on data connects to central challenges about fairness, transparency, and accountability in the society we are creating. This debate, moreover, is anything but academic. Particularly in criminal law, algorithms are already being used to assist in judges’ decisions about life and liberty. Accordingly, as Park suggests, we should not deploy these tools without first considering their interaction with our overall judicial system.21 This is not to claim that human systems of justice are unbiased. Nonetheless, before we apply an algorithmic solution, we would do well to recognize that this choice will affect fundamental values such as constitutional due process and, indeed, the very idea of individualized justice—and it is imperative that our policy and legal structures reflect this reality.

In a very different domain—potential federal agency use of blockchain—Fung also reveals the relationship between fundamental due process rights, what a government actor owes to the people it serves, and the promise of a code-based technology. Blockchain technology is difficult to define with precision,22 and there are in fact many flavors of blockchain,23 but it generally refers to an “open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way.”24 Though it is most commonly associated with cryptocurrencies in the popular consciousness, blockchains in fact represent a more general innovation that promises immutable, secure, and decentralized storage and transmission of information—whether the information in question is a digital asset exchanged for value, a series of commercial transactions tracked by a business,25 data about a creative work’s licensing permissions,26 or key information about persons applying for public benefits.27

As Fung explores, the choice to implement a blockchain technology is not made in a vacuum. Rather, it demands a context-specific evaluation of the entity’s internal needs and external constraints. His analysis suggests how the government, which operates as a public, centralized record-keeper,28 may need to think especially carefully before implementing a technological intervention that would decentralize control of information. Though a public agency like the U.S. Customs and Immigration Services (USCIS) might look to blockchain technologies to permit uniform, efficient, and durable record-keeping, Fung contends that USCIS would need to use a closed blockchain to ensure that it could modify a record in the event that an applicant successfully appeals an adverse decision.29 This insight links an apparently technical choice—whether to adopt the blockchain innovation—to fundamental constitutional concerns. And in turn, these sorts of technical choices—such as whether to use an open or closed blockchain—interact with the way the technology operates and affect the reliability and authenticity of the blockchain-based solution itself.30 In other words, core values such as due process inform technical choices, which can subsequently alter the relative costs, benefits, and limitations of the technology in practice.




Disruptive technology, then, is more than a tired Silicon Valley buzzword. Digital innovations can be “disruptive” in the sense that they affect fundamental rights and public processes, for good and for ill. Given technology’s omnipresence in public and private life, it is only by balancing bedrock values and determining what principles or practices are non-negotiable that we can determine how to approach law and policy in the face of innovations. Our challenge is to construct systems of law and policy that are dynamic enough to account for core values, without moving so hastily that we erode the broader legal system or race right past the ways that our choices are affecting human lives.


1 Frank H. Easterbrook, Cyberspace and the Law of the Horse, 1996 U. Chi. L. F. 207, 208 (1996).

2 Lawrence Lessig, Commentary, The Law of the Horse: What Cyberlaw Might Teach, 113 Harv. L. Rev. 501, 502 (1999).

3 See Kai-Fu Lee, Kai-Fu Lee on The Merging of Online and Offline Worlds, Medium (Nov. 29, 2017), https://medium.com/@kaifulee/kai-fu-lee-on-the-merging-of-online-and-offline-worlds-a590efd37d75.

4 Cf. James Grimmelmann, Internet Law: Cases and Problems 692 (8th ed. 2018) (“What if Internet law is no longer a ‘specialized area of law’ because all law is Internet law now?”).

5 Cf. Innovation of Legal Method, COHUBICOL, https://www.cohubicol.com/ (investigating “two types of computational law: 1. artificial legal intelligence or data-driven law (based on machine learning), and 2. cryptographic or code-driven law (based on blockchain technologies).”).

6 Phillip Shaverdian, Blockchain-based Digital Assets and the Case for Revisiting Copyright’s First Sale Doctrine, UCLA L. Rev. Disc.: Law Meets World (2019).

7 Andrew Lee Park, Injustice Ex Machina: Predictive Algorithms in Criminal Sentencing, UCLA L. Rev. Disc.: Law Meets World (2019).

8 Chris Ott, Destination Unknown: The Perilous Future of Blockchain and Artificial Intelligence Technologies Under the California Consumer Privacy Act of 2018, UCLA L. Rev. Disc.: Law Meets World (2019).

9 Alexander Fung, Blockchain Technology and the Government: Dealing With the Threat of Data Manipulation and Increasing Records Longevity, UCLA L. Rev. Disc.: Law Meets World (2019).

10 Shaverdian, supra note 6 (quoting Juliet M. Moringiello, False Categories in Commercial Law: The (Ir)relevance of (In)tangibility, 35 Fla. St. U. L. Rev. 119, 137 (2007)).

11 Shaverdian, supra note 6.

12 Ott, supra note 8 (internal footnote omitted).

13 Cf. Urs Gasser, Recoding Privacy Law: Reflections on the Future Relationship Among Law, Technology, and Privacy 130 Harv. L. Rev. F. 61 (2017) (discussing three legal response patterns to address innovative technologies: subsumption within existing legal structures, internal legal innovation, and law reform).

14 See Ott, supra note 8; Park, supra note 7.

15 This description is simplified for clarity. For a more complete account, see generally David Lehr & Paul Ohm, Playing With the Data: What Legal Scholars Should Learn About Machine Learning, 51 U.C. Davis L. Rev. 653 (2017).

[16] Ott, supra note 8.

[17] Park, supra note 7 (citing Trey Guinn, Big Data Algorithms Can Discriminate, and It’s Not Clear What to Do About It, Conversation (Aug. 13, 2015), https://theconversation.com/big-data-algorithms-can-discriminate-and-its-not-clear-what-to-do-about-it-45849).

[18] As the saying goes, “to err is human.” Alexander Pope, An Essay On Criticism (1709).

[19] Park, supra note 7 (citing Latanya Sweeney, Discrimination in Online Ad Delivery, 11 ACMQueue 1 (2013); Mike Ananny, The Curious Connection Between Apps for Gay Men and Sex Offenders, Atlantic (Apr. 14, 2011), https://www.theatlantic.com/technology/archive/2011/04/the-curious-connection-between-apps-for-gay-men-and-sex-offenders/237340; Jennifer Langston, Who’s a CEO? Google Image Results Can Shift Gender Biases, U. Wash. News (Apr. 9, 2015), https://www.washington.edu/news/2015/04/09/whos-a-ceo-google-image-results-can-shift-gender-biases//).

[20] Id. (quoting Danielle Ensign et al., Runaway Feedback Loops in Predictive Policing, 81 Proc. Machine Learning Res. 1 (2018)).

[21] See id.

[22] See Adrianne Jeffries, ‘Blockchain’ Is Meaningless, Verge (Mar. 7, 2018, 11:36 AM), https://www.theverge.com/2018/3/7/17091766/blockchain-bitcoin-ethereum-cryptocurrency-meaning.

[23] For a taxonomy of the types of blockchains, see Fung, supra note 9.

[24] Marco Iansiti & Karim R. Lakhani, The Truth About Blockchain, Harv. Bus. Rev. (Jan.–Feb. 2017), https://hbr.org/2017/01/the-truth-about-blockchain.

[25] See id.

[26] See Shaverdian, supra note 6 (“This [blockchain] technology provides ‘an opportunity to not only limit the number of copies as intended by the artist but also to create a unique non-fungible versions of the digital masterpiece.’” (quoting Elena Zavelev, How Blockchain Empowers the Digital Art Market, Forbes (Nov. 7, 2018), https://www.forbes.com/sites/elenazavelev/2018s/11/07/how-blockchain-empowers-the-digital-art-market/)).

[27] See Fung, supra note 9 (discussing possible government use of blockchain to “prevent[] digitally stored public records from being stolen and manipulated, and [to] increase[] the longevity of our records”).

[28] See id.

[29] Id.

[30] See id.

About the Author

Alicia Solow-Niederman is a PULSE Fellow in Artificial Intelligence, Law, and Policy at UCLA School of Law. Her scholarship focuses on the ways in which emerging technologies interact with law as well as with political and social institutions and norms. She also teaches the seminar "Disruptive Technology" from which some of these pieces originated.

By uclalaw