Using recent research from data scientists and technologists, this Article argues that we are at a contradictory moment in history regarding the intersection of gender and technology, particularly as it affects lesbian, gay, bisexual, transgender, and queer (LGBTQ+) communities. At the very same moment that we see the law embracing more and more visibility regarding gender identities and fluidity, we see an even greater reliance on surveillance technologies that are flatly incapable of working beyond the binary of male and female classifications. These technological limitations become even more fraught in today’s age, when we face an unprecedented degree of surveillance— gender-related and otherwise—than we have ever seen in history. When a binary system of gender merges with the binary nature of code, the result fails to integrate LGBTQ+ communities, particularly nonbinary and transgender populations, erasing them from view.
Using insights from a wide range of studies on artificial intelligence (AI) technologies—including automated body scanners, facial recognition, and content filtering on social media—we argue in this Article that we need to grapple with the reality that the relationship between AI and gender is far more complicated than the law currently suggests. Technology companies, along with multiple courts, colleges, and workplaces, must realize that the binary presumptions of male and female identity are largely outdated for some and often fail to capture the contemporary complexity of LGBTQ+ identity formation. The question for legal scholars and legislatures is how technology can and should respond to this complexity. In the final Parts, we discuss some of the legal implications of these surveillance technologies, looking at both law and the design of technology, and turn to some normative possibilities to develop greater equality and gender self-determination.Katyal Final Article Pages - no bleed