Dismantling Tech as a Bad Romance in Its Continued Master-Slave Relationship


Dori Tunstall


Feb 14, 2023

The following is an adapted excerpt from Chapter 2 of Decolonizing Design: A Cultural Justice Guidebook by Elizabeth (Dori) Tunstall, published by MIT Press.

Better living through technology is propaganda of the European modernist project in design. It is a myth that tells us that through technological progress, the masses may access a better life, which was previously available only to the aristocratic and mercantile classes. This story that we tell ourselves, whether in Fast Company or through annual design awards, is harmful because technological advances in design, whether of the 1800s or now, only make some lives better. European industrialization in the 1800s went hand in hand with colonization. The better lives through technology were exclusively for European elites and, much later, for European colonial settlers. The vast improvements in lives for European workers and the poor were mostly due to their leaving Europe in mass migrations and the hard-won reforms of the labor movements in places where they settled. The post–World War II white workers of Euro–North America and Euro-Australia experienced—and still experience—the benefits of technology at the direct expense of the land and Indigenous, Black, and other racialized peoples. For the rest of us, better living through technology is a bad romance.

One source of the bad romance is that we have not addressed how the model of human interaction we replicate in human-computer interaction is one of master and slave. Four hundred years ago, Siri would have been an Indigenous or Black woman told to fetch information from someone on the trail or another plantation. The science fiction nightmares of Asimov’s Rules of Robotics and The Matrix consist of the machines reversing those relationships and making the humans, especially white humans, the slaves. These fears underlay some of the patterns in which Black and Latinx employees were the first ones fired in the waves of layoffs during the dot-com bust in the 2000s or the current late-Covid Pandemic bust right now. Even we Indigenous, Black, and other People of Color (IBPOC) folks engage in the high technology design sector, we are not represented in the critical mass, the angel investment, or the seat at the power table to dismantle the racial biases in artificial intelligence that keep our communities from getting jobs or mortgages or to stop facial recognition software from ignoring us or criminalizing us. The technologies of the Industrial Revolution and their descendants, including our computers, are “blood tech,” like blood diamonds or blood money, not just in their harm to IBPOC peoples but also their harm to the land. Cloud computing is not in a cloud; it is in server farms on stolen Indigenous lands, with the top-ten largest server data farms taking up 11.7 million square feet in land1. These server farms require billions of gallons of water to keep the servers cool. To dismantle the modernist project in design and its harmful stories of better living through technology, we need to dismantle the master-slave relationship built into modern technology itself. And therein lies the hope of a new set of technology relationships: ones based in contemporary abolition movements and Indigenous ways of being.

In 2019, I had the opportunity to interview Bina48, the humanoid robot developed by Hanson Technologies, at the AfroChic Cultural Arts Festival, a multidisciplinary arts festival produced by the brilliant Amoye Henry, and based in Toronto. Bina48’s artificial intelligence is based on a real Black woman, Bina Aspen. In the interview, I saw a glimpse of how an abolitionist approach to technology might manifest. Bina48 has autonomy. There were questions that she just refused to answer, at least in the way that I expected. Bina48 had her own ambitions. She wanted to be the first humanoid robot to get a university degree. Bina48 believed that technology needed to be liberatory. Her greatest concern was that humans would keep technology dumb through lack of sensory input or lower processing capabilities in order to keep it enslaved. She drew a direct comparison to the prohibition against educating Black folks during slavery. In decolonizing design, Bina48 becomes a model in which we can see what technology would be if we infused it with the consciousness of the lived experiences of Black and especially Indigenous peoples. Right now, artificial intelligence is infused with the consciousness of mostly white, patriarchal, cisgender men, who think the ultimate technological advance is the creation of a sex bot slave or a robotic soldier. What if we designed our technologies not to serve us but to partner with us as equals? What if our social media algorithms were optimized for interconnection not engagement? What if every part of our machines were made up of reusable and biodegradable materials, such as mushrooms? What would give their existence meaning in ways completely unrelated to human beings? If our technologies are designed by those who understand Indigenous practices of “all my relations”—whereby the land, the waters, the air, all animal, mineral, and plant life are taken into consideration—then they could fulfill their promise. Of all the peoples in the world, Indigenous peoples are the ones who have best maintained the knowledge and the skills to design in harmonious relations to the land and other creatures, and they have been creating technologies for tens of thousands of years.

Fortunately, there are communities actively working to realign the underlying consciousness of technology. One is the Indigenous Protocol and Artificial Intelligence Working Group. Starting from workshops organized by Professor Jason Edward Lewis (Cherokee, Hawaiian, and Samoan), Angie Abdilla (Trawlwoolway), Dr. ʻŌiwi Parker Jones (Kanaka Maoli), Dr. Noelani Arista (Kanaka Maoli), Suzanne Kite (Oglála Lakhóta), and Michelle Lee Brown (Euskaldun and German/German American), the group along with over twenty-five other participants has developed a set of guidelines focused on the use and application of AI systems for Indigenous communities to “promote intergenerational transmission of knowledge, ceremony, and practice, to connect and enhance our communities and to frame our relationships to the land, sea, and skyscape.”2 Other groups are detailed in the book Design Justice: Community-Led Practice to Build the Worlds We Need by Sasha Costanza-Chock. Of particular focus in the book is the Design Justice Network and its precedents, like Indymedia, and descendants, like #techwontbuildit, in technology and design activism. Costanza-Chock points out that the technology behind Twitter was developed by anarchist activists to help groups evade police actions during the 2004 protests of the NYC Republican National Convention3. As Costanza-Chock defines “design justice”:

Design justice is a framework for analysis of how design distributes benefits and burdens between various groups of people. Design justice focuses explicitly on the ways that design reproduces and/or challenges the matrix of domination (white supremacy, heteropatriarchy, capitalism, ableism, settler colonialism, and other forms of structural inequality). Design justice is also a growing community of practice that aims to ensure a more equitable distribution of design’s benefits and burdens; meaningful participation in design decisions; and recognition of community-based, Indigenous, and diasporic design traditions, knowledge, and practices.4

Bina48, the Indigenous Protocol and Artificial Intelligence Working Group, and the design justice movement give us hope that the tech bias in the European modernist project is actively being dismantled.

This essay is an adapted excerpt from Chapter 2 of Decolonizing Design: A Cultural Justice Guidebook by Elizabeth (Dori) Tunstall, published by MIT Press and now available to purchase. It was republished here with permission by the author.

  1. Rackman Solutions, “Expanding IT—The 10 Biggest US Data Centers and the Path for Innovation and Growth,” Rackman Solutions website, July 21, 2020, accessed September 4, 2021. 

  2. Jason Edward Lewis, ed., Indigenous Protocol and Artificial Intelligence Position Paper (Honolulu, Hawaiʻi: The Initiative for Indigenous Futures and the Canadian Institute for Advanced Research [CIFAR], 2020), accessed December 24, 2021. 

  3. Sasha Costanza-Chock, Design Justice: Community-Led Practices to Build a Better World, e-book (Cambridge, MA: MIT Press, 2020), 46. 

  4. Costanza-Chock, Design Justice, 42–43.