Facial Recognition, with Safiya Noble and Tawana Petty

Even if you think you understand the way technology affects your life, this episode is an eye-opener.

The first in our five-part series on Global Digital Rights Challenges, this powerful conversation between Safiya Noble of c2i2 at UCLA and Tawana Petty of Data for Black Lives helps illuminate the ways technology and facial recognition are shaping our lives – for better and for worse. 

Transcript

Tawana Petty: One of the running themes was that we feel watched, we don’t feel seen. We feel like our every movement is tracked and targeted and that we’re leaving this trail of data everywhere we go, not for our benefit but for our detriment. A lot of people think of data and technological systems as these systems that will only impact you if you have access, but we know that whether or not you have access to those systems, those systems have access to you.

Natalie Monsanto: This is the Promise Institute Podcast, and I’m Natalie Monsanto, the Institute’s Communications and Development Manager. I’m very excited to welcome you to the first of our five-part special series on Global Digital Rights Challenges. This episode discusses facial recognition and entrenching racial bias.

The series was produced in partnership with UCLA Law’s Institute for Technology Law and Policy and features striking conversations about the way the relationship between technology and human rights is playing out around the world. As digital considerations entwine with and blur more of the core functions in our lives, conversations like these are becoming all the more imperative. 

Stay tuned after the episode to hear about ways to support work like this, and to follow us on social. As we start off, the voice you’ll hear first is our Assistant Director, Jess Peake, introducing our speakers.

Jess Peake: I’m extremely excited to introduce our two speakers. Dr. Safiya Umoja Noble is an Associate Professor here at the University of California, Los Angeles, in the department of Information Studies where she serves as the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry, also known as C2I2, who is one of the co-sponsors today.

Safiya also holds appointments in African-American Studies and Gender Studies. She is a Research Associate at the Oxford Internet Institute at the University of Oxford and has been appointed as a commissioner on the Oxford Commission on AI and Global Governance. She’s a board member of the Cyber Civil Rights Initiative, serving those vulnerable to online harassment, and serves on the NYU Center for Critical Race and Digital Studies Advisory Board. She’s also the author of a best-selling book on racist and sexist algorithmic bias in commercial search engines, entitled Algorithms of Oppression: How Search Engines Reinforce Racism with NYU press. Safiya will be speaking today with Tawana Petty. Tawana is a mother, social justice organizer, youth advocate, poet and author. Tawana is intricately involved in water rights advocacy, data and digital privacy rights education, and racial justice and equity work.

She’s the National Organizing Director at Data for Black Lives. The former Director of the Data Justice Program at the Detroit Community Technology Project, Co-Founder and former Co-Lead of Our Data Bodies, a convening member of the Detroit Digital Justice Coalition, a Digital Civil Society Lab Non-Resident Fellow at the Stanford Center on Philanthropy and Civil Society and the Director of Petty Propolis, a black women-led incubator for artists primarily focused on cultivating visionary resistance through poetry, literacy workshops, anti-racism facilitation and social justice initiatives Tawana was named one of 100 brilliant women in AI ethics in 2021. So please join me in welcoming Tawana and Safiya for this really interesting conversation. Safiya I’ll turn things over to you. 

Safiya Noble: Okay. Thank you so much. I am so excited Tawana to get to talk to you today. I mean, this is maybe a highlight of 2021. Just the work and the space and the imagination that you have for black people in Detroit and all over the world, that the space that you’re holding for all of us is great. And so I just want to say thank you for being in this conversation with us and I’m looking forward to working on issues of injustice as they face black people and vulnerable people with a variety of different kinds of technologies that are weaponized against our communities. How did you get to this space of these kinds of conversations? I just am curious about more of your origin story.

Tawana Petty: Yeah. Thank you for that, first of all, I’m honored to be in conversation with you as well. I know you probably don’t want me to do this, but I want everybody to get your book. It’s right here in front of me.

Safiya Noble: Don’t do this. 

Tawana Petty: I have to because there are so many nuggets in here that are really a roadmap for organizing on the ground. I’m not sure if you see this book that way, but I definitely see this book that way. 

I grew up in Detroit, I’ve been here my entire life. I’ll be 45 at the end of the month and for my entire life, Detroit has suffered under the weight of a dominant negative narrative, it’s been pervasive throughout media images, music, newspaper articles, films, memes (once we got to the world of social media). Detroit has just had this, dominant negative, pervasive, violent narrative that has really painted us with a pretty terrible broad stroke.

There’s a psychological impact to growing up under that type of narrative. I grew up being taught that if I wanted to be anything or do anything with my life that I had to get out of here, that there was nothing here to help you be a more vibrant, contributory human being. I knew something was wrong with that.

You know, there is a reason why folks, at different points of my life, were flooding into here of different racial demographics while black residents here were being told we had to escape. So, you know, I started to interrogate that a bit as I got older and I’m a poet, so I was already using my poetry since the age of seven, essentially, to kind of write what I thought Detroit was, you know, which was contradicting what everything around me was telling me Detroit was.

Now, I say poetry is visionary resistance, but it started off as a poetic response and then it kind of moved into “how can I use my voice to counter narratives that make me feel dehumanized, make my neighbors feel dehumanized, make my educators feel dehumanized”, because I even had teachers who were being in the classroom saying like, you know, we don’t have what we need to properly educate you and so we’re doing the best we can. 

I started to use every talent that I feel like I was gifted to try to push back on those narratives and create an alternative reality. And later in life, I got connected with the Allied Media Projects who were looking at media for liberation and they’ve been in Detroit for about 13 years now, I joined the Detroit Digital Justice Coalition which I’ve been convening since 2016, and really have started to think about technology as a way to add to that countering of the dominant narrative, whether it is being very transparent, as you talk about a lot in your book about how we experience technological systems and data science and all these algorithms, etc. How we experienced those systems, but then, you know, that turned into me joining Our Data Bodies as a Community Researcher, initially, and talking to my neighbors about how they were experiencing these systems. Then, that led to my becoming the Data Justice Director at Detroit Community Technology Project, which was part of the collaboration with Our Data Body, so I moved from the Community Researcher to the Data Justice Director, and then ultimately over to Data for Black Lives, which now I serve as the National Organizing Director. 

Safiya Noble: Oh my God, that’s such an incredible journey. When you talk about what it’s like to grow up under the negative narratives that get ascribed to black spaces, you know, I don’t even think that I had thought about the things that were inside of me as I was writing Algorithms of Oppression that are also linked explicitly to that. I grew up in Fresno, CA which is midway between the Bay Area and LA. People in California think of it as a place where all the country people are. It is surrounded by sundown towns, many grand wizards of the KKK come out of the San Joaquin Valley, you know, to national leadership and it’s a place where space is clearly defined along racialized terms. If you’re black and you grew up there, at least for me, when I grew up in the 70s and 80s, in some ways, those were really defined by “you might run into [Klu Klux] Klan, so you don’t go to that part of the town after sundown”. I mean, that was like how I learned about this. I also had at the same time, this narrative that was running in a place like Fresno, about west Fresno, which is where all the black people are, it’s a historically black part of town. Those same kinds of narratives followed when I lived in Oakland and worked in San Francisco and people saying again, not black people saying, oh, you don’t want to live in Oakland. This idea that the spaces that we occupy, where we come from are the wrong spaces, the bad spaces, and the places where we can go the geographic kind of boundaries of where we can move, because they belong to other people. That is in me as I hear you talk about Detroit. There’s such a long history, like other cities, where non-black people decided they would not bring their money back after our rebellions for civil rights and for justice and power in our communities. We see those neighborhoods and those cities and those communities abandoned.

And there’s something so powerful to me about the way you talk about reclaiming. Like I don’t have to go somewhere else to be great, right. I’m part of what makes Detroit great. You are, [there are] so many brilliant organizers coming out of Detroit. So, it’s these stories that shape us that are so important to how we frame the way we think about our work and the world.

I will just say that the fact that you’re an artist, I mean, I really want to double-click on that because I feel like there’s so much about the arts and humanists and social scientists who see things differently than the kinds of traditional people we’re used to working with in the tech space, who have a different conception of how they see our bodies.

So can we just get into Our Data Bodies and how that informed your thinking, especially bringing this kind of more humanistic lens to your work? 

Tawana Petty: Yeah, absolutely. I’m glad you mentioned LA and Fresno and growing up with that experience, Our Data Bodies covered three cities. It was Detroit, Charlotte and LA, and one of the running themes across those three cities was we feel watched, we don’t feel seen. We feel like our every movement is tracked and targeted and that we’re leaving this trail of data everywhere we go, not for our benefit, but for our detriment. Our smallest mistakes are tracking us and following us with every movement that we try to make as human beings. A lot of people think of data and technological systems as these systems that will only impact you if you have access to a particular app, or if you have access to broadband internet. But, we know that whether or not you have access to those systems, doesn’t really have much to do with whether or not those systems have access to you. 

Community members were consistently telling us that they felt like these systems were integrating or speaking with one another to make decisions about their lives, whether or not they had engaged with those institutions, governmental entities, or not. The data body is really like a make-up of all the ways that our being communicates with the world and how that communication makes decisions about how we experience our neighborhoods, how we experience institutions within our neighborhoods, how we experience the cities or states or townships or countries we live in and how decisions are made about those communities.

So, as an example, one that we lift up in the digital defense playbook, say you get an EBT card, which is this card that you can get from the government to help you with access to food. If you’re purchasing particular items for your family and there is this trend of other residents purchasing similar items for their families, then there might be a decision made whether or not to invest in fresh food in your neighborhood or a grocery store in your neighborhood, but it’s not giving a full picture. Let’s say you only have a liquor store or a gas station that has food in your community, which is a walkable distance, and you’re living in an impoverished neighborhood that repeat information of “well, these people only buy hot potato chips and frozen meats or those types of meals, then they’re not going to buy fresh food”, but it doesn’t account for the fact that there is no fresh food within a walking distance from your neighborhood. The data body is contributing to making these uninformed, really, decisions about how we experience society.

Safiya Noble: Yeah 

Tawana Petty: In general, it replicates these harms a lot of times. The flip side of that is you’re living in a more wealthy neighborhood and you tend to be at a fresh food market every day, or all of your neighbors are getting fresh produce. Then they might make a decision to add yet another fresh food grocery store or institution in your neighborhood based on those patterns. But, that’s a wealthier neighborhood with access to those things. So the data body has a trail, a stream of information that it creates about you based on how you function. 

Safiya Noble: Yeah, I remember being an undergrad at Fresno State and doing this study on neighborhood grocery stores and the lack of them. It happened to coincide when I was doing this study with a visit by some Cuban students who were coming to visit the United States. I remember taking them around west Fresno, where there were nothing but liquor stores, and going in and looking at the expiration dates on food, which of course we know was expired, old food that isn’t even consumable. 

Tawana Petty: Or re-labeled after expired. 

Safiya Noble: -all the things that we know. Then driving, truly like, 15 minutes across town to the more affluent white neighborhoods.

It was so shocking, the difference between the bodega versus the like whole foods on steroids space, that the students were in complete disbelief. They, I mean, just couldn’t even conceptualize that this was happening within a 15-minute drive from the black neighborhood to the white neighborhood. I think about those kinds of legacies that predate when Chris Gillyard talks about digital redlining, what we’re talking about is a continuation of those kinds of redlining practices that have been here since the inception of the country, and then get concretized, remade, with a new baseline as if that new baseline of data is the truth. 

Tawana Petty: Right. 

Safiya Noble: -that doesn’t account for these kinds of profound disparities that exist all over the country economically, and of course that are profoundly racialized. So, your work is so important here in these conversations. What we share in common, as we look at the people for whom these technologies are allegedly working.

Yet, we know that the people who experienced them experienced them in these profoundly discriminatory ways. Apparently, profoundly is my word of the day. Sometimes I have a word. 

Tawana Petty: It’s accurate. 

Safiya Noble: I hear myself saying it too many times. I want to tell you one other thing that I heard, I was at this conference called the Big Boulder Initiative, I don’t know if you’ve ever heard of it. It’s kind of like one of those conferences where the tech leadership in Silicon Valley, you know, like they go to the woods or they go to these desert conditions or to the mountains, and then they plan how they’re going to run society and they share all their wares and all their new tech with each other and they geek-out on it. I somehow got into this place to talk about my book, and of course, I broke the meeting like I sometimes do, and the person who spoke right after me was a very senior official in the FBI. Then I knew I was in the wrong spot, but also he said, as far as the US government is concerned, your data profile is who you are.

That was one of the most chilling phrases I have ever heard anyone utter because I understood the consequences of what that meant. That all these ways in which we’ve been captured and put into surveillance systems is read like the truth in the eyes of government officials, other law enforcement, or others.

Let’s talk about what it means to be captured, misread, or misunderstood by these systems. It’s interesting, the word “recognition” too because that’s such a loaded word to assume that there is some type of human recognition happening when our faces are dot-mapped by some software. What’s your work been around facial recognition and the organizations that you’re working with, and what do you think are some of the most important things we should be thinking about right now?

Tawana Petty: Yeah. Ooh, that’s so loaded. I’m glad you escaped from that god-awful gathering. 

Safiya Noble: I had no business being in that meeting, I don’t even know. Let me tell you what, they were kind of giving these pitches about facial recognition. One of them, I remember so distinctly, was a facial recognition technology where you could point at a crowd of a hundred thousand people that would give you hyper-accuracy of who those people are. Now, of course, this was happening years ago before researchers were even, I mean, I gave this talk a long time ago. I don’t even know if the book was out yet. Of course, we know those technologies have been adopted by law enforcement and used at Black Lives Matter protests, at Occupy protests, at all kinds of lawful first amendment, sanctioned protests, and gatherings. These get framed as technologies of recognition, efficiency and opportunity by one crowd. At the same time, we know this kind of more detrimental effect. I feel like there’s a lot for us to unpack about how we came to know the harm and I definitely want to know what you’re learning from community organizers about that, because I do think that community organizers and people like you have pushed the policymakers so hard to think about bans and moratoriums and other interventions. 

Tawana Petty: Yeah. I want to start with the irony that it would be an FBI representative talking about like our data profile. You know, we know the Bureau of Investigations, which was Hoover that then moved on to the FBI and then COINTELPRO, the counterintelligence program, was surveilling social justice activists, you know, like Marcus Garvey and then using some of those same agents to surveil the Black Panther party. But, one of the things that I’ve also learned is that these agents are not politicized on the history of the FBI, that we know more about the history of the FBI than the agents who come through the FBI, similarly to law enforcement, law enforcement officers don’t know the history of law enforcement. We know more about the history of law enforcement than actual officers within those institutions and so they say a lot of contradictory things in addition to the behavior that they impose upon our communities. But, it’s been tremendously difficult to get these surveillance technologies out of predominantly black communities. I believe there’s only one in New Orleans which has been successful in getting some form of a ban instituted. The other 20, 25 to 27 plus communities, have been predominantly white. Detroit has not succeeded in getting rid of this technology. We’ve succeeded in getting some policies put in place with some supposed repercussions for law enforcement officers who abuse the technology. I use abuse very loosely because the use of the technology at all, is abuse. We have an ordinance in place that is supposed to increase transparency, but we have not been successful in getting a ban. I will say one of the things that I’m deeply satisfied with is how heightened the political education has been around this topic. When I first discovered that Detroit was ramping up mass surveillance, it was actually at the time that we started our research with Our Data Bodies. Detroit was ramping up, what’s called project Greenlight, and so it started off as eight or nine gas stations that stay open late, that were going to have these flashing green lights that were monitored 24 hours a day by law enforcement at a real-time crime center. So thinking about a real-time crime center is like the bat cave, right? Where you have law enforcement officers in this cave, looking at all these screens and watching over the city so that they could protect us. So, it was going to be eight or nine gas stations that stayed open late. That perked my ears up, but it didn’t send me into a panic immediately. I was just starting this research to get to know how community members were experiencing data systems, but very quickly, those eight or nine gas stations moved up to over 100 locations and then within a year from that, it was 257. Then within three years it was 500, so now we’re at over 2000 project Greenlight surveillance cameras and three real-time crime centers. They’re no longer just connected to real-time crime centers, they’re now connected to police precincts as well as mobile devices that law enforcement officers are able to take home with them. And, they implemented face recognition. 

So, we’re looking at, you know, an officer having the ability to kick his feet up on his couch and look at his mobile device and monitor the city using face recognition technology. That’s the reality we have in Detroit in addition to cell phone tracking, surveillance, traffic cameras, drones, surveillance helicopters. Detroit is completely surrounded by mass surveillance and face recognition. We have tried everything from town halls to policy, moratoriums, ordinances, attempted bans. Most recently, we lost a campaign called Proposal P, initially, it would have been a ban, it was of course watered down behind the scenes, but it had the provisions that actually do create safety in our communities, like truly affordable housing which would consider the income in Detroit. Just as an example, we hear affordable housing being thrown around all over the United States, but in Detroit, affordable housing took into consideration those suburban communities we talked about earlier who have a much higher income. So, it takes into consideration wealthier incomes when determining what’s affordable.

So, what our Proposal P which was revamping the city’s constitution would have done, was account for Detroit’s median income so that it could be truly affordable. It would have prevented any water shut-offs, making not only an assistance plan but affordable water. It would have been reparations for the disenfranchisement that Detroiters have suffered for over a half-century and so many other mechanisms would have been put in place to ensure the quality of life is increased so the quality of life crimes are reduced. Because our law enforcement officers, FBI agents, and other folks who are in these institutions, in these uniforms are not politicized about the history of anti-blackness or the ways these systems came to be, they’re replicating these tremendous harms. I have found that anytime I’m able to sit one of them down and have a one-on-one conversation about this history, a lot of times minds are changed. I’m an abolitionist at heart, I’m working towards doing away with those types of institutions. I don’t want to be tracked, traced, targeted and incarcerated, right.

But, these humans inside these uniforms have to be politicized as much as we have to be politicized. I think that’s going to be the only way that we’re going to get some of those folks to transform the way they see our cities and our communities because the interactions last summer with these protests and uprisings are unacceptable.

Unacceptable. We saw heightened brutality. We saw militarized policing across our communities. We saw increased violence. Then, what came with that was heightened surveillance. Now, we’re looking at an infrastructure bill coming out of our government that’s going to invest millions upon millions of dollars in surveillance infrastructure.

It feels like a really challenging uphill battle at times. My hope is in the education of our community members and even folks that wear that uniform so that they understand that they’re contributing to a society, as you talk about in the book, that helps us to feed into our worst impulses. Because we’ve been convinced to fear these predominantly black, brown, indigenous communities. We think that the only type of solution that there is for safety, is heightened surveillance, mass incarceration, and these other systems that violate our human rights and civil liberties. 

Safiya Noble: Absolutely. Everything you’re saying is so right. I mean, I’m thinking about a recent story I read about women incarcerated in Los Angeles county. The majority of whom are there because of quality of life crimes, crimes from stealing diapers for your baby or food or petty crimes that are tied to survival, including surviving systems and people who are abusers and abusive and the response to the diminished quality of life.

What I think of too, as just like the basic respect for the humanity of poor people and black people and women and children. It’s met with hundreds of millions of dollars of investment in surveillance technologies against us, rather than hundreds of millions of dollars of investment into education, affordable housing jobs, healthcare, public media, all the kind of public goods that would create the conditions for thriving instead of surviving.

I’m just going to say, I think black women, our work has always been in service of the most expansive models and conceptions of democracy and participation as one can get. I think of Indigenous peoples’ total worldviews and ways of living that are about mutual respect and mutual aid, that defy things like property control.

Those are ways of knowing that are so important at a time when the very planet is facing its own crisis. So I feel like this is so, so important. It’s not a political and intellectual exercise in the way that I feel like these conversations about facial recognition or technologies of surveillance and control often get reduced to, because it’s popular now to talk about these things. I feel like the communities that we’re in and the work that we’re doing is really about our own kids, our own siblings, our parents, ourselves. So I just wanted to acknowledge that and acknowledge you in that and acknowledge myself in that, because I also feel like we are often extremely undervalued and I appreciate UCLA giving us a space to have this conversation today.

I know how many spaces are foreclosed for our voices. With all that we know and what you know about what’s happening in Detroit, I don’t think it’s out of the realm of possibility to think that we should outlaw, in every way, surveillance technologies. I mean, they have become so normalized that people think like, “oh, it’s just how I get into my phone, I keep my iPhone more secure”, but I just wonder what are the interventions that you see that if policymakers or people could make a difference, what would you orient them and point them toward working on now?

Tawana Petty: Yeah, wow you said so much in such a short window, right. You know, I’m sitting here, and I told you this before the call, I’m sitting here in a hotel because my city invested something like $50 million in surveillance infrastructure.

Yet, half the city now doesn’t have power. Every time it rains the same neighborhoods flood over and over again, there’s this lack of priority and seeing. us as fully human and tapping into that Indigenous practice that we’re familiar with. We know what makes us safe. We know that being able to turn to one another, to see one another, to have our basic needs met is safety and yet, and still over and over again for decades, they have continued to invest in the things that do not create safety, that prioritize the security mindset.

One of my late mentors Grace Lee Boggs, she would always talk about what time is it on the clock of the world. I wrote this article several years ago about what I could potentially find in common with this 99-year-old woman who’s always asking, “what time is it on the clock of the world?”, and within that article, I wrote about the horseshoe crab. There’s a lot of irony in this because now the horseshoe crab has been looked into for like our vaccines to keep us alive during COVID. But, I wrote about what it would mean to stop surveilling. To stop, to stop monitoring this horseshoe crab that has figured out how to live millions and millions of years without human interaction and to just thrive and be, and how, when it is disrupted and move into another area for our experimentation, it refuses this to mate, the horseshoe crab will not mate if we uproot it from where it lives and put it under our microscope for our benefit. Humans are no different when we’re poked and prodded and tracked and traced and surveilled.

We have to figure out how to function with that type of social control. That’s how I feel in Detroit every day, there are green lights, the surveillance flashing green lights everywhere. If you happen to live somewhere where it’s outside of your bedroom at night, you’re not going to get any rest. Just the knowledge that nearly every institution you walk by, except for the more affluent neighborhoods within Detroit, you are under constant targeted surveillance. I will back up and say, even in the more affluent neighborhoods, because a lot of people have ring doorbell on their homes and want to protect their homes from the undesirable neighbors that they don’t really want in their neighborhood. I say that with air quotes, but these systems are being programmed to further disenfranchise us, like who were the suspicious characters that folks are quoting about in like neighborhood app as an example where it’s supposed to be community safety.

You see easily that the racial tones and behaviors are replicated. Detroit has been an experiment, I feel like we’ve been horseshoe crabs for so long, and that we’ve been picked that and prodded at where now there’s this newer language that says Detroit is coming back. We’re worthy of investment now because we’re on the fresh water and folks want to see what they can get out of Detroit. We don’t want to be the horseshoe crab. We don’t want to be picked and prodded for the benefit of folks who are looking at us now as a potential investment, we want to be seen as fully human and mass surveillance has never created safety and it sure as heck doesn’t create an environment where you can feel like you’re seen and fully human.

I went all the way around the block because this situation is so connected to how we feel in our dignity and humanity. It’s such a complicated situation, but it’s going to take all of us to be speaking out against this and to get clear on what it means to be safe. We know what it means to be safe and it has nothing to do with hyper policing, mass incarceration, hyper surveillance, face recognition, and all those other technologies that our law enforcement agencies are lobbying with.

Safiya Noble: Yeah, absolutely. 

Our communities are the market opportunity for these companies that sell these types of technologies. Our exploitation is one more market opportunity.

It’s so important that we’re making legible what we’re talking about when we’re talking about technology and data and systems and even surveillance. I mean, sometimes I think about what words mean and whether we should do away with certain words or expose these words for what they are.

Data is one of those kinds of words that really flattens what we’re talking about. If we said data is surveilling your every move and selling it to companies who then can make a profile about you and engage with you on terms that you will never be able to control. Maybe that’s a better, longer, and more accurate way of thinking about data.

I love what you’re saying because it’s really explicit that surveillance is tied directly to the erosion of quality of life and erosion of our humanity. No one benefits indeed from that, not just the people who are hyper-surveilled. I want to invite questions, Estefania says, “I’d love to hear about the implications of surveillance tack on individual behaviors, such as self surveillance”. 

Of course, when policing becomes so normalized to the point that community members can replicate these processes. I mean, of course we know that this is happening in such micro ways. Every year I asked my students, how many of you went to a protest this year about anything? it’s typically like some gen X-er who’s in the back of a classroom who’s coming back to school.

She’s like, yeah girl, of course. And I asked the students, you know, what gives and they say, I’m afraid if I go to a protest, I won’t be able to get a job or there’ll be a picture of it. I’m like, well, what about the fact that that’s one of your constitutional rights, then they’re like, I still need a job.

So, you know, I think about this relationship between self-surveillance and self-policing that actually has us retreat away from our civil rights, rather than push into an expansion of civil rights. That’s also a dimension of what I think so many of these projects do is they reframe our mental models about what freedom is, fundamentally, because the surveillance technology is kind of overdetermined the degree to which we believe we are free and then of course, act accordingly. So I’m wondering, what do you think about this question? Or do you want to add anything? 

Tawana Petty: All of the same things you just said, I know particularly with the expansion of project Greenlight in Detroit, there are institutions that I love that I no longer visit, that is my favorite grocery store which is right up the street, I go out of my way not to go to it because it’s a Greenlight location now.

And so, just this idea that no matter what I do, I’m going to be on some officers’ mobile device or on a screen at some real-time crime center being monitored while I go pick up peas from the supermarket is a very disheartening feeling. It is form of social control where you start to navigate your city differently with the hopes that you don’t run out of places, you can go where you’re not surveilled. That coupled with face recognition and brief cam, which is another system that they use in Detroit, where you’re followed from one place to the next place, and the idea that you could be misidentified which Detroit has boasted two cases, known cases of misidentification that you could be picked up and arrested for something you didn’t do. It is deeply troubling and it is a reality that I live every day and many people in Detroit live every day. 

Safiya Noble: Absolutely, Anhel is asking if you’ll tell us the story of Ms. Robinson’s experience at a Detroit skating rink, does this ring a bell for you organizing against retailers? 

Tawana Petty: Yeah, absolutely. So major shout-out to Fight for the Future for organizing a campaign to remove face recognition from retail stores. In addition to that, the campaign has been expanded to remove face recognition from public spaces in general. This is a 14-year-old in a suburban Michigan, just outside of Detroit, roller rink who went to go skate one night. Her parents dropped her off at a skating rink so that she could have fun with her friends. Under the guise of checking her temperature for COVID, they snapped an image of her face and then told her that she was another child who had been in an altercation several months earlier at the skating rink and that their system had identified her at 97% so they put her out into the street at night to basically fend for herself until someone could pick her up.

That’s another example of how pervasive these systems are, you’re going to get your image taken while they’re supposedly checking your temperature and they’re scanning your face. The, institution, the skating rink basically was just like, oh, if we made a mistake, you know, then we apologize, but no real apology. None of the parents who send their children to the skating ring had any idea that face recognition was a component of their visit and so it’s so pervasive, it’s everywhere. And they’re thinking of suing this skating rink and there are two lawsuits currently against the Detroit Police Department for the misidentifications of Robert Williams and Michael Oliver.

Safiya Noble: Yeah. You know, this idea that every institution that adopts these different kinds of technologies and software, where does the accountability lie? Are the retailers, government institutions, schools, universities, accountable and responsible? Who in those institutions is responsible for the adoption of these technologies?

And of course, the software systems and makers themselves issuing responsibility for their faulty products. We have the research, of course, we know from the gender shades studies from the work of many black women, in fact, Joy Buolamwini, Timnit Gebru, Deborah Raji, you know, around black women’s faces being the least likely to be identified correctly in facial recognition systems all the way through to the kind of work that you are doing and that your colleagues are leading in Detroit, we know that this is pervasive and I think you’re right. There is no measure of accountability, this to me seems like an extremely important space and place. I think of this, of course, in terms of whether it’s the federal trade commission cracking down on these kinds of faulty products that cause consumer harm, whether it’s civil rights enforcements and these things being seen as encouragements upon our civil rights, I mean, there are many different ways we can think about accountability. 

In the few minutes that we have left, I want to create a space for thinking about other kinds of imaginaries and this is where I want to just leave room for you to talk a bit about your incubator because I think we could talk, you and I, could talk all day about the harms and one of the things that I realize is that we do talk all day about the harms and then I look around and I see that the men of Silicon Valley are able to have hundreds of millions of dollars and beautiful spaces and lots of time to imagine a world that they want to see and the work that we get to do is to imagine how do to stop it or intervene upon it or dismantle it.

We don’t have any money or resources or space to imagine the kinds of worlds we want, that aren’t completely constrained and limited to pushing back on what flows down upon us. So, I want to ask you about the world that we could imagine, because I want to do what you’re doing, which is to make more space. So I just want to hear, how are you doing it so we can do it too. 

Tawana Petty: Thank you so much, and I’d like to just briefly say, go to d4bl.org, look at our data, capitalism micro-site our No More Data Weapons campaign if you want additional resources, but thank you so much for asking about the artist incubator.

I’ve been a poet since I was seven years old and that was the first way that I was able to articulate a different reality about living and being in a city that was being so demonized. I’ve taken that opportunity as an artist to incubate other artists and so through Petty Propolis, I do an annual art festival and artist’s retreat in historic Idlewild, Michigan, which is one of the historic black neighborhoods that was considered kind of a resort where W.E.B. Dubois and other black brilliant minds and other artists would go when they couldn’t go to other places to kind of kick back, relax, unwind, and incubate their thoughts. I’m following in that tradition, I invested in some property there, I bring 30 artists and innovators and herbalists from Detroit to have several days of unwinding and co-creativity, and then I do a festival, free to the community, over a two-day period. Of course, we haven’t done it in the last two years because of COVID, but we will be back next year if everything goes well, but that is something that is deeply in my passion. We also do workshops and fellowships for young artists to kind of bring them into this visionary resistance mindset of how we can start to create that, reality that we all deserve.

The reason why I started off with the focus of Detroit is very much for the reasons that I indicated earlier, my entire life has been considered kind of like a sh*thole city, even before it, places were being demonized through the government, the last administration. I don’t know what folk’s political leanings are, but I will tell you that when I hear Detroit is coming back, I hear “Make America Great Again”.

I want people to understand why those terms are complicated. No matter what side you’re on, if you use a statement like “Detroit is coming back”, then you negate the humanity of the people who have never left. I’m not tackling political understandings, I’m tackling a narrative that dehumanizes a community that has lived and held up and held on to a city that was so disinvested in for a half-century and said that some folks were forced to stay because, you know, they didn’t have means to leave and some folks just dug their heels in and said, this is my city and I’m going to see that it survives. 

I’m one of those people who said, this is my city, I’m going to see that it survives and I’m going to re-spirit the young people who come through here because I don’t want them to feel like I felt growing up, and being told that I could only be a vibrant human being if I left here. So that is what I use my artists incubator for is to increase literacy, literary arts, and to create a space where our community members can be respirited and co-create together.

Safiya Noble: Well, you know, I’m respirited just being here with you, getting to really spend some time with you. I can’t wait till I get to spend time with you again, I feel really grateful for your work and thank you to everybody who came and participated and gave witness and space to this conversation with us today.

Where should people find you so that people who want to stay connected with you? 

Tawana Petty: Thank you, d4bl.org, that’s Data 4 Black Lives. We have social media, Instagram, Twitter, I’m not personally on Facebook, but there is a Facebook page as well. And Petty Propolis, there is a Twitter page and you can keep up with some of the things that we hope to be doing over the next several months, so help us re-spirit and co-create together.

Thank you so much. I’m deeply honored to be in this discussion with you today. 

Safiya Noble: Thanks. Thanks, Tawana. 

Natalie Monsanto: A huge thanks to the Center for Critical Internet Inquiry for co-sponsoring this event and to our spectacular guests. Follow Tawana Petty on Twitter, she’s @combsthepoet while Safiya noble is @SafiyaNoble please be sure to check out the Institute for Technology Law and Policy they’re @UCLAtech on Twitter. They’re also the new kids on the block and happened to be our partners in this series and a lot of our work in this area, plus they’re really great people. Last but not least, follow us on social we’re @promiseinstucla and please, if this episode was valuable to you, support our work, visit law.ucla.edu/supportpromise to make a donation at any level and help future conversations like these come to be. Subscribe to the show so you can be sure to catch the rest of this series as it’s released, until next time take care.