When the PBS documentary Coded Bias started streaming on Netflix, many viewers were shocked to see the dangers of AI - that computers can't really recognize women. They also can't read the faces of anyone but white men. Where does that leave people of color?
The new documentary opens a huge can of worms. Not only does it reveal the gaping flaws in facial recognition software, its a testament to showing the most powerful women in tech. Directed by Shalini Kantayya, the film features women leaders in tech, like Joy Buolamwini, Safiya Umoja Noble and Meredith Broussard. Kantayya speaks to Forbes about protest, big tech and the future of AI.
This film shows us how AI impacts our lives right now, with facial recognition being used by police, what does this mean for protesters?
Shalini Kantayya: Our tech can infringe upon so many rights that our democracy is based on, especially our right to protest. The truth is, unless we have some laws that protect us, I feel that we don't live in a culture where we can opt out of these systems anymore. We shouldn’t have to check the terms and conditions of these tech platforms to make sure they don’t violate our civil rights, especially given that we’re increasingly required to use them to participate in society. If we want to hold on to our civil rights and our democracy, we really have to empower ourselves around these issues.
What do you think about the recent changes by big tech?
In June, we saw a change that we never thought possible, which is that IBM said they would get out of the facial recognition game. Microsoft said they would stop selling it to police, and Amazon, in a good gesture, said that they would press a one-year pause on the sale of facial recognition technology to police. It was brought about in part because of the integrity of the scientists in my film—Joy's work, Gender Shades, supported by Timnit Gebru and Deborah Raji, which proved this stuff was racially biased—but also the largest movement for civil rights and equality that we've seen in fifty years on the streets of literally every city across the US.
The film has received great acclaim, what do you think people connect with the most?
I think people are making the connection between the inherent value of Black life and racially-biased invasive surveillance technologies that disproportionately impact those same communities. I owe those activists a debt of gratitude, because they have changed the way my film is received, and shown that we are ready to have a national conversation about systematic racism. Racist, authoritarian surveillance tech has no place in policing without laws to protect citizens. This should not happen in a democracy.
What are the dangers of big tech’s immense growth in power throughout the course of the pandemic? How does Coded Bias shed light on these injustices?
Everything we love, everything we care about as citizens of a democracy is going to be totally transformed by artificial intelligence—in fact, is in the process of being transformed. AI systems are often the most invisible, automated, first-line gatekeepers to every opportunity we have, and they are rarely vetted for bias or even accuracy.
The way that big tech is entrenched in liberal communities and progressive politics is sort of unseen. Because they have a reputation of being always in the direction of progress, always in the direction of forward thinking society, I feel like we have a harder time challenging these big companies. In this current pandemic, we’re becoming even more reliant on these companies and they’ve sort of created a system where they make it seem that there’s only one way to do technology.
What did it teach you?
Just as we need conscious checks on our own biases, facial recognition software needs that as well. I had no idea the scope of invasive surveillance, the preciseness to which they can predict our behavior, and how vulnerable all of us can be to predatory practices because of these algorithms. We need to look at how humanity gets lost when we prioritize efficiency at the cost of all else.
What drew you to filming Joy Buolamwini, and why is her work so groundbreaking?
I sort of stumbled upon the work of Joy Buolamwini. Other authors in the film include Cathy O'Neil, author of “Weapons of Math Destruction,” Safiya Umoja Noble, author of “Algorithms of Oppression,” and Meredith Broussard, who wrote “Artificial Unintelligence.” I basically fell down the rabbit hole of the dark underbelly of the technologies that we're interacting with every day.
What did you learn making this film?
What I learned in making the film—which stands on the on the foundation of probably three decades of scholarship and activism and research, mostly by women, people of color, and LGBTQ communities who have been speaking the truth about what's happening in Silicon Valley—is that these technologies have not been vetted for racial bias for gender bias, or even accuracy or fairness. And they exist in these black boxes that we can't examine as a society.
Was there a telling moment while shooting?
I had the experience of standing next to Joy and the computer could see my face, and the computer could not see her face. Even in the film, I don't think it could capture how I felt in that moment, because it really felt like, "When the constitution was signed, Black people were three-fifths of a human being. And here we're sitting at a computer who's looking and doesn't see Joy as a human being, doesn't recognize her face as a face."
To me, that was a stark connection of how racial bias can be replicated. I think when you experience it viscerally—and that's not even a misidentification that comes with police, law enforcement, frisking you, or some infringement on your civil rights—just that visceral experience of not being seen has implications that we need to talk about more.