Among the storylines in the film is the use of facial recognition AI by law enforcement agencies. One particularly powerful scene shows a person having their face scanned by facial recognition technology just to buy a soda from a vending machine in China.
How bias is coded
Part of the discussion focused on how AI technology tends to favor the creators, who are often white men, and thus leaves marginalized groups behind.
“The same way that we see institutionalized bias in admissions to higher education institutes, job placement, immigration policies, and access to a host of things, we see the same bias played out in algorithms,” Brown said. “[The] documentary does a stellar job of highlighting how technology is biased” and arguing that the government might need to step in and regulate AI in some way. “We often forget that the person doing the coding has their own assumptions, stereotypes, and bias, which carries into code.”
When asked by Brown what inspired her to create the film, Kantayya said she read Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil, which sent her down a rabbit hole.
“I didn’t realize the extent to which we as human beings are already outsourcing our autonomy to machines,” she said. “We’re trusting them to make such important decisions … I realized that these same systems we’re trusting so implicitly have not been vetted for racial bias or gender bias.”
One of the main messages of the film, according to Kantayya, is that bias in AI can make historically vulnerable communities open to further discrimination and brutality, as in the case with police surveillance. “Harm is what’s at stake,” she said. “These things infringe on our civil rights.”
“It’s such an important conversation for us to be having at all scales—locally, nationally, globally—about the level of impact that algorithms and AI can have on humans and society,” Smith said.
Noting a storyline in the film about residents of an apartment building in Brooklyn fighting back against facial recognition software and teaching themselves about AI, Smith said she often thinks about this imbalance of public understanding of computer science and algorithms.
Action items
Kantayya said that data scientists have almost too much responsibility, in having to question what is ethical and fair and what might impact someone’s civil rights. What needs to happen, she said, is a radical inclusion of more people, including ethicists and policy makers. “Inclusion has to be a part of the picture and the process,” she said. “[Diverse] voices need to be in the rooms where these decisions are being made.”
When it comes to training the next generation of computer scientists, Kantayya suggested integrating more liberal arts education into a computer science education and making it more of an interdisciplinary study to “change the pipeline of how these subjects are being taught.”
As the computer scientist on the panel, Smith agreed, pointing out that she doesn’t remember having learned any of that as an undergraduate student.
Another way to diversify the field, Kantayya added, is to have a massive investment campaign and outreach to get more women and people of color in STEM. By speaking to low-income high school students and showing students how AI can be relevant to them, “the more competitive these industries will be,” she said.
When some WPI students go on to work at big tech companies, she implores them to speak up and be a voice of dissent. Listening to people who are radically different from us “makes technology more fair, ethical, and competitive,” she said.
Closing the discussion, Jean King, Peterson Family Dean of the School of Arts & Sciences, thanked Kantayya for combining arts and sciences into one conversation and for giving everyone a lens with which to look at these topics. “All professors and students have a role to play,” she said.
Responding to a question from King about the next steps in this journey, Kantayya reiterated the importance of national literacy around algorithms and AI. “Maybe your institution could be a part of spreading that curriculum,” she said.
Aside from increasing literacy, “I’ve seen how everyday people can make a difference,” Kantayya said. “I think if we each do one thing, everyday people can make a difference when we care enough to make a difference.”
—Melanie Thibeault