‘Search engines lie’ the theme of annual speech
Commitment to Community speaker exposes the impact of biased technology in address.
October 7, 2018
Dr. Sofiya Noble, this year’s Commitment to Community (C2C) speaker, showed an audience of Hamline students, staff and faculty how search engines lie during her keynote address, last Thursday night. C2C is an annual event that began in 1996 by Hamline students and is dedicated to starting conversations about equity, inclusion and social justice on campus.
Addressing a packed audience of primarily first-years from the Hamline Church United Methodist, Noble, an assistant professor at the University of Southern California, shared how a 2010 keyword search for “black girls” sparked eight years of writing and research into how technology intersects with bias and social injustice.
The results of her search, she said, were overwhelmingly pornographic and sexual in nature.
“The question to me was why I didn’t have to add the word sex or the word porn,” Noble said.
Those identities, she said, were “synonymous with porn” according to the algorithms being used.
Those findings launched Noble’s study of how algorithms propagated stereotypes of racism and sexism, which led to the recent publication of her book, Algorithms of Oppression: How Search Engines Reinforce Racism and the topic of her speech on Thursday night. The speech was also live-streamed to Anderson Center where more first-year classes were gathered to watch.
A key finding of Noble’s research was that most users assume search engines and algorithms give a “credible and fair representation” and are “fact givers” and “arbiters of knowledge.”
One reason for this trust is that search engines deliver accurate results for everyday, mundane issues, like finding out when restaurants and stores open. However, the accuracy of algorithms is more murky when addressing complex issues like race, sexual identity and bias, partly because Google is not a truly democratic provider of information.
Instead, screening of content and ad buys or keyword optimization manipulate results and push information with a particular agenda to the top.
“There’s a host of values being demonstrated in what is screened and what is not,” Noble said. “If you take one thing away from my talk, it’s that Google is an advertising platform, not the library.”
In the Q&A session following her speech, Noble told students “you can’t do anything now, today” and instead encouraged them to use their vote and political mobilization to critique algorithmic bias.
She pointed to past, highly publicized incidents of biased results, including dramatic differences in the results of searches like “three teenagers” when the word “black” or “white” was introduced. Other examples included search results linking racial slurs against blacks to the White House during Barack Obama’s presidency.
Responses from companies like Google, she said, treated those incidents as a “glitch in the system that otherwise works perfectly.”
Noble asked students to question whether society should put so much trust in algorithms’ ability to screen or curate information in a fairly representative way. She said her goal is to provoke thinking of algorithms as more than “just math” or an objective tool for refining information.
“These are issues of our making and if anything, the technology we use will only increase those,” Noble said.
Responding to another student who asked how future coders could help prevent the creation of biased algorithms, Noble also emphasized the importance of a multidisciplinary education.
“You have to couple your computer studies with social issues,” Noble said. “You can’t design technology for society if you don’t know society.”