Harvard Researcher: Google-Generated Ads Show Racial Bias

UPDATED: February 20, 2013, at 10:35 a.m.

A Harvard researcher has found that typically African-American names are more likely to be linked to a criminal record in Google-generated advertisements on the online search engine and on the news site Reuters.com, a website to which Google supplies advertisements.

Latanya Sweeney, director of the Data Privacy Lab at Harvard, began her research after a colleague, government department fellow Adam Tanner, told her that his Google search of her name had generated an advertisement that read: “Latanya Sweeney: arrested.”

In disbelief, Sweeney began poking around online and found that advertisements from criminal records site InstantCheckmate.com incorrectly suggested that she had an arrest history. The two then plugged Tanner’s name into Google and, to the pair’s surprise, it generated a neutral InstantCheckmate ad that did not hint at a criminal record.

“Adam jumped to the conclusion that [the advertisements] were coming up on Black-sounding names,” said Sweeney. “I spent hours trying to show him that he was wrong and couldn’t.”

Thus began the start of a research study, which was partially funded by Google, that involved extensive combing of the Internet and numerous databases. Sweeney’s research paper summarizing her findings is slated for publication in an academic journal.

First, Sweeney identified typically African-American names using a database that compiled first names given disproportionately to babies of one racial identity over another. She then paired the first names with last names by identifying real professionals with academic qualifications—medical doctors, for example—and verified their racial identities with Google Image search results. Finally, using a sample of typically Caucasian and typically African American names, she ran analysis on search results.

According to Sweeney, Google maintains that it cannot predict which advertisements—positive or negative—will be most popular, so advertisements are initially distributed at random. However, in searches of typically African-American names on Google and Reuters.com, Sweeney found between 81 to 95 percent of generated ads suggested an arrest.

“It is an interesting mirror of society,” said Sweeney, “that the Internet, which started out neutral, has begun to show racial bias.”

When asked for comment, Google spokesperson Aaron J. Stein wrote in an email that AdWords, Google's profitable advertising product, does not engage in racial profiling.

“We also have a policy which states that we will not allow ads that advocate against an organization, person or group of people,” Stein wrote. “It is up to individual advertisers to decide which keywords they want to choose to trigger their ads.”

Reuters.com could not immediately be reached for comment on Sweeney’s study Wednesday evening.

Sweeney said there are two possible reasons for the seemingly biased results: Google’s computer-generated algorithm may be unintentionally skewed, or Google users may choose to more frequently click on arrest advertisements that come up for black names over white names.

Gary King, director of the Institute for Qualitative Social Science at Harvard, wrote in an email that isolating the cause of the skewed search results will determine the next steps for researchers.

“Laying out the patterns, as Latanya is doing, and then ascertaining its causes and effects, is very important,” King wrote.

—Staff writer Anneli L. Tostar can be reached at annelitostar@college.harvard.edu, Follow her on Twitter at @annelitostar.

Tags