IE 11 is not supported. For an optimal experience visit our site on another browser.

Google ads may be racially biased, professor says

A professor at Harvard University believes she’s uncovered evidence that race is sometimes used to determine which ads you see when you go online.

Based on thousands of searches, Latanya Sweeney, who runs the university’s Data Privacy Lab, concludes in a recent paper that there is “discrimination” based on race in the delivery of certain ads.

She found that Google searches using names that sounded black (Latanya) turned up strikingly different ads than when the search was done for a name that’s more typically considered white (Adam).

“Right now, we don’t know why it happens or whose fault it is,” she said.

This all started when Sweeney’s colleague, Adam Tanner, went online to search for a previous paper she had written. Alongside the search results was an ad for It said, “Latanya Sweeney. Arrested?”

Sweeney had never been arrested and wondered what was going on. So they did a search using Tanner’s name and this time the ad said, “Looking for Adam Tanner?

”After a few hours of searching with different names, Tanner said he had figured it out. “Black-sounding names” got the “arrested” version of the Instant Checkmate ad, while names that sounded white got a more neutral “looking for” version.

Sweeney, a computer scientist, was skeptical and decided to show that this was not happening. In September and October of last year, she conducted 2,184 searches (on Google and using the search field on using racially associated names, such as DeShawn, Darnell, Jermaine, Geoffrey, Jill and Emma.

The searches took place using different browsers at different times of day, different days of the week, with different IP and machine addresses, operating in different parts of the country. Sweeney focused on the ads generated for They appeared more than any other public records search company and they were the only ads to include the word arrest.

The results, published on Harvard’s Data Privacy Lab website, found that “black-sounding names” were significantly more likely to produce ads that suggested an “arrest” might have taken place regardless of whether the company had an arrest record associated with the name.

  • With searches for apparently black names on, when Instant Checkmate ads appeared, “arrest” was in the text 60 percent of the time. When names that appeared to be white were searched, “arrest” came up in only 48 percent of the ads.
  • The names that resulted in the highest percentage of ads with “arrest” in the text were Darnell (84 percent), Jermaine (81 percent) and DeShawn (86 percent). Those that had the most ads without the word “arrest” were Jill (77 percent) and Emma (75 percent).
  • The disparity was smaller, but still significant with searches on The word arrest was used 92 percent of the time for black names and 80 percent for the white names.

“I was surprised,” Sweeney said. “I really did not expect the pattern to hold and then to do so with that much statistical certainty shocked me.”

Reuters told NBC News it does not generate the ad results that appear on its website. The company said they are “wholly determined” by Google’s AdSense program.

In a statement sent to NBC News, Google said:

"AdWords does not conduct any racial profiling. We also have a policy which states that we will not allow ads that advocate against an organization, person or group of people. It is up to individual advertisers to decide which keywords they want to choose to trigger their ads." also insisted its advertising is not connected to race in any way:

“As a point of fact, Instant Checkmate would like to state unequivocally that it has never engaged in racial profiling in Google AdWords, and that we have absolutely no technology in place to even connect a name with a race. The very idea is contrary to our company's most deeply held principles and values.”

Why is this happening?
Sweeney suspects there are two main possibilities:

  • Instant Checkmate did something, such as provide ads suggestive of arrest disproportionately to black-identifying names.
  • Google’s algorithms simply reflect society’s values. That is, people are more likely to click on such an ad when the suggestion of an arrest is paired with a name that sounds African-American and the ads clicked the most appear most often.

“Whatever the reason, I think it will be interesting to know what the answer is,” she said.

Sweeney and Tanner are conducting more research that should be out in the spring.

So what’s the harm in this?
We all know the Internet enables advertisers to deliver targeted ads based on your search history. It’s one thing to assume you’re a sports fan or just had a baby – and even that creeps out some people. But what if the ads served up were based on race in some way?

Just imagine that someone is searching your name – your boss, a potential employer, friend, someone you just started dating, a financial institution or potential landlord – and they see an ad that intimates you might have a criminal record?

More Info:

Read Professor Sweeney’s Paper: Discrimination in Online Ad Delivery

Samples of the ads in a slideshow format are at

Herb Weisbaum is The ConsumerMan. Follow him on Facebook and Twitteror visit The ConsumerMan website.