Epoca Libera / Demotix / Corbis, file
Triple jumper Voula Papachristou apologizes following her Twitter comments mocking African immigrants while trying to qualify for the 2012 London Olympic Games.
It’s no surprise that people say racist things on Twitter, especially with so many high-profile cases involving celebrities and famous athletes. But just how many tweets a day contain something racially insensitive?
About 10,000, according to U.K.-based think tank Demos. It used a filter to analyze 126,975 English-language tweets and then estimated that one in every 15,000 tweets contained a “racist or ethnic slur.”
Of those, anywhere between 47.5 percent and 70 percent were “non-derogatory” or used to “express in-group solidarity.” (Cutting out the term “white boy” reduced the total number of racially insensitive tweets in half).
That still leaves the tricky question of how many tweets would be considered, by most people, as racist. A program can’t parse the racist subtext of a tweet and human analysts have different ideas of what counts as racism.
“Even though racist, religious and ethnic slurs tend to be used in a non-derogatory way on Twitter, this does not mean that hate speech is not being used on this platform,” the report’s authors said. “Language does not require the use of slurs in order to be hateful.”
Ultimately, the study found that one in 55,000 tweets (around 0.000018 percent) was indicative of racial prejudice. That includes up to 10 percent of the tweets that were considered “casual” racial slurs — meaning they weren’t explicitly racist, but would probably be considered offensive by some people — and the estimated 100 tweets a day that threatened violence.
First published February 13 2014, 3:02 PM
Keith Wagstaff is a contributing writer at NBC News. He covers technology, reporting on Internet security, mobile technology and more. He joined NBC News from The Week, where he was a staff writer covering politics. Prior to his work at The Week, he was a technology writer at TIME.
He lives in Brooklyn, N.Y.