Breaking News Emails
When Naim Tyler tweeted a video on Nov. 8, 2016, that showed an alleged voting machine malfunction in favor of Hillary Clinton, he expected some attention. But he didn’t realize what was about to happen.
The video he posted showed him repeatedly pressing the button for Donald Trump, while the machine’s indicator light stayed on for Clinton. It turned out that the machine was working properly, and that Tyler wasn’t following the instructions for changing his vote. But nonetheless, the video aligned with right-wing conspiracy theories and went viral, aided by Russia’s then-unknown disinformation campaign.
“BREAKING: Machine Refuses to Allow Vote For Trump in Pennsylvania!! RT the hell out of it! #VoterFraud #voted #ElectionDay” tweeted @Ten_GOP, a Kremlin-backed Twitter account masquerading as Tennessee’s Republican party. It was retweeted more than 29,000 times and picked up by dozens of media outlets.
“I kind of had a feeling it was going to end up getting a couple thousand retweets and people were going to see it, but I didn’t know it was going to blow up like it did,” said Tyler, who goes by @lordaedonis on Twitter and has close ties to alt-right groups. “I wish I had said something on the video instead of just poking the screen like a dummy.”
Tyler’s story highlights how Russia’s disinformation campaign was able to seize on political content posted to social media by politically motivated Americans in order to spread distrust in the election.
Russian troll accounts frequently retweeted right-wing celebrities and media accounts to boost their messages. The Russian account @tpartynews, posing as an American Tea Party group, retweeted both political provocateurs like Ann Coulter and mainstream right-wing accounts like Fox News. Russian accounts have been found to push right-wing conspiracies published by fringe news websites, according to an analysis by the German Marshall Fund, which tracks Russia’s influence campaigns.
The spread of viral misinformation, sometimes aided by foreign governments, is a playbook that tech companies, journalists and government agencies are watching for when Americans head to the polls on Nov. 6 for the 2018 midterms. Last week, the U.S. charged a Russian woman with attempting to meddle in the upcoming election.
“If you can get indigenous content, turn that into a conspiracy, and filter that into the mainstream media, that’s a textbook case,” said Clint Watts, a former FBI agent and researcher who specializes in Russian interference. Watts is also an NBC News contributor. “As an information warfare missile, that was a direct hit.”
Tyler’s video racked up millions of views, and caught the attention of both far-right media and mainstream outlets. By noon, the far-right website Gateway Pundit had shared the video in an article that claimed “Machines won’t take Trump votes.” Another article from the conspiracy site Infowars received more than 50,000 Facebook engagements and was linked to by 33 websites with their own original stories on the video.
A Drudge Report post on Facebook that featured Tyler’s video was shared more than 100,000 times, Tyler said. By 11 p.m. ET, the tweet had gone global. A Saudi Arabian women’s magazine tweeted Tyler’s video to its 400,000 followers with the question, “Has there been a forgery?”
The tweet may even have reached Trump. In a telephone interview on Fox News on Election Day, Trump noted national reports of voter fraud, and while not naming the tweet specifically, seemed to describe its contents.
“It’s happening at various places today, it’s been reported,” he said. “The machines, you put down a Republican and it registers as a Democrat, and they’ve had a lot of complaints about that today.”
Propublica, CNN and Buzzfeed all debunked the idea that Tyler’s video illustrated a rigged election. In a fact check that attracted little attention and a fraction of the original video’s social engagement, Propublica reported Tyler’s video showed the machine was working “exactly as it should,” and included a video explainer of the voting machine that clearly established Tyler had simply not followed the posted directions to change his selection.
But news organizations did not report at the time that Tyler, who did not allow reporters to use his real name on election day, had deep ties to alt-right groups. He had been tweeting his pro-Trump views for months, earning him minor celebrity status in the alt-right. Just 20, Tyler was an author of two self-published self-help books and a budding entrepreneur. He was also a striving internet provocateur in the online “hotep” community, a pro-black movement whose pro-Trump, ultra conservative views aligned it with the alt-right before the presidential election. (Hotep is an Egyptian word meaning “peace.”)
He appeared in a movie alongside notorious trolls Milo Yiannopoulos and Mike Cernovich and on the YouTube channel of Carl Benjamin, a British anti-feminist better known on the far-right as Sargon of Akkad, one of the most influential vloggers in Youtube’s extremist community, according to the Data & Society Research Institute.
“Not to toot my own horn, but I was sort of Twitter famous back then,” Tyler said.
Aided by Russia
But none of Tyler’s tweets or appearances approached the viral heights of his election day video, in part due to the help of Russia’s multimillion dollar disinformation campaign.
@Ten_GOP was run by employees of Russia’s government-funded Internet Research Agency, or “troll farm,” who worked in an office building in St. Petersburg with a mission to sow distrust in American democracy.
Before the account was shuttered by Twitter, @Ten_GOP accumulated 142,000 followers, including then-candidate Trump’s campaign manager Kellyanne Conway, son Donald Trump Jr., digital director and 2020 campaign manager Brad Parscale, and indicted National Security Advisor Michael Flynn, who all retweeted @Ten_GOP in the weeks before the election.
Two years later, Tyler says he didn’t plan to spread misinformation and never implied the election was rigged anyway. He also said he wasn’t aware of @Ten_GOP’s Russian roots at the time, and that he wasn’t part of an elaborate hoax to dupe voters about election machine integrity.
"It is kind of weird,” Tyler said of being used in Russia’s disinformation campaign. “I guess it’s not something that too many people can say has happened to them."
Stil, he said he was proud of “the fact that I was able to achieve virality.”
Tyler’s tweet makes for an illustrative example of the kind of disinformation spread on social media before the election and into 2017, according to Dr. Vladimir Barash, the science director at network analytics firm Graphika, Inc., and co-author of a new study that analyzed how misinformation spread on Twitter during and after the 2016 election.
Tyler’s tweet was one in an ecosystem of stories on a variety of topics including race, religion, LGBT politics and gun rights that “exploit wedges in society and bend the truth in a way that generates attention to spread mistruth like a virus through social media networks,” and distort the public’s understanding of political candidates and the issues, Barash said.
As for the midterms, Barash said “there’s no magic spell you can wave over the media landscape and solve disinformation. But platforms have some responsibility to make decisions about what constitutes safe, informed public discourse.”
Most of the Twitter accounts that spread disinformation during the 2016 election are still operating today, according to Barash's study. On a typical day, these accounts push more than a million tweets.
But this time around, Barash said the media and the public was less likely to be taken in by a tweet like Tyler’s.
“In 2016, this was so new that very few people were paying attention. Now there are a lot more eyes so its harder for nefarious disinformation campaigns to hide the truth,” Barash said. “All that said, that doesn’t mean we should rest on our laurels. This is going to be a problem until we as a society take some real steps.”