Facebook’s former security chief Alex Stamos said in an interview with NBC News that the U.S. has not been aggressive enough in fixing its vulnerabilities to election attacks and is ill-prepared for the new tricks hackers will launch in the midterms, which are less than three weeks away.
Stamos likened the 2016 election to Pearl Harbor and 9/11 — not in loss of life but in terms of failure to foresee impending threats. But unlike the aftermath of those events, the U.S. has not taken aggressive action to fix its vulnerabilities, particularly those exploited by Russian hackers that were able to break into Democratic organizations and the Hillary Clinton campaign, Stamos said.
“There's been small improvements in campaign security,” Stamos said. “But we have not seen the kind of massive upgrade in campaign infrastructure that you would need to stand against a professional hacking agency like that.”
Stamos, who left Facebook in August after three years as chief security officer to teach at Stanford University, said he believes the company has taken the necessary steps to ensure its system can’t be manipulated by Russia again — but that adversaries will have new tricks and tools this time around.
“Facebook's done a good job of responding to what happened in 2016,” Stamos said. “The real question is what are our adversaries going to do in 2018?”
“The truth is within the security and safety space, you're not building a bridge where you're done,” Stamos said. “You're playing chess. But it's an infinite game of chess where your adversary you're playing against constantly switches out.”
With less than 20 days until the U.S. midterm elections, tech companies, the U.S. government and cybersecurity firms are on high alert for any attempts to spread disinformation, disenfranchise voters or hack into campaigns and voting systems.
Facebook has taken a variety of steps to improve its response and detection of foreign campaigns. While most of the work manifests in computer code, the company created a “war room” and committed this year to doubling its security team to over 20,000 employees.
Inside the war room, Facebook employees from across the company are running drills and simulations of potential disinformation campaigns and attacks from both foreign and domestic adversaries.
Facebook has also been more aggressive in going after coordinated media campaigns that break its rules—even those from within the United States. The company recently removed 559 U.S. pages that it said “have consistently broken our rules against spam and coordinated inauthentic behavior.” On Monday, the company announced a ban on false information about voting, such as posts that claim people can vote through text messaging.
Samidh Chakrabarti, the product manager of Facebook’s civic engagement team, recently told NBC News that Facebook is “much more effective than we used to be” and the entire company is “laser focused on getting it right,” in regards to election integrity and security.
Stamos is not alone in voicing concern about U.S. cybersecurity. President Donald Trump’s decision in May to eliminate the role of cybersecurity czar rattled experts. Current and former U.S. officials told NBC News in July that the White House did not have a coherent strategy to guard against future election interference.
Stamos pointed to partisan fights over Russia’s election meddling as the reason the country has not responded as if it had suffered a major attack, adding that a lack of coherent government response left companies such as Facebook in a difficult position.
“Two years on from the Russian hacking and nothing of that scale has happened, and it's because people disagree whether it even happened or not,” Stamos said. “These are the kinds of things that the tech companies can't do by themselves.”
Stamos, who served as Yahoo’s chief information security officer before joining Facebook in 2015, warned that domestic threats to the United States could use the “same playbook that the Russians utilized in 2016 to influence the election from the inside,” which has already happened in other countries.
“If you look at Mexico and some other elections, it's pretty clear the future of disinformation in the United States is going to be domestic,” Stamos said.
To stop these types of campaigns from succeeding, Stamos called for a ban on microtargeting political ads to individual voters, which has allowed campaigns to show users different, highly specific messages on Facebook depending on their location, interests and political leanings.
“We're moving towards a world that is not good for U.S. democracy where the campaigns and the PACs are splitting the electorate into these tiny, tiny little slices,” Stamos said. “They're giving a completely different story to each of them. They look very, very different to each person. And I just don't think that's good for our democracy, even without Russians interfering.”
In his new role at Stanford’s Hoover Institution, a public policy think tank, Stamos said he hopes to bring companies and governments together to solve major security issues that he said he couldn’t address at Facebook.
“I left because, you know, I had an intense three years there,” Stamos said. “And there were some organizational changes that meant that I was not able to fix all things that I wanted to fix. And I was a little upset.”
Looking back, Stamos said the solutions to some of the biggest security challenges at Facebook should have been made “when the products are designed.”
“There was a restructuring that left me without that kind of responsibility and it seemed like a reasonable time to go,” Stamos said. “I'm also really excited to be able to now work on these issues in a much more global scale.”