Memes and videos are set to become the dominant forms of election misinformation around the 2020 elections, according to experts who spoke with NBC News.
That means Instagram, by far the most popular photocentric app, will be a particularly crucial battleground for election manipulation efforts — and the first skirmishes have already happened.
Facebook announced Monday that it removed 50 Instagram accounts linked to a Russian-backed influence campaign — and just one Facebook account. Facebook, which owns Instagram, linked the campaign to the Internet Research Agency, the Kremlin-linked troll factory that executed a wide-ranging influence campaign on the 2016 election.
While social media companies have been able to crack down on text-based posts, disinformation analysts say that viral memes and videos can be more difficult to trace, easier for foreign actors to create and accomplish the intended effect with greater frequency than the standard Facebook post or tweet.
Facebook additionally announced new efforts Monday at combating image-based disinformation, saying that the company will launch a new pop-up that will appear when people try to share Instagram posts that have been rated false or partially false by a third-party fact-checker. That pop-up will alert a user that they are about to share something untrue and provide them with the option to cancel their share or “share anyway.”
On the Russia-linked campaign, Facebook said the actors “also maintained accounts presenting themselves as local in some swing states, and posed as either conservatives or progressives.”
Facebook said the accounts “primarily reused content shared across internet services by others, including screenshots of social media posts by news organizations and public figures.”
The accounts “often posted on both sides of political issues including topics like U.S. elections, environmental issues, racial tensions, LGBTQ issues, political candidates, confederate ideas, conservatism and liberalism.”
In an interview with NBC’s Lester Holt, Facebook CEO Mark Zuckerberg said he’s seen that Russia, Iran and China “increasingly with more sophisticated tactics are trying to interfere in elections,” adding that his company has “a full plan” to combat 2020 disinformation “and a playbook of what tactics we expect each of these nation states are going to try to employ … and how we can stop them.”
There has been plenty of attention on dissecting Russia’s use of Facebook and Twitter to spread its message in 2016. Now, experts are zeroing in on platforms such as Instagram — where Russia had a far higher rate of engagement in 2016 than it did elsewhere — Snapchat, and even TikTok, which removed two dozen accounts for posting the Islamic State militant group’s propaganda Monday, as places where foreign malfeasance could flourish this time around.
Ben Nimmo, head of investigations at the social-media analytics outlet Graphika, said image-based disinformation proves highly effective and is easy to make.
"It's very quickly absorbed,” Nimmo said. “And it's very easy to share."
Beyond Instagram, newer platforms including TikTok can provide fertile and untouched ground for manipulation efforts. These platforms can also be harder to regulate.
"We can expect Russia to pivot to new social media platforms like TikTok and Snapchat, things that are video platforms where people can send messages without getting caught," Charity Wright, a former Army and National Security Agency cyberthreat analyst, said. "It's harder to catch messages that come through video."
Last month, a New York University study found that Instagram was set to play the largest role of any social platform in 2020 disinformation campaigns, though it has received far less attention than outlets such as Facebook, Twitter and YouTube.
Paul Barrett, a NYU professor who wrote that report, said that Instagram's position as a highly exploited platform for disinformation "is one of the untold stories, or inadequately told stories, out of this whole thing."
Academics and analysts have also stressed the importance that Instagram played in 2016 election manipulation despite the focus on Facebook and other platforms. The Senate Intelligence Committee has detailed at length Instagram’s role in that interference.
"All the attention focused on Facebook and Twitter," Barrett said. "Instagram deserved more attention."
"And I absolutely expect Instagram to be a magnet for disinformation in 2020,” Barrett added.
Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, said that Instagram allowed for Russia to target a younger and more urban audience than Facebook or Twitter, making it attractive to campaigns aimed at suppressing the more liberal vote.
"It's important because there's a lot more that can happen on Instagram because it's visual. And I think that that specific element of stories and visual impact is more problematic. It's more engaging. So, it's harder to find and identify certain things that might be problematic."
Both Albright and Wright said Russian disinformation, as well as disinformation originating from other nations, are likely to focus less on outrage in 2020 and more on suspicion. Less on hot-button issues aimed to trigger readers and viewers and more on content that will make them question if anyone at all can be trusted.
"So, we live in this world where we don't know whom to trust," Wright said. "And there's scandals and there's corruption and talk about impeachment, and it's really just trying to tear us and our political processes apart so that nobody knows whom to trust."