When Jesse Morton watched as the U.S. Capitol was stormed, he recalled the ferocious faith he once had — not as supporter of President Donald Trump, but as a jihadi recruiter on a mission from God.
Like some of Trump's most ardent backers, Morton was also deplatformed by social media companies like YouTube, where he was one of Al Qaeda's most prolific English-language recruiters, giving him a rare personal insight into the future they face.
"A lot of what we see unfolding in front of us now, with regard to the far right, I experienced directly, when the primary threat we were concerned with was the jihadists," he said last week from his home in Alexandria, Virginia.
Morton served nearly fours years on terrorism-related charges in federal prison, where he assisted the FBI as an informant. He went on to become a research fellow at George Washington University's Program on Extremism and now works as an anti-extremism activist.
He was careful to note that Trump and his followers are not Islamist terrorists bent on violent jihad, but he said his experience shows that deplatforming radicals does work, if only to a point.
Just eliciting a reaction from huge tech companies and powerful law enforcement agencies can feel like intoxicating vindication, he said, and pulling charismatic troublemakers off social media certainly limits their reach. But for a smaller group of die-hards, removal bolsters the same feelings of isolation, outrage and in-group solidarity that led to radicalization in the first place, he said.
Although their follower numbers might go down, "what you see is, you see those feelings of camaraderie, those feelings of community, those feelings of meaning and significance in the movement, as if you're having an effect," he said. "And so you feel emboldened. You see, you feel powerful."
Download the NBC News app for breaking news and politics
Those crosscurrents are borne out in several statistical studies that focused on Islamist deplatforming.
A 2015 report for the Brookings Institution think tank found that even when Islamist extremists managed to log back on to Twitter using different names — a strategy that technology companies have made increasingly difficult — they struggled to restore their previous follower counts.
"It appears the pace of account creation has lagged behind the pace of suspensions," co-author J.M. Berger wrote.
After suspensions began in earnest in September 2014, the primary Islamic State, or ISIS, hashtag dropped from about 40,000 tweets a day to fewer than 5,000 in about five months.
"When we first started doing this with jihadists, people liked to say it was like whack-a-mole, you know, where you just knock one down and another one pops up," Berger said in a video call last week. "The research that I've done and that subsequent people have done demonstrate that that's not the case."
Berger's analysis also backed up Morton's experience — that a heady cocktail of isolation and vindication risks accelerating the violent reaction of a small minority who put in the effort to move to more private platforms.
"You're only talking to people who echo the same views and obsessively talk about violence and anger and hate," Berger said. "Then there is a reasonable chance that being in that environment could radicalize you more."
Morton said such a toxic feedback loop could easily promote the notion that "there is no other recourse but violence as a result of us being unable to express our ideas."
They agreed that Big Tech's attempts to rein in right-wing extremists has seemed more reactive than preventive and that crafting a consistent set of rules around account suspensions would help companies undercut the feeling among censored groups that they are being singled out.
Morton and Berger also said there was a substantial difference between the deplatforming of jihadis and the deplatforming of right-wing extremists: their bases of support.
While jihadi recruiters in the United States do not have any substantial political backing, right-leaning voters are legion, encompassing multiple causes and ideologies, and they can count on many elected officials to defend them.
After the election, Trump made false allegations of widespread voter fraud and attacked cities with large shares of Black voters, who had come out in force for Biden. His lawyers baselessly alleged a global conspiracy and filed dozens of suits to overturn the election results — a legal strategy that failed in court after court with not a single incident of voter fraud proven in the lawsuits.
A Quinnipiac University poll published Jan. 11 found that 73 percent of Republicans said they "believe there was widespread voter fraud" during November's elections, false allegations aggressively promoted by Trump but repeatedly rejected by the courts.
That will make deplatforming right-wing extremists far more delicate and potentially less effective, said Faiza Patel, co-director of the Liberty and National Security Program at the Brennan Center for Justice at New York University Law School.
"When you were taking down, you know, accounts of Muslims, people didn't really care," she said. "When you're taking down the accounts of sort of prominent people, people are going to care, and the platforms are very aware of that dynamic."
CORRECTION: (Jan. 19, 2021, 4:30 a.m. ET) An earlier version of this article misstated Jesse Morton’s current position. He is an anti-extremist activist; he is no longer a research fellow at George Washington University's Program on Extremism.