In the wake of the Paris terror attacks, some politicians and intelligence officials began rallying for "back doors" into encrypted messaging apps.
On Wednesday, FBI Director James Comey told senators that the person who opened fire at an anti-Prophet Muhammad event in May sent more than 100 encrypted messages to someone overseas, and the FBI couldn't read any of them.
"I'm not questioning their motivations," Comey said of companies who encrypt their data. "The question we have to ask is, 'Should they change their business model?'"
In theory, a "back door" would let law enforcement gain insight into secret terrorist communications — if those terrorists were limited to a handful of regulated apps.
But that's not the case. There are more than 400 free and commercial encryption products currently on the market, security expert Bruce Schneier told NBC News. Many of them were created outside of the U.S., meaning that even if Washington created laws demanding Silicon Valley create "back doors," the foreign apps and services wouldn't have to comply.
"There are a ton of apps out there," Will Ackerly, a former NSA employee and co-founder of encrypted mail service Virtru, told NBC News.
"Bad guys don't play by the same rules," he said. "They will have other alternatives. I think that will always be true."
Home-brewed encryption software
Terrorists worried about government surveillance have another option: building their own apps, much like Al Qaeda did with Asrar al-Mujahedeen, a program it created in 2007.
That is just one several encrypted messaging apps created by terrorist organizations over the last few years, according to a report from threat intelligence firm Recorded Future. ISIS also creates its own smartphone apps, like Amaq, an Android news app recently discovered by a group affiliated with the anti-terrorist hacker collective Ghost Security Group.
That all might seem like terrible news for intelligence agencies. Even if they could somehow gain access to every ready-made app out there, terrorists could simply start relying on home-brewed software. Not all encryption tools, however, are created equal.
"It's relatively easy to put together an application that uses encryption," Jonathan Katz, a computer science professor at the University of Maryland, told NBC News. "It's more difficult to build an application that's end-to-end secure."
Your average computer science graduate should be able to build encryption software that keeps away scammers, he said. Without experience and resources, however, it's unlikely they could build unbreakable, error-free software that could keep out nation states.
Katz pointed to the Heartbleed bug as an example of how even vetted encryption technology, in that case OpenSSL, can be vulnerable to implementation errors.
Quality encryption also relies on high-quality randomness, which, if lacking, can allow hackers to derive the encryption key without breaking the underlying system.
It's not that the tools and knowledge needed to create secure encryption software aren't out there, according to Ari Juels, a computer science professor at Jacobs Technion-Cornell Institute. They are well-known and readily available.
"While the building blocks are solid, assembling them in a secure way can be tricky," Juels told NBC News.
So isn't it a good thing to force terrorists to use substandard, home-brewed software? Possibly, but it's not a panacea for the FBI and other intelligence agencies, Juels said.
Identifying and cracking every new encryption program used by handful of people could get expensive and time-consuming, with no guarantee that anything of value will be discovered.
"I would suspect any home-brewed encryption software would have vulnerabilities, but it's not clear it would be cost-effective to exploit those vulnerabilities," Juels told NBC News.
How do you keep track of terrorists?
Facebook, Google and other companies have urged government officials not to mandate "back doors" into encrypted software. That could create new vulnerabilities that might be exploited by hackers and foreign governments, they said, not to mention stir up privacy concerns.
"It's a question of balancing civil liberties and potential exposure to cyber-attack with the ability of law enforcement to track criminals and terrorists," Katz said.
Despite once working for the NSA, Ackerly is opposed to back doors, claiming that intelligence agencies can use metadata, as well as tapped phones and computers, to effectively go after terrorists.
"There are so many tools available to intelligence agencies," Ackerly said. "Weakening the ability of people to protect their data, I think that is barking up the wrong tree."
At this point, there are so many encrypted chat apps out there, getting access to all of them probably isn't feasible.
Even if intelligence agencies got "back doors" into the most secure apps and forced terrorists onto worse software, the net cost to society would be too great, said Schneier.
"You can't do that without forcing other people to use insecure tools, too," Schneier told NBC News. "And it's not a good idea to force everyone to be insecure. Encryption is too important of a security tool to screw up like that."