The FBI says it got data off the iPhone 5C used by one of the San Bernardino attackers, ending a potential legal showdown with one of the world’s biggest companies.
So now we can stop talking about all that encryption stuff, right? Not so much.
“This saga has really just asked more questions,” said Oren Falkowitz, CEO of cybersecurity firm Area 1 Security and a former NSA analyst. “None of them have really been answered.”
On Monday, after insisting repeatedly that the FBI had no way to get into the device without Apple’s help, and after a week ago making the surprise announcement that a third party might actually have given them a way in, the government backed out of the case, saying investigators had cracked the phone.
Which means that case in the Federal District Court for the District of Central California is over. The FBI had wanted to get into the phone, and now they have done it. But how?
Do they have to tell Apple how they did it? Could the method be used again by the FBI? Will it be used in other cases, even if they’re not related to terrorism? Could hackers use it? Was the FBI able to get anything useful off the phone in the end?
"This lawsuit may be over, but the Constitutional and privacy questions it raised are not," Rep. Darrell Issa, R-Calif., said in a statement on Monday. "Those worried about our privacy should stay wary — just because the government was able to get into this one phone does not mean that their quest for a secret key into our devices is over."
The case in California was never really an isolated incident, not when it came to locked iPhones and not when it came to larger questions about data security. It’s part of an ongoing, international discussion — and sometimes public PR slugfest — between companies and governments that’s been going on for years and won’t likely end anytime soon.
"I don't think (the San Bernardino case) settled the ultimate question of what legal authority the All Writs Act provides the government. The question of whether the government can use the All Writs Act to force tech companies to undermine their security measures is still a live one," said Esha Bhandari, a staff attorney for the American Civil Liberties Union, referring to the statute the Justice Department cited in California.
"We don't have any ultimate resolution on the legal question," she said.
It’s also unclear whether the FBI might be able to re-use the technique with other phones — and, if so, what the legal and technical implications would be.
Apple asked for a delay in a case in Brooklyn involving a drug dealer’s iPhone after the FBI said it might have found a solution in California. Manhattan District Attorney Cy Vance has said he has 175 iPhones that he would like to get unlocked.
"We don't know, of the hundreds of phones that might be in the hands of law enforcement, whether the same tool could be used on some of them or all of them," said the ACLU's Bhandari.
In a statement on Monday, Apple repeated its mantra in the case: We’ll follow the law, and we’ll continue to make our products safer.
“We will continue to help law enforcement with their investigations, as we have done all along,” Apple said, “and we will continue to increase the security of our products as the threats and attacks on our data become more frequent and more sophisticated.”
So, how’d they do it? Who helped the FBI get into the phone? That's first on the list of things we just don't know.
“The FBI cannot comment on the technical steps that were taken to obtain the contents of the county-issued iPhone, nor the identity of the third party that came forward as a result of the publicity generated by the court order,” FBI Assistant Director in Charge David Bowdich said in a statement on Monday night. He said the FBI was careful “to ensure that the contents of the phone would remain intact once technical methods were applied.”
There has been a lot of speculation about who might have come forward to help the FBI get into Farook’s phone. One thing security experts said, though, is that it's not a shock that someone had a way in. Devices and software, like all things made by humans, have flaws.
When those flaws are unknown even to the company that made them, they’re referred to as “zero day” vulnerabilities. There’s a whole market out there for researchers — some totally legitimate and with good intentions, others less so — who find zero days and others gaps. Sometimes they tell the company, and get a “bug bounty.” Other times, those gaps are found by intelligence agencies that hang on to them for their own purposes. Sometimes, they're used by criminals.
“It’s not surprising that one of the most popular platforms in the world would have hundreds of people looking for ways in that could be used for monetization purposes,” Falkowitz said.
FBI Director James Comey said in a letter to the Wall Street Journal on March 23 that the bureau had received a number of ideas for how to get into the San Bernardino phone. Now that they found one that works, some question whether the bureau should hand the technique over to Apple.
“When you’re dealing with a product like the iPhone where so many millions of people could be impacted, and compound that with the fact that this has been extremely public, the government has an immediate need to work with Apple and patch whatever gap exists,” Falkowitz said.
The Obama administration in April 2014 laid out a set of considerations officials run through when deciding to publicly disclose a vulnerability. The government should weigh those same questions — which include whether the gap poses a "significant risk," and the likelihood of someone else finding it — when thinking about this apparent vulnerability, said Herb Lin, a senior research scholar for the Hoover Institution at Stanford University.
"[The FBI] should go through the process established by the White House to determine whether vulnerabilities should be disclosed to the vendor," Lin said in an email.
"The government's role is to protect the public, and the tech companies' role is to provide the best security for our users," said Denelle Dixon-Thayer, chief legal and business officer for Mozilla.
Some hoped that the debate sparked by the California case might inspire new legislation to guide courts and investigators. Senator Mark Warner (D-Va.) and Rep. Mike McCaul (R-Texas) have proposed an “encryption commission” that would be made up of law enforcement, tech and business representatives.
Rep. Ted Lieu (D-Calif.) and a group of lawmakers back the ENCRYPT Act, which would prohibit individual states from enacting their own anti-encryption laws.
And a bill is reportedly brewing from Senators Dianne Feinstein and Richard Burr that would give federal judges more power in similar cases, according to Reuters.