FBI investigators have a method to get data off iPhones. The federal law enforcement agency does not seem in a rush to tell anyone how it’s doing it, however.
While the U.S. government has a “vulnerabilities equities process,” there are no strict rules on when law enforcement has to disclose a hole they discover in tech security. And that means that after obtaining a technique to get data off an iPhone in the San Bernardino massacre investigation, the bureau may not have to let Apple — or the public, for that matter — know how it’s getting into the company’s devices.
“The standard practice is one would disclose the vulnerability through some form of responsible disclosure,” said Zulfikar Ramzan, chief technology officer at security company RSA. “You want Apple to be able to learn from it, see what they can do to fix it.”
Most of the time, when a “white hat” security researcher discovers a vulnerability in a product, they first take the flaw to the vendor and give them a chance to patch it, Ramzan said. Sometimes, companies will pay researchers a “bug bounty” for coming to them first. Then, after the company has put a fix in place, the researchers will go public with their discovery. That helps companies protect consumers, Ramzan said.
“It’s not just a matter of getting it to the vendor because it’s a nice thing to do for the vendor,” said Alex Rice, cofounder of HackerOne, which connects companies and security researchers. “In almost all circumstances it’s the best thing to do for the public and consumers as well.”
Apple did not respond to a request for comment for this story.
The federal government has made some past statements that demonstrate a policy of flagging and patching security vulnerabilities when intelligence agencies become aware of them. The Office of the Director of National Intelligence has said that it is biased toward disclosure, and the White House has given a barebones outline of its position.
“When federal agencies discover a new vulnerability in commercial and open source software … it is in the national interest to responsibly disclose the vulnerability rather than to hold it for an investigative or intelligence purpose,” the Office of the Director of National Intelligence said in a statement in April 2014.
The ODNI was denying a Bloomberg News report which claimed the NSA had known about the widespread “Heartbleed” bug for two years but didn’t tell anyone.
After the Heartbleed bug was exposed two years ago, the White House said that it had a “disciplined, rigorous and high-level process” in place for assessing whether or not a flaw should be revealed.
But “there are no hard and fast rules,” Michael Daniel, the administration’s cybersecurity coordinator, wrote in the White House’s 2014 blog post.
The President’s Review Group on Intelligence and Communications Technologies, which was formed after Edward Snowden’s surveillance leaks, recommended in its 300-page report that a better system for reviewing vulnerabilities be put in place.
“In almost all instances, for widely used code, it is in the national interest to eliminate software vulnerabilities rather than to use them for U.S. intelligence collection,” they said in the report.
But the FBI hasn't said how it decides if it will reveal a vulnerability or not, said Professor Susan Landau, a cybersecurity researcher at Worcester Polytechnic Institute.
"There is much more argument for national security to report less often to the vendor than for law enforcement, for exactly the issue that national security is looking outside the country and law enforcement is looking inside the country," Landau said.
"So how can they not tell us what the equities process is, and why they're not sharing it with Apple?"
Digital rights advocates have called for the FBI to disclose its method, citing security concerns for other Apple device users. The non-profit Electronic Frontier Foundation pointed to the President’s Review Group recommendations in a statement, saying that “any decision to withhold a security vulnerability for intelligence of law enforcement purposes leaves ordinary users at risk from malicious third parties who also may use the vulnerability.”
It’s not clear if the FBI or other law enforcement agencies might try to employ the method used in California on other iPhones already in custody. There’s plenty of reason to think they want to, however. Most people carry a smartphone that holds a trove of information about everyday activities. When a crime is committed, a phone can become a storehouse of potential evidence.
The office of Manhattan District Attorney Cy Vance told NBC News that it has 215 iPhones they want to access, but none are the same iPhone 5C model as the one used by Syed Farook in California. That doesn't mean the technique might not be usable elsewhere.
“It would be very, very unlikely that this is a technique that is uniquely applicable to this one iPhone,” Rice said.
The American Civil Liberties Union located 63 cases across the country where the government has tried to use the All Writs Act — the law cited in the California case — to compel Google or Apple to help get data off a locked device.