The story came out of the blue. In February, the U.S. Department of Justice requested and received a court order to unlock the iPhone 5c used as a work phone by a deceased terrorist in the San Bernardino, CA shootings. In short order, Apple filed a petition to overturn the order, claiming such a move would create a dangerous precedent. Although the FBI and other authorities asserted that it was all about a single handset, it soon became obvious that prosecutors were just waiting for a court precedent to be set, so they could go after Apple too and use the same scheme to unlock other iPhones seized in criminal cases.
In various public statements, interviews, and guest editorials, Apple made the point crystal clear. Creating what they called a “govOS” would leave a backdoor open for even criminals to exploit. It would thus compromise the security of hundreds of millions of Apple customers who used iOS gear protected by data encryption.
The very public back and forth continued for weeks until March 21st, the day before a scheduled hearing to consider the issue, when the DOJ requested that the presiding judge vacate or postpone the order. It seemed that a “third-party” had approached them, in the nick of time, with what appeared to be a workable method to retrieve the data from that iPhone. Some cynical observers suggested the FBI decided it might lose in court, and thus worked harder to find another solution.
So the hearing was postponed and the DOJ was given two weeks to present an update. In the meantime, published reports named an Israeli software developer, Cellebrite, as the company contracted by the FBI to use its “mobile forensics solutions” to unlock that iPhone. However, this was never confirmed.
So the DOJ made a court filing on Monday to vacate the order:
Applicant United States of America, by and through its counsel of record, the United States Attorney for the Central District of California, hereby files this status report called for by the Court’s order issued on March 21, 2016.
The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court’s Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016.
Accordingly, the government hereby requests that the Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016 be vacated.
Not revealed is what was done to bypass or guess that passcode to unlock the iPhone or what, if anything, was found on it. While I do not expect the DOJ or the FBI to reveal the method that was ultimately successful, there has been some speculation.
So one possibility was to mirror the unit’s flash memory on a PC. This would result in a file that could be accessed with attempts to unlock the four-digit passcode. Apple sets a limit of ten tries, after which the data is deleted. But if only nine efforts are made and not successful, the file could be reloaded into memory, and nine more attempts would be made. This process would continue until a successful result.
As a work phone, however, I’d be skeptical there was anything useful. The terrorists were smart enough to destroy their own gear. A work phone could, at any time, have been retrieved by the authorities in San Bernardino before the shooting, if only to apply software updates, and suspicions would have been raised if the terrorist refused.
What this means, though, is that this method may not be available for any agency in the DOJ, and possibly local and state police departments, to similarly recover data from locked iPhones. Now just consider a public trial in which evidence against one or more criminal defendants was obtained from an unlocked iPhone. The defense would surely request details on just how it was done, and that would mean that outsiders would know, assuming the request was granted. And no doubt it would be.
So Apple doesn’t quite get off scot-free. They are no doubt already considering schemes that might be used to unlock iPhones by police and even criminals, and are actively working on methods to enhance security. So maybe the next time someone tries to use this method on a new or updated iPhone, it won’t work.
That would mean Apple could be hauled into court yet again to deliver a solution, and this sorry episode repeat itself.
Regardless, the critical issues raised aren’t going away. Ultimately it may mean that new laws need to be considered to balance the needs of law enforcement and individual privacy. President Obama warned against an “absolutist” approach, but that position may avoid the implications. It doesn’t address the core issue, which is how the authorities can hold onto a method to unlock iPhones, and other encrypted gear, without that method eventually getting into the hands of criminals.
For now, however, this legal skirmish is over. But if and when Apple tightens iOS security, there may be more legal demands to answer.