To briefly refresh your memories: For days earlier this year, the FBI made loud claims that Apple needed to help unlock an iPhone 5c used by a terrorist. Based on a court order, Apple asserted it would have to build what it called a “govOS” that defeats brute force protections, to allow unlimited login attempts to accomplish the task.
On the day before a scheduled court hearing pitting the U.S. Department of Justice and Apple, the DOJ requested a postponement. The reason? A “third party” came to them with a solution to pluck the data off that iPhone. A week later, and they announced they had achieved success. But as we move through April, the FBI remains coy about what, if anything, they might have retrieved. Well, at least to the public.
Now if they actually had discovered even one bit of actionable intelligence, they’d shout it to the skies. Not necessarily the information, mind you, but that they had achieved success. Instead, they are reportedly still looking over the material, which doesn’t convey optimism that anything useful was found. This seems to argue in favor of my ongoing skepticism.
After all, this was a work phone. The San Bernardino terrorists were smart enough to destroy their own tech gear, but the office iPhone was left intact. Since it was a work phone, one that could be retrieved by the employer with a simple request, it would hardly make sense to put something incriminating on it. After all, the refusal to turn it over would rouse suspicions and might have helped the authorities to thwart the act. I’m assuming a basic level of intelligence, since Syed Farook, one of the deceased terrorists, was a well-educated health department employee. All right, a highly deranged well-educated person, but I would assume he had a reasonable amount of intelligence.
So I don’t expect anything to come of this successful attempt. But it still leaves a big elephant in the room, which is the method by which the authorities were able to unlock that iPhone. Did the third-party, which some believe to be Cellebrite, an Israeli mobile forensics company, exploit a security leak in iOS? If that’s the case, certainly Apple should know so that they can shore up the system and protect customers.
But that’s only if there is a security leak. Indeed the FBI is now admitting that this method doesn’t work on any handset newer than an iPhone 5c.
So the core debate remains. Few would deny that police have the right to obtain evidence that might prevent or solve a crime. But there are constraints to that authority that, in many cases, it requires a court order in the U.S. I am not going to research how it’s done in every advanced country, but the process is reasonable.
However, Apple’s decision to encrypt data on iOS gear has changed the equation. In the old days, a search warrant could be easily executed on someone’s home, office or personal property. But when it comes to encrypted data from a mobile handset, can a suspect be forced to unlock it by a court edict? What if the suspect is dead?
But if a mobile device is locked to protect customers, can the authorities demand the manufacturer build a backdoor, thus opening the platform to possible exploits? How does that benefit a customer’s right to privacy?
Now if the method used to unlock that iPhone succeeds when it’s used elsewhere, the secret is bound to come out when and if it’s used to get evidence for someone on trial. The defense would demand to know how it was done as part of their right of discovery. Once that happens, the method can be discussed during the course of a public trial, so everyone would know what to do and how to do it.
Indeed, the DOJ may have already begun to share details of how the data was extracted with the Senate Intelligence Committee and other high ranking Senators, according to a published report. Such briefings would be held as the committee develops legislation to deal with law enforcement requests to retrieve data from encrypted devices.
So at some point, Apple should be given the information they need to deal with a potential security problem. If there is a security problem. Some reports suggested that the iPhone was opened due to a trick, by loading the flash memory into a PC’s RAM, testing up to 10 passcodes, reloading the memory image, and repeating the attempt. Eventually the efforts would succeed, in theory. The DOJ said it took 26 minutes to accomplish the task without, once again, explaining just how it was done.
Now I suppose the DOJ can keep doing things this way, hoping that third-party company will get them out of a jam, for as long as it works. If it fails, we might be back to square one. To avoid repeated conflict, one possible solution would be for the DOJ and tech companies to get together to devise a workable method that fulfills the needs of law enforcement without comprising your security. But that may be an impossible dream.