So iPhone Security Isn’t Perfect

March 6th, 2018

As many of you recall, Apple found itself in the thick of it after a mass shooting in December of 2015 in San Bernardino, CA. The FBI asked them to create a back door for iOS, which would allow the authorities to break in to an iPhone 5c used as a work phone by one of the assailants.

The request was the result of stupidity. A work device set up properly with Apple’s configuration tools ought to have been easily unlocked by the employer. It should not have been necessary to accidentally lock the device and hope that the proper password or unlock scheme would be discovered before the unit is locked for good as the result of five passcode attempts.

On the surface, it seemed a sensible move despite the logical fallacies. Why, for example, would the terrorists place incriminating data on a work phone? Wouldn’t it be stored on personal devices? Other than assuming they were very stupid, which may very well be true, it didn’t create much hope that a successful entry would yield any significant amount of information.

But the authorities had to check anyway, in the hope that the unit contained evidence critical to the investigation, perhaps to reveal the existence of other plotters in the scheme. But Apple said they couldn’t help for a very sensible reason. The mere act of creating such a back door would, in effect, defeat the iOS potentially for almost anyone clever enough to exploit that back door. And don’t think it isn’t possible. Some members of the U.S. Congress also repeated this fiction, that it could be restricted to a single device.

Well, the FBI couldn’t get the courts to go along with the demand, but it didn’t matter in the end. They were able to pay a third-party hacker to do the deed, perhaps much as a million dollars. Or maybe not, and in the end they reportedly didn’t find any useful information on that iPhone.

Now before people run off and complain that Apple’s security is now seriously comprised and you can’t trust such devices, as a practical matter nothing is perfect. There are tricks to defeat a fingerprint sensor, even Touch ID. What’s more, Face ID isn’t immune to hacking either. A 3D mask reportedly can do the deed, shades of Mission Impossible!

For law enforcement officers, there are now two forensic companies that promise to break into iPhones and other gear. You’ll notice, by the way, that we never hear of anyone having trouble breaking in to devices running Google Android. It’s always about the iPhone, which ought to at least reassure most of you that a successful intrusion no easy process. so there’s little reason to feel paranoid.

Remember that even the key fob from your family auto may be defeated by hackers with fairly cheap tools. It’s why we have auto insurance.

Now the original company reportedly engaged in unlocking iPhones and other gear was an Israel firm, Cellebrite. It reportedly costs $1,500 per iPhone to bypass the Secure Enclave that houses your fingerprint or facial data. It can supposedly be used even on the iPhone 8 and the iPhone X — yes the one with Face ID!

Clearly business is good, as Cellebrite recently reported record revenue and growth for 2017.

Yet another company is getting in the act, one called Grayshift, a startup reportedly run by U.S. intelligence contractors and a former security engineer from Apple. If the latter is true, Apple must be busy consulting the former employee’s NDA to see if there’s a legal angle to put a stop to this venture. The GrayKey tool reportedly costs $15,000 for an online solution said to be limited to 300 users. It’s unlocking scheme is reportedly similar to Cellebrite’s.

Now it’s not that such efforts are always universally applauded for law enforcement authorities. There are downsides, such as the ability of rogue governments around the world to compromise the privacy of their enemies, and perhaps for criminals to get ahold of such gear and unlock devices used by intelligence people and the well-heeled. Even if Cellebrite and Grayshift are diligent in making sure that their tools are not being acquired by criminal elements, I suppose it’s always possible for a unit to be stolen and misused.

No doubt such successes may be temporary. Assuming Apple can figure out what they’re doing, they can devise ways to tighten security and bypass such schemes, and other devices that may become available. At the same time, such forensic firms will no doubt, in turn, find ways to improve their detection algorithms and around and around we go.

You’ll notice that Apple is not making a deal of talking about such devices. Certainly Alphabet, Google’s parent company, won’t touch this one with a ten foot pole, because Android has no security reputation to speak of.

At the same time I wouldn’t be surprised to learn of a startup that could possibly add an additional lawyer of protection to a mobile device to defeat the efforts of Cellebrite and Grayhawk to unlock iPhones.

| Print This Article Print This Article

One Response to “So iPhone Security Isn’t Perfect”

  1. dfs says:

    Why do a lot of us take the idea of police or government getting into our iPhones so seriously? Well, it would probably be silly to suggest that any sane person actually thinks of his iPhone as an extension of himself. It might be more reasonable to say that on some subconscious level he thinks of it as an extension of his house: just as a house is a place where he stores his physical stuff, so his iPhone (or computer or whatever) is a place where he stores his electronic stuff. Therefore it should be as difficult for police or government to gain entry to his iPHone as it is for them to gain entry to his house. And (at least in the absence of a court order) the maker of his iPhone shouldn’t be expected to — let alone be required to or be buillied into — cooperate with the authorities voluntarily, any more than should the maker of the owner’s burglar alarms or whatever other security gear he has installed.

Leave Your Comment