Sorry to keep things so short, folks. But I’m two Manhattans (really good movie*, you should see it) into my evening and I don’t have any strong opinions about anything (except the idiocy of wanting to make Daylight Savings Time permanent) so let’s get straight to the music. Thanks to Entropy for putting Request Line together last week – this is a fun one.
*Am actually talking about the movie Metropolitan.
So here’s a brief mini-rant I want to post about the whole Apple vs. FBI business
Preface: I enthusiastically support Apple’s position here.
There’s been a lot of debate about how fulfilling the FBI’s (court-ordered) demands would “break encryption” and “create a backdoor” into iPhones. Apple has said as much themselves, and the tech press has enthusiastically run with it. But that’s horseshit.
Proper security design generally requires a strong password (you all know what this is by now) and employs a series of hashing rounds in order for the computer’s processing ability to limit the speed at which “guesses” can be tried. There’s a balance here – if you send it through too many hashing rounds, it could mean the computer is tied up for a time while authenticating, leading to user frustration (in the case of an iPhone, imagine having to wait thirty seconds for authentication every time you unlock your phone). Send it through too few, and it becomes feasible to brute-force a password – guesses can be made and tested at an extremely high rate and the limiting factor is the strength of the password. A four digit passcode is a TERRIBLE password – so using a hardware-based solution (i.e. trillions upon trillions of hashing rounds) to resist brute force attacks isn’t feasible. Instead Apple employs a software-based solution – a lockout process. After a certain number of incorrect guesses, the device must wait for a set amount of time before another guess is processed.
What the FBI is demanding is that Apple make it possible (in some way, shape, or form) for them to make unlimited guesses at the passcode that’s protecting the device. The idea has been pushed very heavily that it would be prohibitively expensive or difficult for Apple to modify its operating system to make this possible – this the REAL horseshit of the matter. Since the security is a software-based solution, all that needs to be changed is the software – specifically, one of two variables needs to be changed. The first variable is the number of guesses before enhanced security measures are employed (i.e. something low-level like increased lockout time up to something high-level like erasing the device’s contents). The second variable is the duration of the increased lockout time. The first can be set to a high number. The second can be set to a low number. Either of these steps, or both, would make brute-forcing a four-digit passcode into a relatively trivial matter (even if you’re still required to punch in the guesses through a hardware interface).
So the question doesn’t become one of whether the software can be “written” to do this – all that takes is changing a couple of variables. The big question is whether it’s possible for Apple to push out software updates (i.e. modify the operating system to absorb these changes) without user confirmation. If that’s not possible, Apple would have said so IMMEDIATELY and the issue would be moot – what the FBI wanted simply wouldn’t be possible. Apple’s silence makes it clear to me that they already have this capability – meaning that this supposed “backdoor” already exists. So really this boils down to a very familiar problem – security vs. convenience. A four-digit PIN is too weak to allow for a hardware solution. But customers would hate having to enter a strong password every time they wanted to access their device. And solutions based on “something you have” (i.e. biometrics, key fobs, etc.) can’t stand up to a determined attack because the attacker can simply take that thing.
Does this mean I think Apple should accede to the FBI (and the court’s) demands? Absolutely not. From a business perspective, it’s catastrophic. Their devices would be perceived as insecure, and countries such as China (or the United States!) would demand to be able to access devices at will. But I want to put to bed the notion that what the FBI is asking would technically unfeasible (or even difficult) to do. It’s not, and it bugs me to see so many words wasted on the insistence that such is the case.