Science & technology | Cryptography

Taking a bite at the Apple

The FBI’s legal battle with the maker of iPhones is an escalation of a long-simmering conflict about encryption and security

“WE FEEL we must speak up in the face of what we see as an overreach by the US government.” With those words Tim Cook, head of Apple, the world’s biggest information-technology (IT) company, explained on February 16th why he felt his firm should refuse to comply with an FBI request to break into an iPhone used by Syed Farook, a dead terrorist. Farook and his wife Tashfeen Malik, who were sympathisers with Islamic State, shot and killed 14 people in California in December, before both were themselves killed by police. The FBI’s request, Mr Cook said, was “chilling”.

Ever since 2013, when Edward Snowden’s leaks pushed privacy and data security into the public eye, America’s IT firms have been locked in battle with their own government. The issue at stake is as old as mass communication: how much power should the authorities have to subvert the means citizens and companies use to keep their private business private?

For Mr Cook to choose the Farook case as the line he will not cross seems, on the face of things, baffling. The phone is government property (Farook was a public employee). The FBI wants help unlocking it because it may contain information on the motives or contacts of a dead terrorist. What could be more reasonable? But for Apple, and those security advocates who think the firm is right to defy the government, it is not reasonable at all. Far from being a one-off, they suspect the FBI’s case has been chosen carefully, in order to set a legal precedent that would let policemen and spies break into computers much more easily—and would do so in a way that undermines everyone’s security.

Cryptoporticus

The files on Farook’s phone, as on all iPhones, are encrypted. Unless the correct code is entered to unlock the phone, they are meaningless gibberish. By itself such a code provides little security. It is, by default, a mere four digits long, which makes it easy to memorise but means there are only 10,000 possible combinations. This makes it simple to try every combination until by chance the right one is hit, a process called “brute-forcing”.

Other features, though, are designed to make brute-forcing harder. After six wrong guesses a user has to wait a minute before trying a seventh. That delay rises rapidly to an hour. On average, therefore, brute-forcing a four-digit iPhone passcode will take 5,000 hours—nearly seven months. Yet even that might be a surmountable obstacle, were it not for the fact that iPhones can also be set to wipe themselves clean after ten failed attempts to log in.

Crucially, both restrictions—the time between attempts, and the wipe after ten failed tries—can, unlike the encryption itself, be circumvented. That is because they are enforced by the phone’s operating system, iOS, and operating systems can be changed. Apple does so regularly, issuing updates that add features and fix bugs. The FBI is, in essence, asking for just such an update, bespoke to the phone in question, that would remove the extra security features so that they can brute-force it quickly.

In theory the bureau could write such an update itself. But it could not use it without Apple’s help because, precisely to stop such attacks, iPhones will accept an update only if they can be convinced, via a special, cryptographically signed certificate, that it comes from Apple. Only Apple possesses the long, randomly generated number used as the key to that process.

The FBI has insisted its request is a one-off, and that once the software has done its job Apple can delete it. But many security experts are sceptical: they do not believe that looking inside Farook’s phone is the bureau’s only motive. “They almost certainly won’t find anything of interest on the phone,” opines Nicholas Weaver, a computer-security researcher at the University of California, Berkeley. He points out that Farook and his wife took the trouble to destroy two other phones and a laptop, while leaving the iPhone—which belonged to Farook’s employer—intact. (On the other hand, Farook did disable the phone’s online backup feature, data from which the FBI would have access to—a few weeks before his rampage.)

Dr Weaver and people like him think the FBI has pushed the case specifically because it is hard, from a public-relations point of view, for Apple to be seen to be refusing to co-operate. They worry that if Apple agrees to build such a system once, it will find similar requests impossible to refuse in future—an argument that was bolstered when it emerged that the Justice Department was demanding Apple’s help in at least nine similar cases (in seven of those, the firm is resisting). Some fret that the FBI might even require the firm to start sending subverted code to specific suspects over the air, using the technology it employs to distribute legitimate updates.

Viewed narrowly, that might be no bad thing. The FBI has argued many times that encryption can thwart legitimate investigations, leaving vital clues undiscovered. But security researchers point out that what works for the good guys works for the bad guys, too. If a subverted operating system managed to escape into the “wild” even once, then the security of every iPhone would be put at risk. The trade-off, says Kenneth White, a director of the Open Crypto Audit Project, an American charity, is not security versus privacy, but security for everyone versus the police’s ability to investigate specific crimes. And the risk of a leak would rise with every extra person who had access to the nobbled code: defence lawyers demanding to see it; court-appointed experts given the job of checking it works as intended; and so on.

A second argument against collaboration points out that Apple has governments besides America’s that it must answer to. Deliberately compromising its security for the Americans, says Mr White, will encourage other countries to make similar, perhaps even broader, demands for access. Having conceded the point once, Apple will find it hard to resist in future. In countries less concerned with civil liberties and the rule of law, that could have serious consequences.

Key decisions

All these arguments are set to be rehearsed when Apple and the FBI meet in court, on March 22nd. But however the decision goes, it is unlikely to be the last word. Most observers expect appeals to carry on all the way to the Supreme Court. In the meantime IT firms, Apple included, are taking steps to lock themselves out of their own customers’ devices, deliberately making it harder to fulfil official requests for access.

New versions of the iPhone feature something called the Secure Enclave. This is a separate computer within the phone, whose job is to police access to the rest of the device. Cracking it would require an extra piece of customised software aimed at neutering the Enclave itself. That is doable, for Apple has retained the ability to alter the Enclave so that it can issue updates and fix bugs. But things need not stay this way. The firm has pondered removing its ability to modify the Enclave, which would frustrate official requests for access.

Even then, a determined policeman has options. It is possible, with expensive equipment and a good deal of skill, to recover cryptographic keys from hardware by poking around physically in the transistors and wiring of the chip itself. Decapping, as this process is known (the first step is to remove the chip’s protective plastic cap) is the stuff of intelligence agencies and a few dedicated laboratories, and carries a risk of destroying the chip for no gain. But, if access were thought crucial, and there were no other options, it is possible to do it.

Even so, ensuring that phones themselves can be unlocked will not solve all of the authorities’ problems. There are plenty of encrypted messaging apps available for smartphones, many written outside the United States and thus beyond the reach of its government. The most advanced feature a technique called forward secrecy, which uses disposable, one-time encryption keys to ensure that old messages stay scrambled even if those looking manage to get hold of the conversers’ permanent keys. (One such app, called Telegram, which was developed by Pavel Durov, a nomadic Russian, announced on February 24th that it had reached 100m users.)

All this may sound like an arms race. It is. Silicon Valley has roots in the counterculture of the 1960s, and a potent streak of civil libertarianism. There is a sense there that Mr Snowden’s revelations proved the American government cannot be trusted not to abuse its powers of surveillance. And there are commercial factors, too. Apple has made privacy and security important selling points for its products.

Cybersecurity types, meanwhile, feel aggrieved that policemen and politicians do not seem to grasp what they view as a fundamental point: weakening security for the police’s benefit inevitably weakens it for everyone. “They keep asking for a ‘Manhattan Project’ to figure this out,” says Mr White. “But that’s like asking for a Manhattan Project to figure out how to divide by zero.” (Attempting to dividing by zero is, by definition, a mathematical folly.)

It will be left to the courts to decide the right approach in this particular case. But the fight between Apple and the FBI raises very big questions. To answer them will ultimately require the intervention of elected politicians.

This article appeared in the Science & technology section of the print edition under the headline "Taking a bite at the Apple"

Really?

From the February 27th 2016 edition

Discover stories from this section and more in the list of contents

Explore the edition

More from Science & technology

Large language models are getting bigger and better

Can they keep improving forever?

What is screen time doing to children?

Demands grow to restrict young people’s access to phones and social media


Locust-busting is getting a upgrade

From pesticides to drones, new technologies are helping win an age-old battle