Apple’s fight for its right to sell phones with encryption that can’t be unlocked by cops, courts or anybody else got a lot more dramatic this week.
On Tuesday night, a federal judge ordered the company to disable the auto-erase feature on the iPhone 5c used by one of the perpetrators of the mass murder in San Bernardino, Calif. last December.
Apple CEO Tim Cook promptly responded with “Message to our Customers” note on Apple’s site in which he called the judge’s order “an unprecedented step which threatens the security of our customers” and said Apple would resist it.
Since then, the discussion has blown up all over the news. Confused? Here’s what you need to know about the situation.
What does the court want, exactly?
If you’ve read stories saying Judge Sheri Pym wants Apple to decrypt the phone, that’s not quite right. Her three-page order compels the Cupertino, Calif., company to provide “reasonable technical assistance” to disable the security feature in iOS 9 that erases the contents of an iPhone after 10 incorrect attempts to enter its unlock passcode.
If Apple can patch the operating system to disable that feature — something the order suggests would be done by writing a custom software update to be installed on the phone over a USB cable — then investigators could “brute-force” the passcode, one sequence of numbers at a time.
That appears to be the only way to see the contents of the phone used by Syed Rizwan Farook, who with his wife Tashfeen Malik murdered 14 people and injured 22 in the Dec. 2 terrorist attack.
This approach could work on this model, to judge from posts by iOS security experts Dan Guido and Robert Graham, because the iPhone 5c lacks the “Secure Enclave” feature that would reject the proposed patch. A third Mac infosec veteran, Securosis CEO Rich Mogull, concurred with that diagnosis in an e-mail to me.
What does this mean to my iPhone?
Right now, nothing. The government is not demanding that Apple revise iOS altogether to remove the auto-erase feature or provide a permanent backup key for law enforcement (something politicians have requested repeatedly).
Further, your iPhone could not be subject to this kind of brute-force unlocking unless an investigator had already taken it from you. This proposed exploit requires physical access to the phone, not a remote download through the App Store or iOS’s system-update mechanism.
Staging a comparable attack on newer iPhone models “would be much more difficult” but “not impossible” security investigator Jonathan Zdziarski posted in his own analysis of the situation.
If Apple loses, what could happen next?
That’s where privacy advocates get uneasy.
“This is the government demanding a company create a fundamentally defective product that doesn’t provide the platform protections that the company has guaranteed to the best of its ability via engineering,” e-mailed Joseph Lorenzo Hall, chief technologist for the Center for Democracy and Technology.
“What the court is essentially ordering Apple to do is custom-build malware to undermine its own product’s security features,” wrote Ross Schulman, senior policy counsel for New America’s Open Technology Institute. “If a court can legally compel Apple to do that, then it likely could also legally compel any other software provider to do the same, including compelling the secret installation of malware via automatic updates to your phone or laptop’s operating system or other software.”
Courts in other countries could very well require the same of Apple or any other tech firm. If, however, security agencies in China, say, or Israel, already have your phone and want to inspect its contents, you would be unwise to consider its data safe even if Apple wins this case.
Bear in mind that the U.S. government can, under court order, order up its own malware and attempt to sneak that onto the computers of suspects. The CDT’s Greg Nojeim called that “troubling” and in need of more controls, but called it “probably ultimately less harmful than is requiring device manufacturers to engineer digitally defective products.”
What about other encrypted gadgets and apps?
Google’s latest version of Android — which I use on my own Nexus 5X — also also includes full-device encryption, without a backup key for law enforcement and with an auto-erase feature that’s invoked after too many incorrect unlock attempts.
Google, however, does not seem to have faced a demand like Apple’s. And it has yet to declare its willingness to nail its colors to the mast in the way that Tim Cook just did. Many tech companies seem happy to let Apple take the lead here.
One exception: Jan Koum, founder of the WhatsApp encrypted-messaging tool, endorsed Cook’s stance in a Facebook post Wednesday. “We must not allow this dangerous precedent to be set,” he wrote. “Today our freedom and our liberty is at stake.”
Will Apple be able to stop this in court?
Pym’s order gives Apple five business days to assert that this proposed circumvention “would be unreasonably burdensome.”
The law at stake, the All Writs Act, is a measure that dates to the 18th century and allows courts to issue orders “necessary or appropriate” to a case when standard-issue search warrants and other judicial tools don’t fit.
Is asking Apple to write custom code that defeats its own security measures necessary and appropriate, or is it unreasonably burdensome? We pay judges to answer questions like that.
If Apple loses, we will see other companies subject to similar demands. If it wins, law enforcement agencies seeking customized security-circumvention software will have a much harder time advancing their argument.
Will we hear about this in the 2016 campaign?
Oh, yes. Republican candidate Donald Trump pounced on Apple in a Wednesday morning Fox News appearance: “Who do they think they are? No, we have to open it up.”
This may be the only thing I know for sure about this situation: Trump’s statement is not the last we will hear about it.