Advertisement
U.S. markets closed
  • S&P Futures

    5,210.00
    -4.75 (-0.09%)
     
  • Dow Futures

    39,211.00
    -12.00 (-0.03%)
     
  • Nasdaq Futures

    18,197.25
    -34.25 (-0.19%)
     
  • Russell 2000 Futures

    2,048.50
    -1.30 (-0.06%)
     
  • Crude Oil

    82.61
    -0.11 (-0.13%)
     
  • Gold

    2,164.50
    +0.20 (+0.01%)
     
  • Silver

    25.33
    +0.06 (+0.24%)
     
  • EUR/USD

    1.0878
    +0.0001 (+0.01%)
     
  • 10-Yr Bond

    4.3400
    +0.0360 (+0.84%)
     
  • Vix

    14.33
    -0.08 (-0.56%)
     
  • GBP/USD

    1.2727
    -0.0002 (-0.02%)
     
  • USD/JPY

    149.2820
    +0.1840 (+0.12%)
     
  • Bitcoin USD

    65,586.70
    -2,416.39 (-3.55%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,722.55
    -4.87 (-0.06%)
     
  • Nikkei 225

    39,596.29
    -144.15 (-0.36%)
     

Apple could lock governments out of future iPhones

Apple's (AAPL) legal showdown with the federal government over the security of iPhones likely won't be settled for many months, or even years if the case goes all the way to the Supreme Court. But the world's biggest technology company could easily make upcoming versions of the iPhone even more secure, all but eliminating its ability to help law enforcers crack the encryption.

In the current case, the FBI wants Apple to create a special, less secure version of its iOS iPhone software and install it on the phone of deceased San Bernardino terrorist Syed Farook. With the weaker software installed, the FBI would have a much easier time guessing Farook's password. That's only possible because the iPhone's hardware allows certain kinds of software updates from Apple without requiring a password. Future phones could be designed to lock out any such changes or erase data if changes were made.

That would likely bring the clash back to Washington, D.C., where lawmakers and presidents have struggled for decades to find a compromise between security and privacy. Already lawmakers in two states, New York and California, are pursing legislation that would require phone makers to include a backdoor for law enforcement agencies.

Apple is expected to appeal a magistrate judge's ruling in the Farook case that it create the new software to help the FBI crack the terrorist's iPhone. Apple CEO Tim Cook said the software would constitute a backdoor into all iPhones that would inevitably leak out and be misused. The case has split public opinion, with some tech firms such as Google (GOOGL) and Facebook (FB) coming out in support of Apple's position and some politicians and law enforcement officials criticizing the company's stand.

Regardless of what Apple does with future phones, the fight over the Farook case will also figure into the debate over broader policy changes, says Robert Cattanach, a partner at law firm Dorsey & Whitney who works on cyber-security cases and a former special counsel to the Secretary of the Navy. "The real question is whether the government can leverage whatever precedent is created – including the possibility that the court does not force Apple to comply — as a basis to convince Congress to pass legislation that would require tech companies to provide them with a true backdoor to defeat encryption going forward," Cattanach says.

The evolution of iPhone security

Apple has made the iPhone increasingly secure since the device was introduced in 2007. The moves have helped protect the personal data of hundreds of millions of iPhone owners from getting into the wrong hands but also made it more difficult for law enforcement agencies to extract data from the phones of criminals or terrorists. 

Apple didn't make the moves solely out of a desire to prioritize security. Customers have been getting more and more concerned with every new hacking scandal, including a 2014 breach of Apple's iCloud service that led to the disclosure of hundreds of nude photos and videos of celebrities. Edward Snowden's revelations also included numerous efforts by the government to crack iPhone security.

The biggest change so far came in September 2014, when Apple released an updated version of its iOS software which encrypted all iPhone user data -- everything from photos to chat logs -- with virtually uncrackable encoding by default. Previously, law enforcement agencies had been able to extract data from a locked phone even without the password because the data itself was stored in an unencrypted format. But after Apple made the change, accessing the data became much more complicated.

Then last year, Apple added a new hardware security feature, dubbed the secure enclave, to the iPhone as part of the rollout of its fingerprint scanning unlock feature. The enclave is a separate processing chip devoted solely to securing data. Everything the secure chip does is encrypted, in theory eliminating vulnerabilities that hackers might find in other parts of the phone or in the iOS software.

For future iPhones, Apple could essentially lock itself out of the device even further. That became apparent after some confusion arose about the Farook case earlier this week.

Farook's iPhone is an older 5C model, although it's running more current software that includes the default encrypting of all data. But it lacks the security chip of newer phones, and some security experts initially said that the FBI's requested software trick from Apple wouldn't be able to fool the secure enclave. Apple quickly disabused the experts, admitting that a similar workaround could apply to newer phones as well. The secure enclave has a reprogrammable component, its firmware, that apparently can be reset by Apple.

But, the experts noted, Apple could alter the secure enclave in future versions of the iPhone to block changes to its operating software code without a password. Or the chip could be designed to erase all its data if its software was tampered with.

"I bet Apple will move towards making the most sensitive parts of that stack updatable only in very specific conditions: wipe user data, or keep user data only if the phone is successfully unlocked first," Ben Adida, security expert and lead engineer at Clever who formerly worked at Square and Mozilla, wrote on his blog on Thursday. "The interesting question will be whether Apple will be legally allowed to engineer their phones this way."

Advertisement