On Key Escrows and Backdoors
Last week, Apple announced that the company will no more able to decrypt your personal data, making the extraction of this data impossible when requested by law enforcement.
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8. (Source: Apple)
I am really impressed with the great privacy enhancements that Apple put in place over the years and specifically in this latest release. It’s important to note that a lot of user’s content (including pictures, videos, 3rd party application data …) can still be extracted of a locked iPhone without his consent if you can get access to the his laptop on which he has iTunes sync enabled as forensics expert Jonathan Zdziarski notes.
The reactions were widely positive. This comes only weeks after the “iCloud leak” and, over the past year, we have learned the extent of the overreaching mass surveillance programs thanks to Edward Snowden. But not everyone welcomed the decision. Orin Kerr wrote an article in the Washinghton Post, claiming that it’s a “dangerous game” that Apple is playing and that Apple shouldn’t obstruct “justice” for cases where law enforcement has a lawful search warrant.
So what is Orin Kerr advocating for?
Orin Kerr’s fantasy seems to be a computer system that is encrypted in such a way that someone with a warrant would be able to decrypt the device. Let’s see how this technically can be implemented.
Key escrowing and backdoors
For Apple to be able to hand over data to law enforcement, they would require a backdoor. Backdoors in encrypted systems are better known as key escrows. Key escrowing allows Apple to derive the decryption key of the device based on non-user entered parameters. In such a setting, Apple would be able to generate the decryption keys of any phone, not just the warranted devices.
This has considerable privacy and security implications.
Here’s what breaking government mandated backdoors look like in a non-digital world, it makes us all safer, right?
Systems security vs Encryption
As we’ve seen many times in the past, all web services are breached at some point. iCloud, Gmail, Facebook … have all had data breaches. What we learned is that no matter how good your engineers are, attackers will always win. Although, the attackers have the upper hand in breaching systems, the defenders appear to have the upper hand when it comes to encryption. Breaking into a system is not all, you still need to be able to read the information that is stored on a server if it’s encrypted. And we know that correctly implemented cryptography works. So encryption is your last line of defence when it comes to protecting your information.
What does this mean for the Apple iOS 8 device encryption? It does mean that, whoever your attacker is, whether it’s the Chinese government, the NSA or simply a group performing illegal hacking of mobile data to ask for ransoms, they will eventually figure out how Apple generates those key escrow decryption keys and be able to decrypt any iOS device.
Recent Snowden revelations showed how the NSA hacked into Google’s datacenter to gather information that they couldn’t get through legal channels. I have no doubt that they, their partners and their adversaries would do anything to compromise Apple’s master key system (or already have).
The security tradeoff that Orin wants us to make here is definitely not worth it. Are we ready to trade in the security of millions of iPhones to be able to decrypt a few iPhones a year that had the required search warrant? I don’t think so.