Archive for February, 2016


The Core of Apple v. FBI

If you haven’t read Apple’s open letter to customers yet, you really should. As Apple points out,

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

Rich Mogull expands upon this concept in a very convincing way that gets to the core of the issue here.

Everything, all of it, boils down to a single question.

Do we have a right to security?

Vendors like Apple have hit the point where some of the products they make, for us, are so secure that it is nearly impossible, if not impossible, to crack them. As a lifetime security professional, this is what my entire industry has been dreaming of since the dawn of computers. Secure commerce, secure communications, secure data storage. A foundation to finally start reducing all those data breaches, to stop China, Russia and others from wheedling their way into our critical infrastructure. To make phones so secure they almost aren’t worth stealing, since even the parts aren’t worth much.

To build the secure foundation for the digital age that we so lack, and so desperately need. So an entire hospital isn’t held hostage because one person clicked on the wrong link.

The FBI, DOJ, and others are debating if secure products and services should be legal. They hide this in language around warrants and lawful access, and scream about terrorists and child pornographers. What they don’t say, what they never admit, is that it is physically impossible to build in back doors for law enforcement without creating security vulnerabilities.

One of the significant reasons I have been so enthused by Apple throughout the years is that their hardware is focused upon providing technical solutions that not only make the world a better place, but do so in manners that aim to eliminate real world challenges.

Take TouchID for instance. No other company has created a solution that is as secure and because of this purity, it is all but impossible for bad guys to hack the software to gain access to your banking information that has formed the security basis for Apple Pay. As a result, fraud is impossible with Apple Pay unless someone actually has possession of your credit card information, which really has nothing to do with Apple Pay fraud at all (and which is no different than stealing a person’s wallet. It’s more stealing than it is fraud).

TouchID is so brilliant precisely because the security design places every aspect of the implementation upon the hardware itself. The fingerprint information gets stored within a walled off area of the processor, a secure element that received device-specific provisioning when manufactured, unavailable to any system or network besides the TouchID sensor. The data is not saved in software, accessible for clever hackers to search out vulnerabilities. No, it resides on the hardware itself, and when the phone is locked, all keys to files and keychain items are wiped and inaccessible until the read-only secure element processes a fingerprint match received from TouchID input.

Let’s apply what the FBI is requesting in the San Bernardino case to Apple Pay, because it demonstrates the exact point that Mogull is making. Fleshing out the technical-nature of this example helps us understand the hypotheticals posed. If Apple were to design a backdoor into Apple Pay, they would be destroying the integrity of the security design behind Apple Pay altogether, exposing the software to everyone and giving anyone with the know-how access to reverse engineer the design in order to expose potential weaknesses. Software is merely instructions and there’s always a way to work around those instructions. This simple fact is precisely the problem with digital security since computers were invented.

Said another way, if Apple were to design an exception for the FBI by requiring software workarounds, we will forever be dealing with systems that are inherently insecure. The horrible atrocity in San Bernardino already occurred. But in the government’s desire to gain access to information surrounding the event (assuming that it proves useful at all) will only create more danger for citizens, since all data will be subject to abuses to bad guys with ill-intent (and speaking nothing to the potential for government overreach).

I’ll end with another paragraph from Mogull.

The FBI wants this case to be about a single phone used by a single dead terrorist in San Bernadino to distract us from asking the real question. It will not stop at this one case, that isn’t how the law works. They are also teaming with legislators to make encrypted, secure devices and services illegal. That isn’t conspiracy theory, it is the stated position of the director of the FBI. Eventually they want systems to access any device or form of communications, at scale. As they already have with our phone system. Keep in mind that there is no way to limit this to consumer technologies, and it will have to apply to business systems as well, undermining corporate security.