Here’s one.
A federal judge has ordered Apple Inc to provide software to the Justice Department to help it unlock a phone used by one of the suspects in the San Bernardino, CA, terror attack because investigators suspect the device may hold critical details of the plotting behind the mass murder.
The government’s justification is this:
Law-enforcement agencies say companies such as Apple make it harder to solve crimes including terrorist attacks, child abuse and murder by putting security measures on phones that make it difficult or impossible for investigators to open them and examine data inside.
That’s an entirely valid concern.
The problem, though, is that forcing a back door into citizens’ communications encryption utterly destroys citizens’ privacy and security. There’s nothing to prevent Government from abusing that back door to engage in snooping on general principles and then actively and maliciously snooping in order to preserve the power of the men then in Government. The lawlessness of the present administration demonstrates that progression.
Of immediate effect, though, is that a backdoor for Government is a backdoor for hackers, whether these be script kiddies, terrorist hackers, financial or identity theft hackers, or any other sort.
The privacy and the security of our private identities, of our finances, of our health records, of any aspect of our lives we find useful to protect from prying eyes are critical to our ability to engage with our neighbors and our businesses and our government free from threats or attack.
The privacy of our communications, the security of our speech, must absolutely be preserved. There is no security at all without our individual liberties, of which speech is one, held secure.
“Law-enforcement agencies” and this Federal judge know this full well. And they know full well the truth of Apple CEO Tim Cook’s statement in his letter posted to Apple’s Web site:
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack.
Two things:
1. What the FBI is asking for is not a backdoor – it is for Apple to disable the lockout function after x (I believe the default is 10) failed passcode attempts. This could be done as a firmware/software push to the phone in an isolated environment, which all manufacturers have (a Faraday cage). The update is sent only to that device, and is an extremely low risk to other phones – using it would require capture of the update mechanism by the phone carriers/manufacturers.
2. The Fourth Amendment protects against *unreasonable* search and seizure, not *all* search and seizure. The argument must address that, and in an age of great and increasing danger, must persuade that the risk involved is worth the possible price, not only to me, but to others.
I am not yet convinced.
1. What you’re describing is, if not a backdoor, at best an unlocked window–the FBI is demanding exactly a means of jimmying a window: to bypass the entry mechanism into the secured/encrypted data. And, of course, once created, such a jimmy will work on any such entry mechanism, not only the particular phone.
“You’ve encrypted your data. I do not know your encryption key. Create for me a mechanism that will allow me to bypass your encryption key.”
The FBI’s to claim that this is a one-use key, to be destroyed at Apple’s discretion after the government has said it is done with it for the…present…investigation, is a disingenuous one. Keep in mind the background to this demand: this FBI is demanding that all encryption algorithm developers must give the FBI a decryption key–and we’re to trust government–the FBI–not to abuse it. Apple will have sole control over the thing? This “promise” of the FBI’s is a demonstration of the FBI’s disingenuousness. Of course the FBI will be present, watching–and recording–the procedure: that’s the chain of evidence proof that will be necessary in court.
Nonsense. There’s always a Very Good Reason for extending this exception just a little bit. And then there always is a Very Important Cause for broadening the thing just this little bit. And then there always is a Very….
The thought that Apple would successfully destroy the window jimmy–after the government is through with it, mind you, is risky. Human error leading to unintended exposure is too great, and too common. Human deliberate exposure for the exposeur’s High Purpose is too great a risk–see Edward Snowden. Human engineering success at gaining access to this sort of tool is legendary.
And this: half the solution to creating a bypass by hackers and other criminals, or by terrorists is simply in the knowing that such a solution is possible.
2. This is plainly an unreasonable search. The government is free to search to its heart’s content; it has the warrant. The government must conduct the search, though; that’s what the 4th Amendment authorizes. Nor the warrant, nor any other writ issuable under our Constitution, authorizes the government to draft–to enserf–anyone else into creating the government’s tools or conducting the government’s search for it.
Too, the government wouldn’t be in this strait were it not for a lower government IT expert–and so one who plainly knew better–hadn’t changed the phone’s iCloud access password, thereby preventing any automatic backup to the cloud–a cloud to which the government already had access and had already conducted its search. That the terrorists hadn’t manually backed up their phone to the cloud for months doesn’t suggest that the automatic backup couldn’t have been triggered by the FBI, were that particular password still intact. Oh wait–the iCloud is Apple’s product, too: why has the FBI not asked for help regaining entry to the iPhone’s iCloud account? Hmm….
Finally, the risk tradeoff is this: government convenience in conducting a warranted search (and so by definition “reasonable”)–its own IT expertise, either indigenous to the FBI, or the good folks in the NSA, or those of the CIA, or…, can create their own window jimmy to bypass the encryption–vs the security of all Americans from government unwarranted searches, searches that, with this jimmy it’ll be able to conduct at whim. And likely, given this FBI leadership’s demand for decryption keys generally, it’ll conduct without the nicety of a warrant.
Since there is no security without liberty, demanding Apple’s participation in a government search is too large a trade.
In the end, no one gets to sacrifice my liberty because they feel uncomfortable.
Eric Hines