Monday, February 29, 2016

Apple v. FBI ...some remarks...

Apple v FBI

“This is not a case about one isolated iPhone,” says Apple in the introduction to Apple Inc.’s Motion to Vacate Order Compelling Apple Inc. to Assist Agents in Search, and Opposition to Government’s Motion to Compel Assistance. I’ve just read and reread it. You should do the same.

It reads more like a press release than a motion attempting to vacate a specific judicial order. This may be because we simply don’t have contemporary laws designed to deal with the issue. It is also interesting that the document seems to conflate legal arguments with policy arguments. This is easy to explain, as (1) most of the law is on the government’s side and (2) passions are high:

“If Tim Cook’s kid was kidnapped, you know the data from that iPhone would be at the FBI in five minutes flat!” — Popular rhetorical statement paraphrased from numerous social media sites
“In short, the government wants to compel Apple to create a crippled and insecure product. Once the process is created, it provides an avenue for criminals and foreign agents to access millions of iPhones. And once developed for our government, it is only a matter of time before foreign governments demand the same tool.” — From Apple, Inc.’s Motion to Vacate
 Apple is surely right about one thing: “This is not a case about one isolated iPhone.” The opinion of every lawyer I’ve asked is that Apple v. FBI is headed for the Supreme Court. The problem is, I don’t want the Supreme Court (or any court) empowered to make policy – that’s a job for the Legislative branch. Regardless of what you think of Congress, this is their job and they’d better get this one right. What it means to be a digital citizen and identifying the border between security and privacy are two of the most important issues of our time.

Can This iPhone Be Hacked?

Let’s stop talking about whether or not Apple can or cannot hack the phone. It is within the capabilities of several organizations (including Apple) to retrieve the data from this specific iPhone 5c. As my good friend Col. John Fenzel (Ret.) told me, “The NSA most certainly has the expertise to decrypt Farook’s iPhone, and if they don’t, that represents not only a technical security gap, but a virtual chasm of expertise at Fort Meade that needs to be addressed now.” In other words, Apple can do it, the NSA can do it, and so can John McAfee (just ask him).

The Balance Between Security and Privacy

Taking into consideration that the entirety of our digital lives – the 150-plus times each day that we check our smartphones, the amount of information we store in the cloud (remember, there is no cloud; it’s just someone else’s computer), the fact that everything about us, our tax ID numbers, social security numbers, credit card numbers, contact information, financial data, medical history, entertainment preferences, our computer software, the content we own, everything we are – is digitally described and stored somewhere, the right question to ask is: “What is the proper balance between security and privacy?”

The Value of Information

According to Cornell University Law School’s Legal Information Institute, “The Fourth Amendment originally enforced the notion that ‘each man’s home is his castle’, secure from unreasonable searches and seizures of property by the government. It protects against arbitrary arrests, and is the basis of the law regarding search warrants, stop-and-frisk, safety inspections, wiretaps, and other forms of surveillance, as well as being central to many other criminal law topics and to privacy law.” So, in practice, the Constitution says that even if Apple is compelled to create a piece of software that compromises the security of iOS, it will only be used lawfully. Our Fourth Amendment rights are sacred, and there is nothing about Apple’s motion that suggests otherwise.

I asked my friend Jane Rhodes-Wolfe, FBI Section Chief of the Exploitation Threat Section/Counterterrorism Division, to tell me why the FBI is choosing this particular phone at this particular time to fight this particular battle. She could not comment on the pending litigation, but she did say, “The evidence obtained from communication (telephonic and electronic) records is key to law enforcement and intelligence matters. The evidence obtained may identify key members of an organized group or contacts of a lone actor. That information leads investigations in locating witnesses, co-conspirators, facilitators and influencers. Communication is key to understanding the crime or terrorist attack and bringing those responsible to justice or to thwart further nefarious acts.”

Eavesdropping on communication is good for catching bad guys. I get it. But why limit our thoughts to getting info about a terrorist off of an iPhone? What about all of the other data in Metamerica – the digital description of each and every one of us? What about the remarkably large digital infrastructure that is completely invisible to us but enables almost every facet of our modern, connected lives?

The Danger of Compelling Apple to Comply

There is another side to this story, and it foretells a dark, evil place where no one – not Tim Cook, not the FBI, not the Judge, not you, not I – knows enough to snap to judgment. What would the world look like if only nation-states and bad guys had access to your most personal information? Jane pushed back and said, “How would that make the future different than today?” There’s a pretty profound difference. Today, only Apple, the NSA and a few virtuoso hackers have the ability to hack our phones. If Apple is forced to create a backdoor, the code will get out into the wild and no one knows what would happen at scale. Don’t pretend you can imagine it – you can’t!

The horror show won’t happen all at once. One scenario has nation-states and law enforcement chipping away at our digital liberties (however we will ultimately define them) until the law of unintended consequences rules. In that scenario, average citizens will be 100 percent at risk 100 percent of the time and “bad guys” (however we ultimately define them) will go so far underground, they will cease to be findable. Don’t limit your imagination to human “bad guys,” the bad actors are likely to be machine learning algorithms.

Now, bring together every nightmare scenario you can imagine from every Orwellian-style writing you’ve ever seen, and you will start to understand the remarkable number of bad things that might happen in a digital world where there are no metaphorical door locks, and not even bathrooms are private.

Your Future, You Decide

Which brings me to the most critical, time-sensitive thing about Apple v. FBI: your personal role in our collective future. This is not a hypothetical case; it is real and it is happening now. What you need to do is decide how you want Congress to act. What kind of law do you want to see passed? What does it mean to be a digital citizen of the United States of America?

If you’ve read this far, you are probably wondering how to frame the debate, with whom to have it, and what to do when you’re sure you have an answer. My suggestion is to speak to everyone you know. This is not a technical debate. Don’t worry about who can or cannot hack any particular device. Just keep discussing the real issue: “What is the proper balance between security and privacy?” It is truly one of the most important questions of our time, and we owe it to our posterity to have a civil, Socratic debate about it.

No comments:

Post a Comment