Justice

The FBI’s Court Order Against Apple May Violate The First Amendment

CREDIT: AP Photo/Carolyn Kaster

Apple made history last week after refusing to obey a court order to crack a locked iPhone that belonged to one of the San Bernardino shooters. The Justice Department doubled down, filing a motion on top of the FBI’s to compel Apple to crack open the phone. The tech giant has until February 26 to respond to the court orders, but has shown no sign of caving.

There are a host of constitutional issues at play with the Apple’s case, but the Cupertino-based company hired a free speech attorney to help take the matter to the U.S. Supreme Court. Why would Apple need a free speech attorney for terrorism case? The reason lies with an obscure part of the law where computer code was ruled as a protected under the First Amendment.

As it stands, Congress hasn’t passed a law that would force a private tech company to create a tool on the government’s behalf. The closest approximation is the Communications Assistance for Law Enforcement Act (CALEA) in 1994, which required telecom companies to make their infrastructure systems law enforcement-ready (e.g., wiretaps and VoIP). But data was left out, and it was decided that information networks, namely the internet, would be treated differently.

The FBI is using the 18th-century law called the All Writs Act (AWA), which allows courts to ask third parties to help execute a warrant, to compel Apple’s cooperation. “You always know there’s a problem when the government is relying on a law from the 18th century because Congress has already taken up the issue of a private party helping as a [government] actor,” said Melanie Teplinsky, a cyberlaw professor at American University in Washington, D.C.

But the government asking Apple to create and validate a tool for law enforcement may go beyond the AWA’s traditional application. “The judge has an awful lot of discretion here,” said Michael Froomkin, an internet law professor at the University of Miami. “In the ordinary AWA [argument], the government asks to use something you’ve already got. This is unusual because they’re asking [Apple] to do quite a lot of work.”

The government could compensate Apple for that work, and asked the company for an estimate. But even in doing so, the case still deviates from the law’s traditional application, Froomkin said. Apple’s best shot could be proving that complying with the FBI’s request affects other phones and one way to do that is with a First Amendment defense.

Is Code Speech?

In 1999, a district court ruled that publishing encryption source code, similar to how developers share code in published articles or open source marketplaces like Github, was a form of speech protected under the First Amendment. Mathematician Daniel Bernstein wanted to publish the source code for the encryption program he created, Snuffle, to share with the academic community. At the time, the government required authors to get a license before exporting encryption source code and blocked Bernstein from publishing his academic paper. He challenged the government in Bernstein v. U.S. Department of Justice and won. The court agreed that code language used and understood by programmers should be given similar protections as spoken languages.

Another case involving an illegal decryption program for DVDs, Universal Studios v. Corley, complicated that ruling, saying speech protections don’t cover object code, the language of 1s and 0s computers translate from source code, to complete an action, said Vivek Krishnamurthy, a Harvard University cyberlaw professor. “Once you put [the code] into computer and to be processed, it gets little to no [First Amendment] protection because it’s just functional like a hammer or gun,” he said.

Writing the code itself wouldn’t violate Apple’s First Amendment rights, but requiring Apple sign and certify the firmware update could be compelled speech — forcing a person to divulge information or profess beliefs they don’t have or want to share.

“Signing it digitally, that’s an assertion — about a fact — that this is a legitimate software,” Froomkin said. The argument may not be a slam dunk, he said, but “it has the virtue of resonating with something real. Apple is being asked to assert something it doesn’t want to assert” — violating the security of potentially all of the company’s devices for a law enforcement investigation.

Compelled speech is usually “used in a criminal context, protecting the contents of your mind when you don’t want to disclose them,” Teplinsky said. Apple could theoretically make the case that signing a firmware update to circumvent the iPhone’s 10 failed passcode entry limit would be forcing the company to express a political belief or divulge intellectual secrets it doesn’t want to share.

An Issue For Congress, Not The Courts

The case is politically and legally divisive: Silicon Valley is split with Microsoft CEO Bill Gates agreeing that the government’s request is reasonable and Google and Facebook backing Apple. Even former NSA Director Michael Hayden, who helped implement the agency’s controversial metadata program and still supports it, is in Apple’s corner.

Apple’s case demonstrates the dearth of regulation for cyber issues and leaves the First Amendment as a potential justification for refusing the FBI’s court order. But the end result could have broader implications regarding the government’s ability to use private companies and citizens as extensions of law enforcement.

On a technical level, Apple and cybersecurity experts contend that signing and disseminating official code to bypass the iPhone’s 10-guess limit before all data is wiped from the device, would not only undermine one iPhone’s security but all of them.

“The government is saying we don’t want the phone, we will give you [Apple] the phone and you can break into it,” said Jerry Grant, a digital forensics investigator who runs his own consulting firm in Rochester, New York. “But my understanding is, this isn’t a single phone option. [The government] is asking Apple to remove the restriction that wipes the phone after 10 tries. That option is the same on every phone. Apple can’t just build that option to work with just one phone because it’s bypassing a core function of the operating system.”

Removing the 10-try limit would allow the FBI to guess the passcode in about 20 hours, or 10,000 tries, for a four-digit, all numeric passcode. The government has the technology to build something like what they’re asking Apple for, but they need proprietary information to do it, Grant said.

Even if the government wins after a series of appeals, Apple could tie its hands by investing in stronger security measures that make the iPhone virtually invincible. The company is already reportedly working on updates for its next iPhones that would make complying with a similar government court order impossible. Moreover, Apple’s employees could independently refuse to fulfill the request, making the issue more focused on individual rights rather than that of a corporation.

“The debate is about how we as a society try to deal with cases when fundamental rights like privacy and freedom of expression measure up against public security and safety. And those questions aren’t best asked in the courts but in the legislature,” Teplinsky said. “The debate needs to happen and it needs to happen in public,” rather than in the courts.

A Deeper Problem

Apple’s resistance to the FBI’s request isn’t out of sheer defiance, but a symptom of growing concern around cybersecurity. The company initially worked with the FBI in the San Bernardino investigation, offering phone data that had been stored on the company’s iCloud servers. Apple also asked the FBI to submit its request under seal, but the agency went public with the ordeal, the New York Times reported.

“For a long time, these issues have been addressed behind closed doors,” Teplinsky said. But the 2013 Snowden revelations changed everything and ignited a global ripple effect. In the U.S. and overseas, “businesses started to really understand that if they didn’t provide some reassurances to customers, that it was going to become a business problem for them. Companies who were working with the intelligence agencies lost business,” and public trust.

The U.S. has been increasingly looking to make phone data more accessible for routine law enforcement investigations as well as counterterrorism efforts. Cyberthreats and the need for encryption have only recently entered public discourse, but tensions within the government and with the public on how to protect the nation’s information systems haven’t been resolved.

“We have a cybersecurity problem. There is information security, which is protecting your own information, and attacking others,” Teplinsky said. “And when it comes to protecting our own, we’re vulnerable. We not only have to worry about government surveillance, when it’s pursuant to court order but cyber-espionage designed to steal our most valuable secrets.”

And Apple’s response is likely part of a movement where universal encryption is a natural step.