In the wake of Federal Magistrate Judge Sheri Pym’s February 16 order to compel Apple to create code to unlock the work-provided iPhone of San Bernardino killer Syed Farook, both parties (see the U.S. Justice Department’s motion to compel and Apple’s Motion to Vacate) responded with arguments that reveal how a technical forensics question cuts to some core issues concerning technology, security and fundamental rights.

An initial non-technologist’s note on the tech: on balance, I am persuaded by the view, argued by Apple and others, that creating the software sought in this case involves serious security risks and that, once the code is written, the government would not only enjoy the precedent for future use, but Apple would be in a difficult position to resist the efforts of other governments. It is also the case that this is not merely about access to the passcode, since the ability to identify the password is also the ability, on iOS devices, to decrypt the data on the device. That is to say, my understanding is that this is a case about encryption, even if it doesn’t directly raise the ‘classic’ decryption questions related to public key or other forms of crypto. In short, whatever specific legal issues are at play in federal court, the implications of the case are indeed far-reaching, in law and policy.

U.S. courts will obviously decide the issue based on domestic American law. As my dean and colleague at UCI Law, Erwin Chemerinsky, notes in an op-ed,

The public battle is being portrayed as pitting protecting privacy against safeguarding national security. Although that is the underlying issue, the legal question here is much narrower: Can a court, without statutory authority, force Apple to create new software? I think the answer is no, and Apple should win this legal fight, assuming Apple is, in fact, being asked to create something new.

Because this is an issue for U.S. courts, the public debate belongs to the language of American statutory and Constitutional law. Naturally, then, the articulation of the issues lacks some legal resonance beyond American borders and doesn’t quite capture how the dispute’s resolution may have profound implications for the hundreds of millions of individuals worldwide who use secure digital technologies, including Apple devices.

One way to deal with this (let’s say) rhetorical gap is to transcribe the domestic legal debate according to the legal framework that does resonate globally – international human rights law, in particular the International Covenant on Civil and Political Rights (ICCPR), to which the United States and 168 other countries are bound. How the United States handles this case will reverberate among other parties to the ICCPR, not to mention non-parties looking to legitimate how and when they compromise digital security for their citizens.

Article 17 of the ICCPR protects against “unlawful or arbitrary interference” with privacy, and it guarantees everyone “protection of the law against such interference or attacks.” An argument could be made, akin to the domestic argument to which Dean Chemerinsky alludes, that the order to compel Apple to write code is not based on applicable law, since the relevant catch-all act of Congress (the All Writs Act) could not have contemplated the kind of digital technologies pervasive in our lives. And the fact that, in this specific case, even the San Bernardino police chief says “there’s a reasonably good chance that there’s nothing of any value on the phone,” leads me to wonder whether the effort itself lacks the elements of reasonableness to justify it as a non-arbitrary interference with privacy. In this particular case, the narrow Article 17 issues may be complicated by the fact that San Bernardino County owned the phone and Farook is dead. However, as a broader question of the authority of the government to compel the writing of code to undermine digital security, Article 17 is a good place to start thinking about implications under the ICCPR’s right to privacy.

Those concerned instrumentally with privacy — that is to say, how privacy is a critical gateway for the enjoyment of other rights — may look elsewhere in the ICCPR. And here is where I think it’s particularly useful to think about how Judge Pym’s order and its broad implications may serve to undermine freedom of expression. Article 19  of the ICCPR provides:

1. Everyone shall have the right to hold opinions without interference.

2. Everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice.

3. The exercise of the rights provided for in paragraph 2 of this article carries with it special duties and responsibilities. It may therefore be subject to certain restrictions, but these shall only be such as are provided by law and are necessary: (a) for respect of the rights or reputations of others; (b) for the protection of national security or of public order (ordre public), or of public health or morals.

In my report on encryption, presented to the Human Rights Council last spring, I argued that secure communications are fundamental to the exercise of freedom of opinion and expression in the digital age, permitting the maintenance of opinions without interference and securing the right to seek, receive, and impart information and ideas. Encryption and anonymity, I argued, allow for zones of privacy that enable all sorts of expression. From the report:

11. . . . an open and secure Internet should be counted among the leading prerequisites for the enjoyment of the freedom of expression today. But it is constantly under threat, a space — not unlike the physical world — in which criminal enterprise, targeted repression and mass data collection also exist. It is thus critical that individuals find ways to secure themselves online, that Governments provide such safety in law and policy and that corporate actors design, develop and market secure-by-default products and services. None of these imperatives is new. Early in the digital age, Governments recognized the essential role played by encryption in securing the global economy, using or encouraging its use to secure Government-issued identity numbers, credit card and banking information, business proprietary documents and investigations into online crime itself.

12. Encryption and anonymity, separately or together, create a zone of privacy to protect opinion and belief. For instance, they enable private communications and can shield an opinion from outside scrutiny, particularly important in hostile political, social, religious and legal environments. Where States impose unlawful censorship through filtering and other technologies, the use of encryption and anonymity may empower individuals to circumvent barriers and access information and ideas without the intrusion of authorities. Journalists, researchers, lawyers and civil society rely on encryption and anonymity to shield themselves (and their sources, clients and partners) from surveillance and harassment. The ability to search the web, develop ideas and communicate securely may be the only way in which many can explore basic aspects of identity, such as one’s gender, religion, ethnicity, national origin or sexuality. Artists rely on encryption and anonymity to safeguard and protect their right to expression, especially in situations where it is not only the State creating limitations but also society that does not tolerate unconventional opinions or expression.

Lest we think of secure technologies as necessary only for the vanguard of activists and professionals in expressive fields, all we need to remember is the trite but true analogy to the postcard – sending messages online without encryption is equivalent to sending a postcard or unsealed letter, to which anyone with the means (and it’s not hard) can gain access. A world of open, insecure, transparent communications is also a world of chilled expression.

So, to the extent that Judge Pym’s order leads to vulnerabilities in secure communications, compromises in the ability of individuals worldwide to fight and evade the consequences of censorship, and precedent that could be deployed on other platforms moving forward, I believe that the implications for freedom of expression are potentially quite serious.

The freedom of expression, it’s often said, is not absolute, and Article 19(3) permits restrictions where provided by law and necessary and proportionate to protect, among other things, national security and public order. Law enforcement and national security officials raise legitimate concerns about the barriers posed by encrypted communications, whatever the device. Here’s in part what my report said about that last year:

13. The “dark” side of encryption and anonymity is a reflection of the fact that wrongdoing offline takes place online as well. Law enforcement and counter-terrorism officials express concern that terrorists and ordinary criminals use encryption and anonymity to hide their activities, making it difficult for Governments to prevent and conduct investigations into terrorism, the illegal drug trade, organized crime and child pornography, among other government objectives. Harassment and cyberbullying may rely on anonymity as a cowardly mask for discrimination, particularly against members of vulnerable groups. At the same time, however, law enforcement often uses the same tools to ensure their own operational security in undercover operations, while members of vulnerable groups may use the tools to ensure their privacy in the face of harassment. Moreover, Governments have at their disposal a broad set of alternative tools, such as wiretapping, geo-location and tracking, data-mining, traditional physical surveillance and many others, which strengthen contemporary law enforcement and counter-terrorism.

So the government enjoys a clearly legitimate interest when it comes to public order and national security, but the order here sees to me to be problematic as a matter of Article 19. First, just as with the legal standard in Article 17, I have some doubt whether the order is based on authority that we should consider “provided by law.” Assuming the All Writs Act is sufficient, however, is the restriction “necessary…for the protection of national security or of public order”? To be sure, the FBI and local law enforcement, such as New York’s district attorney, argue that access to secure communications is necessary. In the case of this iPhone used by this particular person, it is easy to imagine a situation where the need for data on the phone is necessary. Law enforcement’s argument for access to the data is not frivolous, and if the technology were not such to create vulnerabilities across all devices (as Apple and others argue), we would probably not be having this debate.

But that’s perhaps the critical problem. The concept of necessity under Article 19 includes within it the concept of proportionality (see para 35 of the report for my discussion with an array of sources; the Human Rights Committee made this clear in General Comment 34, starting at para 22; and governments by and large agree with this understanding of necessity under Article 19). Seeking access in this situation according to the proposed means would almost certainly implicate the security, and thus the freedom of expression, of unknown but likely vast numbers of people, those who rely on secure communications for the reasons identified above and explicated in more detail in the report. This is fundamentally a problem of technology, one where compromising security for one and only one time and purpose seems exceedingly difficult if not impossible. It is also not clear that the government has tried other means short of compelling Apple’s code-writing in this case, such as enlisting the technical expertise of the NSA to access this particular phone. Again from the report, “States must show, publicly and transparently, that other less intrusive means are unavailable or have failed and that only broadly intrusive measures, such as backdoors, would achieve the legitimate aim.” (para 43)

The FBI director said last week that this is one of the hardest issues he’s seen in government. I don’t doubt that. But I am not sure that the government has fully appreciated the trade-offs involved for the long-term security of communication. During the first term of the Obama administration, the Department of State, led by Secretary Clinton, pursued a strong policy of Internet freedom, one in which the global use of secure technologies was encouraged as a bulwark against repression, censorship, harassment. It just so happens that such security has value not only for expression but also the overall economy. Ultimately, however, that policy of securing the Internet depends upon strong protection in domestic law at home, and that’s a conversation that can benefit from incorporating human rights law into the discourse.

FBI v Apple from the perspective of human rights law