Hi all,
My partner, Patrick Ball, wrote a great piece in Foreign Policy linking together the need for strong encryption and human rights in the US and around the world.
https://www.foreignaffairs.com/articles/2016-09-14/case-against-golden-key?c...
Worth a tweet?
Cindy
*********************************** Cindy Cohn Executive Director Electronic Frontier Foundation 815 Eddy Street San Francisco, CA 94109 (415) 436-9333 x108 ----Cindy@eff.org mailto:Cindy@eff.org ---- www.eff.org http://www.eff.org/
Join EFF! https://supporters.eff.org/donate
Cindy Cohn:
Hi all,
My partner, Patrick Ball, wrote a great piece in Foreign Policy linking together the need for strong encryption and human rights in the US and around the world.
https://www.foreignaffairs.com/articles/2016-09-14/case-against-golden-key?c...
Worth a tweet?
Cindy
Definitely!
-Katie (Krauss)
On 14 Sep (08:16:52), Cindy Cohn wrote:
Hi all,
My partner, Patrick Ball, wrote a great piece in Foreign Policy linking together the need for strong encryption and human rights in the US and around the world.
https://www.foreignaffairs.com/articles/2016-09-14/case-against-golden-key?c...
So hrm... this article requires registration to be read for which I don't want to do that :). Is there someone with the full text version that can post it?
Thanks! David
Worth a tweet?
Cindy
Cindy Cohn Executive Director Electronic Frontier Foundation 815 Eddy Street San Francisco, CA 94109 (415) 436-9333 x108 ----Cindy@eff.org mailto:Cindy@eff.org ---- www.eff.org http://www.eff.org/
Join EFF! https://supporters.eff.org/donate
tor-project mailing list tor-project@lists.torproject.org https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-project
Weird. I didn’t have to sign in. Here’s the text which I share for research purposes with this list.
Cindy
The Case Against a Golden Key
Published by the Council on Foreign Relations
The Case Against a Golden Key
Encryption Is a Life or Death Matter
By Patrick Ball
In April, not long after Apple refused to unlock the iPhone of one of the San Bernardino shooters, a variety of U.S. law enforcement representatives and officials, including the FBI, announced their support of an encryption bill that would require companies to “comply with court orders to protect Americans from criminals and terrorists.” In short, to encode in all the software running on cell phones or personal computers a backdoor, or a “golden key,” for the authorities. Although the FBI’s legislative initiative failed earlier this year, a new effort to expose Americans’ data is under way in Congress.
Developing such keys is unproductive and dangerous. Although the primary worry in the United States is that it could lead to mass surveillance, the lesser-known hazard is that it could jeopardize the safety of human rights activists, primarily those based abroad who rely on U.S. encryption tools to do their work.
There is no guarantee that a golden key will work, given the facility in which rogue hackers from all over the world can develop their own encryption tools. In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States. Of these, roughly two-thirds are commercial and the others are open source, even though some of the free products are only libraries that contain building blocks rather than whole encryption systems. Given the resources available to ill-intentioned hackers, it would therefore be impossible to stop them from building strong encryption applications of their own.
Instead, building encryption software with a golden key for government access would gravely compromise security for law-abiding citizens around the world, as it would encourage criminals and terrorists to build their own illegal software to frustrate the authorities and leave those without the technological skills—most of the world—more vulnerable to attack.
In discussions with numerous human rights activists working in developing countries, they name encryption as one of the most vital things for their safety. In preparation for a debate with James Baker, the FBI general counsel, and John Inglis, former deputy director of the National Security Agency at the National Academies of Sciences, Engineering, and Medicine, I surveyed about a dozen of my colleagues who provide training and technology to civil rights, civil liberties, and human rights projects around the world. They told me about a few of the groups using cryptography to protect their data: journalists’ associations in Egypt, Nicaragua, Rwanda, and Uganda; LGBT groups in Jordan, Morocco, Serbia, and central and southern Africa; democracy activists in Ethiopia, Kyrgyzstan, and Turkey; human rights activists in Cambodia, Tunisia, and East Africa and Latin America; and environmental activists in Ecuador and India.
I asked my colleagues whether these groups were concerned about general surveillance since Edward Snowden’s revelations about the breadth of U.S. spying. Their general answer was, yes, they’re more aware of the problem, and yes, it worries them, but their day-to-day issues are local. These groups worry most about attacks by their own military and police. As they work to promote and protect basic human rights, they face surveillance, beatings, imprisonment, torture, and execution by their own governments.
Many of the encryption tools that human rights groups use were funded by grants from the U.S. Department of State. The State Department’s goal with this funding is to further U.S. interests abroad by supporting vibrant, independent nongovernmental organizations that hold corrupt, brutal governments publicly accountable for their actions. They recognize that the protection of data—particularly data about abuses by government officials—is vital to that end.
One of the dozens of free and open source encryption tools designed specifically for human rights work is the Martus project. In 2001, I worked with a social entrepreneur and a team of software engineers to create it, and I directed the project until early 2013. Martus is a self-encrypting database application that stores information in a network of servers—that is to say, in the “cloud.” Hundreds of human rights groups have stored data in Martus. It is offered in over ten languages and has received downloads from more than 100 countries.
I’ve worked on and off in Guatemala for over 20 years, and the country is a prime example of how encryption aids human rights work and why it is so important. During the 1990s, I worked with a nongovernmental project that interviewed more than 6,000 people and collected media and other information about the tens of thousands of people disappeared, tortured, and killed by the government during the 1980s.
We encrypted our database files every night with a system called PGP, which was also used for other human rights investigations in Guatemala, to protect them against theft by government-sponsored death squads. Our security concerns were not merely good practice with sensitive information. The Guatemalan National Police had been accused of committing thousands of forced disappearances during the 1980s. My colleagues even knew people who had been disappeared. We worried constantly about physical attacks on our office, as well as digital assaults on the sensitive data we were accumulating.
Today, putting people’s physical security first—whether it’s against repressive governments, cybercriminals, or even nongovernmental terrorists—requires strong digital security.
In 2013, a Guatemalan court convicted the former president, General José Efraín Ríos Montt, of genocide against the indigenous Ixil people. During the trial, I presented statistical evidence consistent with the prosecutor’s claims that the army committed acts of genocide. I had worked with the nongovernmental project mentioned above and with the UN-run truth commission. Both of these 1990s research projects involved interviewing thousands of people about the violence of the 1980s, and this information was extremely sensitive because many of the perpetrators continued to hold high-ranking positions in the army and the government. Ríos Montt’s conviction was later overturned by the Guatemalan Constitutional Court on a technicality, but prosecutors have appealed the decision and are awaiting a new trial.
In 2007, when we began analyzing the lost and rediscovered archives of the Guatemalan National Police, we used Martus to store and encrypt the data. In 2013, prosecutors used documents and statistical analysis from the police archive to convict the director of the National Police, Héctor Bol de la Cruz, of responsibility for commanding the disappearance of student and labor leader Edgar Fernando García.
Documents in the police archive revealed that during the 1980s, the National Police exchanged information with the FBI about training and suspects, in large part through links with the U.S. State Department’s law enforcement liaisons. It’s clear what a “suspect” meant to the National Police—human rights activists, student and labor leaders, and dissident professors. We don’t yet know what information the FBI provided to assist the National Police.
In Guatemala, encryption was essential for protecting the data we collected to assure it was not tampered with and to conceal it from the officials we later hoped to try for crimes. It worked. After decades of painstaking research, independent groups persuaded Guatemalan prosecutors to try these cases, and we won.
A similar effort is under way in Iraq today. Several Iraqi groups are using the Martus secureApp Generator to create tools to document human rights violations committed by the Islamic State (otherwise known as ISIS), by other insurgent groups, by the Shiite militias aligned with the government, or by agents of the Iraqi government. The human rights groups use Martus to protect data about the victims and witnesses of violence.
In November 2014, I spent a week in northern Iraq working with Yezidi human rights groups after ISIS invaded their communities, killing or abducting thousands of people and causing thousands more to flee. A colleague and I shared Martus with them and helped the Yezidi groups adopt secure methods to document violence against their community. The Yezidi have one of the strongest cases of genocide that I have seen in recent years, but to bring justice to their people, they will have to prove it in a fair trial of the perpetrators. Yazidi groups are currently worried that as they collect data, ISIS will try to assassinate witnesses or human rights workers. Strong encryption protects their evidence against ISIS and against rogue or government hackers.
Many of these cases discussed here deal with nongovernmental groups in countries that are or were considered U.S. allies. My colleagues and I have collected data and presented expert statistical testimony in the trials for war crimes, crimes against humanity, and genocide of three former heads of state—Yugoslavian leader Slobodan Milosevic, Ríos Montt, and Chadian President Hissène Habré. Ríos Montt and Habré were very close U.S. allies while they were in power. For example, former U.S. President Ronald Reagan once said of Ríos Montt, who murdered and disappeared about 100,000 civilians during his 18 months in office, that he was “totally dedicated to democracy.” Ríos Montt’s government worked with the FBI. When the FBI demands access to all the world’s data, it is prudent to remember the types of governments with whom the FBI will share this information.
The question for the rest of us, for the White House and for Congress, and also for the American people, is are we willing to massively degrade security for everyone, and weaken journalists and independent groups, simply to add to the FBI’s already enormous powers?
THE HOME THREAT
Although the dangers appear less imminent in the United States, domestic civil rights, civil liberties, and human rights groups are similarly concerned about government surveillance of their electronic communications and stored data. For example, Human Rights Watch has sued the U.S. government about surveillance three times. The organization uses secure applications for cell phone calls (wherever possible), as well as end-to-end encrypted videoconferencing. Human Rights Watch’s internal e-mail platform employs two types of commercial encryption, and the organization is actively researching methods to archive encrypted information.
Journalists in the United States are also using strong cryptography, such as an innovative program called SecureDrop, which enables whistleblowers to share information with media organizations securely and anonymously. This is especially important for whistleblowers with information on malfeasance by officials in local, state, or federal government. As in foreign countries, protecting free media requires security against government snooping.
Unfortunately, this is not merely speculation. The need for security against U.S. government surveillance is well-founded. The FBI has acknowledged its history of surveillance of Martin Luther King, Jr., Malcolm X, and Muhammad Ali, among many other African American and antiwar activists during the 1960s. Referring to these cases, Baker admitted during our debate at the National Academies of Science, Engineering, and Medicine that the FBI has made “mistakes.” There are quite a few.
In the 1980s, the FBI conducted illegal investigations against the Committee in Solidarity with the People of El Salvador, which resulted in discipline for six FBI agents and the resignation of the agent responsible. The FBI’s past surveillance of nonviolent critics of U.S. government policy are now well documented in the historical record.
We should also not forget that surveillance disproportionately affects communities of color. As the journalist Trevor Aaronson wrote after investigating the FBI’s Miami office and discovering its network of thousands of confidential informants, “Anytime the government has a toy or tactic, it starts with a vulnerable population.”
A golden key to all of the world’s data would be quite a toy. Malkia Cyril, a Black Lives Matter activist and executive director of the Center for Media Justice, wrote recently that “encryption is necessary for black civil and human rights to prosper [because] it protects our democratic right to organize for change.” She points out that the police are already surveilling those involved in nonviolent movements demanding police reform. They monitor these activists’ social media activity and seize their phones if they are arrested during a protest.
Of course, we do not yet know the extent of the FBI’s intrusions, but what we’ve learned about the extrajudicial use of stingray devices for cell phone metadata tracking suggests that there’s a substantial problem. Cyril concludes, “For black communities and others pushed to the margins of political and economic power—democratic engagement and the exercise of our human and civil rights in a digital age demands the ability to encrypt our communications.”
THE REASONING
The FBI has been disingenuous about why it desires a golden key. At the simplest level, the FBI’s complaint about encryption is that criminals’ communication will “go dark.” What the FBI means is that in a few cases, suspects’ phones and data will be inaccessible to law enforcement because of strongly encrypted data. But these arguments fail to convey the vast new powers that they will have gained by knowing each phone’s location tracking and metadata.
The FBI’s chief hacker, Jim Burrell, argued at the panel we both participated in that “lawfully” hacking devices is time-consuming, difficult, and expensive, although Burrell conceded that his department’s budget is in the “hundreds of millions” of dollars. Baker claimed that there are cases that the FBI simply cannot solve because a suspect has so securely encrypted the data. He did not provide specifics, but some similar claims by the FBI turned out to be exaggerations. For example, former FBI agent Ron Hosko was forced to withdraw such a claim in a 2014 Washington Post op-ed. The journalist Timothy B. Lee analyzed the exaggeration in Vox:
[Hosko] offer[ed] a concrete example of a case where smartphone encryption would have thwarted a law enforcement investigation and cost lives. "Had this technology been in place," Hosko wrote, "we wouldn’t have been able to quickly identify which phone lines to tap. That delay would have cost us our victim his life."
There's just one problem: Hosko was wrong. In the case he cited, the police had not used information gleaned from a seized smartphone. Instead, they used wiretaps and telephone calling records—methods that would have been unaffected by Apple's new encryption feature. The Washington Post was forced to issue a correction.
The FBI also famously backtracked in the recent Apple case: at first they said it was impossible to crack its encryption, but then later they found a way to hack the phone after Apple refused to provide a golden key.
Although Baker insisted emphatically that the FBI supports “strong encryption . . . full stop,” he was quick to qualify. The FBI supports “strong encryption” only if it also provides “lawful access.” But providing access to encrypted material without the user’s passphrase (or other keys) is the exact definition of weak encryption.
Furthermore, the FBI solves very few cases relative to the huge number of positive uses of encryption we see in commerce, nongovernmental groups, and even government communications globally. Creating a golden key puts all of this at risk.
What I have learned over the past 25 years is that encryption saves the lives of people who are working to protect human rights and advance freedom around the world. It is clear that the FBI is willing to compromise the security of our national electronic infrastructure and to risk the lives of activists to advance their short-term institutional interests. The question for the rest of us, for the White House and for Congress, and also for the American people, is are we willing to massively degrade security for everyone, and weaken journalists and independent groups, simply to add to the FBI’s already enormous powers?
The work of independent, nongovernmental groups moves us all forward toward a more just and respectful world. This is, by far, the best defense against terrorism, particularly against the terror wreaked by the police and militaries that commit the majority of the world’s violence against civilians. Today, putting people’s physical security first—whether it’s against repressive governments, cybercriminals, or even nongovernmental terrorists—requires strong digital security.
*********************************** Cindy Cohn Executive Director Electronic Frontier Foundation 815 Eddy Street San Francisco, CA 94109 (415) 436-9333 x108 ----Cindy@eff.org ---- www.eff.org
Join EFF! https://supporters.eff.org/donate
Greetings,
Responding inline;
There is no guarantee that a golden key will work, given the facility in which rogue hackers from all over the world can develop their own encryption tools. In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States. Of these, roughly two-thirds are commercial and the others are open source, even though some of the free products are only libraries that contain building blocks rather than whole encryption systems. Given the resources available to ill-intentioned hackers, it would therefore be impossible to stop them from building strong encryption applications of their own.
Might there be other valid definitions of a "golden key"? In my view a golden key is obviously an operating system software update signing key and it is guaranteed to work.
http://arstechnica.com/security/2016/02/most-software-already-has-a-golden-k...
Instead, building encryption software with a golden key for government access would gravely compromise security for law-abiding citizens around the world, as it would encourage criminals and terrorists to build their own illegal software to frustrate the authorities and leave those without the technological skills—most of the world—more vulnerable to attack.
Are you saying that cutting edge security software necessarily gets developed by criminals and terrorists? In that argument lie slippery slopes. I can tell you certainly many computer security researchers have been accused of criminal activity but let's not perpetuate that stereotype.
Again my point above regarding for example Debian's package signing keys, they weren't intended to be a "golden key" but it turns out they are. Yes crypto is used but we need cryptographic group signature schemes to protect against key compromise.
Journalists in the United States are also using strong cryptography, such as an innovative program called SecureDrop, which enables whistleblowers to share information with media organizations securely and anonymously. This is especially important for whistleblowers with information on malfeasance by officials in local, state, or federal government. As in foreign countries, protecting free media requires security against government snooping.
Actually SecureDrop doesn't have any end-to-end crypto unless the source encrypts the document with the journalist's PGP key. I know that SecureDrop uses Tor onion services which does provide end-to-end transport crypto but that's not the same as application level end to end crypto. In the worst case scenario if the SecureDrop server were hacked the attacker could read these documents that were submitted without PGP encryption.
In Ka-Ping Yee's most excellent paper "User Interaction Design for Secure Systems" ( http://zesty.ca/pubs/icics-2002-uidss.pdf ) he describes various principles and properties that secure software systems should have and one of them is called the Principle of the Path of Least Resistance which can be summarized as "the natural way should be the secure way". This means that the user is going to do the easiest thing therefore if there is an extra action that need be taken for additional security then this will be neglected.
sincerely, david
On Sep 14, 2016, at 11:58 PM, dawuud dawuud@riseup.net wrote:
Greetings,
Hi David,
You know I didn’t write this, yes? It’s just a piece written by my partner, who is a pretty famous war crimes investigator. And you know that it already ran in Foreign Affairs magazine, yes. So while I wasn’t really seeking comment, a few thoughts below.
Responding inline;
There is no guarantee that a golden key will work, given the facility in which rogue hackers from all over the world can develop their own encryption tools. In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States. Of these, roughly two-thirds are commercial and the others are open source, even though some of the free products are only libraries that contain building blocks rather than whole encryption systems. Given the resources available to ill-intentioned hackers, it would therefore be impossible to stop them from building strong encryption applications of their own.
Might there be other valid definitions of a "golden key”?
Of course, but he starts the piece by defining what he means, by reference to the Feinstein/Burr bill:
" encryption bill that would require companies to “comply with court orders to protect Americans from criminals and terrorists.””
You might not like this definition but I don’t think the piece is confusing on that point.
In my view a golden key is obviously an operating system software update signing key and it is guaranteed to work.
http://arstechnica.com/security/2016/02/most-software-already-has-a-golden-k...
Also, I think this is talking to a different audience than Ars. Honestly, the term "Golden Key” has been used by the government as well as folks like us at EFF since the 1990s to refer to a wide range ways for the government to get access to the plaintext of something that’s encrypted without the cooperation of the encryptor so I don’t think it’s misused here.
Instead, building encryption software with a golden key for government access would gravely compromise security for law-abiding citizens around the world, as it would encourage criminals and terrorists to build their own illegal software to frustrate the authorities and leave those without the technological skills—most of the world—more vulnerable to attack.
Are you saying that cutting edge security software necessarily gets developed by criminals and terrorists?
No, that’s not what it’s saying. It says that even if you try to block strong encryption from regular people, criminals and terrorist will still get it. And then you get the worst of both worlds — ordinary people are not protected and surveilling criminals and terrorists is still hard for law enforcement.
In that argument lie slippery slopes. I can tell you certainly many computer security researchers have been accused of criminal activity but let's not perpetuate that stereotype.
I certainly do. I’ve spent now about 26 years and counting defending computer researchers, starting with Dan Bernstein in the Bernstein v. DOJ case. And of course Patrick has been offering encryption for about the same amount of time to keep people around the world safe.
Again my point above regarding for example Debian's package signing keys, they weren't intended to be a "golden key" but it turns out they are. Yes crypto is used but we need cryptographic group signature schemes to protect against key compromise.
I think you’re misreading what he’s saying. Sorry. He’s not attacking anything called a “golden key” by anyone ever. He’s attacking what the Feinstein/Burr bills requires.
Journalists in the United States are also using strong cryptography, such as an innovative program called SecureDrop, which enables whistleblowers to share information with media organizations securely and anonymously. This is especially important for whistleblowers with information on malfeasance by officials in local, state, or federal government. As in foreign countries, protecting free media requires security against government snooping.
Actually SecureDrop doesn't have any end-to-end crypto unless the source encrypts the document with the journalist's PGP key. I know that SecureDrop uses Tor onion services which does provide end-to-end transport crypto but that's not the same as application level end to end crypto. In the worst case scenario if the SecureDrop server were hacked the attacker could read these documents that were submitted without PGP encryption.
That’s why it says “enables” and not “always does it perfectly.”
In Ka-Ping Yee's most excellent paper "User Interaction Design for Secure Systems" ( http://zesty.ca/pubs/icics-2002-uidss.pdf ) he describes various principles and properties that secure software systems should have and one of them is called the Principle of the Path of Least Resistance which can be summarized as "the natural way should be the secure way". This means that the user is going to do the easiest thing therefore if there is an extra action that need be taken for additional security then this will be neglected.
We are in strong agreement, it seems. But we’re in a very hard fight with the government which wants to effectively ban strong encryption. That’s why people like me at EFF and Patrick from his role in the international human rights community are working hard to reach out beyond tech communities, like to the readers of Foreign Policy, to try to explain this to them in language that they can understand without having to read technical papers. Because if we stay in our technical silo, we’ll lose.
cheers,
Cindy
sincerely, david _______________________________________________ tor-project mailing list tor-project@lists.torproject.org https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-project
*********************************** Cindy Cohn Executive Director Electronic Frontier Foundation 815 Eddy Street San Francisco, CA 94109 (415) 436-9333 x108 ----Cindy@eff.org ---- www.eff.org
Join EFF! https://supporters.eff.org/donate
tor-project@lists.torproject.org