Listen to the article (AI powered narration)

Published on May 05, 2020

Until COVID-19 took Capitol Hill by storm, folks on both sides of the aisle were in the middle of debating legislation that could affect software companies’ use of end-to-end encryption.

As it currently stands, software companies can provide their users with end-to-end encryption (E2EE) without being held liable for the content that their users post or transmit. Section 230 of the 1996 Communications Decency Act provides software vendors with this right.

However, the government contends that E2EE allows terrorists, drug cartels, and human traffickers to act with impunity, and the Justice Department would like access to software companies’ encrypted applications via a backdoor. On the other side of the argument, cybersecurity experts and privacy advocates argue that providing governments with a backdoor into encrypted apps ultimately makes users’ data less secure.

The encryption debate has been contentious for years

The debate kicked into high gear in the aftermath of the December 2015 San Bernandino shootings. During this distressing time, the FBI tried to access one of the shooters’ phones; however, Apple refused to provide them with backdoor access, arguing that the FBI’s request for backdoor access violated the Fourth Amendment, which protects against unreasonable search and seizure. Ultimately, the FBI handled the matter outside of court and was able to access the phone via a third-party, commercial source.

With encrypted apps like Telegram and Discord allegedly functioning as a new form of dark web, and Facebook set to roll out E2EE conversations on their platforms (Instagram, Facebook Messenger, WhatsApp), the encryption debate is in full swing.

Facebook’s end-to-end encryption initiatives have been consistently challenged

Back in October of 2019, U.S. Attorney General Barr and representatives from the U.K. and Australia penned an open letter to Facebook requesting that the social media giant not go forward with E2EE across its messaging platforms. With E2EE, the keys that unlock messages are stored within the individual devices at the endpoints of conversations, rather than with the information service providers’ servers (e.g., those belonging to WhatsApp or Instagram). Co-signed by U.K. Home Secretary Priti Patel, Australian Minister for Home Affairs Peter Dutton, and Acting U.S. Secretary of Homeland Security Kevin McAleenan, Barr’s letter contends that Facebook’s E2EE initiatives will facilitate child exploitation, terrorism, and election tampering.

Civil liberties advocates, such as the ACLU and the EFF, disagree with the governments’ stance, arguing that E2EE protects the privacy of citizens and gives control to individuals rather than corporations. CIA officer turned whistle blower Edward Snowden writes, “The government is demanding backdoor access to the private communications of 1.5 billion people using WhatsApp. If Facebook agrees, it may be the largest overnight violation of privacy in history.”  

Despite Barr’s open letter to Facebook, it remains to be seen whether encryption laws will be altered in the United States. Mark Zuckerberg responded to the letter, strongly condemning Barr’s proposal. In his response, Zuckerberg points out that U.S. and U.K. law enforcement agencies are already able to access data from tech companies when authorities have proper warrants or subpoenas.

Signed into law in March 2018, the Clarifying Lawful Overseas Use of Data Act (CLOUD Act) requires U.S.-based tech companies to provide law enforcement with access to data on servers, regardless of where the servers are located; however, this act doesn’t require tech companies to build backdoors for law enforcement. According to Snowden, Zuckerberg, and civil liberties advocates, backdoors are a bridge too far.

In all likelihood, such an important—and contentious—issue is likely to reach the Supreme Court.

The U.S. Supreme Court has historically protected citizens’ privacy rights in the wake of technological advancements

Katz v. U.S. (1967) was a landmark Supreme Court decision, which expanded the scope of the Fourth Amendment to account for new wiretapping technologies. The judges ruled that citizens are granted a “reasonable expectation of privacy” when it comes to search and seizures.

Another important Supreme Court decision, Kyllo v. U.S. (2001), ruled that using thermal-imaging technologies as part of a search was an invasion of privacy. Years later, after GPS technologies matured, the Supreme Court’s U.S. v. Jones (2012) case ruled that it was unlawful for police departments to use GPS tracking devices without a valid warrant.

It remains to be seen whether the Court will again rule in favor of users’ privacy in regard to encryption technologies.

The most recent development: the bipartisan EARN IT Act

Written by Lindsey Graham (R-SC) and Richard Blumenthal (D-CT), the bipartisan “Eliminating Abuse and Rampant Neglect of Interactive Technologies Act” (EARN IT Act) is currently on the table—and like most laws with potential encryption ramifications—it’s rather contentious.

Part of the EARN IT Act focuses on Section 230 of the 1996 Communications Decency Act, which provides immunity for any tech company that publishes its users’ content while meeting certain conditions. The EARN IT Act would expand those conditions to include guidelines that would likely compromise encryption efforts, e.g., backdoors to encrypted applications and data. Any company that fails to create such backdoors or fails to adopt other EARN IT guidelines would lose its Section 230 liability protections.

Ostensibly, the EARN IT Act has been established to combat child predators: an initiative we can all support; however, as it is written, the proposed law has garnered the ire of cybersecurity pundits. They claim that the Act contains far-reaching ramifications for encryption.

Riana Pfefferkorn, associate director of surveillance and cybersecurity at Stanford’s Center for Internet and Society, describes the EARN IT Act as a “bait-and switch” and an “underhanded attack.” Similarly, former FTC attorney Whitney Merrill called it “encryption backdoor legislation in disguise.” Some companies, such as Signal, an end-to-end encrypted messaging service, have vowed to leave the U.S. if such a law passes.

Without a doubt, providing governments with backdoor keys would be a problem

There are occasions when law enforcement entities gain legal access to criminals’ devices, but they lack the ability to read the devices’ encrypted messages; this is commonly known as “going dark.” However, when this occurs, governments still have access to these devices’ metadata, and they’re often are able to gain access to the content of messages via third-party services, as was the case in the San Bernardino incident.

If the government were to essentially make end-to-end encryption illegal—or mandate that U.S.-based tech companies provide law enforcement with backdoors—this could lead users to purchase encryption goods and services from foreign companies. After all, privacy and encrypted conversation is a service that many users value.

Additionally, U.S.-based software companies with mandated backdoors could end up being banned in other countries. That would not be good for U.S.-based enterprises who do business internationally.

What’s in store for software companies?

Software companies that facilitate communication via encrypted messaging are currently on the government’s radar. There is a real need to address illegal activity on these platforms, and we don’t aim to suggest that legislation should not be passed. However, such legislation cannot come at the expense of consumer data privacy. Thus far, the proposed legislation we’ve seen provides the U.S. government with far too wide of a net.

Seeing as bills will continue to come down the pike, software enterprises would be wise to address any illegal activity on their platforms more aggressively. If enterprises fail to do so, a law could pass that jeopardizes not only software companies’ current protections under Section 230, but also the privacy of its users.

John Donegan

John Donegan

Enterprise Analyst, ManageEngine

John is an Enterprise Analyst at ManageEngine. He covers infosec, cybersecurity, and public policy, addressing technology-related issues and their impact on business. Over the past fifteen years, John has worked at tech start-ups, as well as B2B and B2C enterprises. He has presented his research at five international conferences, and he has publications from Indiana University Press, Intellect Books, and dozens of other outlets. John holds a B.A. from New York University, an M.B.A. from Pepperdine University, an M.A. from the University of Texas at Austin, and an M.A. from Boston University.

 Learn more about John Donegan
x