FreedomWorks Foundation Content

EARN IT Remains a Thinly Veiled Assault on Encryption

The Senate Judiciary Committee is set to mark up a bill that is a classic piece of Washington, D.C. misdirection. The Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, S. 3398, presents itself as a bill targeted at fighting the spread of child pornography (rebranded as “child sexual abuse material,” or “CSAM”), which obviously no decent person would oppose. However, the actual impact of the bill reaches far beyond this purpose, simultaneously threatening both tech platforms’ liability protections and their ability to deploy strong encryption for their users, endangering innocent Americans’ security in the process.

A manager’s amendment that has been released in advance of the markup leaves our primary concerns with the bill firmly in place.

The EARN IT Act works by creating a new 19-person National Commission on Online Child Sexual Exploitation Prevention, chaired by the U.S. Attorney General, the task of which is to recommend “best practices” that tech providers and platforms “may choose” in order to assist the government in stopping the spread of CSAM. However, companies that do not implement any one of the commission’s “voluntary” recommendations stand to lose their legal protection against liability for CSAM materials posted or transmitted by a 3rd party using their service, as provided by Section 230 of the Communications Decency Act (CDA).

To be clear, CDA Section 230 already requires, along with other laws, that tech services are obligated to remove illegal content (including, specifically, CSAM) and report it to law enforcement, which platforms like Facebook do to the tune of tens of thousands of reports per year. What Section 230 does is specify that the person who actually posted the illegal material is liable for civil or criminal cases against them, not the platform. EARN IT would allow platforms like Facebook, Twitter, or YouTube to be held liable instead.

What’s the problem with this if platforms already cooperate with the government in fighting CSAM, which every decent person agrees ought to be eradicated? Answer: Encryption. One of the “best practices” that the commission is commanded by EARN IT to police is as follows:

“[W]hether a type of product, business model, product design, or other factors related to the provision of an interactive computer service could make a product or service susceptible to the use and facilitation of online child sexual exploitation.”

Federal law enforcement agencies, led aggressively by longtime opponent of encryption Attorney General (AG) Bill Barr, have repeatedly made clear that they consider strong encryption to be the equivalent to tech companies helping “perpetrators hide from law enforcement.” Barr goes on to suggest that “[g]iving broad immunity to platforms that purposefully blind themselves – and law enforcers – to illegal conduct on their services does not create incentives to make the online world safer for children.”

It’s not a great leap from there to see how the commission is likely to make weakening or creating backdoor access to encrypted communications one of the “best practices” without which tech platforms will be at risk for being sued to death for illegal content they themselves can’t even see. That’s hardly a “voluntary” arrangement, and it stands to make millions of innocent internet users more vulnerable to government surveillance and hackers alike.

Encryption can obviously be used for nefarious purposes, just like a house with its blinds drawn and doors locked can conceal all manner of terrible things its occupants might do. But having encryption that companies and governments can’t access provides essential benefits as well. Political dissidents under oppressive regimes all over the world have to literally trust their lives to the security of their encrypted communications. Journalists use encryption to protect their sources from harm. Businesses use encryption to protect themselves against corporate and government espionage. And, of course, our own government employs encryption to protect itself from the prying eyes of hackers as well.

Barr himself has admitted that creating backdoor access to encrypted files and communications for law enforcement necessarily makes users less safe. This should be intuitive – any vulnerability that the government can exploit can be used by nefarious actors too. It’s the digital equivalent of being mandated to leave a spare key in a designated spot around your house in case the police need inside – if a burglar knows where to find it, bad news for you. Noted cybersecurity expert Bruce Schneier summarizes it well, “As computers continue to permeate every aspect of our lives, society, and critical infrastructure, it is much more important to ensure that they are secure from everybody — even at the cost of law-enforcement access — than it is to allow access at the cost of security.”

Even if child predators or any other bad actors use strong encryption to shield their communications, law enforcement has a multitude of investigative techniques and tools that it can use to bring them to justice, just as they did in the days before technology gifted the government with the ability to scan electronic communications in real time. As it stands, federal law enforcement hasn’t devoted enough resources to address child exploitation online to pursue more than a small fraction of the leads it already gets. Instead, the Federal Bureau of Investigation (FBI) and AG Barr are effectively arguing that the solution to CSAM is that no one should be allowed to have a private, unrecorded conversation.

This attempt to use the tragedy of child exploitation to push Barr’s anti-Fourth Amendment, anti-privacy agenda is D.C. cynicism at its worst.