Here’s a puzzle: How do you write a law that’s so badly designed that (1) the people it’s meant to help oppose it, (2) the people who hate regulation support it, and (3) everyone involved admits it will be abused? The answer, it turns out, is the Take It Down Act.

The bill started with the entirely reasonable goal of addressing non-consensual intimate imagery online. But then something went wrong. Instead of building on existing successful systems, or within the parameters of the First Amendment, Congress decided to create a new framework combining vague “duty of care” requirements with harsh criminal penalties — a combination that, as we’ve previously detailed, practically begs to be weaponized for censorship.

Most tellingly, Donald Trump — in endorsing the bill during his address to Congress — openly bragged about how he plans to abuse its provisions to censor content he personally dislikes. When the person championing your anti-abuse legislation is promising to use it for abuse, you might have a problem.

The bill is so bad that even the Cyber Civil Rights Initiative, whose entire existence is based on representing the interests of victims of NCII and passing bills similar to the Take It Down Act, has come out with a statement saying that, while it supports laws to address such imagery, it cannot support this bill due to its many, many inherent problems.

While supportive of the bill’s criminal provision relating to authentic nonconsensual intimate images, which closely resembles CCRI’s model federal law and state laws that have survived constitutional challenge, CCRI has serious reservations about S. 146’s reporting and removal requirements. Encouraging speedy removal of nonconsensual intimate imagery from platforms is laudable, but the provision as written is unconstitutionally vague, making it difficult for individuals and platforms to understand what conduct is prohibited or required. The provision is also unconstitutionally overbroad, extending well beyond unlawful imagery. Finally, the provision lacks adequate safeguards against abuse, increasing the likelihood of bad faith reports and chilling protected expression. Such flaws would be alarming under any circumstances; in light of the current administration’s explicit commitment to selectively enforcing laws for political purposes, they are fatal. CCRI cannot support legislation that risks endangering the very communities it is dedicated to protecting, including LGBTQIA+ individuals, people of color, and other vulnerable groups.

These warnings echo what digital rights groups like the Center for Democracy & Technology and EFF have been shouting for months — only to be completely ignored by Congress. The concerns are not theoretical: the bill’s vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse.

Yet despite these clear red flags, Ted Cruz announced that the House will take up the Senate’s fatally flawed version of the bill. This comes after leadership dismissed substantive criticisms during markup, including explicit warnings from Alexandria Ocasio-Cortez about the bill’s potential for abuse.

That’s Cruz saying:

I am thrilled that the TAKE IT DOWN Act will be getting a vote on the House Floor early next week.

Thank you to [Speaker Johnson, Steve Scalise, and Brett Gurthrie] for their leadership and action to protect victims of revenge and deepfake pornography and give them the power to reclaim their privacy and dignity.

When this bill is signed into law, those who knowingly spread this vile material will face criminal charges, and Big Tech companies must remove exploitative content without delay.

The weird thing about this bill is that we already have systems to handle non-consensual intimate imagery online. There’s NCMEC’s “Take It Down” system, which helps platforms identify and remove this content. There’s StopNCII.org, a non-profit effort that’s gotten virtually every major platform — from Meta to TikTok to Pornhub — to participate in coordinated removal efforts. These systems work because they’re precise, transparent, and focused on the actual problem.

But apparently working solutions aren’t exciting enough for Congress. Instead of building on these proven approaches, they’ve decided to create an entirely new system that somehow manages to be both weaker at addressing the real problem and more dangerous for everyone else.

The problem here is pretty simple: If you give people a way to demand content be taken down, they will abuse it. We already have a perfect case study in the DMCA. Even with built-in safeguards like counternotices and (theoretical) penalties for false claims, the DMCA sees thousands of bogus takedown notices used to censor legitimate speech.

The Take It Down Act looks at this evidence of widespread abuse and says “hold my beer.” Not only does it strip away the DMCA’s already-inadequate protections, it adds criminal penalties that make false claims even more attractive as a censorship weapon. After all, if people are willing to file bogus copyright claims just to temporarily inconvenience their opponents, imagine what they’ll do when they can threaten prison time.

And imagine what the current Trump administration would do with those threats of criminal charges over content removals.

CDT’s Beeca Branum put out a statement this morning about how stupid all of this is:

“The TAKE IT DOWN Act is a missed opportunity for Congress to meaningfully help victims of nonconsensual intimate imagery. The best of intentions can’t make up for the bill’s dangerous implications for constitutional speech and privacy online. Empowering a partisan FTC to enforce ambiguous legislation is a recipe for weaponized enforcement that risks durable progress in the fight against image-based sexual abuse.”

“The TAKE IT DOWN Act, while well-intentioned, was written without appropriate safeguards to prevent the mandated removal of content that is not nonconsensual intimate imagery, making it vulnerable to constitutional challenge and abusive takedown requests. Moreover, its ambiguous text can be read to create an impossible requirement for end-to-end encrypted platforms to remove content to which they have no access.”

The most baffling aspect of this debacle is watching self-proclaimed progressive voices like Tim Wu and Zephyr Teachout champion a bill that hands unprecedented censorship power to an administration they claim to oppose. This morning, both of them appeared at a weird press conference in support of the bill. While their recent embrace of various unconstitutional and censorial internet regulations is disappointing, their willingness to hand Donald Trump a censorship weapon he’s openly bragging about abusing is genuinely shocking.

The Take It Down Act will likely become law, and then we’ll get to watch as the Trump administration — which has already announced its plans to abuse it — gets handed a shiny new censorship weapon with “totally not for political persecution” written on the side in extremely small print. The courts might save us, but they’re already drowning in unconstitutional nonsense from this administration. Perhaps not the best time to add “government-enabled censorship framework” to their to-do list.


From Techdirt via this RSS feed