TAKE IT DOWN Act: Removing Harmful Online Content

A person in a dark blue suit sits at a desk signing an official document, possibly addressing illegal content online. Only their hands and torso are visible, highlighting the formal setting.

If you’re the victim of illegal content posted online, you may have legal recourse due to the TAKE IT DOWN Act.

In 2023, when Elliston Berry was just 14 years old, she became the victim of fake nudes shared on social media platforms, including Snapchat. The fact that Berry’s mother, Anna McAdams, had such a difficult time getting the content removed partially inspired the Take It Down Act. The act became law after President Trump’s signing of it on May 19, 2025.

Are you rebounding from a PR crisis due to attacks on your reputation or character? If you were the victim of illegal content, hate speech, pornographic content or other damaging imagery, it’s important to repair your online presence immediately. Call us at 844-230-3803 for more information.

Take It Down Act

A person in a suit signs a formal document on a wooden desk in an ornate room, with several people seated in the background, ensuring no illegal content is involved in this official or legal setting.

The Take It Down Act — which is short for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act — went into effect on May 19, 2025. Democratic U.S. Senator Amy Klobuchar and Republican U.S. Senator Ted Cruz introduced the bill.

This bipartisan legislation combats digital exploitation by requiring the removal of non-consensual intimate images involving either authentic or artificial intelligence generation tactics.

Those who do not remove content within 48 hours after receiving notification could face serious legal consequences, such as fines or prison time.

Congress.gov explains the bill in this way: “[It] generally prohibits the nonconsensual online publication of intimate visual depictions of individuals, both authentic and computer-generated, and requires certain online platforms to promptly remove such depictions upon receiving notice of their existence.”

There are different specifics regarding imagery of an adult subject versus imagery of a minor subject. In both cases, whether or not it was the intention of the publication to cause harm is a consideration. Also, whether or not the imagery was authentic or computer-generated doesn’t make a difference in adult versus minor complaints.

Platforms that don’t comply with the act will be violating the Federal Trade Commission Act, which means they can be subject to Federal Trade Commission enforcement.

We work with clients to help them build their online reputation and take control of their digital presence. We can help if you’ve been the victim of illegal content online. Give us a call at 844-230-3803 to learn more.

Take It Down Act: Removals

The Take It Down Act refers to “covered platforms” frequently. These are the types of online services that will need to comply with the bill.

According to the wording of the Take It Down Act, a covered platform is:

  • “[A] website, online service, online application, or mobile  application that serves the public.”
  • It “primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files,” OR
  • “[For] which it is in the regular course of trade or business of the [platform] to publish, curate, host, or make available content of nonconsensual intimate visual depictions.”

Platforms have until May 19, 2026, one year from when President Trump signed the Take It Down Act, to post notice-and-removal procedures. This process must be accessible to the public and clear in wording to be understandable. The disclosure must explain how an individual can make a removal request and how the covered platform will comply.

The Take It Down Act also includes information about protecting good-faith efforts.

There will certainly be instances when a supposed victim reports the content, even if it was legally and consensually posted. At times, the covered platform may remove the content in good faith, not realizing that it doesn’t fall under the category of non-consensual intimate imagery.

It can be difficult to determine if such images were created and posted legally or not, and it’s possible that some website owners will err on the side of compliance.

If the original content creator tries to take legal action against the covered platform for removing the content, the Take It Down Act protects the platform by not making it liable.

Take It Down Act Endorsements

Hands in red, white and blue reach up together.

After the signing, Senator Klobuchar said, “We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse.”

In his statement, Senator Cruz said, “Predators who weaponize new technology to post this exploitative filth will now rightfully face criminal consequences, and Big Tech will no longer be allowed to turn a blind eye to the spread of this vile material.”

First Lady Melania Trump is also in support of the Take It Down Act. In March 2025, when the First Lady hosted a roundtable to discuss online protection for minors, she said, “We must prioritize their well-being by equipping them with the support and tools necessary to navigate this hostile digital landscape. Every young person deserves a safe online space to express themself freely, without the looming threat of exploitation or harm.”

Taking advanced safety measures to shield yourself from illegal content is more important than ever. Get started with a free reputation analysis here.

Take It Down Act Critics

Illustration of a large hand stopping or silencing a man and woman shouting through megaphones, symbolizing censorship or suppression of free speech and control over illegal content.

While there’s a substantial amount of support for the act, there’s also a growing number of critics who are worried about its impact on lawful speech.

According to the Electronic Frontier Foundation, major flaws with the act include:

  • The takedown provision could apply to any intimate images.
  • Automated filters to flag content could lead to errors.
  • Forty-eight hours isn’t a long enough time to verify the accuracy of the report.

Similarly, the Cyber Civil Rights Initiative (CCRI) said in a statement that the takedown provision “is highly susceptible to misuse and will likely be counter-productive for victims.”

Take It Down Act and ORM

A woman with curly hair sits at a desk, holding her head in her hands in frustration. She faces a laptop, with an open notebook and a coffee mug nearby. Bookshelves and a large plant are in the background.

Do you feel that you’re the victim of harmful online content that aligns with what’s outlined in the act? Here’s what to do to protect your online reputation:

  • Do a thorough assessment to ensure that the content qualifies for removal.
  • Prepare a removal notice that specifically cites the violations of the act.
  • File the notice as soon as possible.
  • Closely monitor the website(s) — they should remove the content within 48 hours.
  • If the content was not removed, follow up with the webmaster or consider seeking legal counsel.

It’s not a perfect system by any means. For example, there are legitimate concerns about how the act can impact lawful speech. However, the Take It Down Act strives to hold platforms accountable while protecting victims of harmful online behavior, especially young people.

It’s also important that your notice includes the required criteria, such as:

  • Takedown requests must be in writing and include your signature.
  • You must provide a way for the platform to find the non-consensual intimate imagery so they remove it.
  • The platform needs your contact information to get in touch with you.

Certain types of content, like child sexual abuse material and extreme pornography without consent, should be held to a no-exceptions policy. Such content has no legitimate reason to exist on the internet.

Digital Age Reputation

Illustration of a person moving a dial from red to green.

Overall, the act signed by President Trump is intended to protect victims of online abuse when non-consensual intimate imagery and revenge porn is posted. By making the posting of non-consensual intimate images and revenge porn a federal crime, the act is able to empower victims to hold perpetrators accountable in a real and measurable way.

Here’s a recap of the issue:

  • The Take It Down Act, a bipartisan bill passed in May 2025, requires covered online platforms to remove non-consensual intimate images within 48 hours.
  • Imagery covered under the act includes authentic images as well as deepfake image abuse.
  • Platforms must make reasonable efforts to also remove duplicate content.
  • There’s bipartisan support for the act, and people from both sides of the political spectrum feel that the act represents critical legislation that will protect victims.

There are various types of illegal content you may come across online. Illegal content may impact child safety or threaten young people or adults.

  • Child sexual abuse material
  • Consumer protection issues
  • Illegal hate speech
  • Incitement to commit terrorist acts
  • Intellectual Property rights infringements

By having efficient tools and processes to combat such content, victims are able to take legal action when online content threatens their safety or privacy.

Being the victim of illegal content on the internet doesn’t have to impact your future. With our parent company NetReputation, we provide services to protect your digital presence. Get started with a free online reputation analysis here

You might also like

If you’re the victim of illegal content posted online, you may have legal recourse due to the TAKE IT DOWN …