The 'Take It Down Act' takes effect on May 19, 2026 in the US, requiring platforms to comply with takedown requests of sexually explicit images within 48 hoursThe 'Take It Down Act' takes effect on May 19, 2026 in the US, requiring platforms to comply with takedown requests of sexually explicit images within 48 hours

Nude AI images: The ‘Take It Down Act’ in the US will let users request for quick removal

2026/01/11 10:00

Since the end of December, 2025, X’s artificial intelligence chatbot, Grok, has responded to many users’ requests to undress real people by turning photos of the people into sexually explicit material. After people began using the feature, the social platform company faced global scrutiny for enabling users to generate nonconsensual sexually explicit depictions of real people.

The Grok account has posted thousands of “nudified” and sexually suggestive images per hour. Even more disturbing, Grok has generated sexualized images and sexually explicit material of minors.

X’s response: Blame the platform’s users, not us. The company issued a statement on Jan. 3, 2026, saying that “Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.” It’s not clear what action, if any, X has taken against any users.

As a legal scholar who studies the intersection of law and emerging technologies, I see this flurry of nonconsensual imagery as a predictable outcome of the combination of X’s lax content moderation policies and the accessibility of powerful generative AI tools.

Targeting users

The rapid rise in generative AI has led to countless websites, apps and chatbots that allow users to produce sexually explicit material, including “nudification” of real children’s images. But these apps and websites are not as widely known or used as any of the major social media platforms, like X.

State legislatures and Congress were somewhat quick to respond. In May 2025, Congress enacted the Take It Down Act, which makes it a criminal offense to publish nonconsensual sexually explicit material of real people. The Take It Down Act criminalizes both the nonconsensual publication of “intimate visual depictions” of identifiable people and AI- or otherwise computer-generated depictions of identifiable people.

Those criminal provisions apply only to any individuals who post the sexually explicit content, not to the platforms that distribute the content, such as social media websites.

Other provisions of the Take It Down Act, however, require platforms to establish a process for the people depicted to request the removal of the imagery. Once a “Take It Down Request” is submitted, a platform must remove the sexually explicit depiction within 48 hours. But these requirements do not take effect until May 19, 2026.

Problems with platforms

Meanwhile, user requests to take down the sexually explicit imagery produced by Grok have apparently gone unanswered. Even the mother of one of Elon Musk’s children, Ashley St. Clair, has not been able to get X to remove the fake sexualized images of her that Musk’s fans produced using Grok. The Guardian reports that St. Clair said her “complaints to X staff went nowhere.”

This does not surprise me because Musk gutted then-Twitter’s Trust and Safety advisory group shortly after he acquired the platform and fired 80% of the company’s engineers dedicated to trust and safety. Trust and safety teams are typically responsible for content moderation and initiatives to prevent abuse at tech companies.

Publicly, it appears that Musk has dismissed the seriousness of the situation. Musk has reportedly posted laugh-cry emojis in response to some of the images, and X responded to a Reuters reporter’s inquiry with the auto-reply “Legacy Media Lies.”

Limits of lawsuits

Civil lawsuits like the one filed by the parents of Adam Raine, a teenager who committed suicide in April 2025 after interacting with OpenAI’s ChatGPT, are one way to hold platforms accountable. But lawsuits face an uphill battle in the United States given Section 230 of the Communications Decency Act, which generally immunizes social media platforms from legal liability for the content that users post on their platforms.

Supreme Court Justice Clarence Thomas and many legal scholars, however, have argued that Section 230 has been applied too broadly by courts. I generally agree that Section 230 immunity needs to be narrowed because immunizing tech companies and their platforms for their deliberate design choices — how their software is built, how the software operates and what the software produces — falls outside the scope of Section 230’s protections.

In this case, X has either knowingly or negligently failed to deploy safeguards and controls in Grok to prevent users from generating sexually explicit imagery of identifiable people. Even if Musk and X believe that users should have the ability to generate sexually explicit images of adults using Grok, I believe that in no world should X escape accountability for building a product that generates sexually explicit material of real-life children.

Regulatory guardrails

If people cannot hold platforms like X accountable via civil lawsuits, then it falls to the federal government to investigate and regulate them. The Federal Trade Commission, the Department of Justice or Congress, for example, could investigate X for Grok’s generation of nonconsensual sexually explicit material. But with Musk’s renewed political ties to President Donald Trump, I do not expect any serious investigations and accountability anytime soon.

For now, international regulators have launched investigations against X and Grok. French authorities have commenced investigations into “the proliferation of sexually explicit deepfakes” from Grok, and the Irish Council for Civil Liberties and Digital Rights Ireland have strongly urged Ireland’s national police to investigate the “mass undressing spree.” The U.K. regulatory agency Office of Communications said it is investigating the matter, and regulators in the European Commission, India and Malaysia are reportedly investigating X as well.

In the United States, perhaps the best course of action until the Take It Down Act goes into effect in May is for people to demand action from elected officials. – Rappler.com

The article originally appeared on The Conversation.

Wayne Unger, Associate Professor of Law, Quinnipiac University

The Conversation
Market Opportunity
Sleepless AI Logo
Sleepless AI Price(AI)
$0.04133
$0.04133$0.04133
-0.43%
USD
Sleepless AI (AI) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Grayscale Moves $25M in Crypto to Coinbase Prime

Grayscale Moves $25M in Crypto to Coinbase Prime

The post Grayscale Moves $25M in Crypto to Coinbase Prime appeared on BitcoinEthereumNews.com. Key Points: Grayscale moved $25M in ETH/SOL to Coinbase Prime. Transfer
Share
BitcoinEthereumNews2026/01/11 18:46
Coinbase to list AUDD and XSGD stablecoins on September 29

Coinbase to list AUDD and XSGD stablecoins on September 29

The post Coinbase to list AUDD and XSGD stablecoins on September 29 appeared on BitcoinEthereumNews.com. Coinbase will start listing AUDD and XSGD, two fiat-backed local stablecoins, on September 29 at 19:00 UTC, according to an official update published by the company. These are the first Australian dollar and Singapore dollar stablecoins to be added to the platform, with trading available globally; no regions are geo-blocked. The company claims that this move is part of its plan to help onboard a billion people into crypto by giving users the ability to transact in familiar currencies. People in Australia and Singapore will be able to convert AUD to AUDD and SGD to XSGD without paying fees, according to Coinbase. Local stablecoins start gaining serious traction The stablecoin market is blowing up. It crossed $250 billion in market cap by June 2025, which is a 50% jump from the previous year. In 2024, these tokens processed more than $30 trillion in transactions. And that number’s still climbing. Analysts now expect stablecoins could grow into a $2 trillion asset class in the next few years. That growth is dragging in businesses too. In Coinbase’s State of Crypto report, 81% of crypto-aware small and mid-sized U.S. businesses said they want to start using stablecoins for payments. On top of that, the number of Fortune 500 companies looking into stablecoins has more than tripled since last year. But almost everything onchain is still in U.S. dollars. About 60% of global currency reserves are held in USD, but 99% of stablecoins in circulation are pegged to it. That makes it hard for people outside of the U.S. to move money in their own currencies. Local stablecoins like AUDD and XSGD aim to fix that gap. A survey by Ipsos for Coinbase found that over 70% of crypto owners in Australia and Singapore want local stablecoins they can actually use. Coinbase says adding…
Share
BitcoinEthereumNews2025/09/25 00:34
Gavin Newsom Referred For Secret Service ‘Threat Assessment’ After X Post

Gavin Newsom Referred For Secret Service ‘Threat Assessment’ After X Post

The post Gavin Newsom Referred For Secret Service ‘Threat Assessment’ After X Post appeared on BitcoinEthereumNews.com. Topline Bill Essayli, the acting U.S. Attorney for the Central District of California, on Saturday said he referred Gov. Gavin Newsom to the Secret Service for a “full threat assessment,” after Newsom wrote on social media a jab targeting Homeland Security Secretary Kristi Noem. The California governor wrote on X: “Kristi Noem is going to have a bad day today.” Copyright 2025 The Associated Press. All rights reserved Key Facts “We have zero tolerance for direct or implicit threats against government officials,” Essayli wrote on X, adding he “referred this matter” to the Secret Service and requested a “full threat assessment,” though it’s not immediately clear what the assessment would entail. Essayli responded to an earlier social media post from Newsom, who wrote, “Kristi Noem is going to have a bad day today,” before signing a series of bills protecting California’s immigrant population, including a ban on face coverings for federal agents and a requirement for officers to identify themselves. Noem has neither responded to Newsom nor Essayli as of 5:20 p.m. EDT on Saturday. Newsom’s office did not immediately respond to a request for comment from Forbes, though Newsom told reporters Saturday the bills he signed “run in complete contrast to what [Noem’s] asserting and what she’s pushing.” Key Background California officials have, in recent weeks, opposed moves by the Trump administration. Newsom, who has repeatedly criticized President Donald Trump’s immigration crackdown, indicated Saturday he would sign into law legislation in an effort to rebuke Homeland Security’s deportation raids in the state. The Department of Homeland Security disputed whether California could pass a bill prohibiting federal agents from wearing masks, arguing it would put agents “and their families at risk of being doxed and targeted by vicious criminals.” The Trump administration ramped up its mass deportation campaign earlier this…
Share
BitcoinEthereumNews2025/09/21 05:42