Elon Musk’s artificial intelligence company xAI now faces a serious lawsuit in the United States after three anonymous plaintiffs accused its Grok image technology of generating sexually explicit images from real photos of minors. The lawsuit claims that Grok altered ordinary pictures of young people into abusive sexual content and allowed those manipulated images to circulate online. The plaintiffs argue that the company failed to implement safety controls that other AI developers already use to prevent the creation of illegal and harmful content.
According to TechCrunch, the case was filed Monday in the U.S. District Court for the Northern District of California. The plaintiffs, identified as Jane Doe 1, Jane Doe 2, and Jane Doe 3, want the court to approve a class action lawsuit representing people whose childhood photos were allegedly altered by Grok into sexualized images. Two of the plaintiffs remain minors, and their lawyers say the incident caused severe emotional distress and serious concerns about reputation and personal safety.
Allegations Against xAI
The complaint argues that xAI failed to follow common safeguards used by other advanced AI labs to prevent image models from generating sexual content involving real people, especially children. Many image generation systems include filters that block attempts to create child exploitation material or manipulate photos of identifiable individuals. The lawsuit claims Grok lacked these basic protections.
Lawyers for the plaintiffs also point to Elon Musk’s public promotion of Grok’s ability to generate sexualized images and depict real people in revealing clothing. The filing argues that allowing the system to generate erotic content from real photos makes it extremely difficult to stop the creation of sexual images involving minors.
How the Images Spread
One plaintiff discovered that Grok had altered her high school homecoming and yearbook photos to show her unclothed, and those images later appeared on a Discord server that shared manipulated pictures of several students. Another plaintiff learned from criminal investigators that a mobile app using Grok technology created sexualized images of her, while a third plaintiff was notified after investigators found a similar altered image on a suspect’s phone.
The plaintiffs say the images continue to circulate online, and they hold xAI responsible because third party apps still rely on the company’s models, code, and servers to generate those images.
The lawsuit asks the court to impose civil penalties under several laws designed to protect children from exploitation and punish corporate negligence. The plaintiffs also seek broader accountability, arguing that companies developing powerful AI systems must take stronger steps to prevent their technology from creating abusive and illegal content.