Grok class action lawsuit overview:
- Who: Three consumers filed a class action lawsuit against X.AI Corp. and X.AI LLC.
- Why: The plaintiffs allege X.AI’s artificial intelligence (AI) tool Grok is knowlingly capable of creating child sexual abuse material (CSAM).
- Where: The Grok class action lawsuit was filed in California federal court.
A new class action lawsuit alleges Elon Musk’s artificial intelligence company X.AI Corp. is responsible for creating child sexual abuse material through its AI tool Grok.
Plaintiffs Jane Doe 1, Jane Doe 2 and Jane Doe 3 claim X.AI released Grok, a generative AI model with image- and video-making features, knowing it would be used to create sexual content with real images or videos of children.
The plaintiffs argue X.AI, which was founded by Musk, failed to implement industry-standard safeguards to prevent the creation of nonconsensual sexualized deepfakes of real people, including minors.
“Knowing the type of harmful, illegal content that could — and would — be produced, xAI released Grok,” the Grok class action lawsuit says.
According to the complaint, “The Center for Countering Digital Hate reviewed a random sample of 200,000 images of the 4.6 million images Grok produced between Dec. 29, 2025, through Jan. 8, 2026, estimating that Grok generated 3 million sexualized images, including 23,000 that appeared to depict children.”
The plaintiffs want to represent a nationwide class of individuals who had real images of themselves as minors altered by Grok to produce sexualized images or videos with their faces and/or other distinguishing features reasonably identifiable.
Elon Musk knew Grok would be used to create CSAM, class action claims
The plaintiffs claim Musk and other decision makers at X.AI knew Grok would be capable of producing CSAM once it was enabled to produce sexually explicit content of adults.
The Elon Musk class action lawsuit alleges the plaintiffs’ lives have been “shattered” by the loss of privacy, dignity and personal safety caused by the production and dissemination of CSAM depicting them.
“Plaintiffs will live every day with the constant anxiety of not knowing whether someone they encounter has seen this invasive and sexually explicit content created with images of them as children,” the Grok class action says.
The plaintiffs claim X.AI is guilty of violating Masha’s Law and the Trafficking Victims Protection Act. They demand a jury trial and request declaratory and injunctive relief and an award of statutory and/or actual damages for themselves and all class members.
A similar class action lawsuit was filed in California federal court earlier this year accusing xAI of sexually exploiting women and girls by creating nonconsensual deepfake images of them in sexual positions.
What do you think about the claims in this Grok class action lawsuit? Let us know in the comments.
The plaintiffs are represented by Annika K. Martin, Mark P. Chalos, Betsy A. Sugar and Michelle A. Lamy of Lieff Cabraser Heimann & Bernstein LLP and Vanessa Baehr-Jones of Baehr-Jones Law P.C.
The Grok class action lawsuit is Jane Doe 1, et al. v. X.AI Corp., et al., Case No. 5:26-cv-02246, in the U.S. District Court for the Northern District of California, San Jose Division.
Don’t Miss Out!
Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!
Read About More Class Action Lawsuits & Class Action Settlements:
- Lenovo class action claims company sent marketing texts at unlawful hours
- Oura Ring accused of violating California automatic renewal law in class action
- Ajinomoto expands glass contamination recall affecting Trader Joe’s, Kroger and other brands
- Adobe agrees to $150M settlement with DOJ over alleged subscription violations