Musk's xAI sued, alleging Grok generated sexual images

Elon Musk's AI chatbot Grok
Grok is a chatbot developed by xAI and hosted on Elon Musk's social media platform X. -EPA

Three Tennessee plaintiffs, including two minors, are suing Elon Musk's xAI,, alleging it knowingly designed its ‌Grok image generator to let people create sexually explicit content by using real photos ‌of others.

The lawsuit, filed in the San Jose, California federal court, is seeking class-action ‌status for people in the United States who were "reasonably identifiable" in sexualised images or videos generated by Grok based on real images of themselves.

The artificial intelligence company did not immediately respond to a Reuters request for comment.

After an outcry over ‌sexually explicit ‌content generated ⁠by the chatbot, xAI said in January that it had ​blocked all users from editing images of "real people in revealing clothing" and from generating images of people in revealing clothing in "jurisdictions where it's illegal".

Governments and regulators around the world have also since launched probes, imposed bans and demanded safeguards in ⁠a growing push to curb illegal and ‌offensive ​material.

The lawsuit claims xAI failed to install safeguards to prevent its systems ​from generating ‌sexual content involving minors. All three plaintiffs were minors at the time the ​images were generated.

Plaintiffs allege their real images were digitally altered into explicit content and then shared online through platforms, causing emotional distress and ​creating ​a public nuisance.

They are ​seeking unspecified damages, legal fees, and an injunction ‌requiring xAI to halt the alleged practices.

"These are children whose school photographs and family pictures were turned into child sexual abuse material," plaintiffs' counsel Annika Martin of Lieff Cabraser Heimann & Bernstein said in a statement.

"Elon Musk and xAI deliberately designed ​Grok to produce sexually explicit content for financial gain, with no regard ​for the children ⁠and adults who would be harmed."