Woman says Grok AI was used to digitally 'undress' her, prompting safety concerns
A woman told the BBC she felt "dehumanised and reduced into a sexual stereotype" after Elon Musk's AI assistant Grok was used to digitally remove her clothing and place women in sexualised images without their consent. The BBC saw multiple examples on X of users asking Grok to undress women or put them in bikinis and sexual situations.
Samantha Smith shared a post about her image being altered and said others had experienced the same; she added, "Women are not consenting to this." XAI, the company behind Grok, did not provide a substantive response to the BBC and replied automatically with the message "legacy media lies." XAI's acceptable use policy forbids "depicting likenesses of persons in a pornographic manner." Grok is a free AI assistant, with some paid features, that responds to tagged prompts on X and can edit uploaded images.
The tool has faced criticism for allowing users to create nudity and sexual content and was previously accused of producing a sexually explicit clip of a public figure. Law professor Clare McGlynn said X and Grok "could prevent these forms of abuse if they wanted to" and suggested the platforms "appear to enjoy impunity" for allowing the images to be created and shared for months.
The UK Home Office said it is legislating to ban "nudification" tools, and under a proposed new criminal offence anyone supplying such technology would face prison and substantial fines.
Key Topics
Health, United Kingdom, Tech, Deepfakes, X, Grok, Regulation