Elon Musk must urgently deal with Grok being used to create deepfake nudes, says tech secretary
Users of X, formerly Twitter, appear to have prompted Grok, the AI chatbot developed by Musk's xAI, to generate undressed images of people
Technology Secretary Liz Kendall said Elon Musk’s X must urgently deal with its artificial intelligence chatbot Grok being used to create sexualised deepfake images.
Listen to this article
Users of X, formerly Twitter, appear to have prompted Grok, the AI chatbot developed by Musk's xAI, to generate undressed images of people.
A post on the Grok X account said that there have been "isolated cases where users prompted for and received AI images depicting minors in minimal clothing", and added: "xAI has safeguards, but improvements are ongoing to block such requests entirely."
The technology secretary backed regulator Ofcom, which has asked X and xAI to set out the steps it is taking to comply with legal obligations to protect UK users of the social media platform.
Read more: Grok gives out detailed information on suicide methods and techniques, LBC investigation finds
Read more: 'I'm feeling spicy': Grok's new 'spicy' AI girlfriend will develop relationships with children
Ms Kendall said: “What we have been seeing online in recent days has been absolutely appalling, and unacceptable in decent society.
“No one should have to go through the ordeal of seeing intimate deepfakes of themselves online. We cannot and will not allow the proliferation of these demeaning and degrading images, which are disproportionately aimed at women and girls.
“X needs to deal with this urgently. It is absolutely right that Ofcom is looking into this as a matter of urgency and it has my full backing to take any enforcement action it deems necessary.”
She added that efforts to curb the spread of sexualised deepfakes were not an attempt to restrict free speech.
Donald Trump’s US administration has hit out at European regulators for attempts to regulate what appears online on American platforms.
But Ms Kendall said: “Services and operators have a clear obligation to act appropriately. This is not about restricting freedom of speech but upholding the law.
“We have made intimate image abuse and cyberflashing priority offences under the Online Safety Act – including where images are AI-generated. This means platforms must prevent such content from appearing online and act swiftly to remove it if it does.
“Violence against women and girls stains our society – and that is why we have also legislated to ban the creation of explicit deepfakes without consent, which are both degrading and harmful.
“Make no mistake – the UK will not tolerate the endless proliferation of disgusting and abusive material online. We must all come together to stamp it out.”
Ofcom says it has contacted X and xAI to understand what steps have been taken to address this, but has not launched an investigation.
A spokesperson for the regulator said: "Tackling illegal online harm and protecting children remain urgent priorities for Ofcom.
"We are aware of serious concerns raised about a feature on Grok on X that produces undressed images of people and sexualised images of children.
"We have made urgent contact with X and xAI to understand what steps they have taken to comply with their legal duties to protect users in the UK.
"Based on their response we will undertake a swift assessment to determine whether there are potential compliance issues that warrant investigation."
Under the Online Safety Act in the UK, social media firms must prevent and remove child sexual abuse material when they become aware of it.
After a request for comment, xAI replied with an automatically-generated email saying "legacy media lies".
Internet Watch Foundation chief executive Kerry Smith said: "The IWF has received a number of reports from the public relating to suspected child sexual abuse imagery on X generated by the AI chatbot Grok.
"We are still working through these reports but, so far, we have not seen any imagery which crosses the legal threshold for being considered child sexual abuse in the UK."