Ofcom makes 'urgent contact' with X after users created sexualised images of children using AI tool
Under the Online Safety Act, tech firms must take appropriate steps and quickly take down any illegal content on their platforms to prevent harm to users
Ofcom has made "urgent contact" with X and xAI as Elon Musk's AI tool Grok allows users to create nudified and sexual images of women and children.
Listen to this article
The social media platform X and the accompanying AI tool have come under fire after users began sharing photos of other people with Grok and asked it to put them in a bikini or to remove their clothes, to which it complied.
Grok also appeared to comply with requests for this to be done to photos of children too.
This has sparked global outrage, with critics putting pressure on Elon Musk to take action to prevent this from continuing.
While users can reportedly no longer use the image generation tool, they are still able to ask Grok to do this by mentioning it on X, which it will do.
Read more: Grok gives out detailed information on suicide methods and techniques, LBC investigation finds
Read more: 'I'm feeling spicy': Grok's new 'spicy' AI girlfriend will develop relationships with children
The xAI safety team issued a statement on January 4: "We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.
"Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content."
Grok itself responded to criticism, telling users: "xAI has implemented strict guidelines to prevent Grok from generating explicit or non-consensual content.
"Reports indicate some misuse persists, leading to backlash. We've hidden the media feature and encourage reporting violations via X."
In the UK, it is illegal to create or share non-consensual intimate images or CSAM - including deepfakes - and individuals who commit these offences online can face prosecution.
The Government has announced plans to ban nudification tools altogether.
Under the Online Safety Act, tech firms must take appropriate steps and quickly take down any illegal content on their platforms to prevent harm to users.
Ofcom is currently determining whether or not it needs to investigate X and xAI following the recent developments and is seeking to understand what it is doing to protect users in the UK.
An Ofcom spokesperson said: “Tackling illegal online harm and protecting children remain urgent priorities for Ofcom.
“We are aware of serious concerns raised about a feature on Grok on X that produces undressed images of people and sexualised images of children.
“We have made urgent contact with X and xAI to understand what steps they have taken to comply with their legal duties to protect users in the UK.
"Based on their response we will undertake a swift assessment to determine whether there are potential compliance issues that warrant investigation.”