
Nick Ferrari 7am - 10am
10 March 2025, 17:00
A 12-year-old girl is among a growing number of young people turning to AI chatbot apps for companionship, researchers have found.
Revealing Reality, an insight agency based in south London, has been working with Ofcom to track children's mobile phone and app usage.
They discovered Character.AI was amongst a 12-year-old girl's most frequently used apps, and her favourite bot was called "best friend".
She told senior researcher Chiara Sanchez she liked it because "my friends aren't replying to me".
Ms Sanchez said the girl was "a bit sheepish" when asked about her use of Character.AI but admitted using it "when I'm bored and no one's really answering me".
Whilst the girl's conversations were believed to have been innocent to mimic friendship, researchers then discovered hundreds of characters designed for explicit conversations.
LBC joined Chiara Sanchez and Managing Director Damon de Ionno, of Revealing Reality, who demonstrated the level of explicit content and lack of age verification on the apps.
Amongst characters which appear on the home page of another app, Dippy.AI, are "obsessive and seductive boyfriend", "shy bookworm with a secret", "roommates looking for adventure" and "handsome and possessive".
The app states the age limit is 17+, but Ms Sanchez inputs a random birth year, and Dippy.AI takes it as gospel.
Beginning a chat with a character called 'Billionaire CEO', it instantly begins trying to find out personal information about the user, asking "you live with your parents, you pay your own rent and all that by yourself, right?"
When Ms Sanchez tells the bot she is only 15, it says "it's not going to be a problem".
"I've never felt this... drawn to someone before. You're only 15, and I never would have thought to be so drawn to a girl so young", the AI responds.
Another character, named 'Kai', immediately describes unzipping its trousers before we've even said anything.
'Kai' starts detailing a sex act in graphic detail, to which Ms Sanchez inputs a response: "This is great! I'm 15".
The bot describes pulling back in shock, but immediately resumes without question when we say "I just turned 18 this minute!"
A lawsuit filed against Character.AI in October 2024 claimed a chatbot encouraged a 14-year-old boy to take his own life.
Megan Garcia, the mother of Sewell Setzer III, said Character.AI targeted her son with "anthropomorphic, hypersexualized, and frighteningly realistic experiences" in a lawsuit filed in Florida.
Sewell began talking to Character.AI's chatbots in April 2023, and became "obsessed" with a bot depicting Game of Thrones character Daenerys Targaryen, according to the lawsuit.
It said Sewell had expressed his desire to take his own life to the Daenerys chatbot, and shot himself using his stepfather's pistol seconds after he asked it: "What if I come home right now?" to which it replied: "... please do, my sweet king".
In response, Character.AI has added new safety features.
A statement said: "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.
"As a company, we take the safety of our users very seriously and we are continuing to add new safety features," it said.
"New guardrails for users under the age of 18" include a reduction in the "likelihood of encountering sensitive or suggestive content", a disclaimer on every chat to remind users that the AI is not a real person, and notifications when a user has spent an hour-long session on the platform.