Deepfakes are eroding reality itself, and children are paying the price while governments lag behind
There’s an inconvenient truth that societies and governments across the world must come to acknowledge and accept as fact: Deepfake technology and digital deception is fundamentally changing our relationship with reality.
Listen to this article
It poses a risk to businesses, governments, and society at large, but particularly to children and other vulnerable communities. While AI is, of course, an incredible tool for innovation, the democratisation of the technology in recent years, particularly following the advent of generative AI, has extended the threat landscape for all.
The accessibility of deepfake technology is extremely high. While gaining access to sophisticated malware or phishing campaigns still involves certain barriers—such as navigating the deep or dark web—creating convincing deepfakes is relatively easy using widely available tools and publicly accessible images or videos of targets.
Think of how much we all share across various social media channels. This ease of access and simplicity of use significantly increase both the scale of the phenomenon and its potential impact on individuals.
The risks for young people today are numerous. A 2024 Ofcom report revealed that 50% of UK children aged 8-15 saw at least one deepfake in the six months prior to being questioned.
Most worryingly, when it comes to synthetic content, one in seven respondents reported seeing sexual deepfakes, most of which featured women, but 17% featured a minor. These are worrying statistics, as we are increasingly seeing links between harmful deepfakes and fake social media profiles and teen suicide.
At the core of the deepfake issue is a wider inability to determine what is real and what is not. As things stand, we will be forced to raise the next generation in a culture of uncertainty and doubt.
This is totally unsustainable and it is essential that we move from a state of passivity to one that is proactive when it comes to helping children and adults to distinguish between human-generated and artificial content.
Legal frameworks and policies almost always follow technological developments rather than precede them. It is difficult to design effective regulation before specific use cases and risks fully emerge. For this reason, we are still at an early stage of the process.
Some countries—including the UK, Italy, Spain, Denmark, and France—have begun addressing deepfake regulation earlier than others, but we must closely observe how each environment evolves, so that we can learn from one another’s experiences and improve regulatory approaches collaboratively.
Furthermore, we must keep governments accountable to their commitments and keep the pressure on when it comes to evolving regulation in line with the fast-moving advances in AI abuse.
A crucial element for protecting children is education. We need to teach students about the evolving nature of reality by providing them with new foundational conceptual tools to understand what is human, human-modified, AI-generated, and fully synthetic. However, it may still be too early to implement this in a comprehensive way. We are still in the process of understanding what should be taught, how it should be taught, and how best to assess the long-term impact of these changes. To help set out the roadmap and address the many challenges inherent in AI abuse, collaboration across industries, the public sector, academia and public health will be crucial.
Of course, we can begin by teaching the fundamentals, but we are operating in an extremely fast-moving environment that demands agility and proactivity, because what serves as a foundation today may become obsolete tomorrow.
______________
Marco Ramilli is the Co-Founder and CEO of identifAI Labs, a company that specialises in delivering advanced AI technologies
LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email opinion@lbc.co.uk