How far-right extremists are using the world's largest gaming platforms to target children as young as nine
'Soft touch' moderation has made the world's largest gaming platforms a target for far-right actors, LBC has been warned.
Platforms like Minecraft and Roblox are being used by far-right actors across the globe to “normalise” and “gamify” extremist beliefs and radicalise young people, LBC has been warned.
Listen to this article
From recreations of Nazi death camps, the October 7 massacre and school shootings to usernames and community groups that spread and share far-right propaganda, experts have warned these hugely popular platforms are being co-opted by extremists to lure young people into their warped ideologies.
Experts have told LBC “malign actors” from the far-right and Islamic extremist groups are specifically targeting these platforms due to “soft-touch” moderation in a bid to expose young people to violent and often radicalising content.
A 2025 study by the Global Network on Extremism and Technology (GNET) found children as young as nine are being exposed to Nazi and Jihadist content on the gaming platform Roblox.
Read more: Roblox to use facial age estimation on children before allowing them to access chat
Roblox, which is primarily used by children and young teenagers and has 380 million active monthly users, relies entirely on content created and shared by its players.
“Roblox is extremely popular, but it’s also not well moderated at all and has a history of not having good moderation tools,” Jean Slater, who researches violent extremist movements, with a focus on Roblox, told us.
“So predators who want to reach children know that they can use Roblox to do so.”
Jean was clear to clarify that this is usually not “predators” in the sense of criminals actively looking to physically abuse children, but rather those who prey on a child’s susceptibility to extremist ideas.
“This is more a political motivation than what we think of when we think of predators and children,” Jean continued.
“Now these political movements are often very detrimental to these young people and they end up being exposed to sometimes real predators in a traditional sense and…Roblox is a good place to do so because there is very free moderation.”
And despite attempts at moderation, Roblox is often “exploited by these malign actors” and has been found to “playable recreations of mass shootings, content glorifying Nazi Germany and right-wing extremist groups”
“Even recruitment into right-wing extremism has taken place on Roblox,” the GNET found.
Their study examined 350 separate posts and profiles on the platform, 175 of which were “coded as far-right and right-wing extremist material.”
GNET added: “Due to the high number of minors on the platform, it is possible that such content is both produced and consumed by underage users, which may have the potential to contribute to youth radicalisation processes in the online sphere.”
The research looked at accounts and content created between April and June 2025 as part of the RadiGaMe (Radicalisation on Gaming platforms and Messenger services) project.
Roblox actively blocks explicitly far-right content, but GENT found that users were easily able to find workarounds, such as misspelling names, using dogwhistles like HH and 88, which both refer to “Heil Hitler.”
Games uploaded onto the platform, like “Racism Tycoon” and “Lolocaust”, which combines the terms LOL and Holocaust, would have been accessible to children as young as 13 before they were removed by Roblox.
The investigation found at least 206 examples of antisemitic content, including Holocaust denial and dogwhistles like “1488” - a reference to a 14-word white supremacist slogan and Adolf Hitler.
A number of games, including one recreating Nazi extermination camps, were also discovered before being removed.
GNET added: “Roblox’s very young user base and the fact that all our data was collected in public parts of the platform suggest that children and teenagers may be exposed to identity-based hate and extremist ideas on the platform.
“In some cases, this could potentially contribute to radicalisation processes in young people.”
Minecraft, one of the most-popular children’s video game on the planet, has also been a target of the far-right in recent years, LBC has found.
GNET warned the popular building game, owned by tech giant Microsoft, is being used as a “tool for ideological grooming and radicalisation.”
“By creating immersive, gamified spaces that embed hateful narratives into familiar gameplay mechanics, extremists are turning gaming culture into recruitment infrastructure,” they said.
A slew of examples have been discovered online in recent years, including a recreation of a Nazi death camp created in Minecraft before being shared online.
The game's blend of community-driven content and the difficulties Microsoft faces in moderating this material due to the nature of Minecraft make it an easy target for the far-right.
Unlike Roblox, this content rarely appears on the Minecraft platform itself, but is rather uploaded onto other social media platforms like YouTube, X and Discord before being shared widely.
One video discovered online and shared widely among far-right groups on Telegram showed Minecraft characters walking under the Nazi symbol of the Black Sun, with the caption “Defend Your Land.”
Another video contained the phrase “Save Minecraft: Exterminate the Invaders.”
Both Minecraft and Roblox have been used to recreate crimes committed in the name of white supremacy, including the New Zealand Christchurch shooting - when a far-right gunman killed at least 50 people in two mosques in 2019.
Microsoft has long insisted it does everything in its power to prevent extremist material being shared through Minecraft.
A Microsoft spokesperson told us: “Microsoft has a longstanding commitment to child online safety. Among other harms, we prohibit violent extremist content on our services and we leverage a range of tools, including proactive detection technologies, to address content and conduct that violates our policies.
On Minecraft’s official servers, we leverage a variety of systems that empower players to enjoy a safe experience, including chat filtering, in-game reporting, parental controls, and more. If content violates our Community Standards, we have dedicated teams for review and moderation, who also work in coordination with Microsoft’s extensive security and threat analysis teams.
"On private servers that are set up independently by users outside of Minecraft’s systems, we investigate reported violations and apply enforcement mechanisms as needed.”
It isn’t only far-right groups taking advantage of Roblox and Minecraft’s young audience and creation features.
Using a TikTok account set up for children, we discovered videos online using both Roblox and Minecraft to recreate Islamist executions and battles, as well as entire propaganda videos.
These videos may not be made by the actual extremist groups themselves; rather, they could be made by people trying to be edgy online.
The accounts we found feature plenty of warnings stating that the user doesn't support any extremist groups and that it is all roleplaying.
When children and young people are viewing or playing content that features extremist language or symbols, it makes this "less catastrophic" to them, Ashton Kingdon, author of a government report into radicalisation through gaming and a criminology lecturer at The University of Southampton, told us.
They also may not fully understand the complexity of what they're doing.
Ashton added: "When people are building concentration camps in Minecraft and putting them on YouTube, these simulations, I don't think they genuinely think that they want to do that.
"I think that people are doing it because they can.
"I think the bigger problem is that technology cannot necessarily recognise it as propaganda or extremism because it's gamified. I think the wider issue is how we moderate this.
"So, there are conversations to be had around moderation and digital literacy, like how dangerous games are and why this is deeply problematic, what they're doing.
"If they're building this, it's about telling them that it's not funny.
"It is offensive, and it is extreme."
This is ultimately an issue of moderation, and Jean Slater tells us platforms like Roblox, and gaming giant Steam, are simply not doing enough to prevent the spread of far-right content.
“I think that they probably could do more considering how much money they have.
“They have a tremendous amount of resources.”
She added: “I think that if you are having a platform that is primarily for children to play, in order for it to be safe and healthy, moderation should be intrinsically embedded in every decision.
“Everything that's done needs to consider the health and safety of children.
“You cannot have a safe environment for children online or off if you don't consider their well-being with almost every single decision that's made.”
A Roblox spokesperson told LBC: “Roblox strictly prohibits content or behavior that promotes extremist organisations or individuals.
"Our dedicated safety team proactively monitors for these violations and takes swift action against accounts or experiences found to break our rules.
"As the GNET report notes, Roblox has made considerable efforts to curb extremism. We quickly investigated their findings and removed the users and experiences found to be in breach of our standards.
“To prevent extremist iconography from reaching our players, we use advanced AI to review all images, text, and assets before they are published. This is backed by advanced safeguards and filters designed to prevent harmful content and communications.
"Sharing images or video in Roblox chat features is not possible, we have mandatory age checks that help ensure kids and teens are chatting with others of similar ages by default, while our filters are designed to block the sharing of personal information to prevent users from being moved to other platforms where moderation standards may be less stringent.“While no system is perfect, our commitment to safety is constant.
"We continue to evolve our protections and work regularly with law enforcement and global experts to counter violent extremism online."