Tom Swarbrick 4pm - 7pm
Anti-Jewish racism is out of control on the internet's 'anti-Semitism superhighways' - ministers must take action
20 December 2024, 07:11 | Updated: 20 December 2024, 08:44
When the last Government first announced an Online Safety Act, parents, teenagers and campaigners like me, from amongst anti-racism organisations and across the mental health sector and childrens charities, had high hopes about the protections this would bring from harmful content online.
Listen to this article
Loading audio...
Despite the criticism, we had an Act of Parliament – the Online Safety Act - that, notwithstanding many issues with it, was the basis for action against online harm. Not perfect, not necessarily all that well understood, but far from the failure that some predicted.
One of the key issues was its focus. The Act covered a wide range of platforms, anything from Facebook to online forums. It is perhaps then little surprise that one of the big debates as the Act went through Parliament was about whether small, but really risky and harmful sites – some designed to be such - should be regulated in the same way as the big, household names like Instagram, X/Twitter and so on.
By ‘really risky’ I mean sites that people use to share anorexia tips, spread racist and antisemitic conspiracies, and even share advice on how to take their own lives. The overwhelming consensus, across the main political parties, was of course these sites should be regulated at the same level as the big players.
That regulation was going to be relatively light-touch, but it would have made a real difference. Just like Facebook, these sites would have to offer users enhanced choice about what they wanted to see, allowing them to avoid material they didn’t want to.
They’d have to submit risk assessments to Ofcom – the online safety regulator. They would have to be much more transparent in reporting what risky behaviour is going on in their platforms.
To be clear, this debate has nothing to do with censorship; it’s about tech companies and people who run small forums behaving like reasonable and responsible businesses.
The new legislation wasn’t initially going to allow for the regulation of these small, high-harm sites alongside the big sites, but a group of charities and campaigners came together to change the Act. Baroness Nicky Morgan tabled an amendment and this was enthusiastically supported by the Labour, which was then in opposition.
It is all the more surprising then what has followed. This week, the government has U-turned on its previous position, one which previously had cross-party consensus, and a move that wouldn’t have cost it anything to implement.
It has decided that these sites don’t need the higher level of regulation at all. The government is saying it will tackle this issue by coming down hard on illegal content. But a huge amount of the material we’re talking about is legal.
In the report that we have released today, we are demonstrating just why the Government is wrong.
We - the Antisemitism Policy Trust - have a new comprehensive analysis conducted in the months following the 7 October attacks in Southern Israel, and what we have found is a disturbing increase in antisemitic content across various social media platforms.
Our study examined hundreds of thousands of posts from smaller platforms including TruthSocial, Minds and Odysee – online spaces you may not ever have heard of. Alt-media websites like these have lower volumes of content than say X but more extreme antisemitic posts. Antisemitic accounts are posting around the clock, using ‘hashtag dumping’ to create fake trends and bypass safety algorithms.
Generative AI is also being used to create graphic antisemitic content. It starts on these small sites but it doesn’t stop there. Poorly moderated alternative media platforms like Rumble, Minds, and Odysee are sources of antisemitism that spill over into larger mainstream platforms.
Whilst on the mainstream platforms themselves, the problem is large and growing. Online antisemitism is on the rise, potentially driven by reduced content moderation standards. There has been an approximately 115-fold increase in antisemitism on X compared to October 2023.Notably, whereas our previously we estimated 2 antisemitic tweets per Jew per year (500,000 tweets against a community of 250,000) our new report suggests there are as many as 4 antisemitic tweets per day (760 times more).
Meanwhile, platform features are being abused to spread antisemitism, hate speech, and violence. ‘Thread hijacking’, for example, where individuals jump on a popular topic, or news story – for example about a Spiderman movie, or Wicked, is being used by alt-influencers to draw large audiences from celebrity accounts and public figures into antisemitic communities. Once there, people are seeing not only antisemitism but a mixture of other online conspiracy theories, rage bait, and pro-Russia, and anti-Ukraine commentary.
Much of the antisemitic content uncovered is easily detectable by commercially available content safety systems, raising questions about why social media platforms do not utilise these capabilities.
As it stands, the regulatory environment is weak. As a result of the decisions taken this week, we’ll be in the bizarre position where a business networking site will likely end up being regulated at a far greater level than a forum set up to help people trade tips on how to starve themselves. Sites that have demonstrably led to loss of life offline, will be let off the hook whilst only a select few large platforms will have additional responsibilities.
This decision, then, is not just morally wrong, it’s absurd. If the stakes weren’t so serious, it might be a laughing matter. Sadly, it’s going to lead to a less well, more polarised society, and lives may well be lost as a result.
We’ve seen the role smaller platforms can play in spreading harm – just look at the Southport riots – where disinformation spread across Telegram and smaller sites. Look at the shootings in Buffalo, New York a couple of years ago inspired by content on 4Chan – another hate site.
In failing to make the right decision, the government has made it harder for groups working to tackle harm, it has ignored a major vote in Parliament, and it appears to have prioritised limiting its own workload over public safety. That is a derogation of duty, and we will be working to ensure that the situation is reversed.
Danny Stone is the chief executive of the Antisemitism Policy Trust.
LBC Views provides a platform for diverse opinions on current affairs and matters of public interest.
The views expressed are those of the authors and do not necessarily reflect the official LBC position.
To contact us email views@lbc.co.uk