What is a social media algorithm and why are they controversial?

7 March 2025, 17:43 | Updated: 10 March 2025, 15:11

What is 'the algorithm'? | Online Safety Day on LBC

Online Safety Day is today, Monday 10th March from 7am to midnight on LBC, available across the UK on Global Player on your smart speaker, iOS or Android device; on DAB digital radio and TV, at LBC.co.uk and in London on 97.3 FM.

By Kit Heren

Algorithms may be one of the most discussed and little understood aspects of life in the 21st century.

Listen to this article

Loading audio...

So much of our lives is lived online, and most of us are aware that what we do on social media is largely governed by unseen and unspecified rules.

In recent years, particularly since 2016, when Donald Trump first came to power and Britain voted to leave the European Union, some in the media and politics have raised concerns about how social media algorithms can influence politics.

And others have spoken out about how social media algorithms can radicalise people, especially young men, and turn them onto harmful real-world behaviour.

Governments and campaigners have responded and are trying to push through a slew of reforms to regulate social media companies and protect users from harm.

Read more: Social media companies to be 'named and shamed' for not protecting women and girls online, Ofcom chief tells LBC

Read more: Ofcom must punish social media giants if they don't protect children, or calls to ban phones for under-16s will grow

Social media algorithms are controversial
Social media algorithms are controversial. Picture: Alamy

What is an algorithm?

An algorithm is simple, in principle. It's a set of instructions to perform tasks, find solutions to a problem, or make decisions.

In a computing sense, these instructions are written in code.

Gabby Logan discusses online safety with Alice Young

How do algorithms work in the context of social media?

In social media, algorithms work as a series of rules that determine what content a user is shown.

For social networks, user attention is key; they want to keep users on the site and attract more users.

That means they are incentivised to show users content that they believe users want to see.

In light of this, they use information about what kinds of posts a user has already displayed an interest in to recommend them further related posts and personalise their feed.

They might also infer that content a user is interested in means they are also likely to be interested in content of a different, related kind - based on other similar users' activity.

Algorithms on sites like Facebook, X/Twitter, Instagram, TikTok and YouTube use signals such as relevance, location, interactions, level of engagement, and demographics to work out what to serve each user.

The algorithms are powered by artificial intelligence in the form of machine learning.

Elon Musk's X platform has been controversial
Elon Musk's X platform has been controversial. Picture: Getty

Without the algorithms, social media sites would be a jumble of more random posts.

For example, people who are interested in cats might have to wade through many posts on video games - which would make them more likely to stop looking at the site and maybe leave altogether.

This means that in a sense social media algorithms are a tool to increase user efficiency and satisfaction.

But many contend that the algorithms suppress free speech, by promoting specific kinds of content rather than letting posts compete freely in a 'marketplace of ideas'.

Children can often be presented with harmful content online
Children can often be presented with harmful content online. Picture: Alamy

Why are algorithms controversial?

Social media algorithms - or 'the algorithm' - have been controversial for years.

They've become the centre of a row over the role of technology, artificial intelligence and social media in our lives.

Critics of social media companies say that the algorithms are addictive - and are especially harmful for children and teenagers, who are more easily influenced.

A 2024 study found that TikTok algorithms were pushing misogynistic content onto children. Researchers suggested that although their study was limited to one platform, the algorithms of other social media giants were likely to be similar.

They also said the harmful content children were seeing online was likely to influence their real-world behaviour.

Dr Kaitlyn Regehr of UCL, who helped run the study, said: "Harmful views and tropes are now becoming normalised among young people.

"Online consumption is impacting young people’s offline behaviours, as we see these ideologies moving off screens and into schoolyards."

Elon Musk's social media needs to be 'regulated', says Simon Marks

Others have warned that social media algorithms promote specific ideas that can even influence democratic elections.

Britain's Electoral Commission warned in a study from earlier this year that "platform business models and content promotion systems can amplify harmful content, particularly abuse and intimidation directed at political candidates and misleading electoral information aimed at voters."

They added: "These algorithms can create echo chambers and increase political polarisation by promoting content that reinforces existing views."

James O'Brien analogy on harmful social media algorithms

What is being done?

The UK and many countries across the world are alive to the issues posed by social media algorithms and are trying to find ways to mitigate their negative impacts.

In the UK, social media companies will legally have to protect users from illegal content, and children from any harmful content - or face punishment from Britain's communications regulator Ofcom.

Ofcom has been given these powers under the Online Safety Act, which was voted into law in 2023.

The regulator's boss told LBC in February that it will also "name and shame" social media sites that fail to protect women and girls online.

Some think that the government could go even further - and ban social media for under-16s.

Chief Medical Officer Chris Whitty is looking at whether to raise the "digital age of consent" from 13 to 16 - which would force social media companies to change their terms and conditions.

Last year Australia voted to ban social media for under-16s, although the restrictions have not come into force yet.

'Women are five times more likely than men to have intimate image abuse,’ explains Ofcom's Chief Exec

Campaigners have accused the social media giants and the government of not taking children's safety seriously enough.

Joe Ryrie, co-founder of Smartphone Free Childhood said: “Families, schools and even kids themselves are crying out for guardrails to stop vast global corporations from monetising children’s time and attention, yet the government is still dragging its feet.

“If they were serious about protecting childhood, they’d be moving faster and going further — because right now, Big Tech’s profits are still being prioritised over kids’ wellbeing."

But fining American social media companies may be politically difficult, as some in the government are thought to be concerned worried about how the Trump administration may react.

It also comes at a time where the Chancellor is keen to encourage investment in Britain's tech firms.