Skip to main content
On Air Now

I am LBC’s tech expert and these online safety issues keep me up at night

Share

I am LBC’s tech expert and these online safety issues keep me up at night.
I am LBC’s tech expert and these online safety issues keep me up at night. Picture: Alamy
Will Guyatt, technology correspondent

By Will Guyatt, technology correspondent

For the first time, Will Guyatt reveals his biggest concerns about our digital lives.

Listen to this article

Loading audio...

When I started covering the tech beat, the Internet was on one computer in the newsroom, locked behind a door. Almost 30 years on, I’ve covered the rise of the Internet and the incredible tech transformation that means over three-quarters of Brits now carry a smartphone in their pockets, 100,000 times more powerful than the computers that took us to the moon.

Today, I proudly cover a widening tech remit for LBC - the highs and lows, the good and the bad. For our Online Safety Day, I wanted to reveal my biggest concerns about our digital world. These are the top 10 things that keep me up at night.

Truth no longer matters online

We’ve seen the death of truth in 2025, with major social networks all stepping back from moderation and content checking. It started with Elon Musk’s purchase of X - was continued by Meta as Mark Zuckerberg said goodbye to Nick Clegg and seemingly donned a MAGA hat, and has continued into TikTok revealing it was laying off hundreds from its trust and safety team to replace them with AI solutions that have not yet proved their worth.

It was already hard to know what was real online and with AI deep fakes, the lack of accountability and the rise of ‘alternative fact’ - misinformation is rife everywhere, and now thanks to Donald Trump suggesting US tech companies should only be managed by the US government - they are pushing back on governments worldwide - and that’s incredibly worrying.

Our Tech Laws don’t move quickly enough

It took over a decade for the Online Safety Act to slowly make its way through parliament and into UK law. As LBC’s recent investigation into the Steam online gaming marketplace revealed, it's already not fit for the modern digital age, with numerous gaps where new innovations have emerged. We can’t wait another five years while ministers argue the toss over AI regulation - they must move quickly.

Our government doesn’t really get tech

Despite the best efforts of ministers who typically move from one government post to another, there are no real technology experts in our government.   As a result, government policy around tech often becomes very broad and populist in nature - often built on a handful of emotional and tragic cases - not hard, representative evidence. I don’t get why the government hasn’t conducted a large-scale national survey on the impact of tech on our lives. We can’t react to the big changes brought on by tech by relying on vibes.

People don’t know how to behave online

You would never have taken a video call without headphones on public transport a decade ago - so why on earth is it acceptable today? There’s a complete lack of social contract around society’s use of technology. Parents don’t know how many hours online are acceptable for themselves, let alone their children. If we adults don’t get it, how do we expect kids to do so?

Perhaps it's time for the government to create some of those public information films like we had back in the 1980s - the sort of stuff that previously told us not to swim on a full stomach or entertain ourselves in a power substation on a summer holiday. We would greatly benefit from some training. Perhaps I’ll write the book?

Parents must do more

Tech companies are at the centre of society’s many issues and encourage our rampant digital use - but as I’ve long said, Mark Zuckerberg doesn’t sit at the end of your street peddling smartphones with his apps to your child. You are responsible - and if you hand it over without parental controls or some agreed-upon monitoring policy, you are not doing your best. Maybe my LBC guide can help you?

AI is dangerous and we’re sleepwalking towards it

We’re in a world where we’re grappling with the growth and influence of social media and misinformation, and now we’re moving on to the next tech revolution before we’ve fully worked out how to deal with the last one. We’ve already heard of tragic cases where people - young and old have formed delusional relationships with chatbots - and we’re still encouraging everyone to get closer with them. AI is exciting and game changing - but society needs guardrails and protection that the tech companies are currently not offering in pursuit of the next big technical innovation.

How the hell do I protect my daughter online?

I’ve got a seven-year-old, and I’m frankly terrified about how I’ll protect her online. At the same time, I am determined to give her the help and support she needs to embrace the digital world. I’m definitely not ready to give her unaccompanied access to any kind of digital service yet - and might not for several more years - but I can do it in a collaborative and supportive way - where she knows it’s ok to talk to an adult about the good and BAD experiences she faces in her online journey.

Why don’t tech companies engage more?

Don’t get me wrong - I love chatting to Nick Ferrari about the latest social media story - or attempting to explain Facebook’s newsfeed algorithm to James O’Brien - but on many occasions, I can’t help but wonder why someone from one of the tech companies isn’t there doing it? The reality is - having worked for them, many of the social networks reward their media teams for avoiding interviews, which is something I’d love to change.   Perhaps we’d trust and understand them a little more if they spent a bit of quality time on air with Shelagh or Ben on LBC.

They make huge profits and don’t always give back

It’s a common refrain that tech companies should pay more tax in the UK - but there’s a bigger, and arguably more important issue - tech companies make billions from the UK, but should be responsible for giving more back.   Incredible charities like the Safer Internet Centre and the Lucy Faithfull Foundation are doing excellent work to tackle online harms and educate children, but they typically do not receive the bulk of their funding from big tech businesses.

Online abuse is real and has consequences

Despite growing calls to stop the Police investigating “hurty words” from certain corners, society needs to understand that online abuse can be incredibly damaging, and that one person’s “banter” can be extremely harmful to the person who receives it. I’ve worked with young people whose lives have been turned upside down by online bullying and harassment - and have seen a real world impact.

____________________

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email opinion@lbc.co.uk