Santander creates deepfake videos to warn people about AI scam dangers

7 August 2024, 00:04

Santander UK has created deepfake videos (Dominic Lipinski/PA)
Deepfake videos. Picture: PA

The videos, depicting the bank’s fraud lead as well as an online influencer, will be placed on social media to highlight the risks of fraud.

A major bank has created a series of deepfake videos to warn people how realistic they already are and highlight the threat from fraudsters.

Santander is placing the videos, depicting its fraud lead Chris Ainsley and influencer Timi Merriman-Johnson – also known as @mrmoneyjar, on social media to help raise awareness.

In one of the videos, Mr Ainsley appears to say: “For scammers, it is a powerful tool they can use to steal your money.”

He goes on to list the tell-tale signs of a potential deepfake, saying: “Look for blurring around the mouth. The person might blink less frequently than usual.

“If they are wearing glasses, the light reflections might not look right. The background may not feel natural. If something looks strange, trust your instincts.”

The video shows Mr Ainsley saying HM Revenue and Customs (HMRC) and banks are often impersonated.

He adds: “I’ve even been impersonated myself. This video is just the latest example. As you might have guessed, this isn’t me, this is a deepfake, created to warn you about deepfakes.”

He adds that people should consider whether a video was received from a reliable source and whether it is asking for money.

“If they are asking for money, that’s a big red flag,” he adds.

The video depicting Mr Merriman-Johnson meanwhile purports to offer an “incredible investment opportunity”.

The real Mr Merriman-Johnson then says: “That definitely wasn’t me. That was a deepfake video,” adding that it is “something we’re all likely to see more of in the future”.

Deepfakes can be videos, sounds or images of real people that have been digitally manipulated through artificial intelligence (AI), to convincingly misrepresent a person or organisation.

Santander warned that with generators and software widely available, fraudsters simply require authentic footage or audio of their intended victim – often found online or through social media – to create a deepfake.

An Opinium survey of 2,000 people for Santander in July found that just over half (53%) of people have either not heard of the term deepfake, or have misunderstood what it meant, with just 17% of people confident they could easily identify a deepfake video.

The survey also indicated that many people have encountered a deepfake, often on social media. Over a third (36%) of people surveyed said they have knowingly watched a deepfake.

Six in 10 (59%) people said they are already more suspicious of what they see or hear because of deepfakes.

Mr Ainsley, head of fraud risk management at Santander, added: “Generative AI is developing at breakneck speed, and we know it’s ‘when’ rather than ‘if’ we start to see an influx of scams with deepfakes lurking behind them.

“We already know fraudsters flood social media with fake investment opportunities and bogus love interests, and unfortunately, it’s highly likely that deepfakes will begin to be used to create even more convincing scams of these types.

“More than ever, be on your guard and just because something might appear legitimate at first sight – doesn’t mean it is.”

Mr Merriman-Johnson said: “As I said in the video, if something sounds too good to be true, it probably is.”

He added: “If you are ever in doubt as to whether a company or individual is legitimate, you can always search for them on the Financial Conduct Authority Register.”

Here are Santander UK’s top tips to spot a deepfake:

1. Most deepfakes are still imperfect. Whether there is blurring around the mouth, less blinking than normal, or odd reflections – look out for the giveaways.

2. But at some point, deepfakes will become impossible to distinguish from real videos, so context is important. Ask yourself the same common-sense questions you do now. Is this too good to be true? If this is real, why is everyone not doing this? If this is legitimate, why are they asking me lie to my family or my bank?

3. Know what types of scams deepfakes are likely to be used for. Deepfakes are likely to be used by criminals to scam people through investment scams and impersonation fraud, such as romance scams. If you know the tell-tale signs of these scams, you will know how to spot them – even if a deepfake has been used.

By Press Association

More Technology News

See more More Technology News

White mobile phone being held in the hands of a young person

Charity declares ‘national childhood emergency’ amid concerns about online harm

Young boys are being targeted in sextortion plots

British teenage boys targeted by Nigerian crime gangs in 'sextortion' plots

Psychologists gave the more accurate ADHD videos an average rating of 3.6 out of five (PA)

ADHD misinformation on TikTok is widespread and affecting young people – study

Two hands on a laptop keyboard

Start-up firms established at universities could be lost to overseas competitors

Broadband customer survey

Major broadband providers outperformed by smaller rivals in annual survey

A child's hands holding a mobile phone while playing a game

Ad watchdog announces crackdown on degrading images of women in gaming apps

Close-up of African office worker typing on keyboard of laptop computer at the table

British teenage boys being targeted by Nigerian ‘sextortion’ gangs

Science, Innovation and Technology Secretary Peter Kyle

Kyle to lay out plans to turn industrial wasteland into AI hotbeds on US visit

Wind turbines against a grey sky and rainbow

UK scientists behind clean power AI breakthrough win £1m Government prize

Hands using a microscope

Groundbreaking form of AI ‘could speed up drug development by around six years’

Suni Williams and Butch Wilmore spent nine months in space

The ravaging effects of space on the human body revealed as astronauts touch down after 286 days

RoboGuide face-to-face with a guide dog labrador

Robot dog developed to support blind and visually impaired

A white robot arm holding a kettle, pouring hot water into a cup being held out by a human

Coffee-making robot ‘glimpse into future where intelligent machines commonplace’

NASA astronaut Suni Williams is helped out of the SpaceX Dragon spacecraft

Home at last: NASA astronauts greeted by dolphins as they land back on Earth after eight months stranded in space

Silhouette of a hand holding a phone with the Gemini logo on its screen with the Google logo partially obscured in the background

Independent launches AI-created news briefings for ‘busy’ readers

Fresh images of the remote Antarctica base at the centre of an assault allegation have emerged after the crew begged for rescue.

Inside the Antarctic base at centre of assault mystery as crew plead for rescue