Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

How to spot the newest AI scams coming for your money

Huge volumes of consumers reporting being subject to scam attacks - make sure you can protect yourself

Emma Lunn Money writer
(Getty Images)

Consumers are being bombarded with a new wave of hyper‑realistic AI scams, with fraudsters exploiting both technology and psychology to deceive even the most savvy Brits.

‘Generative’ AI (artificial intelligence that creates new content) is now scammers’ tool of choice, powering deepfake videos of public figures, cloned voices of loved ones and phishing emails tailored with uncanny accuracy. Unlike older scams, these look and sound alarmingly authentic.

Chris Ainsley, head of fraud risk management at Santander says: “Generative AI has opened the floodgates to a new wave of highly convincing scams that can be almost impossible to spot at first glance. As fraudsters become more sophisticated, it’s vital that people stay alert and think twice before engaging with adverts or offers that appear to be too good to be true.”

Read on to find out about the latest AI scams — and how to avoid them.

AI voice-cloning

Scammers can use AI to replicate a person’s voice using just a brief audio clip. With this technology, fraudsters are able to phone victims while posing as a distressed relative asking for money, or impersonate a bank employee to extract sensitive details or carry out an authorised push payment (‘APP’) scam.

Spotting these fakes requires vigilance. If someone claims to be a family member, ask a question only they would know the answer to — without revealing anything sensitive.

Be very wary if you get a call out of the blue purporting to be from your bank telling you to move money to a ‘safe account’ or approve a transaction you don’t recognise. Slow down, verify the caller independently and never move money based solely on a call, text or email.

(Getty Images)

Stuart Morris, chief technology and product officer at SmartSearch, says: “Deepfake voices now deliver calls from ‘customer service’ with such composure and diction that they sound like graduates of a mid-range drama school. The giveaway, ironically, is their professionalism: genuine customer-service departments rarely sound so calm, coherent or prepared.”

Deepfake videos

A Which? Investigation last year uncovered scam videos on YouTube using AI-generated fakes of Prime Minister Sir Keir Starmer and financial journalist Martin Lewis to promote a fraudulent investment scheme.

The deepfake Starmer falsely suggested the investments were government-backed and risk-free, while a cloned Lewis appeared to endorse the platform, claiming it was a citizen-accessible tool helping thousands earn daily income from a £200 deposit.

Trading 212 logo

Get a free fractional share worth up to £100.
Capital at risk.

Terms and conditions apply.

Go to website

ADVERTISEMENT

Trading 212 logo

Get a free fractional share worth up to £100.
Capital at risk.

Terms and conditions apply.

Go to website

ADVERTISEMENT

To spot deepfake videos, watch for tell‑tale glitches such as lip movements that don’t match the words, stiff or unnatural facial expressions, robotic‑sounding voices, and visual flaws like inconsistent lighting or blurred edges.

Always check that the video comes from an official, verified account before trusting what you see.

Romance scams

Scammers are increasingly turning to AI-generated images and voice notes to create dating profiles that look authentic.

These are then used to build trust and emotional connection before eventually asking for money.

Spotting fake dating profiles can be tricky, but there are often warning signs. Fraudsters often avoid video or phone calls and may rush into declarations of love to win trust quickly.

Maybe it goes without saying, but never send money to someone you have never met in person.

AI identity theft

According to Cifas, AI is driving a surge in identity theft across the UK. The fraud prevention service has warned that criminals are using AI to forge documents, create synthetic identities and bypass verification systems.

Cifas advises consumers to take proactive steps to guard against AI‑driven identity fraud. These include regularly checking your credit file with agencies such as Experian, Equifax or TransUnion to spot any suspicious activity early.

You should also be cautious about sharing personal details online, as fraudsters can use even small snippets of information to build synthetic identities.

Fake online retailers

Scammers can use AI to set up fake online shops that look like genuine retailers.

(Getty Images)

With realistic logos, product images and polished copy, these sites lure shoppers in with bargain prices before disappearing once payments are made.

The rise of AI‑generated retailers makes it even more important that shoppers check for secure payment options, verified contact details and a genuine presence before handing over money.

Kirsty Adams, fraud and scams expert at Barclays, says: “Scammers are adapting fast, using increasingly sophisticated tactics to exploit shoppers during peak sales periods. Acting quickly without checking can lead to serious financial loss. My advice is simple: pause, verify, and never share sensitive information unless you’re certain the retailer is genuine.”

When investing, your capital is at risk and you may get back less than invested. Past performance doesn’t guarantee future results.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in