Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Fake humans are turning up to job interviews – and you might not even know, FBI warns

Not clear why scammers are creating detailed, false applicants, agency warns

Andrew Griffin
Wednesday 29 June 2022 10:32 BST
Comments
(Getty Images/iStockphoto)

Fake humans are conducting job interviews – and could trick the people interviewing them, the FBI has warned.

Scammers are using deepfakes and other technology to create false applicants that can undertake job interviews, the agency warned. The fake people are made by stealing the personal information of other people and then creating fake but convincing applicants that can go to job interviews as them, it said.

If successful, criminals can then use the job position to access useful data held by those companies, it suggested. But it is not exactly clear why cyber criminals are using the attack.

The problem is on the rise, with a growing number of complaints from companies who have been targeted by the strange attack, the FBI said in a public service announcement.

The attacks are on the rise amid the increase of remote work or work-from-home positions after the pandemic.

In the attack, a person might appear on screen as normal, and talk and move like a real person. But that person is controlled by a scammer, who is able to use fake or AI-generated voices, as well as videos, to create a convincing job applicant.

“Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the agency said in its advisory.

“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

The agency did not give any specific advice for combating the attack, or spotting the fake applicants, but asked that anyone affected get in touch.

It comes amid an increasing fear about deepfakes, which have also been used to generate fake non-consensual sexual imagery and to make politicians appear to have given statements they never really said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in