Criminals using ‘deepfakes’ to apply for home working jobs

deepfake technology cyber crime
deepfake technology cyber crime

Criminals are using deepfake video technology and stolen personal data to impersonate real people and apply for remote working jobs in the tech industry, the FBI has warned.

The US law enforcement agency said it had received complaints about “voice spoofing” taking place during video interviews for remote workers, with the jobs being used to steal private information from corporate databases.

It added that in some instances, hiring managers had become suspicious when actions such as coughing and sneezing were not aligned with the video of the interviewee.

“Deepfakes include a video, an image, or recording convincingly altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said,” said the FBI.

It is thought the criminals behind the impersonation attempts were using the details of genuine people, harvested from stolen data being resold on the dark web.

Criminals would require a photo of their target, such as a social media profile picture, to create a video likeness.

Detailed information about a target’s current employment is typically regarded as low-hanging fruit that is bought and sold in bulk on cybercrime forums.

So far deepfake technology has mostly been used for hoaxes featuring celebrities saying implausible things, though in recent days Russia has been accused of using it to impersonate prominent Ukrainians.

Franzizka Giffey, the mayor of Berlin, reportedly joined a video call with someone pretending to be Vitali Klitschko, her counterpart in Kyiv.

“There were no signs that the video conference call wasn’t being held with a real person,” her office said in a statement reported by Der Spiegel.

It took 15 minutes for Ms Giffey to realise she wasn’t speaking to Mr Klitschko, with officials realising the truth after the impersonator began demanding Ukrainian refugees were deported back to Ukraine for military service.

One notable early example of a deepfake was a video circulated in 2018 appearing to show Barack Obama calling Donald Trump “a total and complete dip----”. The fake footage, which appeared entirely convincing, was created by US comedian and filmmaker Jordan Peele to highlight the risks of the technology.

Taking its name from “deep learning”, an artificial intelligence research concept, deepfake video technology has been commercialised by mobile app developers, typically for “face swap” apps where ordinary people can see and hear themselves as celebrities.

Advertisement