The FBI issued a public service announcement Tuesday warning companies of the increase in the use of deepfake videos in job interviews for tech jobs that involve access to sensitive infrastructure or information. So far, the scam has been tried on jobs for developers and database and software-related jobs.
It’s also worth pointing out that all these jobs are remote, meaning if selected, there’s a good chance that a shady applicant could continue to work using someone else’s deepfake.
To further add authenticity to their applications, these applicants also used stolen personally identifiable information (PII). Numerous victims have reported that their data was used for background checks and more before employment.
These deepfakes aren’t flawless, though, as the actions and lip movements of the individual in the video don’t always match up to the voice in the video. Additionally, other auditory clues such as coughing or sneezing are not in tandem with what is presented visually, as pointed out by the bureau.
The warning doesn’t clarify who is running this scam or their agenda. However, if attackers can get themselves hired, not only will they have access to customer PII, financial data, corporate IT databases and proprietary information, but they will also have a chance to do things like deploy ransomware, steal data or worse.
This isn’t the first time the FBI has warned IT companies looking to hire. In May, the FBI issued another warning about North Korean individuals posing as foreign IT workers when applying for jobs in the US. However, it wasn’t clear at the time if the individuals applying were spies or were looking to collect a larger paycheck.
Someone who writes/edits/shoots/hosts all things tech and when he’s not, streams himself racing virtual cars. You can reach out to Yadullah at [email protected], or follow him on Instagram or Twitter.