Security researchers have discovered that North Korean IT workers use deepfakes to create fake identities and appear for online job interviews. These IT workers secure high-paying remote jobs and send earnings back to the DPRK — a known tactic the country uses to fund its missile and other weapon programs.
The latest excursion was caught by security researchers at Palo Alto Networks Unit 42. The investigation was triggered after the Pragmatic Engineer newsletter reported a story about a Polish AI company nearly hiring a candidate who didn’t exist, following two separate deepfake interviews.
Unit42 reports that the feat is relatively simple to pull off. A single researcher with no image manipulation experience, limited knowledge of deepfakes, and a five-year-old computer was able to create a fake identity for a job interview in just 70 minutes. This was done using only an AI search engine, an RTX 3070 GPU, and an internet connection.

Compared to previous tactics used by North Korean IT workers, this approach has several advantages. First, it allows a single operator to interview for the same position multiple times. Secondly, it helps operators conceal their identity better and eventually avoid being added to wanted lists.
That said, for all its advantages, using deepfakes still has a few catches that can help interviewers spot fakes. Rapid head movements, obstructions in front of the face, sudden changes in lighting, and delays between lip movement and speech are indicators that you might be seeing an AI-generated face.
The technique is an “evolving challenge” for organisations worldwide. North Korean IT workers often target US-based companies for the chance to earn the highest salary. In January 2025, the US DoJ indicted five individuals for laundering over $800,000 for North Korea by using fake work credentials. North Korean shell firms are also known to impersonate US companies offering IT services to do the same.
In the News: Cisco confirms several products affected by RCE flaw