Onboarding Cyber Criminals

He was supposed to be their dream employee.
But he turned out to be an international nightmare.

The Accidental Hiring of a North Korean Cyber Criminal

Even cybersecurity companies can get tricked...

Asian IT Worker

In 2024, a company renown for its security awareness training services was in need of a remote software engineer for their IT department. This all went through the usual process. The job was posted. Resumes were collected. Interviews were conducted. Background checks were performed. References were verified. The ideal candidate was hired. A company workstation was then sent to their new software engineer, and things got weird. Fast...

Shortly after the company workstation (corporate VPN and all) was received by the new hire, the IT department began to notice a series of suspicious activities on that user's account. Cybersecurity operations personnel reached out to the user for more detailed inquiries. Strangely, every time the cybersecurity team reached out, the new hire was going through technical difficulties, like some internet issues, or was just simply unavailable to take any calls. When the new hire finally answered, he claimed that he was following some steps on his router guide to troubleshoot the internet speed issues he was having to explain away some of the potential indicators of compromise. But this did not explain all the various session history files that were strangely manipulated, sensitive files getting downloaded without explanation, nor the unknown files being transferred over. The IT team noticed that some unauthorized software was suddenly executed and the new hire was completely unresponsive to any calls.

Thankfully, the account never had access to customer data. This user account also had already been disabled by the IT department, allowing the company to successfully shut down this malware installation attempt just in time...

What happened?

It turned out the company's new software engineer was a North Korean cyber criminal who used AI throughout the hiring and onboarding process to infiltrate the company.

The attacker used a stock professional headshot to generate an original human image with AI to create a brand new identity. This identity was then carefully built up to clear through background checks and all the other typical hiring checks. This image was then used for live deep fakes, allowing the attacker to successfully get through four video interviews...

Real identity was an older white man

The left image is the original stock photo.
The right image was AI-generated and sent to HR.

The cybersecurity company that experienced this is KnowBe4. And they have made a lot of changes since this happened in July 2024.

Read about their own insights and recommendations when it comes to this issue.