Privacy Ninja

FBI: Stolen PII and Deepfakes Used to Apply for Remote Tech Jobs

FBI: Stolen PII and Deepfakes Used to Apply for Remote Tech Jobs

The Federal Bureau of Investigation (FBI) warns of increasing complaints that cybercriminals are using Americans’ stolen Personally Identifiable Information (PII) and deepfakes to apply for remote work positions.

Deepfakes (digital content like images, video, or audio) are sometimes generated using artificial intelligence (AI) or machine learning (ML) technologies and are difficult to distinguish from authentic materials.

Such synthetic content has been previously used to spread fake news and create revenge porn, but the lack of ethical limitations regarding their use has always been a source of controversy and concern.

The public service announcement, published on the FBI’s Internet Crime Complaint Center (IC3) today, adds that the deepfakes used to apply for positions in online interviews include convincingly altered videos or images.

Also Read: PDPA Laws And Regulations; A Systematic Guidelines In Singapore

The targeted remote jobs include positions in the tech field that would allow the malicious actors to gain access to company and customer confidential information after being hired.

“The remote work or work-from-home positions identified in these reports include information technology and computer programming, database, and software-related job functions,” the FBI said.

“Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information.”

Video deepfakes are easier to detect

While some of the deepfake recordings used are convincing enough, others can be easily detected due to various sync mismatches, mainly spoofing the applicants’ voices.

“Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the US federal law enforcement agency added.

“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

Some victims who reported to the FBI that their stolen PII was used to apply for a remote job also said pre-employment background checks information was utilized with other applicants’ profiles.

Also Read: PDPA Meaning: Know Its Big Advantages In Businesses

The FBI asked victims (including companies who have received deepfakes during the interview process) to report this activity via the IC3 platform and to include information that would help identify the crooks behind the attempts (e.g., IP or email addresses, phone numbers, or names).

In March 2021, the FBI also warned in a Private Industry Notification (PIN) [PDF] that deepfakes (including high-quality generated or manipulated video, images, text, or audio) are getting more sophisticated by the day and will likely be leveraged broadly by foreign adversaries in “cyber and foreign influence operations.”

Europol also warned in April 2022, that deepfakes could soon become a tool that cybercrime organizations will use on a regular basis in CEO fraud, to tamper with evidence, and to create non-consensual pornography.



Subscribe to our mailing list to get free tips on Data Protection and Data Privacy updates weekly!

Personal Data Protection


We have assisted numerous companies to prepare proper and accurate reports to PDPC to minimise financial penalties.

Powered by WhatsApp Chat

× Chat with us