Sextortionists are making AI nudes from your social media images

Sextortionists
Sextortionists

The Federal Bureau of Investigation (FBI) is warning of a rising trend of malicious actors creating deepfake content to perform sextortion attacks.

Sextortion is a form of online blackmail where malicious actors threaten their targets with publicly leaking explicit images and videos they stole (through hacking) or acquired (through coercion), typically demanding money payments for withholding the material.

In many cases of sextortion, compromising content is not real, with the threat actors only pretending to have access to scare victims into paying an extortion demand.

FBI warns that sextortionists are now scraping publicly available images of their targets, like innocuous pictures and videos posted on social media platforms. These images are then fed into deepfake content creation tools that turn them into AI-generated sexually explicit content.

Although the produced images or videos aren’t genuine, they look very real, so they can serve the threat actor’s blackmail purpose, as sending that material to the target’s family, coworkers, etc., could still cause victims great personal and reputational harm.

« As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats, » reads the alert published on the FBI’s IC3 portal.

Source