Malware-Infested AI Nude Image Generators are on the Prowl
FIN7, a notorious Russian hacking group, is using fake AI nude image generator websites to distribute malware, targeting individuals and potentially businesses. According to research from Silent Push, the campaign relies on the popularity of deepfake technology to lure users into downloading information-stealing viruses.
The fake sites, branded as "AI Deepnude Generators," claim to transform uploaded photos into nude images using AI. However, rather than generating any deepfake content, the sites trick users into downloading malware like Lumma Stealer and Redline Stealer, which steal credentials, cookies, and cryptocurrency data from infected devices.
Often promoted as offering "free downloads" or "free trials," the fraudulent sites prompt users to either download software or upload a photo, ultimately leading to a malware-infected password-protected file, hosted on platforms like Dropbox.
While all seven known sites have been taken down, the threat remains active. “We believe it’s likely new sites will be launched that follow similar patterns,” Silent Push warned. FIN7’s SEO tactics ensure that these sites rank high in search results, maximizing exposure.
The campaign casts a wide net. Individuals looking for deepfake content are the primary targets, but businesses are at risk if employees download the malware onto work devices. The malware can lead to further attacks, including ransomware. FIN7 has also been using browser extension lures and spoofed installers of popular software like Zoom and Fortinet VPN to reach victims, demonstrating a diversified approach.
Last year, a similar tactic was used by threat actors, where fake ChatGPT tools started to appear and spread malware. Cybercriminals exploited the growing interest in AI with these fraudulent tools and browser extensions that installed malware instead of providing the promised functionality.
Please, comment on how to improve this article. Your feedback matters!