Artificial intelligence (AI) services are enabling unscrupulous online blackmailers to create fake but highly realistic sexually explicit photographs and videos of innocent victims. The blackmailer usually emails the target individual to show them pornographic images of themselves, threatening to send the pictures to the victim’s contacts – a process known as “sextortion.” A variation is to claim to have compromising images of the victim recorded via the webcam on their smartphone.
Sextortion is a rapidly rising cyber threat
“Leveraging advancements in artificial intelligence (AI), criminals are creating scams that are not only more credible but alarmingly real,” says antivirus software giant Norton, adding, “This abuse of technology paints a grim picture, where the lines between reality and fabrication are becoming increasingly blurred, ushering in scams that are more convincing and harder to detect.”
In its latest Cyber Safety Pulse Report, Norton found that fraudulent scams and other forms of human manipulation account for over 75 percent of digital threats. Norton reports a rise in three specific types of scams: E-shop scams, where fake online stores are created to lure shoppers; sextortion and fake tech support scams.
“These scams exploit human emotions”
“Sextortion scams are particularly insidious, involving threats to release private or compromising information unless a ransom is paid, usually in cryptocurrencies,” says Norton. “These scams often begin with phishing emails and exploit human emotions like fear and shame.”
“We’ve found scammers are leaning on old methods to lure victims, but they now have a more sophisticated arsenal at their disposal to make these schemes more realistic,” added Norton.
Utilized to blackmail key executives
It seems that sextortion scams are currently on the rise. While many of the attacks are designed to extract money from private individuals, sextortion scams can also be used to compromise senior executives and key personnel to blackmail them into assisting the hackers to break into the corporate network.
AI’s entrance influence on cyber has been decidedly negative
Norton’s findings on sextortion corroborate earlier findings from cybersecurity firm ESET in August, reporting a 178% increase in sextortion emails between the first half of 2022 and the same period in 2023, sextortion as a top phishing threat.
Despite a widening belief that AI will transform the workplace and even replace humanity itself, so far, the primary use for AI appears to be to help fraudsters and con artists dupe their victims.