Three examples of how cybercriminals exploit AI chats for malicious purposes

Tre esempi di come i cybercriminali sfruttano le chat IA per scopi malevoli thumbnail

Among the trends of the moment there are certainly the chat IA, which allow you to simulate more or less realistic conversations with real or imaginary characters, such as VIPs and celebrities, even deceased. The most popular of these artificial intelligences is definitely ChatGPTwhich is literally depopulating on the web.

All that glitters is not gold though. On the contrary. A recent search of Check Point Research (CPR), the Threat Intelligence division of Check Point Software, reveals that these AIs are also particularly appreciated in the cybercriminal field. In fact, hackers use them to create infostealers, encryption tools and facilitate fraud activities.

To demonstrate this, CPR has provided us with three examples of how AI chats are used in crime on the web.

How cybercriminals exploit AI chats: three cases

Case 1: Creation of infostealer malware

On December 29, 2022, a thread titled appeared on a popular underground hacking forum “ChatGPT – Benefits of Malware”. The author of the thread revealed that was experimenting with ChatGPT to recreate malware and techniques described in research publications and articles on the most common malware.

In reality, while this individual may be a very techy attacker, these posts appeared to demonstrate to less technically skilled cybercriminals how to use ChatGPT for malicious purposes.

Case 2: Creation of a multilevel encryption tool

On Dec. 21, 2022, an attacker dubbed the USDoD posted a Python script, which he stressed was the “first script I’ve ever created.” When another cybercriminal commented that the code style resembles that of OpenAI, USDoD confirmed that OpenAI gave him a “nice hand to finish the script with a nice scope”.

This could mean that potential cybercriminals who have little or no development skills could exploit ChatGPT to develop malicious tools and become new cybercriminals in all respects with technical capabilities.

The code mentioned above can of course be used harmlessly. However, this script can be easily modified to fully encrypt other users’ computers without any interaction from the “victim”. For example, you can turn your code into ransomware if you troubleshoot the script and syntax.

Case 3: Facilitating fraud activities

A cybercriminal demonstrates how to create a marketplace script for the dark web using ChatGPT. The marketplace’s primary role in the underground economy is to provide a platform for the automated trading of illegal or stolen goods, such as stolen payment accounts or cards, malware, or even drugs and ammunition, with all payments in cryptocurrencies.