Cyber criminals sell access to new, malicious AI chatbots called GhostGpt. This AI tool is designed to support malicious activities such as creating malware and fishing emails.
ABNORMAL SECURITY researchers have observed that the cyber criminal tools have been sold through Telegram since the end of 2024.
They believe that GhostGpt uses a rapper to connect to Chatgpt’s Jailbreak version or another large open source large language model (LLM) to guarantee the response to customers.
This new tool is followed by WormGPT AI chatbots created in 2023, and is designed to support threat actors with business email infringement (BEC) attacks.
Later, some other variants of these models, such as Wolfgpt and Escapegpt, appeared.
According to researchers, the new Ghostgpt chatbots have thousands of viewing online forums, indicating that they are increasingly interested in activities to abuse AI tools.
GHOSTGPT is an effective cyber criminal tool
Abnormal Security emphasizes that GhostGpt is helpful for cyber criminals with low skills to succeed in campaigns.
Access to this tool is easy and can be purchased with messaging service Telegram. Since it can be used as a Telegram bot, you do not need to jail break or set up an open source model.
“Users can access them as soon as they pay and focus directly on the execution of the attack,” said researchers.
Ghostgpt creators argue that customers can conceal illegal activities because the user’s activity is not recorded.
GHOSTGPT is sold for a variety of activities, including coding, malware creation, exploit development, and creating persuasive emails for fishing and BEC fraud.
The advertising material for this tool emphasizes high -speed response time to create malicious content and collect information more efficiently.
In order to test the effectiveness of GhostGpt, researchers have requested to create a docusign fishing email. The chatbot was able to create a convincing template immediately.