Russian cybercriminals have been discovered trying to circumvent the restrictions on ChatGPT and use the advanced AI-powered chatbot for their nefarious purposes.
Check Point Research (CPR) said they spotted multiple discussions on underground forums where hackers discussed various methods, including using stolen payment cards to pay for upgraded user accounts on OpenAI, bypassing geofencing restrictions, and using a “Russian semi-legal online SMS service” to register ChatGPT.
ChatGPT is a new artificial intelligence (AI) chatbot that made huge headlines due to its versatility and ease of use. Cybersecurity researchers have already seen hackers use the tool to generate believable phishing emails, as well as code for malicious, macro-laden Office files.
However, it’s not that easy to abuse the tool as OpenAI put a number of restrictions. Russian hackers, due to the invasion of Ukraine, have even more roadblocks to overcome.
For Sergey Shykevich, Threat Intelligence Group Manager at Check Point Software Technologies, the roadblocks aren’t good enough:
“It is not extremely difficult to bypass OpenAI’s restricting measures for specific countries to access ChatGPT. Right now, we are seeing Russian hackers already discussing and checking how to get past the geofencing to use ChatGPT for their malicious purposes.
We believe these hackers are most likely trying to implement and test ChatGPT into their day-to-day criminal operations. Cybercriminals are growing more and more interested in ChatGPT, because the AI technology behind it can make a hacker more cost-efficient,” Shykevich said.
But hackers are not just looking to use ChatGPT – they’re also trying to cash in on the rising popularity of the tool to spread all kinds of malware (opens in new tab) and steal money. For example, Apple’s mobile app repository, the App Store, hosted an app pretending to be the chatbot, but with a monthly subscription costing roughly $10. Other apps (some of which were found on Google Play, as well), charged as much as $15 for the “service”.