It is not just netizens who are quite excited about ChatGPT and other GenerativeAI solutions, the hackers too have their eyes on the latest innovation. With more people using these models to generate information and build IT solutions, hackers have turned their attention to the new opportunity.
The fact that the dark web (the illegal marketplace where hackers buy and sell all stolen IT data and sensitive personal data) is currently swarmed with 3,000 posts, reflects the attention Generative AI is getting from hackers. “Threat actors are exploring schemes, from creating nefarious alternatives of the chatbot to jailbreaking the original and beyond. Stolen ChatGPT accounts and services offering their automated creation en masse are also flooding dark web channels, reaching another 3,000 posts,” cybersecurity experts at Kaspersky said.
“All these 3,000 posts are found to be discussing the use of ChatGPT for illegal purposes or talking about tools that rely on AI technologies. Even though the chatter peaked in March, discussions persist,” Kaspersky Digital Footprint Intelligence said.
“The threat actors are actively exploring various schemes to implement ChatGPT and AI. Frequently discussed topics include the development of malware and other types of illicit use of language models such as processing of stolen user data, parsing files from infected devices, and beyond,” it said.
sharing jailbreaks
Threat actors tend to share jailbreaks via various dark web channels — special sets of prompts that can unlock additional functionality — and devise ways to exploit legitimate tools for malicious purposes,” Alisa Kulishenko, Digital Footprint Analyst at Kaspersky, said.
Apart from chatbots and artificial intelligence, considerable attention is being given to projects like XXXGPT, FraudGPT, and others. These language models are marketed on the dark web as alternatives to ChatGPT, boasting additional functionality and the absence of original limitations.
ChatGPT accounts on sale
One more threat for users and companies is the market for accounts o the paid version of ChatGPT.
In 2023, another 3,000 posts (in addition to the previously mentioned ones) advertising ChatGPT accounts for sale were identified across the dark web and shadow Telegram channels. These posts either distribute stolen accounts or promote auto-registration services, massively creating accounts on request. Notably, certain posts were repeatedly published across multiple dark web channels. “While AI tools themselves are not inherently dangerous, cybercriminals are trying to come up with efficient ways of using language models, thereby fueling a trend of lowering the entry barrier into cybercrime and, in some cases, potentially increasing the number of cyberattacks,” said Kulishenko.
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.