Business/Technology

ChatGPT may help threat actors to create malware

News Mania / Piyal Chatterjee / 16th October 2024

OpenAI, the company behind ChatGPT, has acknowledged that threat actors had utilized the generative AI-powered chatbot to create and debug malware. The company’s latest threat intelligence report claims that since the beginning of the year, it has “disrupted more than 20 operations and deceptive networks from around the world” that attempted to use its AI chatbot’s capabilities maliciously.

To shed light on how some threat actors make their operations more effective, OpenAI first mentions “SweetSpecter,” a Chinese cyber-espionage group that targets Asian nations, according to Cisco Talos analysts.

According to the creator of ChatGPT, the organization targeted them by sending phishing emails to OpenAI employees’ personal email addresses that looked like support requests but actually contained malware-infected ZIP attachments. These attachments would infect the computer with the SugarGh0st RAT virus when they were opened. Additionally, OpenAI discovered that SweetSpecter was using numerous ChatGPT accounts for vulnerability analysis and scripting.

The next gang is called “CyberAv3ngers,” which is affiliated with the Islamic Revolutionary Guard Corps (IRGC) of the Iranian government and is renowned for tagging industrial systems in vital infrastructure. According to reports, this group created custom Python scripts and hidden code using ChatGPT, as well as generating default credentials in PLCs. CyberAv3gners also utilized ChatGPT, according to OpenAI, to discover how to take advantage of specific flaws and steal passwords on macOS-powered machines .

The AI chatbot was used by another Iranian organization called “Storm-0817” to debug malware, create an Instagram scraper, and even create a unique malware for Android devices that can harvest browser history, files, contact lists, call logs, and screenshots in addition to providing the victim’s position in real time.

 

 

 

 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button