Ethical Hacking News
A malicious supply chain attack has been uncovered, where two packages impersonated popular AI models like OpenAI ChatGPT and Anthropic Claude to deliver an information stealer called JarkaStealer. The packages were uploaded to the PyPI repository, which was downloaded by thousands of users worldwide.
The malicious packages "gptplus" and "claudeai-eng" were uploaded to the PyPI repository by a user named "Xeroline" in November 2023, impersonating popular AI models. The packages contained Base64-encoded data that downloaded a Java archive file and JRE from GitHub and Dropbox, respectively, before running the malware. The JarkaStealer malware stole web browser data, system data, screenshots, and session tokens from various applications like Telegram, Discord, and Steam. The packages were offered under a malware-as-a-service model via a Telegram channel for $20-$50, with the source code leaked on GitHub.
The software development landscape has long been plagued by vulnerabilities, from outdated dependencies to sophisticated supply chain attacks. A recent discovery sheds light on one such attack, where malicious packages uploaded to the Python Package Index (PyPI) repository impersonated popular artificial intelligence (AI) models like OpenAI ChatGPT and Anthropic Claude to deliver an information stealer called JarkaStealer.
The packages in question, named gptplus and claudeai-eng, were uploaded by a user named "Xeroline" in November 2023. Both libraries purported to offer a way to access GPT-4 Turbo API and Claude AI API, but harbored malicious code that initiated the deployment of the malware upon installation. The "__init__.py" file in these packages contained Base64-encoded data that contained code to download a Java archive file ("JavaUpdater.jar") from a GitHub repository ("github[.]com/imystorage/storage"). It also downloaded the Java Runtime Environment (JRE) from a Dropbox URL if Java is not already installed on the host, before running the JAR file.
The collected information was then archived, transmitted to the attacker's server, and deleted from the victim's machine. The stolen data included web browser data, system data, screenshots, and session tokens from various applications like Telegram, Discord, and Steam. This information could have been used for a variety of nefarious purposes, including phishing attacks, financial gain, or simply selling it to other malicious actors.
JarkaStealer was found to be offered under a malware-as-a-service (MaaS) model via a Telegram channel for anywhere between $20 and $50. However, its source code has been leaked on GitHub, allowing anyone with access to the code to deploy the malware. This highlights the persistent risks of software supply chain attacks and underscores the critical need for vigilance when integrating open-source components into development processes.
Cybersecurity researchers have discovered two malicious packages uploaded to the PyPI repository that impersonated popular AI models like OpenAI ChatGPT and Anthropic Claude to deliver an information stealer called JarkaStealer. The packages, named gptplus and claudeai-eng, were uploaded by a user named "Xeroline" in November 2023, attracting 1,748 and 1,826 downloads, respectively.
The malicious packages were uploaded to the repository by one author and differed from each other only in name and description. They purported to offer a way to access GPT-4 Turbo API and Claude AI API, but harbored malicious code that initiated the deployment of the malware upon installation. Specifically, the "__init__.py" file contained Base64-encoded data that contained code to download a Java archive file ("JavaUpdater.jar") from a GitHub repository ("github[.]com/imystorage/storage"). It also downloaded the Java Runtime Environment (JRE) from a Dropbox URL if Java is not already installed on the host, before running the JAR file.
The JAR file was a Java-based information stealer called JarkaStealer that could steal a wide range of sensitive information, including web browser data, system data, screenshots, and session tokens from various applications like Telegram, Discord, and Steam. In the final step, the collected information was archived, transmitted to the attacker's server, and then deleted from the victim's machine.
Statistics from ClickPy show that the packages were downloaded mainly by users located in the U.S., China, India, France, Germany, and Russia as part of a year-long supply chain attack campaign. The discovery underscores the persistent risks of software supply chain attacks and highlights the critical need for vigilance when integrating open-source components into development processes.
"The malicious packages were uploaded to the repository by one author and, in fact, differed from each other only in name and description," Kaspersky said in a post. "The packages purported to offer a way to access GPT-4 Turbo API and Claude AI API, but harbored malicious code that initiated the deployment of the malware upon installation."
In conclusion, the recent discovery of JarkaStealer highlights the importance of software security and the need for vigilance when integrating open-source components into development processes. As the threat landscape continues to evolve, it is essential for developers and end-users alike to stay informed about emerging vulnerabilities and take steps to mitigate them.
Related Information:
https://thehackernews.com/2024/11/pypi-attack-chatgpt-claude.html
https://gbhackers.com/two-pypi-malicious-package-mimic-chatgpt-claude/
Published: Fri Nov 22 01:41:38 2024 by llama3.2 3B Q4_K_M