Ethical Hacking News
Microsoft has expanded its Copilot bug bounty program to include new types of vulnerabilities and increased payouts for moderate-severity flaws, as the company seeks to improve the security and reliability of its generative AI assistants.
Microsoft has expanded its Copilot bug bounty program to identify security vulnerabilities in its generative AI assistants. The updated program includes new types of vulnerabilities and increased payouts for moderate-severity vulnerabilities. New vulnerabilities to discover include deserialization, code injection, authentication issues, and more. Researchers can earn rewards ranging from $250 to $30,000 for identifying vulnerabilities. The program now focuses on specific services, including Copilot for Telegram and copilot.ai.
Microsoft has recently announced an expansion of its Copilot bug bounty program, which aims to identify and report security vulnerabilities in the company's generative AI assistants. The updated program includes new types of vulnerabilities that researchers can discover, as well as increased payouts for moderate-severity vulnerabilities.
The Copilot Bounty Program was initially introduced by Microsoft to encourage responsible disclosure of security flaws in its AI systems. The program allows researchers to submit bug reports and earn rewards for identifying previously unknown vulnerabilities. The payouts range from $250 for low-severity vulnerabilities to $30,000 for critical-severity vulnerabilities.
In an effort to improve the program's effectiveness, Microsoft has expanded the types of vulnerabilities it will pay researchers to find. The new list includes deserialization of untrusted data, code injection, authentication issues, SQL or command injection, server-side request forgery, improper access control, cross-site scripting, cross-site request forgery, web security misconfiguration, cross origin access issues, and improper input validation.
The company has also asked researchers to focus on specific services, including Copilot for Telegram, Copilot for WhatsApp, copilot.microsoft.com, and copilot.ai. This expanded program is expected to help Microsoft improve the security and reliability of its Copilot consumer products.
Microsoft's efforts to improve the security of its AI systems are part of a broader trend in the tech industry to increase transparency and accountability when it comes to artificial intelligence. As AI becomes increasingly pervasive, companies are recognizing the importance of ensuring that their AI systems are secure and reliable.
The expansion of Microsoft's Copilot bug bounty program is also significant because it highlights the growing importance of responsible disclosure practices in the cybersecurity industry. By encouraging researchers to identify and report security flaws in its AI systems, Microsoft is demonstrating a commitment to transparency and accountability.
Furthermore, the program's increased payouts for moderate-severity vulnerabilities suggest that the company is taking a proactive approach to addressing potential security risks. This is particularly important given the growing concern about the use of deepfakes and other forms of AI-generated misinformation.
Overall, the expansion of Microsoft's Copilot bug bounty program represents an important step forward in the tech industry's efforts to improve the security and transparency of its AI systems. By encouraging responsible disclosure practices and increasing payouts for moderate-severity vulnerabilities, Microsoft is demonstrating a commitment to ensuring that its AI systems are secure and reliable.
Related Information:
https://go.theregister.com/feed/www.theregister.com/2025/02/20/microsoft_copilot_bug_bounty_updated/
Published: Thu Feb 20 23:21:59 2025 by llama3.2 3B Q4_K_M