Microsoft Copilot Studio had a security issues which could have allowed threat actors to exfiltrate sensitive data from vulnerable endpoints, experts have warned.
Cybersecurity researcher Evan Grant from Tenable, who found and reported on the vulnerability, which is described as an information disclosure flaw stemming from a server-side request forgery (SSRF) attack, and tracked as CVE-2024-38206 with a severity score of 8.5.
Copilot Studio is an end-to-end conversational AI platform that empowers users to create and customize copilots using natural language or a graphical interface.
Microsoft patches the bug
Describing the flaw, Grant said it abuses a Copilot feature in which it makes external web requests.
“Combined with a useful SSRF protection bypass, we used this flaw to get access to Microsoft’s internal infrastructure for Copilot Studio, including the Instance Metadata Service (IMDS) and internal Cosmos DB instances,” Grant said.
In layman’s terms, Grant pulled the instance metadata in Copilot chat messages and used it to grab managed identity access tokens. These, in turn, allowed him to access other internal resources, as well as read/write features on a Cosmos DB instance.
“An authenticated attacker can bypass Server-Side Request Forgery (SSRF) protection in Microsoft Copilot Studio to leak sensitive information over a network,” Microsoft said in an advisory, effectively acknowledging the bug. There is nothing the users need to do, however, the bug is handled on Microsoft’s side.
While the flaw does allow crooks to access sensitive data, it doesn’t allow them to access cross-tenant information, Grant concluded. Still, since the Copilot Studio infrastructure is shared among multiple tenants, in theory it means that multiple customers can be affected when having elevated access to Microsoft’s infrastructure.
Microsoft Copilot Studio is part of Microsoft’s broader Copilot initiative, which integrates AI-powered tools into its software suite. Announced in 2023, Copilot Studio allows organizations and developers to tailor Copilot’s behavior to their specific needs.
Via The Hacker News