Microsoft’s AI Team Accidentally Leaks Terabytes of Company Data

Uh Oh “Oops” doesn’t even cover this one. Microsoft AI researchers accidentally leaked a staggering 38 terabytes — yes, terabytes — of confidential company data on the developer site GitHub, a new report from cloud security company Wiz has revealed. The scope of the data spill is extensive, to say the least. Per the report, the leaked files contained a full disc backup of two employees’ workstations, which included sensitive personal data along with company “secrets, private keys, passwords, and over 30,000 internal Microsoft Teams messages.” Worse yet, the leak could have even made Microsoft’s AI systems vulnerable to cyberattacks. In short, it’s a huge mess — and somehow, it all goes back to one misconfigured URL, a reminder that human error can have some devastating consequences, particularly in the burgeoning world of AI tech. We found a public AI repo on GitHub, exposing over 38TB of private files – including personal computer backups of @Microsoft employees How did it happen? A single misconfigured token in @Azure Storage is all it takes — Hillai Ben-Sasson (@hillai) September 18, 2023  Treasure Trove According to Wiz, the mistake was made when Microsoft AI researchers were attempting to publish a “bucket of open-source training material” and “AI models for image recognition” to the developer platform. The researchers miswrote the files’ accompanying SAS token, or the storage URL that establishes file permissions. Basically, instead of granting GitHub users access to the downloadable AI material specifically, the butchered token allowed general access to the entire…Microsoft’s AI Team Accidentally Leaks Terabytes of Company Data

Leave a Reply

Your email address will not be published. Required fields are marked *