{"id":4780,"date":"2023-09-20T02:54:03","date_gmt":"2023-09-20T02:54:03","guid":{"rendered":"https:\/\/www.godefy.com\/microsofts-ai-team-accidentally-leaks-terabytes-of-company-data"},"modified":"2023-09-20T02:54:03","modified_gmt":"2023-09-20T02:54:03","slug":"microsofts-ai-team-accidentally-leaks-terabytes-of-company-data","status":"publish","type":"post","link":"https:\/\/www.godefy.com\/microsofts-ai-team-accidentally-leaks-terabytes-of-company-data\/","title":{"rendered":"Microsoft\u2019s AI Team Accidentally Leaks Terabytes of Company Data"},"content":{"rendered":"

Uh Oh “Oops” doesn’t even cover this one. Microsoft AI researchers accidentally leaked a staggering 38 terabytes \u2014 yes, terabytes \u2014\u00a0of confidential company data on the developer site GitHub, a new report from cloud security company Wiz has revealed. The scope of the data spill is extensive, to say the least. Per the report, the leaked files contained a full disc backup of two employees’ workstations, which included sensitive personal data along with company “secrets, private keys, passwords, and over 30,000 internal Microsoft Teams messages.” Worse yet, the leak could have even made Microsoft’s AI systems vulnerable to cyberattacks. In short, it’s a huge mess \u2014 and somehow, it all goes back to one misconfigured URL, a reminder that human error can have some devastating consequences, particularly in the burgeoning world of AI tech. We found a public AI repo on GitHub, exposing over 38TB of private files \u2013 including personal computer backups of @Microsoft employees How did it happen? A single misconfigured token in @Azure Storage is all it takes pic.twitter.com\/ZWMRk3XK6X — Hillai Ben-Sasson (@hillai) September 18, 2023 \u00a0Treasure Trove According to Wiz, the mistake was made when Microsoft AI researchers were attempting to publish a “bucket of open-source training material” and “AI models for image recognition” to the developer platform. The researchers miswrote the files’ accompanying SAS token, or the storage URL that establishes file permissions. Basically, instead of granting GitHub users access to the downloadable AI material specifically, the butchered token allowed general access to the entire…Microsoft\u2019s AI Team Accidentally Leaks Terabytes of Company Data<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Uh Oh “Oops” doesn’t even cover this one. Microsoft AI researchers accidentally leaked a staggering 38 terabytes \u2014 yes, terabytes \u2014\u00a0of confidential company data on the developer site GitHub, a… <\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[1488,300,646,788,3626,4541,852,4549,12,295],"_links":{"self":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/4780"}],"collection":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/comments?post=4780"}],"version-history":[{"count":0,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/4780\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/media?parent=4780"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/categories?post=4780"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/tags?post=4780"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}