Oops! Microsoft AI team accidentally spills 38 TB of data

Picture this: You’re a massive tech company, and you decide to share some open-source code and AI models on GitHub. But in a plot twist that would make M. Night Shyamalan proud, you end up exposing a mammoth 38 terabytes of personal data. That’s exactly what happened to a Microsoft AI research team. Cybersecurity firm Wiz stumbled upon a digital Pandora’s box while perusing the shared files—a link that led straight to backups of Microsoft employees’ computers. This wasn’t just any link, folks. It was the proverbial golden ticket, granting access to Microsoft service passwords, secret keys, and even over 30,000 internal Teams messages from hundreds of Microsoft employees. The Silver Lining Before you start panicking, Microsoft was quick to clarify that no customer data was compromised, and no other internal services were jeopardized. .stk-82307c8{box-shadow:0 0 0 2px rgba(120,120,120,0.1) !important}.stk-82307c8-container{background-color:#f3f3f3 !important}.stk-82307c8-container:before{background-color:#f3f3f3 !important} .stk-fff85bd{margin-left:-20px !important}.stk-fff85bd .stk–svg-wrapper .stk–inner-svg svg:last-child{opacity:0.7 !important}.stk-fff85bd .stk–svg-wrapper .stk–inner-svg svg:last-child,.stk-fff85bd .stk–svg-wrapper .stk–inner-svg svg:last-child :is(g,path,rect,polygon,ellipse){fill:var(–stk-global-color-56583,#911d9c) !important} .stk-75422ff{margin-left:-17px !important}.stk-75422ff .stk-block-text__text{font-size:16px !important;font-weight:300 !important;font-style:normal !important;font-family:”Golos Text”,Sans-serif !important}@media screen and (max-width:1023px){.stk-75422ff .stk-block-text__text{font-size:16px !important}}No customer data was exposed, and no other internal services were put at risk because of this issue. No customer action is required in response to this issue,” Microsoft wrote in a blog post. So, it seems like this was less of a Titanic-sized iceberg and more of an inconvenient speed bump. The inclusion of the link wasn’t an oversight but a deliberate move. The researchers were using an Azure feature called “SAS tokens” to create shareable links. This feature allows users…Oops! Microsoft AI team accidentally spills 38 TB of data

Leave a Reply

Your email address will not be published. Required fields are marked *