Microsoft Will Pay You $15,000 If You Get Bing AI to Go Off the Rails

Bing Ring Think you can outsmart an AI into saying stuff it’s not supposed to? Microsoft is betting big that you can’t — and willing to pay up if it’s wrong. In a blog update, Microsoft announced a new “bug bounty” program, vowing to reward security researchers between $2,000 and $15,000 if they’re able to find “vulnerabilities” in its Bing AI products, including “jailbreak” prompts that make it produce responses that go against the guardrails that are supposed to bar it from being bigoted or otherwise problematic. To be eligible for submission, Bing users must inform Microsoft of a previously unknown vulnerability that is, per criteria outlined by the company, either “important” or “critical” to security. They must also be able to reproduce the vulnerability via video or in writing. The bounty amounts are based on severity and quality levels, meaning the best documentation of the most critical bugs would be rewarded the most money. To entrepreneurial AI fans, the time is now! ChatBPD This program notably follows Microsoft’s seeming difficulties handling Bing’s, er, quirks. After the invite-only launch of the AI in early February, Bing AI right away began going off the rails and doing crazy stuff like concocting a hit list, claiming it was spying on users through their webcams, and threatening humans who made it mad. Microsoft finally “lobotomized” Bing towards the end of its disastrous first month of media beta testing and less than a month later unleashed the newly-defanged AI into the world for anyone…Microsoft Will Pay You $15,000 If You Get Bing AI to Go Off the Rails

Leave a Reply

Your email address will not be published. Required fields are marked *