Major News Site Warns ChatGPT Is Inaccurate, Announces Plans to Use It Anyway

Insider has given its reporters the green light to use OpenAI’s ChatGPT for its reporting, as long as they don’t plagiarize or misconstrue any facts in the process — while acknowledging, strangely, that ChatGPT has been known to both plagiarize and fabricate. “AI at Insider: We can use it to make us faster and better,” reads the subject line of an internal email to employees from Insider’s global editor-in-chief Nich Carlson, screenshots of which were shared to Twitter by Semafor media reporter Max Tani. “It can be our ‘bicycle of the mind.'” Per his email, Carlson is clearly a big supporter of the tech, telling his employees that the bot can be used for tasks ranging from background research to generating SEO-friendly metadata and headlines to generating article outlines and defeating writer’s block and more — his “bicycle of the mind” metaphor presumably arguing that ChatGPT is really just a helpful tool for getting from point A to point B faster. “I’ve spent many hours working with ChatGPT, and I can already tell having access to it is going to make me a better global editor-in-chief for Insider,” Carlson wrote in the email. “My takeaway after a fair amount of experimentation with ChatGPT is that generative AI can make all of you better editors, reporters, and producers, too.” And yet, despite the editor’s apparent enthusiasm, the green light to incorporate AI-generated text into day-to-day workflow was drenched with warnings and caveats about AI-generated pitfalls. In a lengthy note to staff this…Major News Site Warns ChatGPT Is Inaccurate, Announces Plans to Use It Anyway

Leave a Reply

Your email address will not be published. Required fields are marked *