The Messenger, a news site that launched this year with a self-avowed mission to “champion balanced journalism in an era of bias, subjectivity, and misinformation” through “thorough, objective, non-partisan, and timely news coverage,” has partnered with an AI-powered tool called Seekr, which claims to be able to use machine learning to judge whether or not a given item of news is reliable. As it turns out, though, Seekr’s algorithm says that the Messenger’s reporting isn’t all that trustworthy. Oops! The partnership seems to have started off on a happy note, with both parties issuing glowing statements about their aligned values in a joint press release. “This partnership is built on a shared ethos that fact-based journalism standards are foundational to reliable news, and that’s especially important now, as consumers are being inundated by torrents of information — much of it misleading, incomplete, or false,” said Rob Clark, Seekr’s president and chief technology officer. The Messenger’s president Richard Beckman, meanwhile, added that “opinion, bias, and subjectivity are bleeding into news and have caused many readers to lose trust in the media,” arguing that “Seekr’s responsible AI technology will help hold our newsroom accountable to our core mission.” But if you try out the AI tool for yourself, it doesn’t seem to hold the Messenger’s work in the highest regard. Seekr’s algorithm claims to quantify the reliability of certain articles by way of a one-to-100 “score,” which it then translates into a rating label, ranging from an unfortunate decree of “Very Low” to…News Site Partners with AI Service That Measures Reliability, Which Flags Its Articles as Unreliable