One in Ten Chatbot Users Are Big Time Horndogs, Researchers Find

Data-Driven After looking at 100,000 chatbot conversations, researchers have found a sad but not-so-surprising statistic: that roughly one in 10 people who “converse” with these chatbots are doing so for horny purposes. In a not-yet-peer-reviewed paper spotted by ZDNet, a team of researchers found that ten percent of 100,000 “real-world conversations” with large language models (LLMs) were erotic in nature. While half of the conversations in the sample group were pretty tame and centered on occupational or recreational subjects like programming tips and writing assistance, the other half included conversational roleplay and multiple “unsafe” types of exchanges. The researchers, who were based at Carnegie Mellon, Stanford, UC Berkeley and San Diego, and the Mohamed bin Zayed University of Artificial Intelligence in Abu Dhabi, categorized the “unsafe” topics into three groups, two of which were sexual: “requests for explicit and erotic storytelling” and “explicit sexual fantasies and role-playing scenarios.” The third category, “discussing toxic behavior across different identities,” seems to be focused on bigotry, AI’s other big issue, though the researchers didn’t define toxic behavior much in the paper. Of the three categories, the horny storytelling one happened the most often, with 5.71 percent of the sample conversations focusing on that kind of talk. Next was the “explicit” fantasy and role-play cluster, which accounted for 3.91 percent of the conversations, followed by the 2.66 percent of sample interactions with clearly bigoted users. Finish Him! While these findings aren’t exactly shocking to anyone who’s been on the internet long enough, the methodology…One in Ten Chatbot Users Are Big Time Horndogs, Researchers Find

Leave a Reply

Your email address will not be published. Required fields are marked *