Every Last One The attorneys general from all 50 US states — plus a smattering of territories — have signed a letter urging Congress to take action against the proliferation of AI-generated child sexual abuse material (CSAM.) As first reported by The Associated Press, the bipartisan letter, sent Tuesday to Republican and Democratic legislators in the House and Senate, asks political leaders to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically” and “propose solutions to deter and address such exploitation in an effort to protect America’s children.” Already Overdue In the letter, the prosecutors specifically call on US lawmakers to expand existing CSAM laws, which don’t yet explicitly account for the creation and distribution of synthetic child abuse content — which at this point, is already overdue. Back in June, The Washington Post reported that the growing prevalence of AI-generated CSAM was making it more difficult to help real child sex abuse victims, with one expert, Rebecca Portnoff, the director of data science at the nonprofit child-safety group Thorn, telling the newspaper that she and her team had seen a month-over-month increase since image generators first started to reach the public sphere last fall. With that in mind, it’s worth noting that as open-source image generators become increasingly prevalent and easy to access, it’ll likely become that much harder to police what they’re able to produce. The attorney generals also called attention to the recent strides made by deepfake…Every Single State’s Attorney General Is Calling for Action on AI-Generated Child Abuse Materials