{"id":4985,"date":"2023-10-07T18:20:13","date_gmt":"2023-10-07T18:20:13","guid":{"rendered":"https:\/\/www.godefy.com\/study-finds-that-police-crime-predicting-ai-fails-miserably-at-predicting-crimes"},"modified":"2023-10-07T18:20:13","modified_gmt":"2023-10-07T18:20:13","slug":"study-finds-that-police-crime-predicting-ai-fails-miserably-at-predicting-crimes","status":"publish","type":"post","link":"https:\/\/www.godefy.com\/study-finds-that-police-crime-predicting-ai-fails-miserably-at-predicting-crimes\/","title":{"rendered":"Study Finds That Police \u201cCrime Predicting\u201d AI Fails Miserably at Predicting Crimes"},"content":{"rendered":"

Broken Algo Using\u00a0an algorithm similar to those that predict earthquake aftershocks,\u00a0cops have for more than a decade tried to use “predictive policing” software to presage where crimes will take place before they happen \u2014 and as a\u00a0joint investigation by The Markup and Wired shows, it seems to have failed miserably. Geolitica, a predictive policing software used by police in Plainfield, New Jersey \u2014 the only department out of 38 departments willing to provide the publications with information \u2014 was so bad at predicting crimes that its success rate was less than half a percent. Formerly known as PredPol, the machine learning [same as below] software purchased by authorities in Plainfield and other police departments promised to help cops fight crime before it happens, like Philip K. Dick’s classic novella “Minority Report” \u2014 except with computers instead of psychics. Unsurprisingly, these prediction models, which have been the subject of ample reporting over the years, are riddled with ethical concerns given the discriminatory and racist natures of both artificial intelligence and law enforcement. And, as this new investigation reveals, they’re shockingly ineffective at predicting crime, as well. Hilariously Bad To get to this conclusion,\u00a0The Markup and\u00a0Wired looked at 23,631 of Geolitica’s predictions between the months of February and December 2018 and found that less than 100 of those predictions actually aligned with an actual crime, a success rate of less than half a percent. While the algorithm was slightly better at predicting some crimes than others \u2014 it correctly identified 0.6 percent…Study Finds That Police \u201cCrime Predicting\u201d AI Fails Miserably at Predicting Crimes<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Broken Algo Using\u00a0an algorithm similar to those that predict earthquake aftershocks,\u00a0cops have for more than a decade tried to use “predictive policing” software to presage where crimes will take place… <\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[300,2957,4704,4778,211,3840,4779,4702,566,136,12,295],"_links":{"self":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/4985"}],"collection":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/comments?post=4985"}],"version-history":[{"count":0,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/4985\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/media?parent=4985"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/categories?post=4985"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/tags?post=4985"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}