Police departments adopt AI to write reports

AI Reports

Police departments in Oklahoma City, Fort Collins, Colorado, and Lafayette, Indiana have begun using AI-powered software to write incident reports. The software, called Draft One, is developed by Axon, a company known for its body cameras and Taser products. Draft One uses the same large language model as OpenAI’s ChatGPT.

It draws information from body camera audio recordings to generate comprehensive reports within seconds. Competing products from companies like Policereports.ai and Truleo offer similar capabilities. According to Axon, police officers spend 40% of their time writing reports.

The AI tools aim to eliminate that work, requiring officers only to read the generated reports and confirm their accuracy. In Oklahoma City, the tool is currently used for minor incidents where no arrests are made. However, departments in Fort Collins and Lafayette have reported using it for all types of incidents.

While the potential time savings are significant, concerns have been raised about the tool amplifying biases or getting facts wrong. Axon’s chief executive told the Associated Press that district attorneys have expressed worries about officers using the tool and later claiming they didn’t write the reports, possibly to avoid responsibility during legal investigations. Captain Jason Bussert of the Oklahoma City Police Department demonstrated Draft One, showing how it can produce a report quickly and accurately using sounds, speech, and radio chatter from body cameras. Sgt.

Ai-powered tool for police reports

Matt Gilmore, who used the tool after a search operation with his K-9 dog, Gunner, was impressed by the results, stating, “It was a better report than I could have ever written, and it was 100% accurate. It flowed better.”

Rick Smith, Axon’s founder and CEO, emphasized the positive reaction from officers who have tried the tool, as it saves them from the tedious task of data entry.

However, legal experts, prosecutors, and police watchdogs have raised concerns about the implications of relying on AI for critical documentation. Smith acknowledged these concerns, stressing the importance of officers being responsible for the content of the reports, especially since they may have to testify in court about their accounts. Community activist Aurelius Francisco expressed concerns about biases and prejudices that may be compounded by AI tools, calling these issues “deeply troubling.” As police departments continue to explore the benefits and challenges of AI-generated reports, the broader implications for the criminal justice system and civil rights remain to be seen.

Legal and civil rights experts have warned that police reports are the foundation of the entire justice system, and tampering with them could have serious consequences. Police reports influence plea bargains, sentencing, discovery processes, trial outcomes, and how society holds police accountable. Andrew Ferguson, a law expert, wrote in a law review that the act of writing out a justification, swearing to its truth, and publicizing that record to legal professionals serves as a check on police power.

Experts have cautioned that introducing chatbots, which are known to hallucinate, confuse jokes for facts, or randomly add incorrect information, could legitimize wrongful arrests, reinforce police suspicions, mislead courts, or even cover up police abuse. Axon’s manager for AI products, Noah Spitzer-Williams, claimed that Draft One is less prone to hallucination than ChatGPT because the company has turned down the “creativity dial” on the tool. Despite Axon’s recommendation to limit early uses of Draft One to minor incidents and charges, the company’s CEO, Rick Smith, has touted the tool’s potential to revolutionize police work and help scale public safety operations.

However, Matthew Guariglia, a senior policy analyst at the Electronic Frontier Foundation, has called for urgent scrutiny of the increasingly rampant use of Draft One, stating, “We just don’t know how it works yet.”