Police departments begin using Axon’s AI tool

Axon AI

Police departments in the United States are starting to use AI software to help write police reports. Axon, a company known for making body cameras and other law enforcement technology, has created a generative AI tool called Draft One. The company says the software can reduce the time officers spend on routine paperwork by 30-45 minutes.

Draft One was launched in April. It aims to reduce the workload on police officers and improve their interaction with the community. Axon said in its release, “When officers can spend more time connecting with the community and taking care of themselves both physically and mentally, they can make better decisions that lead to more successful de-escalated outcomes.”

The tool works by transcribing audio from police body cameras.

It then uses AI to create a draft narrative based strictly on the audio transcripts. This is done to prevent speculation or embellishments. Officers must review and sign off on the accuracy of the reports.

Each report also indicates if AI was used to create it.

Axon’s AI easing police reporting

Noah Spitzer-Williams, Axon’s AI products manager, said Draft One uses the same underlying technology as ChatGPT, which was designed by OpenAI.

There have been concerns about generative AI producing misleading information. However, Spitzer-Williams said Axon has more control over the software, allowing it to maintain factual integrity in police reports. The scope of Draft One’s usage varies by department.

In Oklahoma City, the police department uses the software mainly for minor incident reports. In Lafayette, Indiana, officers are permitted to use it for any kind of case. However, faculty at Purdue University have raised concerns about the reliability of generative AI in high-stakes situations like police encounters.

Lindsay Weinberg, a Purdue clinical associate professor focusing on digital and technological ethics, said large language models are not designed to generate truth but rather plausible-sounding sentences based on prediction algorithms. She also said algorithmic tools often reproduce and amplify existing forms of racial injustice. “The use of tools that make it ‘easier’ to generate police reports in the context of a legal system that currently supports and sanctions the mass incarceration of marginalized populations should be deeply concerning to those who care about privacy, civil rights, and justice,” Weinberg said.

At the time of writing, Axon, Microsoft, and the Lafayette Police Department had not responded to requests for comment.