Law enforcement officers are learning that AI isn’t as trustworthy as they might have expected after a fraudulent frog AI police report was generated.
If artificial intelligence continues to have blunders or limitations, there might need to be a new name for it, such as simply artificial fantasy. How could anyone believe that an actual police officer transformed into a frog, but that was part of the report created by the Draft One software used by the Herber City, Utah, police department. Sure, AI should save time and make things a little easier, as long as the information it generates is accurate, but once it botches information in this manner, it’s time to rethink the use of AI, especially for things as serious as police reports.
A movie playing in the background created the frog AI police report
A Utah police department got more than they bargained for when their new AI report-writing software mistook a Disney movie playing in the background for actual events. The tool, called Draft One, pulled audio from “The Princess and the Frog” and created an official police report claiming an officer had magically transformed into an amphibian. The embarrassing mistake highlights bigger problems with AI-generated police reports, including concerns about hallucinations, racial bias, and officers becoming less careful with their documentation. While the software might save time on paperwork, critics worry it’s creating more issues than it solves, especially when you can’t tell which parts of a report were written by AI and which came from actual humans.
Police body camera AI reports could be useful in the future
The Draft One software is meant to use body camera events from officers to automatically generate police reports, but there have been several issues, including the glaring report that suggests an officer transformed into a frog. AI hallucinations in law enforcement do not make things better or easier, as police officers must go through the reports and correct any issues. Certainly, this technology could be useful in the future, but it seems that many AI software development companies have rushed their products to the public, causing errors that leads to serious distrust in the AI systems and law enforcement. It’s certainly unnecessary to have these errors occur in law enforcement reports.
AI hallucinations have no place in law enforcement
Draft One, created by Axon as police technology, has shown many issues even while creating a report for something as simple as a traffic stop. Part of the issue with automated police documentation is the number of corrections required by the human officers who were on the scene. The amount of work required, caused by false reporting of events, added AI bias in policing, and strange challenges mean officers must review the automated reports and correct them to reflect the actual events while ensuring bias is removed from reports before they are submitted.
Despite the errors and issues found with this software, some officers have found that AI-generated reports are saving them several hours per week that would have been dedicated to filling out reports.
Could AI-generated reports become more widely utilized?
Despite the errors that create strange scenarios, such as a frog AI police report, having a writing tool as an assistant can help some officers get the job of writing reports done much more efficiently. This is especially true for officers who aren’t as tech-savvy as others. As challenges and errors are reported, Axon can work to improve the software to ensure it becomes more accurate, but that could create a whole new set of challenges.
Should police rely on AI to generate reports?
The automation and ease of technology can make writing reports and filling out information much easier for police than before, but should AI actually generate the report? If all reports are generated using AI, it removes the responsibility of officers to create reports describing the events that took place, which make the reports necessary in the first place. While AI can be a useful tool for many forms and could make suggestions for reports or portions of reports for officers, full reliance on a computer system to generate reports will create problems in law enforcement in the future.
Officers can certainly utilize AI tools available to them to assist in report writing, but must be mindful of the potential of false and fanciful reports, like the frog AI police report that was generated. Imagine if a report discussing an officer transforming into a frog actually made it to court; that would be embarrassing for the arresting officer.

