Researchers surveyed over 3,000 Americans to understand when the public thinks police should call off high-speed chases, finding that people generally support ending pursuits when conditions become dangerous (high speeds, bad weather, heavy traffic) but are more willing to accept risks when violent criminals are being chased. These findings suggest that current police policies emphasizing risk-based decision-making align well with public expectations about when chases should continue versus when they should be stopped. Understanding public attitudes on police chases is important because these pursuits can result in serious injuries or deaths to officers, suspects, and innocent bystanders.
Researchers tested whether artificial intelligence tools actually help police officers write reports faster, as companies selling these technologies claim. The study found that AI assistance did not significantly reduce the time it takes officers to complete their reports. This matters because police departments are already spending money on unproven AI tools that may not deliver the promised efficiency benefits, suggesting agencies should be more cautious before adopting these technologies.
Researchers conducted the first experimental test of AI tools designed to help police officers write reports faster, but found these tools did not actually reduce the time it takes officers to complete their paperwork. This finding contradicts marketing claims made by companies selling AI writing software to police departments, many of which have already started using these unproven technologies. The results suggest police agencies should be cautious about investing in AI report-writing tools since they may not deliver the promised time savings that could free up officers for other duties.
A new study examining civilian review boards—independent panels that investigate police misconduct—found these oversight bodies don't actually improve the public's trust in police as intended. While people with already negative views of police saw some benefit from civilian oversight, the boards generally failed to boost police legitimacy, and when they disagreed with police chiefs' decisions, public trust in both police and the oversight system actually decreased. These findings suggest that civilian review boards, despite widespread public support for the concept, may not be the effective solution to police accountability problems that many communities hope they will be.
Researchers examined the scientific quality of work produced by the Force Science Institute, a private company that provides expert testimony and training materials used in police use-of-force cases, and found that their research fails to meet basic scientific standards required for court evidence. Despite being presented as authoritative science, Force Science materials lack the reliability and rigor that courts, police departments, and training programs should demand when making critical decisions about police use of force. This matters because unreliable "science" is currently being used to influence legal cases, officer training, and department policies that determine when and how police use force against civilians.
Researchers surveyed over 2,400 people to understand how the public views police officers using profanity in different situations, finding that context matters significantly—while swearing at the public, especially in a mean-spirited way, was widely condemned, casual swearing among officers or about situations was generally seen as acceptable. The study revealed that police leaders tend to be much harsher about officer profanity than the public expects them to be. These findings suggest that police departments might benefit from more flexible language policies that focus on truly harmful speech rather than blanket rules that treat all swearing the same way.
Researchers tested whether using artificial intelligence to automatically review police body camera footage could improve officer behavior in two large US police departments. The AI monitoring led to measurable improvements in how professionally officers interacted with the public, though the specific changes varied depending on whether the department was already under court-ordered reform. This suggests that automated review of body camera footage could be a practical tool for police accountability, helping departments better supervise officers when they can't manually review the millions of hours of video recorded each year.
Researchers studied whether gunshot detection technology - acoustic sensors that automatically alert police to gunfire - actually improves police effectiveness in a mid-sized city, finding that it helped officers discover more shootings, seize more guns, and respond faster to incidents without victims. The technology appeared to reduce non-fatal shootings but had no effect on fatal shootings, suggesting it works best for lower-level gun violence that might otherwise go unreported to police. This matters because most previous studies focused on large cities, leaving smaller police departments uncertain whether this expensive technology would benefit their communities.
Police leaders and researchers surveyed about implementing new policing practices revealed that even effective innovations often fail because of organizational barriers rather than whether they actually work. Key obstacles include officers' distrust of changes proposed by outsiders, lack of integration into performance reviews, and the reality that new approaches are typically more complex than existing methods. These findings help explain why many promising police reforms struggle to take hold in departments, regardless of their proven effectiveness.
Researchers tested whether police officers who actually used an AI tool to help write reports had different opinions about the technology compared to officers who didn't use it, finding that both groups had similarly positive views regardless of hands-on experience. While officers who used the AI tool didn't report significantly better outcomes than those who didn't, supervisors noticed improvements in report quality and efficiency when officers used the technology. This suggests that police departments considering AI writing tools should focus on proper training and managing expectations, since officer attitudes may be shaped more by general perceptions of the technology than by direct experience with it.