Surveillance towers, real-time digital fingerprinting and autonomous patrol cars are just a few examples of the tech-driven revolution in law enforcement

Police ranks are depleted. Tenured officers have left in droves. Qualified recruits are hard to come by. Budgets are slashed. Criminals have gone high tech. Can artificial intelligence and other cutting-edge technologies fill the void?

Consider the following.

In Chihuahua, Mexico, a massive tower pulls in feeds from thousands of cameras, biometric sensors, license plate readers, drones and other sensors from infrastructure throughout the region.

In the United Kingdom, a new digital-fingerprint-matching system will allow law enforcement to identify suspects in real time from mere traces of their fingerprints.

Europol’s Innovation Lab is using artificial intelligence (AI) to process massive amounts of data to identify trends and patterns, as well as leveraging tools such as ChatGPT to act as investigative assistants.

Belgian police have developed a platform that allows investigators to cross reference more than 50 separate internal databases and yield results in seconds. New Jersey has used a similar approach to dramatically curtail gun crime.

In Singapore, AI tools help investigators cull potentially obscene materials from seized electronic devices.

Last October, Dubai Police exhibited self-driving patrol cars with 360-degree cameras, license plate readers, an onboard drone and facial recognition technology that will patrol residential neighborhoods.

Other departments and agencies leverage AI to audit bodycam footage, standardize report writing, analyze DNA, extract images from video, detect evidence in crime scene photos and much more.

And now, working with academic and organizational partners Staffordshire University, Swedish Defense College, Professional Development Institute of the University of Ottawa and the Global Consortium of Law Enforcement Training Executives, the Rutgers University Miller Center on Policing and Community Resilience is developing a center of excellence dedicated to the responsible, effective and innovative use of AI.

This article highlights some of the most promising current and potential deployments of AI in global law enforcement.

Education and policy

Before law enforcement deploys AI, it is critical to educate the public and create policy and procedures to ensure fair, equitable and constitutional use of the technology. The Rutgers Miller Center initiative will address that very issue.

“It’s crucial to develop and implement sound policies grounded in the laws of privacy and evidence,” says Jack Donohue, a Miller Center Senior Fellow and former NYPD chief. “Most important, it’s essential to keep a human in the loop who has training and knowledge and set the guardrails for constitutional use of these emerging technologies.”

Similar to the Miller Center, San Mateo (California) County’s Sheriff’s Office established a panel from across the community to usher in use of AI. Undersheriff Chris Hsiung emphasizes that it is critical to involve the public in the process for education, buy-in, and accountability. San Mateo assembled a group of attorneys, civil rights advocates, ethicists, technologists and others to establish and communicate use policies and procedures.

Other efforts are addressing the responsible use of AI. For one, the Law Commission of Ontario (Canada) Criminal AI Lifestyle Project seeks to explore how AI will affect the country’s criminal justice system, particularly on access to justice, due process, human rights and civil liberties.

Current and potential uses of AI in policing

The following use cases, trials and tests represent a snapshot of current and potentially promising applications of AI around the world. They include bodycam footage analysis, report writing, data integration and investigations assistance.

Bodycam footage analysis

When body-worn cameras first came on the scene, the technology was promised to hold police accountable, document encounters with the public, collect evidence and facilitate incident review. However, only a tiny fraction of the hundreds of millions of hours of bodycam footage captured by police is ever reviewed.

Law enforcement agencies in states such as California, New Jersey, New York and Michigan are using such software to review petabytes of bodycam data.

“You need an hour-plus human time to review every hour of footage,” explains Ian Adams, an assistant professor at the University of South Carolina, whose team is running two randomized control trials on technology that uses machine learning and AI to audit body-worn camera video.

The technology doesn’t actually process the footage, though, Adams says. Instead, it examines audio transcripts of filmed encounters to identify activity of interest, such as use of profanity, force, insults, or threats. The process includes a human in the loop, notes Adams. “The technology creates markers and asks supervisors questions, such as ‘Was this an arrest?’ The supervisor answers ‘yes,’ ‘no,’ or that the transcript is mistaken.”

This can promote public accountability. For example, a citizens group might complain that officers use excessive profanity. The technology can identify moments that indicate possible profanity and mark them for a supervisor to review. “The supervisor can then address questions such as ‘Is it profanity?’ ‘If so, is the use within policy limits?’ ‘Was the profanity part of a rap song in the background?’” says Adams.

For now, AI isn’t advanced enough to audit the actual video, says Adams. “Today it’s in the too-hard bucket, but not impossible. A few years ago, it was impossible.”

Report writing

Another popular application for AI in law enforcement is assisting in writing accurate reports. Jonathan Parham, former Chief of Police of Linden, New Jersey, and now city manager for Rahway in the same state, has leaned on AI in both roles.

In his experience, AI is a godsend because the quality of police writing has worsened. As police departments have lost staff due to budget cuts and retirements, inexperienced officers with undeveloped writing skills are writing most criminal reports.

Common errors include inconsistent use of terminology, insufficient or missing detail such as license plate numbers, spelling errors, inadequate legal justification and conclusory statements.

“They don’t make it to the judge when they’re garbage,” Parham says. “You will lose cases, get plea deals and get cases get thrown out. Good reports help prosecutors prosecute.”

AI probes for the types of issues Parham describes. “The AI agent asks questions, collates information and provides a comprehensive narrative,” Parham says. Linden police and Rahway officials tailor the AI for their specific purposes. For example, police administrators would likely instruct the AI to address issues such as privacy (referring to a victim as “Victim 1” instead of their name, for example), probable cause, and evidence.

Though Parham’s teams haven’t collected metrics on the difference between traditional reports and AI-assisted reports, he says the anecdotal results are excellent. For instance, prosecutors have noted that the reports are both better written and establish improved factual predicates for criminal charges to stick. The accuracy and sufficiency of the report are less likely to be successfully challenged by defendants. These results have enhanced the department’s credibility with the district attorney’s office.

They also improve justice and fairness outcomes, Parham says. AI focuses officers on the elements that are essential to an effective prosecution. Innocent suspects benefit because AI helps banish inference, ambiguity, inconsistency and speculation, forcing the state to only move forward on solid cases.

Data integration

Police need to access scores of databases: Fingerprints, DNA, license plate numbers, ballistics, firearms, firearms owners, victims, immigration records, tax filings, property records, sex offenders and so on. The problem is, they usually sit in separate databases across dozens of different law enforcement entities, government agencies and other official repositories.

Forward-thinking and tech-savvy departments are finding ways to pull these data sets together and use AI to make unexpected connections. The Antwerp Police Department in Belgium has been at the forefront. The agency has developed a digital platform with an integrated user interface called FOCUS that combines up to 50 different databases.

The project started a decade ago, explains ICT Deputy Director Stijn Haemhouts, when officers conducting investigations had to serially check dozens of different apps and databases. “They had to remember 30 usernames and passwords,” recalls Haemhouts. “Each database was siloed. There was no interoperability.”

Mimicking user interfaces such as those by Facebook and Google Suite, Antwerp police created a platform that could evolve so there is no need for the hassle of migrating everything to a new platform in seven or eight years. The platform takes the databases as they find them — it doesn’t recreate legacy data from the mishmash of commercial off-the-shelf and government databases and platforms, operated and maintained by different entities. It simply places a layer over these systems, in essence creating the so-called “single pane of glass,” Haemhouts says.

5b96d379-7d54-4a3f-a206-77600af1314c.png
Antwerp police have made criminal investigations vastly more efficient and effective via a platform that combines dozens of databases.

Now, a single query — say, full name — can immediately yield a wealth of related information, even specifying exactly what data came from which sources and where those sources reside.

Haemhouts describes a case in which police received a domestic violence call but had little information beyond the address. Entering the address into FOCUS yielded a wealth of critical information, including that the suspect was a former member of special ops, owned two pit bulls, had an active restraining order on him, and was a bulked-up likely steroid user with a history of weapons and drug offenses. Responding officers took appropriate precautions to arrest him without incident.

Antwerp police is using AI to cross-reference the data to predict likely crime scenes, trends and tactics. For example, it will indicate locations where police should be more present, such as via drive-bys or on foot. Patrols have expanded from 8 to 12 hours so police could be more visible in the community and establish better relationships with the public. And the patrols are more efficient because of FOCUS and AI analysis.

Jurisdictions around the world are catching on. In San Mateo County, California, for instance, the sheriff’s office recently completed a pilot program that uses AI to “connect the dots” between the dozens of data sets such as calls for service, outstanding warrants, video feeds, permits and licenses, traffic accidents and protection orders.

Chris Hsiung, undersheriff for that office says that their AI agent is not predictive but is effective for both current and old cases. For example, San Mateo has resolved cold cases by scanning the files into their system via optical character recognition. In one case, “AI took one and a half seconds to find a lead in a cold case that took a person three weeks to find,” says Hsiung.

The initiative crosses county lines. “Each chief gets to decide what to share with neighboring agencies,” Hsiung says. “We share our buckets of data with our neighbors’ buckets.”

Investigations

Among Europol’s several AI initiatives, among the most promising is its application of Retrieval Augmented Generation (RAG). RAG is an AI system that make it possible to chat with documents. RAG uses Large Language Models (LLMs) as one of its components to interact with text documents, in particular to answer questions, retrieve paragraphs of interest, or summarize their contents. The objective of the Europol research is to use RAG on data sets from exceptionally large investigations that might cover several member states and millions of pieces of data. “We can put this data in a large language model and create hypotheses that we can test with this sort of investigators’ version of ChatGPT” says Gregory Mounier, Head of Europol Innovation Lab. “This is a new frontier in policing.”

Over the past nine months or so, the Europol Innovation Lab has developed and tested different RAG prototypes, Mounier says. He adds that they continue to work on the prototypes and fine tune the models, as well as ensure compliance with the recently passed Artificial Intelligence Act. They hope to introduce the new technology to several member states by the end of the year.

Real-Time Crime Centers

Real-time crime centers (RTCCs) collect feeds from surveillance cameras, license-plate readers, emergency call records, gunshot detectors, drones and other sensors to help police immediately identify, respond to and investigate incidents and trends. Though the first RTCC emerged almost two decades ago, they are popping up around the globe — 150 in the United States alone — sometimes augmented with AI.

Police in the St. Cloud (Florida) RTCC use AI to sift through than 7,000 video feeds and license-plate readers to search for types of vehicles — based on color and other features — and individuals by what they are wearing. The London Underground is testing AI in its RTCC to detect body language, behavior and physical movements for possible threat indicators in its underground rail system. It is searching for activity such as fare evasion, suicidal behavior and pickpocketing. Nice, France, is preparing for the upcoming summer Olympics by powering its RTCC with AI to capture, cull and analyze data generated from facial recognition, advanced video analytics and other technologies.

In each case, however, civil liberties and privacy advocates have pushed back against potential abuses, such as misidentification of suspects and objects. Jurisdictions have responded with legislation, regulation, stricter policies and guidelines, or temporary cessation of AI. Some departments are so spooked by the public perception of AI that they have banished it from their RTCCs.

This article documents only a few of the many law enforcement forays into AI. Experts agree that the technology and applications are still in their infancy. Europol’s Mounier foresees “something like an unmanned police future.” Generative AI will be much more powerful than today, he predicts, and will be combined with powerful, dexterous robots that could range in size from an insect to a one-ton behemoth. “If you inject an intelligent brain in a physically powerful robot, you can imagine an exciting but challenging future,” he says.

Are we ready?