Sponsored Content: How ALPR Encourages Objective Policing
In 2015, the Stanford Open Policing Project began analyzing traffic stops across the U.S., looking at the most comprehensive data set that had ever been collected. After controlling for factors like geography and gender, the study revealed evidence that officers generally stop Black drivers at higher rates than white drivers. Similarly, the project concluded that police require less suspicion to search Black and Hispanic drivers than white drivers.
These statistics highlight the fact that everyone — police, witnesses, and regular citizens — are subject to inherent human bias. In the same vein, the Innocence Project reports that the leading cause of wrongful convictions is mistaken eyewitness identifications.
Fortunately, today’s public safety technology presents law enforcement officers with tools that can help mitigate the natural phenomenon of human bias. Devices like automatic license plate reading (ALPR) cameras are not only effective for solving crimes quickly, they also help mitigate bias by focusing on wholly-objective evidence, like license plate numbers and vehicle details.
Of course, not all technology is built and deployed the same. To ensure an ALPR program is ethically deployed to maximize objectivity, there are a few key details to keep in mind.
Are citizens’ privacy rights safeguarded?
Protecting citizens’ privacy is key to ensuring your community’s trust. Data security contributes heavily to citizens’ confidence about the tool. People rightfully want to understand who has access to the ALPR data, how long it is stored for, and what limitations are in place to avoid any potential overwatch.
For example, Flock Safety ensures that access is only granted to trained law enforcement professionals, footage is stored for only 30 days by default, and is never shared or sold to a third party.
Is transparency built into the product?
Encouraging transparency from the beginning, or even before, implementing an ALPR program ensures that an agency is aligned with the community and creates a platform for open discussion.
Ask questions to make sure you have engaged any stakeholders you need to work with to vet and implement the ALPR program, and what policies or practices you will need to have in place to ensure everyone is onboard.
Constantly communicating with your community throughout the installation process and utilizing features like an ALPR Transparency Portal helps hold your agency accountable, reducing your chances of creating negative connotations with the technology.
Does the software reinforce objectivity?
Because ALPR often relies on accompanying software, there is an added risk for bias if that software attempts to predict aspects of a crime like where/if it will occur, how it will occur, and who might commit it.
So, instead of relying on potentially-biased historical data to predict the future, avoid inequity by deploying an ALPR that uses machine learning to better understand what is actually happening in the present moment. A simple list of questions you can ask vendors might include:
- How do you decide what to build from a machine learning standpoint?
- How do you evaluate how your data is being trained?
- What inputs do you bring into the data and where is the data from?
For more information about how ALPR can protect privacy, encourage transparency, and promote trust while working to eliminate crime, visit www.flocksafety.com.
About the Author
Kevin Cox is a Flock Safety Solutions consultant based in Dallas, Texas. He joined Flock Safety after spending 23 years at the Grand Prairie Police Department in the Problem Solving and Criminal Intelligence units. Cox managed investigations related to Human Trafficking and Vice as well as Crime Prevention, community engagement and partnerships. He also supervised the agency’s policing technology programs including license plate recognition technology, facial recognition systems, drones, and video analytics.