Op-Ed: The Future of Policing and the End of Privacy
Imagine a future where police predict the next crime. By crunching crime data and utilizing predictive analytics, computers will forecast the likelihood of a burglary, a car theft, or an assault at a particular location, time, and date. Based on the forecast, police target the area, waiting for the crime, or perhaps more optimistically hoping that their presence will deter the criminal actors.
“Imagine a future in which police target the people most “at risk” for violence using a rank-ordered algorithmic list of danger.”
Imagine a future in which police target the people most “at risk” for violence using a rank-ordered algorithmic list of danger. From the most “at risk” to the least, citizens are given a threat score from 1-500. Every person arrested gets an individualized number and is thus judged accordingly. Officers use the results as a virtual most-wanted list to target potential suspects.
Imagine a future in which cameras record every pedestrian, sensors track every automobile, and planes fly overhead videotaping the citizens below. All of this visual surveillance is captured for posterity giving police a virtual time machine to search for clues. If a crime occurs, one just has to rewind the tape and track the bad guys.
Imagine no more. The future is here and these technologies and many others are beginning to change how law enforcement does its job.
In Los Angeles, California, predictive policing algorithms guide routine police patrols. In Chicago, Illinois, the Strategic Subjects List (“the heat list”) ranks at-risk citizens from scores of 1-500. In New York City, the NYPD watches the streets with the Domain Awareness System and, in Baltimore, Maryland, Persistent Surveillance Systems planes have filmed entire portions of the city.
This isn’t Minority Report. This is Silicon Valley reshaping modern day policing. This is the rise of big data policing and when you add in cell phone tracking, the Internet of Things, social media monitoring, biometric collection (DNA, iris scans, facial recognition, etc.), and good old fashioned criminal justice record-keeping, you have the developing markers of a frightening data-driven surveillance state.
“There has never been a requirement for police to be transparent about the technologies they are employing.”
So, what has been the reaction of the public to these encroaching digital surveillance tools? How have communities responded to being predicted, assessed, and targeted in big cities across America?
Oddly, there has been little discussion and less debate. And, the real question is whether this lack of engagement is the result of ignorance or apathy.
On the ignorance side, policing technology tends to be secretive. We know of the examples in Los Angeles, Chicago, New York and Baltimore, in large measure because journalists broke the stories and exposed the technology. In fact, Bloomberg Businessweek’s report on the Baltimore overflight videos was the first time the Baltimore City Council had ever heard of the project.
So perhaps secrecy encourages ignorance. But, there has never been a requirement for police to be transparent about the technologies they are employing.
Yet, these new surveillance technologies are different in kind because they threaten real privacy and liberty interests beyond ordinary policing. Whether we want cameras watching us, or algorithms predicting our actions might present more of a need for civilian oversight. The costs to privacy are real, but as of yet, there has been no real democratic engagement to pierce this veil of secrecy.
“People seem to have little concern with giving up the most intimate details to Google, Fitbit, or Facebook.”
Maybe it is just apathy. People seem to have little concern with giving up the most intimate details to Google, Fitbit, or Facebook. In a world where we regularly trade our personal data for convenience or entertainment, do we really expect to keep the information hidden from law enforcement? Or, maybe, like many other things it is apathy toward “the other.” Maybe we think that the policing technology will not affect us personally, so we do not care about how it affects others, especially others in already over-policed and lower socio-economic communities.
But, the thing with these new technologies is that they can affect us all. Cameras surveil everyone in a designated area. Algorithms are ranking everyone arrested. The data web is growing and capturing everyone in its net.
Here is a simple quiz to take to see if you can pass the “engaged citizen” test: (1) Do you know which surveillance technologies are in use in your community right now? (2) Do you know whether those technologies are accurate, unbiased, or accountable? (3) And, finally, if you wanted to find out that information do you know where you would go or who you could ask? This is the most basic of tests, and if you are like most people, you have no idea about any of the answers.
Today, data-driven technologies are shaping the future of policing, and it is time for citizens to secure the future of privacy.
Andrew Guthrie Ferguson is Professor of Law at the UDC David A. Clarke School of Law and author of the book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (NYU Press 2017).