Back to blog

How data-driven policing threatens human freedom

See blog

Readers' comments

The Economist welcomes your views. Please stay on topic and be respectful of other readers. Review our comments policy.


l­i­k­e T­h­e­l­m­A r­es­p­o­n­d­e­d ­i ­A­m st­Art­l­e­d t­h­At ­A ­m­ot­h­er ­c­A­n ­m­A­k­e $6821 ­i­n ­A ­f­ew w­e­e­ks ­o­n t­h­e ­c­o­m­put­er. ­d­i­d y­ou ­l­o­o­k ­At t­h­is w­e­b s­it­e­g­o t­o t­h­is s­it­e ­h­o­m­e t­A­b ­f­or ­m­or­e ­d­et­A­i­l>>>>


The bigger issue is that data-driven policing is driven by black boxes that are propriety of the vendor and cannot be opened for inspection or review. A bit like the recipe for KFC. Cathy O’Neil in her interesting book Weapons of Math Destruction points out the new age of the algorithm needs an Auditor and says...
"The models being used today are opaque, unregulated, and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination: If a poor student can’t get a loan because a lending model deems him too risky (by virtue of his zip code), he’s then cut off from the kind of education that could pull him out of poverty, and a vicious spiral ensues. Models are propping up the lucky and punishing the downtrodden, creating a “toxic cocktail for democracy.” Welcome to the dark side of Big Data."


First: "Terrorism!"
then big-data analysis
then far too many false positives
thus: "Accuracy!"
then more data gathering, more invasion-of-privacy information
then far too many false positives and unprevented terrorist attacs, still
next step?
thought police (your facial expression showed signs of disgust, when the president was on tv? Please come in for questioning)
I think it's systemic. I think it's impossible to identify terrorists through big-data analysis in a cost-efficient way. It's taking huge risks without it yielding the desired results. If the desired result isn't fascism, of course.

B. Hotchkiss

". . . ready to offer those young men and women the opportunity to change their environment."
Why "young" -- unless Mr. Ferguson is guilty of the same kind of data-driven prejudice that he is so concerned about.


We need to control and supervise how police uses new technologies. Most importantly, we should remember that the Western concept of the law is centered around the individual, that is no-one should be unjustly harmed. If a technology can lead to prosecution of 100 guilty people and one unjustly prosecuted innocent among them, this is not acceptable to the Greek law on which our own law is based.
Police officers who are keen to have a new technology should notice that most of 'inconveniences' or 'fears' of new technology are also real glitches and failures of the system.
Most interesting is, however, whether criminals themselves will start using the technology and turn tables to the police.