ACOUSTIC sensors trained to recognise the sound of gunfire and send alerts to officers’ mobile phones telling them when and where the shots were fired. Glasses that recognise faces and record everything. Drones equipped with high-definition video cameras. GPS readers and ANPRs, allowing for constant surveillance of entire swathes of a city. CCTV systems with embedded facial recognition that lets authorities track people in real time.

All of these new technological possibilities are upending a wide range of activities and the customs associated with them. Law enforcement is no different. But if citizens do not like how their doctor or hairdresser, or a social-media site, uses their data or tracks their purchases, they can go somewhere else. The state wields a monopoly on punishment through law enforcement. Police can arrest, and even kill, their fellow citizens. Judges have the power to imprison people. That makes transparency and public consent in the justice system essential.

There is no reason for the police to eschew the best available technology just because it can be used invasively. If criminals store information on their phones, police should be able to see it. If data can help police prevent crime, they should use them. But this needs to be done without impinging on people’s civil liberties. Police and politicians cannot let the allure of new technology lead them to overlook how it will affect the people they serve. And citizens must hold them to account.

Such vigilance must extend to the sellers of these systems as well as their users. Some regimes have embraced emerging technologies the better to control and surveil people: China, for instance, has blanketed its restive regions of Xinjiang and Tibet with facial-recognition cameras, iris scanners and other such kit. In January the European Parliament, following popular concern, imposed export controls on surveillance technology that regimes can use to spy on citizens.

In liberal countries, big-data policing is not about police chiefs sitting around strategising, says Andrew Ferguson, author of a book on the subject. “It’s tech companies selling them cool stuff, charging police departments for storage and data…[and] telling them, ‘We can help you solve more crimes with our cool tech’.” The companies give technology free to help police solve their problems, he says.

Mr Ferguson suggests five questions that departments should answer before buying new technology. Can you identify the risks that the technology addresses? Can you ensure accurate data inputs? How will the technology affect community relations and policing practice? Can it be tested to ensure transparency and accountability? And will police use the technology in a manner that respects the autonomy of the people it will affect?

Some places have begun to create institutions to answer those sorts of questions. Just like many tech firms, the cities of Seattle and Oakland have chief privacy officers, charged with vetting and managing the privacy implications of their cities’ policies. Oakland’s grew out of its privacy commission, a nine-member advisory body to the city council formally established in 2016, after citizens resisted its plan to introduce a domain-awareness system similar to the one Microsoft and the NYPD built in New York.

“We just started showing up and educating the council on the risks of this equipment,” says Brian Hofer, a member of the commission. The Oakland PD and the commission meet once a month to discuss surveillance and the data of Oakland residents. They write tech-use policies together, and the department submits public annual reports on how often and for what purpose its surveillance tech was used. On May 1st Oakland’s city council proposed a bill requiring that any new police technology be approved by the city council and privacy commission.

One might imagine that background—successfully stopping a planned surveillance programme in one of America’s most liberal cities—would augur an oppositional relationship between the privacy commission and the police department. But the opposite is true, say both Mr Hofer and Tim Birch, who heads the Oakland PD’s research and planning division.

Working with the commission “encourages us to think about what technology is really needed,” and to ask whether the benefits are worth the costs, says Mr Birch. Or as Mr Hofer puts it, “The police are aware that they have to behave differently because someone is watching.” He notes that the commission has never recommended the city council bar police from obtaining new technology that they want. “Technology itself isn’t good or bad, as long as they tighten up their [usage] policies.”

Several other municipalities in California have passed surveillance-transparency requirements similar to Oakland’s. Last February a state senator in California introduced legislation requiring that municipalities create and publicise policies for the use of surveillance technology, and restricting the sale or transfer of information gathered through surveillance.

Accidents will happen

Concerns over data-sharing have led cities in California to rethink contracts with Vigilant, an ANPR firm that recently signed up Immigration and Customs Enforcement (ICE), America’s federal immigration police, as a client. Civil-liberties groups worry that ICE could tap into local law-enforcement ANPR data stored on Vigilant’s servers to target undocumented immigrants. Vigilant insists that would be impossible unless a local law-enforcement agency explicitly allowed it, which California’s would not. But, according to Mr Birch of Oakland PD, the ICE contract “terrifies people”. The prospect that the government could find a back door into Vigilant’s massive database, or that a rogue officer who disagrees with California’s liberal policies could share information from the database with federal police, was enough to make co-operation politically impossible for California’s liberal cities.

New Orleans recently ended its relationship with Palantir, a company that built predictive-policing software for the city entirely outside public view. (Its founder, Alex Karp, is a non-executive director on the board of The Economist Group.) Palantir donated the product to the city, but civil-rights activists feared the firm was using New Orleans as a testing ground. Had the city acquired the services through the usual procurement process, it may not have caused a fuss. But a secretive deal for a predictive-policing program run with proprietary algorithms proved too much.

Local politicians upholding their communities’ values is cause to cheer, particularly when it happens in the usually grey area of law-enforcement surveillance. This does not mean that the sort of strict oversight favoured by liberal, multi-ethnic northern California will fly everywhere. “It has to be local,” says Mr Birch. “That’s the only way these privacy commissions can work. They have to reflect local standards.”

There are also benefits in sharing results of number-crunching with other arms of government and civil society. A map of crime is also a map of need. “What you’re modelling is a host of factors, and you’re only giving it to one publicly available resource, which is the punitive resource,” says Mr Goff of John Jay College. “Why would you not also give this to social-service providers?”

Similarly, Andrew Papachristos, a sociologist whose research helped the Chicago PD create its strategic subject list, urged the police to share data, and wrote, “The real promise of using data analytics to identify those at risk of gunshot victimisation lies not with policing, but within a broader public-health approach.” The young men at risk of being shot may also need job training and counselling. Trained mediators could calm conflicts before they flare into violence.

Any number of interventions might benefit them and the community better than contact with the police. As Mr Ferguson writes, “Police may be the primary actors in the system to solve crime, but they do not have to be the primary actors in the system to reduce risk.” And if police can measure their success at driving down crime rates, surely cities can measure providers’ success at offering social services.

But they have to want to do it, and this, too, is a question of citizen involvement—not of oversight, but of political will. “Law and order” candidates win elections more often than “efficiently targeted social-services” candidates. New technology helps justice systems collect and organise data more efficiently. They can use it to punish. Or they can use it for the unglamorous, less politically rewarding work of dealing with the causes of crime.

Ultimately, citizens in open societies must decide for themselves what they are willing to tolerate. Technological change is inevitable, but does not have to happen without being questioned. Perhaps people want their neighbours to drive around in cars topped with facial-recognition cameras that report everything to police. If they do not, they need to speak up—forcefully, and now.

Download the complete “data detectives” graphic novel here here