In the UK today, you can be walking down the street, shop** at the mall, or attending a public event, only to have your face secretly scanned, analysed, and checked against a police database. This OrwellianFootnote 1 scenario is not a distant dystopia, but a very real and present threat to our civil liberties. Police forces across the country are rapidly adopting facial recognition technology (FRT), both live facial recognition (LFR) which scans faces in real-time, and retrospective facial recognition (RFR) which analyses faces from stored footage.

The government and police claim this invasive surveillance is necessary to fight crime. Policing Minister Chris Philip has urged the police to double their use of FRT by May 2024. But the evidence of its effectiveness is flimsy at best. In 2023, across 13 deployments scanning an estimated 247,764 faces, the Metropolitan Police’s FRT system made a mere 18 matches, leading to just 12 arrests. That is an arrest rate of 0.005%. Hardly a resounding success. Even after years of FRT use across hundreds of countries, the International Criminal Police Organisation has only identified 1500 individuals of interest.

Meanwhile, the risks to our fundamental rights are profound and multifaceted. FRT represents a seismic shift in policing, enabling them to indiscriminately track and identify citizens without any individualised suspicion. This mass surveillance chills free expression, association, and assembly—cornerstones of a democratic society. As the sociologist David Lyon warns, such pervasive monitoring “challenges the very basis of democratic norms”.

But the threats of FRT go beyond abstract notions of privacy. The technology has been consistently shown to be biased and error-prone, particularly for people of colour, women, and trans- and non-binary individuals. A 2018 MIT study found that commercial FRT had an error rate of up to 34% for dark-skinned women, compared to a maximum of 0.8% for light-skinned men. For trans- and non-binary people, misidentification rates can reach a staggering 100%. In a country still grappling with the scars of discriminatory ‘sus laws’,Footnote 2 FRT risks automating and amplifying racial profiling and gender-based discrimination.

Even more concerning is the lack of transparency and accountability around the technology’s use. Police are making unilateral decisions about when to deploy FRT and who to include on their facial recognition databases and watchlists, with little public scrutiny or legislative oversight. This means any photo or video accessible to the police—including your social media posts—can be weaponized as surveillance. All police forces across the UK have the ability to run searches for faces against the Police National Database, which has more than 16 million photos and includes millions of images that should have been deleted years ago. In 2022, according to the data from freedom of information requests, there were 85,158 face searches—up 330% from the previous year. Under General Data Protection Regulation, we are asked for consent when a website wants to collect our personal data or track our activity. If our habits are protected from Amazon’s prying eyes, why are our faces nonconsensually up for grabs? And when mistakes inevitably happen, citizens have limited recourse. In one chilling case, a 14-year-old Black schoolboy was fingerprinted by police after being misidentified by FRT. How many more false matches are slip** through the cracks, with life-altering consequences?

Proponents argue these issues can be fixed with improved quality including accuracy and stronger regulations. The Home Office claims there is a “comprehensive legal framework” governing FRT. But this assurance rings hollow when existing oversight bodies are toothless and police admit they have conducted FRT operations in secret. No degree of technical tinkering can fix the fundamental power imbalance of a watchful, unaccountable state.

Others suggest limiting FRT to serious crimes would strike an appropriate balance. But this line is perilously easy to blur. We have seen how anti-terror legislation justified for exceptional circumstances can quickly creep into routine policing. Powers granted are seldom voluntarily relinquished.

Most seductively, the government claims we can have both public safety and individual privacy. But this is a false choice predicated on the untested promise of perfect technology. And it sidesteps the deeper question: even if FRT works flawlessly, is a society of total surveillance one we wish to build? Do we want to live in a world where innocent citizens are constantly tracked, their faces silently analysed without their knowledge or consent?

I firmly believe the answer must be no. A free society is fundamentally incompatible with ubiquitous biometric monitoring. We must have the right to anonymity in public spaces, to go about our lives without the chilling knowledge that a faceless algorithm is constantly watching, ready to flag the slightest anomaly to the authorities.

The path forward is clear. We must push Parliament to join the growing international consensus against this dangerous technology. The European Parliament has called for a ban on police use of FRT in public places. In the US, privacy-protecting cities like San Francisco have outlawed the technology, recognising that it is incompatible with civil liberties. It is time for the UK to choose which side of history it wishes to stand on.

To those who paint opposition to FRT as being ‘soft on crime’, I argue that public safety and individual liberty are not a zero-sum game. We can have secure communities without resorting to indiscriminate surveillance. Proven measures like community policing, crime prevention through environmental design, educational and economic opportunity, and mental health support can make our streets safer while respecting fundamental rights. Evocatively, we can have both liberty and security, and succumbing to the false promises of FRT risks achieving neither.

However, this ban will not materialise without substantial public pressure. It is on all of us to make our voices heard, to flood our MPs’ inboxes, to support advocacy groups leading the charge, to contribute to consultations, to rally our communities. The police may dream of a panoptic future, but we the people still hold the power to reject it.

Facial recognition technology epitomises the dangerous allure of tech-solutionism in public safety. It offers an enticing mirage of a crime-free world, at the cost of a free society. Its proponents speak of pinpoint accuracy, while sidestep** the demonstrable risks of bias, misuse, and overreach. They praise its potential while ignoring the pernicious power dynamics it enables.

In the end, the debate over police use of FRT is about more than just policing. It is a battle for the soul of our democracy. Will the UK uphold the right to privacy and anonymity, or will it sleepwalk into a surveillance state? Will we be a society of free citizens, or will we be reduced to walking ID cards, scanned and logged every time we step outside?

I know which future I want. A United Kingdom where we can walk the streets without fear of a digital despot cataloguing our every move. A nation that fiercely defends the right to go about our lives unencumbered by the prying eyes of the state. A country that rejects the false promise of total surveillance and instead builds safety through community, trust, and respect for civil liberties.

The choice is ours. In 2024, let us come together and definitively ban this invasive, error-prone, and ultimately incompatible technology from our public spaces. Let us choose to build a society of trust over suspicion, of liberty over surveillance, of the presumption of innocence over the presumption of guilt.

Together, we can ensure that in the UK, our faces remain our own.