Tuesday, 7 Jan 2025

Facial recognition cameras to turn Londoners into ‘walking ID cards’

Cressida Dick defends the use of facial recognition in CCTV

Facial recognition software which the Metropolitan Police is poised to roll out across the capital represents ”a discriminatory and oppressive surveillance tool”, furious human rights campaigners have claimed. Katy Watts, a lawyer at Liberty, called for the tech to be banned, claiming it had “no place on the streets of a rights-respecting democracy”, while Madeleine Stowe of Big Brother Watch said it had “no place in British policing”.

Both were commenting after the Met announced it would be using its Live Facial Recognition (LFR) system as a “significant step towards precise community-based crime fighting” on the day the National Physical Laboratory published pioneering research into its algorithm.

Lindsey Chiswick, the Met’s Director of Intelligence for the Met, immediately hailed LFR as a “precise community crime fighting tool” with the chances of a False Positive Identification Rate (FPIR), or false match, being just one in 6,000.

However, Ms Watts was unconvinced, telling Express.co.uk: “We should all be able to live our lives without the threat of being watched, tracked and monitored by the police.

“Facial recognition technology is a discriminatory and oppressive surveillance tool that completely undermines this basic right.”

Referring to the new study, she continued: “This report tells us nothing new – we know that this technology violates our rights and threatens our liberties, and we are deeply concerned to see the Met Police ramp up its use of live facial recognition.”

The expansion “mass surveillance tools” had no place on the streets of a rights-respecting democracy, Ms Watts argued.

She added: “Facial recognition doesn’t make people safer, it entrenches patterns of discrimination and sows division.

“History shows us that surveillance technology will always be disproportionately used on communities of colour and, at a time when racism in UK policing has rightly been highlighted, it is unjustifiable to use a technology that will make this even worse.

“This Government is intent on wrecking privacy rights and monitoring us as well as ripping up exiting protections.

“It’s impossible to regulate for the dangers created by a technology that is oppressive by design. The safest, and only, thing to do with facial recognition is ban it.”

Ms Stowe said: “Live facial recognition is suspicion-less mass surveillance that turns us into walking ID cards, subjecting innocent people to biometric police identity checks.

“This Orwellian technology may be used in China and Russia but has no place in British policing.”

Ms Stowe said the research confirmed LFR had “significant race and sex biases”, but argued police could alter the settings to “mitigate them”.

She added: “Given repeated findings of institutional racism and sexism within the police, forces should not be using such discriminatory technology at all.

“Police forces have also refused to disclose the ethnicity breakdowns of watchlists, but at facial recognition deployments, we have repeatedly witnessed black men, even black children, being wrongly matched and stopped.”

JUST IN: ‘I worked in a supermarket with Fred and Rose West’s daughter’

Don’t miss…
Harry ‘tried to see Charles’ last week but King shut him down – expert [LATEST]
Louise Minchin suffers nasty fall weeks before marathon [LATEST]
Hundreds will gather in London to celebrate final week of Ramadan [LATEST]

The prospect of one in 6,000 people being wrongly flagged was “nothing to boast about”, Ms Stowe claimed, especially in big cities where tens of thousands of people were scanned daily.

She continued: “If rolled out across the UK, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence.

“Live facial recognition is not referenced in a single UK law, has never been debated in parliament, and is one of the most privacy-intrusive tools ever used in British policing. Parliament should urgently stop police from using this dangerously authoritarian surveillance tech.”

The research, entitled Facial Recognition Technology in Law Enforcement, sought to test the accuracy, in operational conditions, of the algorithm used by the Met in terms of different demographics, or sections of the population – in other words, race, gender and age.

LFR compares a live camera video feed of faces against a predetermined watchlist to find a possible match that generates an alert.

Using a face-match threshold of 0.6, representing the comparison score value above which compared images will be considered to match, the chance of a false match is just 1 in 6000 people who pass the camera, the report says.

Ms Chiswick said: “Live Facial Recognition technology is a precise community crime fighting tool. Led by intelligence, we place our effort where it is likely to have the greatest effect.

“It enables us to be more focused in our approach to tackle crime, including robbery and violence against women and girls.”

Hailing the independent research as a “significant step for policing”, she added: “We commissioned the work so we could get a better understanding of our facial recognition technology, and this scientific analysis has given us a greater insight into its performance for future deployments.

“All matches are manually reviewed by an officer. If the officer thinks it is a match, a conversation will follow to check.”

Ms Chiswick continued: “We understand the concerns raised by some groups and individuals about emerging technology and the potential for bias. We have listened to these voices.

“This research means we better understand the performance of our algorithm. We understand how we can operate to ensure the performance across race and gender is equal.”

Source: Read Full Article

Related Posts