Tag: FACIALRECOGNITION
London Police facial recognition expansion casts wide net

London’s Metropolitan Police have a new community crime-fighting strategy that expands the use of facial recognition and other technologies to catch the city’s “most harmful offenders,” and apparently thieves on e-bikes.
Live facial recognition use will be expanded across all London boroughs. Pilots of operator-initiated facial recognition and cameras fixed to “street furniture” will continue. Retrospective facial recognition capabilities will “continue to grow,” but so will public engagement to build trust. First-responder drones will be deployed across the city for to rapidly reach incidents.
The Met says “officers will expand the use of technology and data to target London’s most harmful offenders” under the plan, which aligns with priorities at the national level.
But a rash of thefts involving people riding e-bikes and e-scooters has London residents “particularly concerned,” according to the announcement. The Independent reports these concerns are largely related to phone thefts.
Two accounts of operations to seize e-bikes and a quote on the Met’s crackdown on them and e-scooters follows.
Facial recognition “has many uses and it will pick up people that speed, so it will pick up people on e-bikes and in all sorts of situations,” Metropolitan Police Commissioner Mark Rowley told The Independent.
Further details are offered in the “New Met for London Phase 2 2025-2028” plan, which extends the strategy of its 2023 predecessor. A new Law Enforcement Data Service (LEDS) will be launched around the beginning of 2027, in part to give Met Police instant identity verification capabilities through facial recognition and other biometrics.
High scale, low confidence
Home Office announced a plan to expand police use of facial recognition across the UK last week, as well as to consider making images from the national passport and driver’s license databases available to police.
There are already more than 19 million custody photos in the police national database (PND). Home Office licensed facial recognition software from Cognitec in 2020 for searches against the PND, but has never updated it.
Now the Guardian and Liberty Investigates have revealed that an NPL assessment commissioned by Home Office showed police officials potential problems with bias in the system in September, 2024.
The demographic differentials of facial recognition algorithms (including from Cognitec) as assessed by NIST were publicly available even before Home Office licensed the software, so any suggestion that they were previously unaware of the issue only introduces more questions.
In response to the NPL’s findings, the National Police Chiefs’ Council (NPCC) ordered the confidence threshold for matches to be raised so that false matches would be filtered out, reducing bias, but police forces complained that the change led to too few leads. NPCC documents show potential matches fell from 56 percent of searches to 14 percent.
The NPCC said police found the change meant “a once effective tactic returned results of limited benefit.”
“Evidence is mounting as to why it is crucial we have robust safeguards in place before this powerful and intrusive technology is expanded any further,” says Liberty Policy and Campaigns Officer Charlie Whelton. “For too long, police forces have set the terms, and we are now seeing the real-life consequences.”
Edmonton police first to to test facial recognition body cams from Axon

Police in Edmonton, Alberta are launching a Proof of Concept to test facial recognition-enabled Body Worn Video (BWV) cameras.
A release from the Edmonton Police Service (EPS) says the limited trial will assess the feasibility and functionality of technology provided by Axon Enterprise, the U.S. company that began making Taser electroshock weapons but has since pivoted to body cams and other law enforcement tech. EPS will be the first police service in the world to test Axon’s facial recognition BWV cameras. (A trio of Ontario forces uses tech from Idemia.)
The pilot kicks off today, and will see up to 50 police officers equipped with facial recognition-enabled BWV cameras for the month of December. EPS says it aims to “test the technology’s ability to work with our database to make officers aware of individuals with safety flags and cautions from previous interactions.”
Like other facial recognition systems deployed for law enforcement, the system compares footage from the field with a database of police mugshots. And like other police forces, Edmonton’s thinks it will be helpful in assisting officers in preventing crime.
“As we focus on continuous improvement around enhancing officer situational awareness and public and officer safety, we are pleased to be the first police service in the world to test Axon’s facial recognition technology through the use of Body-Worn Video cameras,” says Acting Superintendent Kurt Martin with the EPS’ Information and Analytics Division. “We are hopeful that upon successful testing, it can be yet another tool in our toolbox to assist us in our efforts to keep our communities and officers safe. This technology will not replace the human component of investigative work.”
Axios’ facial recognition cameras run automatically and with no intervention from officers. They won’t provide alerts to officers while on duty, but will log footage for review by specialists to see if the biometric hardware works how it’s intended to.
In effect, this is a technical trial run of the equipment and pipeline, rather than an evaluation of how it would be used by officers in practice.
If the EPS likes what it sees, it will proceed with more tests in 2026.
The Edmonton Police Service has submitted a Privacy Impact Assessment to Alberta’s Information and Privacy Commissioner to make sure the facial recognition proof of concept from Axon is fair and respects privacy law.
The Scottsdale, Arizona-based company’s stock has nosedived by nearly 30 percent over the last month.
The fight over Ring’s new facial recognition feature

The growing pushback against the use of facial recognition in consumer surveillance devices has sharpened considerably with Amazon Ring’s plan to introduce a new feature called Familiar Faces.
The tool, which is expected to roll out this winter, would allow Ring camera owners to tag and identify specific people who come into view of their cameras.
While Amazon says the feature is intended to help residents quickly recognize friends, family members, and frequent visitors, privacy advocates argue it represents a significant expansion of biometric surveillance into neighborhoods, sidewalks, and front doors.
They warn that the feature risks violating state biometric privacy laws and could expose Amazon to lawsuits like those that forced Facebook and Google to abandon or pay out large settlements for comparable systems.
The Electronic Frontier Foundation (EFF) has raised concerns that the feature’s implementation will require Amazon to scan the faces of every person who appears in front of a Ring camera, not only those who are tagged as “familiar.”
This would sweep in visitors, neighbors, postal carriers, utility workers, canvassers, children selling fundraising items, and people who are simply walking by.
EFF noted that in many states, including those with biometric privacy laws, companies must obtain informed, affirmative consent before collecting or processing biometric identifiers like faceprints.
The company has already said that Familiar Faces will not be available in Illinois or Texas, the two states with the strongest biometric privacy laws and the two jurisdictions where courts have already ruled that large technology companies can be held liable for scanning the faces of individuals without their explicit permission.
Amazon also confirmed that it will not launch the feature in Portland, Oregon, which has enacted restrictions on private sector facial recognition deployments. Amazon maintains that the feature will be off by default when it becomes available, and that it may store untagged facial data for up to six months.
Amazon has told reporters that it is not currently using this biometric information to train algorithms, though it did not commit to avoiding such use in the future.
Amazon Ring’s rollout comes at a moment when scrutiny of the company’s surveillance partnerships is intensifying. Ring spent years cultivating close relationships with local police departments, offering portals through which officers could request footage directly from residents.
By 2022, more than 2,000 law enforcement agencies across the country had partnered with Ring. After sustained criticism, Ring said last year that police would no longer be able to request footage from residents through the app.
However, the company continues to comply with warrants and emergency requests and has in the past provided footage to law enforcement without user consent under emergency disclosure exceptions.
Privacy researchers and civil liberties groups caution that adding a facial recognition layer to an already widespread neighborhood surveillance network could enable the rapid identification of large numbers of people and facilitate location tracking without judicial oversight.
Senator Edward Markey of Massachusetts has taken the lead in pressing Amazon to halt the feature. In a letter sent this fall, Markey argued that the combination of networked home surveillance cameras, facial recognition, and police collaboration could enable pervasive monitoring that has historically been carried out only by state intelligence agencies.
Markey previously led a group of Senate Democrats urging Amazon to avoid facial recognition integration in consumer products and to provide transparency about how Ring data is shared with government agencies.
He cited ongoing concerns about algorithmic error rates, especially for darker-skinned people and women, and referenced the documented history of misidentification in law enforcement face recognition systems that has led to wrongful arrests.
Privacy advocates warn that the infrastructure required for Familiar Faces could allow Amazon to provide similar search functions through law-enforcement requests, even if the company does not presently offer such capabilities.
Amazon has acknowledged that it cannot currently generate lists of all Ring cameras where a person has appeared but did not rule out the possibility of developing such functionality.
Privacy researchers argue that the capacity to search visually across a network for a dog could be readily adapted to search for a person. They note that there is little technical distinction between the two applications aside from policy restrictions that could be changed in future updates.
Amazon insists that it has no plans to develop a system that would allow police to perform bulk facial searches. But civil liberties organizations caution that once facial recognition datasets exist, the legal and practical pressures to use them for broader surveillance can escalate quickly.
The risks extend beyond misuse. Storing biometric identifiers carries heightened consequences in the event of a data breach. Unlike passwords or credit card numbers, biometric traits cannot be reset. If a database containing faceprints is compromised, the harm is permanent.
Amazon says that biometric data collected by Ring devices is stored on Amazon servers under strong encryption and security policies. However, Ring has faced criticism over access controls in the past.
Privacy groups and some state lawmakers view the rollout of Familiar Faces as a test of whether state regulators are prepared to enforce the biometric protections on the books. If state agencies decline to act, the practical effect of the laws could be minimal.
The absence of federal privacy legislation has left regulation to a patchwork of state laws and city ordinances. While a bipartisan group of lawmakers has introduced bills in recent sessions to limit facial recognition in consumer, commercial, or law-enforcement contexts, none have yet passed.
Whether Amazon chooses to alter or postpone the rollout of Familiar Faces may depend on the level of public and regulatory scrutiny as launch approaches. The company has assured the public that the feature is optional, can be turned off, and will comply with local law.






























