UK police begins live facial recognition trials at railway stations

UK police begins live facial recognition trials at railway stations
The UK police have kicked off a six-month pilot using live facial recognition (LFR) surveillance to monitor train stations.

The testing was initiated on Wednesday by the British Transport Police (BTP) at London Bridge railway station, with future plans to cover key transportation hubs in London.

The introduction of LFR on UK railway stations was first announced in November last year, coinciding with a mass stabbing attack on a train to London that left eleven people injured. The system aims to target crime hotspots where data has shown “high harm” offenders are likely to pass through.

“The initiative follows a significant amount of research and planning, and forms part of BTP’s commitment to using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offences, helping us keep the public safe,” says Transport Police Chief Superintendent Chris Casey.

The Transport Police relies on the NEC’s NeoFace M40 facial recognition algorithm, which was evaluated by the National Physical Laboratory (NPL). The force has pledged to publish a full assessment of its operation after the pilot is completed.

Similar to other police LFR deployments, the system will rely on a watchlist of offenders and automatically delete images of people who are not matched. The project includes Network Rail, the Department for Transport and the Rail Delivery Group.

London Assembly member calls for LFR ban

The expansion of LFR into railway stations comes as the UK government prepares for increased rollouts of the technology as part of its newly announced policing reforms. According to the blueprint, the Home Office will fund 40 new LFR vans to be deployed in town centers across England and Wales.

The plan, however, is sparking resistance.

On Wednesday, a member of the London Assembly for the Green Party, Zoë Garbett, called on the city’s Metropolitan Police to halt its facial recognition deployments, citing concerns over bias and a lack of primary legislation for police use of the technology.

The comments were delivered during a 10-week consultation on a legal framework for police use of the surveillance tech kicked off by the Home Office in December, according to Computer Weekly.

“It makes no sense for the home secretary to announce the expansion of live facial recognition at the same time as running a government consultation on the use of this technology,” says Garbett. “This expansion is especially concerning given that there is still no specific law authorising the use of this technology.”

In a report submitted to the London Assembly, Garbett argues that the Met Police has been plagued by a lack of transparency, including over the costs of deploying the technology.

“This rapid increase in deployment has come with no evaluation of its effectiveness or consideration of the cost of using LFR compared to other possible policing and non-policing methods,” the report notes.

Garbett also notes that the London police have been increasing the size of the watchlist , turning LFR from “precise policing” to something more akin to a “fishing trawler.” The Green Party member also argues that the LFR is used disproportionately in areas that have more people of black, Asian or mixed ethnicities than the London average.

The report notes that its findings were informed by the work of advocacy organizations Big Brother Watch and Liberty.

Big Brother Watch is currently mounting the largest legal challenge yet to the Met Police’s use of facial recognition. The case, brought by black anti-knife crime campaigner Shaun Thompson and Big Brother Watch director Silkie Carlo, was heard by the London High Court in January.

Thompson was detained by police after the Met Police’s facial recognition system produced a false match.

UK digital ID providers fear govt plans conflict with data protection act aims

UK digital ID providers fear govt plans conflict with data protection act aims
The mechanism within the UK’s Data (Use and Access) Act that allows businesses certified under the government’s Digital Identity and Attributes Trust Framework (DIATF) to collect data from public authorities has become the latest grounds for dispute between parties in the country’s spasming identity sector. Most the data protection rules remaining under the Act, also known as the DUAA, meanwhile, came into force Thursday.

For digital identity and biometrics providers, also still outstanding is clarity on their role in the country’s digital identity system, and whether it is commercially viable.

Earlier in the week, the Department for Science, Innovation and Technology (DSIT) and Government Digital Service updated the status of the Information Gateway mandated by Section 45 of the DUAA in a joint webinar on the UK’s digital ID landscape. Digital Verification Service providers will be able to request and gather information from public authorities via the Information Gateway.

DSIT says a “Code of Practice” for these information disclosures is coming, and is expected to be approved by Parliament this summer, according to a LinkedIn post by legal and digital identity consultant Richard Oliphant. A lively discussion has followed in replies to the post.

“This is a necessary precursor to establishing the Information Gateway and it will boost the use of DVS in the UK private sector,” Oliphant says.

However, he also identifies two major problems ahead.

One is that DSIT has said there are no plans to allow DVS providers to host government-issued verifiable credentials, like a UK mobile driving license (mDL). These will be stored in the GOV.UK Wallet, giving it an unfair advantage over DIATF firms, which will only be able share derived credentials that do not bear the digital signatures contained in the VC, and therefore have minimal value.

Authorities will also have the option to deny DVS provider requests, DSIT says. But if public authorities can deny data such requests, Oliphant argues, rights such as data portability granted by UK data privacy law.

Both issues could pit DVS providers against the government plan, Oliphant says, and conflict with the DUA’s statutory aims.

In the meantime, all of the DUAA has now commenced, except a complaints procedure requirement that takes force midway through 2026 and some pending ICO governance provisions.

The Information Commissioner’s Office (ICO) has published updates to its guidance for businesses, particularly for “children’s higher protection matters” mandated under the DUAA.