Tag: BIOMETRICS
Washington state confronts expanding surveillance system as Flock draws fire

In Washington state, a series of court rulings, public records disclosures, investigative reports, and municipal policy decisions have converged to reveal the scale of the Flock Safety camera network and the complex ways its data has been shared across agencies, inside and outside the state.
The developments have heightened concerns about surveillance, civil immigration enforcement, reproductive freedom, and the limits of local control over new policing technologies.
The clearest shift came from a Skagit County Superior Court ruling last week in which Judge Elizabeth Yost Neidzwski ordered police departments to release images captured by Flock Safety cameras under Washington’s Public Records Act.
The ruling held that the vehicle images, collected automatically as cars pass roadside camera fixtures, constitute public records regardless of whether they were ever used in specific investigations.
The case originated from a records request submitted by a private citizen who sought access to a half-hour of data from Flock cameras operated by police departments in Sedro-Woolley and Stanwood.
The cities sued to block the request, arguing that the images were exempt, and warning that releasing them might infringe individual privacy or undermine investigative tools.
The court rejected those claims and pointed to the scale of the surveillance, which captures the movements of thousands of drivers not suspected of wrongdoing.
The judge’s ruling dealt directly with the nature of modern automated surveillance.
Unlike red light cameras or speeding sensors which activate when a law is triggered, Flock cameras continuously record every car that passes, producing streams of timestamped images that can show not only plates, but also vehicle features and occupants.
The breadth of that record was central to the court’s conclusion that the data is public. The ruling immediately raised questions for dozens of other police agencies across Washington that now operate similar systems.
Sedro-Woolley and Stanwood had already deactivated their cameras before the decision while the dispute was pending, but the ruling makes voluntary suspension more consequential.
Police departments around the state are now consulting with legal counsel to determine whether continuing to use Flock cameras means they must accept that their footage is subject to broad public disclosure.
Sedro-Woolley had installed its first Flock cameras earlier this year and emphasized their value in finding stolen vehicles, missing persons, and suspected offenders. City officials highlighted early successes, including a robbery arrest and the recovery of an Alzheimer’s patient who had gone missing.
The system was also presented as a cost-efficient enhancement to city policing, as Flock cameras require a fraction of the annual cost of hiring additional staff. But those benefits have been overshadowed by the legal and ethical questions surrounding data access and control.
For now, the physical camera hardware remains in place in Sedro-Woolley, but the system has been disabled and is not capturing images.
The concerns extend far beyond the legal question of whether Flock images are public records. In October, the University of Washington Center for Human Rights released research showing that local Flock systems across the state had been accessed by Border Patrol and other federal agencies involved in immigration enforcement in ways that may violate state law.
Washington’s Keep Washington Working Act, passed in 2019, restricts law enforcement agencies from using state or local resources to support civil immigration enforcement. However, Flock system records obtained by researchers show that Border Patrol ran thousands of searches on data from at least 31 Washington jurisdictions.
Some of those searches occurred in cities that had not knowingly granted access. Others appear to have resulted from how Flock’s network sharing features were configured.
The University of Washington researchers described three kinds of access. The first is direct sharing, where one police agency grants access to another through a one-to-one network connection.
The second is indirect or back door access, where agencies gain entry because another organization in the network has enabled broad sharing rules.
The third is side door access, where local police run searches on behalf of outside agencies, including federal immigration authorities. In some cities, audits showed instances where local officers conducted searches tagged with terms such as “ICE or “immigration” despite state restrictions.
Auburn, which operates a network of Flock cameras, released a statement on October 20 acknowledging that federal immigration officers had gained access to its system through the Flock network’s national lookup function.
The department said it was not aware access had been granted and immediately disabled the feature once notified. Auburn now conducts monthly reviews of all external access to its Flock network and has pledged to permanently revoke access for any agency found to be using the system for immigration enforcement.
The city emphasized that it remains committed to lawful policing while respecting privacy rights and state law.
Other jurisdictions have taken similar steps. Renton reported discovering that federal immigration agents could search its Flock data through another state’s law enforcement agency. The department suspended all external sharing until the source of access could be determined.
In Lakewood and several other cities, police officials said they were unaware of how access had been enabled and moved to restrict or review sharing settings.
The rapid expansion of Flock’s business model has complicated local oversight. The company now operates tens of thousands of cameras nationwide and encourages agencies to share data regionally or across state lines.
Flock’s pitch to police departments has emphasized cost efficiency, ease of installation, and investigative gains.
But as the University of Washington Human Rights report notes, the network’s scale makes it difficult for agencies to determine who has access to their data.
The company allows local administrators to configure sharing, but many police departments either do not fully understand the implications of those settings or rely on default configurations that enable broader access than intended.
Flock’s network also incorporates private entities, including neighborhood associations, shopping malls, and big-box retail properties. Many of those private customers share their camera data with police.
In some cases, police agencies have access to private camera networks without disclosing those relationships publicly. The layered nature of these networks makes it possible for a search initiated anywhere in the country to draw on data from Washington without clear visibility into how that connection was formed.
The debate in Washington is unfolding at a time when reproductive rights and gender-affirming healthcare protections are being tested nationwide. Privacy advocates warn that travel pattern data derived from license plate surveillance could be used to target people seeking care across state lines.
Despite the public pressure and legal scrutiny, many police departments continue to defend the technology. Chiefs in Sedro-Woolley, Mount Vernon, and other cities describe Flock cameras as valuable public safety tools that do not use facial recognition and capture only vehicles, not biometric identifiers.
They frame the cameras as extensions of routine policing work, arguing that the data is only accessed during investigations and automatically deleted after 30 days. But those assurances assume users understand and control the system as configured.
The University of Washington report shows several agencies either never ran network audits or did not know how to access them, leaving oversight gaps.
The statewide conversation is now shifting toward legislative action. Other states, including Virginia and Illinois, have enacted limits on data retention and cross-jurisdiction sharing. Washington currently has no equivalent statutory framework governing local use of license plate surveillance systems.
Governor Bob Ferguson recently issued an executive order emphasizing the importance of protecting private data held by state agencies and reaffirming Washington’s commitments to immigrant rights. But the order does not directly apply to municipal police departments.
State legislators are signaling interest in addressing the issue in the upcoming session.
The fight over Ring’s new facial recognition feature

The growing pushback against the use of facial recognition in consumer surveillance devices has sharpened considerably with Amazon Ring’s plan to introduce a new feature called Familiar Faces.
The tool, which is expected to roll out this winter, would allow Ring camera owners to tag and identify specific people who come into view of their cameras.
While Amazon says the feature is intended to help residents quickly recognize friends, family members, and frequent visitors, privacy advocates argue it represents a significant expansion of biometric surveillance into neighborhoods, sidewalks, and front doors.
They warn that the feature risks violating state biometric privacy laws and could expose Amazon to lawsuits like those that forced Facebook and Google to abandon or pay out large settlements for comparable systems.
The Electronic Frontier Foundation (EFF) has raised concerns that the feature’s implementation will require Amazon to scan the faces of every person who appears in front of a Ring camera, not only those who are tagged as “familiar.”
This would sweep in visitors, neighbors, postal carriers, utility workers, canvassers, children selling fundraising items, and people who are simply walking by.
EFF noted that in many states, including those with biometric privacy laws, companies must obtain informed, affirmative consent before collecting or processing biometric identifiers like faceprints.
The company has already said that Familiar Faces will not be available in Illinois or Texas, the two states with the strongest biometric privacy laws and the two jurisdictions where courts have already ruled that large technology companies can be held liable for scanning the faces of individuals without their explicit permission.
Amazon also confirmed that it will not launch the feature in Portland, Oregon, which has enacted restrictions on private sector facial recognition deployments. Amazon maintains that the feature will be off by default when it becomes available, and that it may store untagged facial data for up to six months.
Amazon has told reporters that it is not currently using this biometric information to train algorithms, though it did not commit to avoiding such use in the future.
Amazon Ring’s rollout comes at a moment when scrutiny of the company’s surveillance partnerships is intensifying. Ring spent years cultivating close relationships with local police departments, offering portals through which officers could request footage directly from residents.
By 2022, more than 2,000 law enforcement agencies across the country had partnered with Ring. After sustained criticism, Ring said last year that police would no longer be able to request footage from residents through the app.
However, the company continues to comply with warrants and emergency requests and has in the past provided footage to law enforcement without user consent under emergency disclosure exceptions.
Privacy researchers and civil liberties groups caution that adding a facial recognition layer to an already widespread neighborhood surveillance network could enable the rapid identification of large numbers of people and facilitate location tracking without judicial oversight.
Senator Edward Markey of Massachusetts has taken the lead in pressing Amazon to halt the feature. In a letter sent this fall, Markey argued that the combination of networked home surveillance cameras, facial recognition, and police collaboration could enable pervasive monitoring that has historically been carried out only by state intelligence agencies.
Markey previously led a group of Senate Democrats urging Amazon to avoid facial recognition integration in consumer products and to provide transparency about how Ring data is shared with government agencies.
He cited ongoing concerns about algorithmic error rates, especially for darker-skinned people and women, and referenced the documented history of misidentification in law enforcement face recognition systems that has led to wrongful arrests.
Privacy advocates warn that the infrastructure required for Familiar Faces could allow Amazon to provide similar search functions through law-enforcement requests, even if the company does not presently offer such capabilities.
Amazon has acknowledged that it cannot currently generate lists of all Ring cameras where a person has appeared but did not rule out the possibility of developing such functionality.
Privacy researchers argue that the capacity to search visually across a network for a dog could be readily adapted to search for a person. They note that there is little technical distinction between the two applications aside from policy restrictions that could be changed in future updates.
Amazon insists that it has no plans to develop a system that would allow police to perform bulk facial searches. But civil liberties organizations caution that once facial recognition datasets exist, the legal and practical pressures to use them for broader surveillance can escalate quickly.
The risks extend beyond misuse. Storing biometric identifiers carries heightened consequences in the event of a data breach. Unlike passwords or credit card numbers, biometric traits cannot be reset. If a database containing faceprints is compromised, the harm is permanent.
Amazon says that biometric data collected by Ring devices is stored on Amazon servers under strong encryption and security policies. However, Ring has faced criticism over access controls in the past.
Privacy groups and some state lawmakers view the rollout of Familiar Faces as a test of whether state regulators are prepared to enforce the biometric protections on the books. If state agencies decline to act, the practical effect of the laws could be minimal.
The absence of federal privacy legislation has left regulation to a patchwork of state laws and city ordinances. While a bipartisan group of lawmakers has introduced bills in recent sessions to limit facial recognition in consumer, commercial, or law-enforcement contexts, none have yet passed.
Whether Amazon chooses to alter or postpone the rollout of Familiar Faces may depend on the level of public and regulatory scrutiny as launch approaches. The company has assured the public that the feature is optional, can be turned off, and will comply with local law.
Voice phishing scams in touch of AI
Source: FT.com AI-powered voice vishing scams are here and elevating the sophisticated frauds since the capabilities of AI have mastered holding a real-life conversation in the last year. The Financial Times writer, a AI researcher at Bramble Intelligence, wrote that AI could clone human voices and impersonate victims to create a highly-manipulative scam – also […]
The post Voice phishing scams in touch of AI first appeared on Identity Week.
UN’s first cybercrime convention sparks concern over data requests

The UN’s Convention against Cybercrime (UNCC), which strengthens international cooperation in combating online crime, was officially signed by 72 countries last month. The legal framework, however, has raised concerns about legally sensitive data requests among tech companies, including those in the identity verification industry.
The UN document is the first global convention focused on preventing cybercrime, including offences ranging from online fraud and financial crimes to drug trafficking and child sexual abuse. It is the first international treaty to recognize the non-consensual dissemination of intimate images as an offence.
More importantly, the convention introduces international standards for the sharing and use of electronic evidence for all “serious offences.”
“Cybercrime is changing the face of organized crime as we know it, and the new UN Cybercrime Convention provides Member States with a vital tool to fight back together,” Ghada Waly, executive director of the UN Office on Drugs and Crime (UNODC), said at the treaty’s signing ceremony in Vietnam on October 25th.
Technology companies, academics and civil society organizations, however, say that the convention is flawed.
Following the signing ceremony in October, identity verification provider Jumio highlighted that the treaty could bring new challenges around compliance for organizations.
“Over-prescriptive compliance requirements may thrust organizations into an unfamiliar position of responsibility for sensitive personal data,” says Joe Kaufmann, Jumio’s global head of privacy and data protection officer.
Other stakeholders are warning of even more dire consequences.
Last year, industry group Cybersecurity Tech Accord, which includes tech companies such as NEC, Microsoft, Meta, Oracle and more, warned that the Convention could result in individuals’ private information shared with global governments, without “legal challenges to problematic requests and without any transparency or accountability mechanisms.”
The text also fails to protect cybersecurity researchers and penetration testers, says the organization.
A coalition of rights groups, including Access Now, Electronic Frontier Foundation and Human Rights Watch, also highlights that the convention could be used by rogue states to go after government critics, whistleblowers and journalists by designating their activity as a “serious offense.”
“It obligates states to establish broad electronic surveillance powers to investigate and cooperate on a wide range of crimes, including those that don’t involve information and communication systems,” the coalition wrote in a statement earlier this year. “It does so without adequate human rights safeguards.”
During the signing ceremony, UNODC’s Waly highlighted that the treaty was shaped over 420 hours of formal negotiations spread out over five years.
Proposed by Russia in 2017 and approved unanimously by the UN General Assembly in 2024, the framework included inputs from more than 150 UN member states. Among the 72 countries that placed their signatures on the treaty in Hanoi in October were the European Union, the UK, China, Russia and Brazil.
Other countries such as India and the U.S. have abstained, with the U.S still reviewing the document, according to Recorded Future.
The future of the UN Convention against Cybercrime will depend on the number of countries that decide to adopt it. The treaty still must be ratified by each state according to its own procedures. It will enter into force 90 days after being ratified by the 40th signatory.
Sophisticated malware found posing as Indonesia’s digital ID app

Cybersecurity researchers have discovered a malware app designed to steal financial data, which disguises itself as Indonesia’s national digital identity platform, Identitas Kependudukan Digital (IKD).
The malware app, named Android/BankBot-YNRK, was found circulating online outside of the official Google Play app store, posing as an APK file of the digital ID platform. Once a user installs it, the app will start exploiting Android permissions to gain access to sensitive data, targeting banking and cryptocurrency apps.
According to an investigation from cybersecurity firm Cyfirma, the Trojan operates stealthily by leveraging its permissions to observe what appears on screens, simulate button presses and automatically complete forms as if acting on the user’s behalf. It also transmitted device details, location data and a list of installed applications back to the attackers.
“Overall, Android/BankBot-YNRK exhibits a comprehensive feature set aimed at maintaining long-term access, stealing financial data and executing fraudulent transactions on compromised Android devices,” says Cyfirma.
The harmful application takes advantage of Android’s overlay capability to present counterfeit login pages over genuine banking and wallet applications. Once users input their login information, it gets sent straight to cybercriminals.
To cover their tracks, attackers would send real-time instructions to the smartphone, such as avoiding antivirus tools or erasing data. The Trojan also suppressed notification and sounds to avoid alerting its users.
Currently, it is unclear how many users installed the illegitimate app.
Identitas Kependudukan Digital (IKD), or Digital Population Identity, was developed by the Directorate General of Population and Civil Registration (Dukcapil) and launched in 2023. As of December 2024, 18 million people have signed up for IKD, while the Indonesian government has been trying to boost the number of digital ID users.


























