close
close

The use of facial recognition technology by police is the subject of the upcoming public NIST meeting


The use of facial recognition technology by police is the subject of the upcoming public NIST meeting

The National Artificial Intelligence Advisory Committee of the U.S. National Institute of Standards and Technology (NIST) will hold a virtual public meeting to receive a briefing from its Law Enforcement Subcommittee (NAIAC LE) on the pros and cons of using AI in law enforcement, specifically related to facial recognition technology.

According to NIST, the NAIAC LE briefing will address AI as it relates to the needs of law enforcement and other agencies to identify individuals for a variety of reasons. NIST said “video and photographic evidence from surveillance footage, bystanders, social media, and other sources can provide crucial evidence about potential suspects, victims, witnesses, or community members in distress,” and that facial recognition technologies (FRTs) can enable law enforcement officials to identify these individuals more frequently, more quickly, and more accurately.”

But, according to NIST, “therein lies both the potential and the risk of facial recognition technology. Unrestricted use of facial recognition systems could bring more people into the unwilling or unwitting focus of law enforcement investigations, and the system could be abused in ways that violate constitutional rights or community norms.”

In preparation for the September 4 public briefing and meeting, NAIAC invites the public to provide comments on the July 2024 discussion draft. Discussion on a framework for the responsible use of facial recognition technology in law enforcementNIST said the discussion should serve as a basis for future recommendations from NAIAC LE to the full NAIAC committee regarding the limited and responsible use of FRTs.

The draft NIST document states: “Although some communities and civil rights organizations oppose the use of facial recognition technology by police in every respect,” public opinion overall is mixed: 46 percent believe that “large-scale use of facial recognition technology by police would be a good idea,” while 27 percent think it would be a “bad idea.”

NIST has put forward a framework that “provides the structure for legal requirements and best practices to guide the responsible use of FRTs.” These are the four fundamental insights that NIST says form the background for the framework:

  • When used appropriately, FRTs can improve the quality of law enforcement efforts, both in criminal investigations and in their role as community advocates.
  • The unhindered use of FRT poses a serious threat to civil rights and civil liberties, including but not limited to concerns about accuracy and bias, risks to free expression, and invasions of privacy;
  • Current legislation does not adequately guide or restrict law enforcement use of FRT to ensure that law enforcement agencies realize the benefits of the technology while protecting against its risks.
  • If police agencies wish to continue to use FRT technology or to use it for the first time, they should take carefully considered security precautions.

NIST said the framework detailed in the draft document provides “preliminary recommendations for future recommendations” as well as alternative recommendations.

“Due to the unprecedented nature of FRT and the fact that reasonable and knowledgeable people have differing opinions about how to plan with uncertainty and how to manage conflicts between competing values, we have not and could not reach consensus on every single important issue related to FRTs,” NIST explained. “For this reason, we identified which issues have caused significant divisions among our members so that NAIAC can have an informed discussion about the competing interests.”

NIST has identified a number of potential FRT applications, which it says are primarily for “surveillance purposes” and “for which we do not yet have a framework or set of preliminary recommendations.”

The discussion draft states: “LEAs should not purchase FRT software from a vendor, use the results of a vendor’s FRT software, or create their own FRT systems unless the vendor or manufacturer:

  • Can demonstrate high accuracy over the
  • demographic groups present in real-world use;
  • Discloses sufficient information about its FRT systems to enable an independent, professional assessment of the performance of its FRT systems for intended law enforcement use cases;
  • Provides instructions and documentation on image quality and other relevant technical specifications required to maintain low error rates for systems sold to law enforcement agencies across all demographic groups;
  • Provides LEA users with ongoing training, technical support and software
  • Updates required to ensure their FRT systems can maintain high accuracy
  • across demographic groups in real-world operational contexts;
  • Develops its FRT technology to facilitate user verification
  • technology and for what purpose; and
  • Can demonstrate compliance with data security best practices.

The NIST draft discussion framework further states that law enforcement agencies should “maintain and publish a comprehensive policy on acceptable use of FRT” that should, at a minimum, specify:

  • Permitted or prohibited FRT uses;
  • Protocols and procedures to ensure consistent and lawful use;
  • Authorized users of FRT;
  • Rules for data collection and retention; and
  • Restrictions on data access, analysis or sharing.

NIST further stated that the use of FRTs “for criminal investigations includes the typical case in which a law enforcement agency applies FRT to identify a suspect from an image taken at the crime scene,” but “in contrast, such use is not criminal when FRT is used to identify an incapacitated person or to restrict access to a high-security building or area.”

However, there are no clear boundaries that could neatly separate criminal from non-criminal use,” NIST explained, pointing out that “the most difficult examples involve the use of FRT to identify victims or witnesses to a crime who may be unwilling to participate in a criminal investigation or prosecution and who could become defendants in other criminal investigations.”

In this situation, NIST recommends that law enforcement agencies “evaluate the use of FRT across the spectrum from law enforcement to non-law enforcement operations using three broad categories: suspects, victims or witnesses, and non-law enforcement operations.”

This open meeting will be held via web conference on Wednesday, September 4, 2024, from 2:00 p.m. to 5:00 p.m. Eastern Time.

Article topics

biometric identification | criminal ID | facial recognition | NIST | police | US government | United States

Latest news on biometrics

Unsurprisingly, US politicians are expressing concern about the introduction of ammunition vending machines in grocery stores…

The Office of Biometric Identity Management (OBIM) of the U.S. Department of Homeland Security (DHS) is seeking comments on its request for…

The revised digital identity guidelines of the American National Institute of Standards and Technology are about to undergo final review.

A recent audited financial statement released by the World Bank shows that Nigeria’s Digital Identification for Development (ID4D) project…

Over 2.5 million Scots now use MyAccount and Yoti’s reusable digital ID to streamline…

The “Digital Identity Landscape 2024” report by market and competitive analysis company Liminal goes one step further with its own interactive presentation…

Leave a Reply

Your email address will not be published. Required fields are marked *