Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.

The FBI’s facial recognition software has been in operation since December 2011, but today members of Congress and the Government Accountability Office (GAO) asserted that it still lacks sufficient privacy and accuracy assessments. The system began with a pilot program and became fully operational in September 2015, according to the GAO.

Those calls for concern were amplified in a GAO report released today that says the FBI has yet to comply with five of six GAO recommendations to do things like assess the accuracy of databases of drivers licenses, mugshots, and other photos currently supplied by about 20 states. Initial GAO calls to fix or improve the FBI’s Next Generation Identification Interstate Photo System (NGI-IPS) were made back in May 2016. A group of U.S. Senators also sent a letter to the FBI last year to address the need for regular audits.

The first of the six recommendations fulfilled by the FBI — audits to oversee the use of its facial recognition service by state and local law enforcement — was completed last week.

“The information that the FBI is using — that information needs to be accurate, especially if they’re using it for the criminal investigations,” GAO Homeland Security and Justice division director Dr. Gretta Goodwin told the committee. “This technology is not going away, and it’s only going to grow.”

VB Event

The AI Impact Tour

Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!


Learn More

Through a combination of drivers license, passport, and mugshot image repositories today, the FBI currently has access to 641 million photos of U.S. citizens, the majority of whom have not been arrested or accused of a crime. This led Congressman Jody Hice (R-GA) to go so far as to call it a “pre-crime database” like the kind used in the Steven Spielberg movie The Minority Report.

“They still haven’t fixed the five things they were supposed to do when they first started, but we’re supposed to believe ‘Don’t worry, everything’s just fine,'” said Rep. Jim Jordan (R-OH).

Committee chair Rep. Elijah Cummings (D-MD) said the FBI and GAO will be told to return in about two months for another hearing to ensure compliance, and that without Congressional oversight, the problem could get worse in the years ahead. “I’m worried this is going to go on and on, and in the meantime, I’m sure we’ll be able to come up with some bipartisan solutions, but American citizens I think are being placed in jeopardy as a result of a system that is not ready for prime time,” Cummings said.

In a House Oversight and Reform Committee hearing held last month with AI ethics experts and researchers, a bipartisan consensus emerged among elected officials for the need of a national moratorium on facial recognition use by law enforcement.

That consensus emerged amid testimony by experts whose research identified a lack of regulation; the alleged misuse of facial recognition by police; and misidentification or poor accuracy when identifying women, people under 30, and people of color when compared to white men. Facial recognition systems have also had trouble recognizing gender non-conforming people or transgender individuals.

FBI facial recognition searches have not received negative feedback, but the agency has also not tracked to verify the number of searches that led to arrests or convictions, Kimberly Del Greco, deputy assistant director of the FBI’s Criminal Justice Information Services division, told Congress. The FBI was joined in the hearing today by Austin Gould, assistant administrator of requirements and capabilities analysis for the Transportation and Security Administration (TSA). The agency is currently testing facial recognition use by international travelers in Terminal F of Atlanta International Airport in Georgia.

Between fiscal year 2017 to April 2019, the FBI’s NGI-IPS has carried out 152,500 facial recognition searches by the FBI or state or local law enforcement, Del Greco said. After a photo is scanned, law enforcement is then given a list of 2 to 50 possible suspects.

The FBI is not currently using automated facial recognition, which can be used to track individuals in real time in video footage, Del Greco said. However, police in Detroit and Chicago — cities in states that share photos with the FBI — are currently testing real-time tracking, according to analysis by the Georgetown Law Center on Privacy and Technology.

Gould also spoke today about the TSA’s facial recognition pilot program that’s testing the technology for airport flight check-ins and bag drops.

The need to opt out rather than to opt in to the program was criticized by multiple members of Congress, who believe many people may be scanned by the TSA without their knowledge. A December 2018 Washington Post article found that about 2% of passengers in the pilot program have given their consent.

“I gave no one my permission to take my picture when dropping off my bag, and I’m an American citizen,” Rep. Mark Meadows (R-NC) said, remarking on his recent travel through Atlanta International Airport. “I would recommend that you stop until you find out your statutory authority,” he said to Gould.

Consent to use a person’s photo is a key element of legislation like the Commercial Facial Recognition Privacy Act proposed earlier this year, as well as others being considered by lawmakers.

The FBI’s system currently achieves 86% accuracy overall, Del Greco said. But that number applies only to the level of accuracy achieved when asked to identify a suspect from a pool of 50 potential suspects.

Learnings from a test by the Department of Commerce’s National Institute of Standards and Technology (NIST) are currently being applied to improve FBI facial recognition system accuracy, Del Greco said. The first statistical analysis of NIST’s facial recognition vendor test performance based on race, gender, and other demographic types is due out this fall.

Recommendations related to AI federal standards in the form of a Trump executive order will be released this summer. The deadline to share public comment about AI federal standards with NIST is June 10.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.