IE 11 is not supported. For an optimal experience visit our site on another browser.

At Facial Recognition Databases Hearing, Congress Attacks FBI

Democrats and Republicans alike assailed the FBI on Wednesday for its use of facial recognition software to identify potential suspects.
IMAGE: Facial recognition template
An androgynous base image and an experimental analysis of the same face.National Science Foundation

Democrats and Republicans alike hammered the FBI on Wednesday for its use of facial recognition software to identify potential suspects, saying the technology fosters racial bias, leads to arrests of innocent people and trashes Americans' privacy.

More than 400 million pictures of Americans' faces are archived in local, state and federal law enforcement facial recognition networks, the federal Government Accountability Office reported last year. Those pictures include the faces of about half of all U.S. adults, experts estimate.

"I have zero confidence in the FBI and the [Justice Department], frankly, to keep this in check," Rep. Stephen Lynch, D-Massachusetts, said at a hearing of the House Committee on Oversight and Government Regulation.

IMAGE: Facial recognition template
An androgynous base image and an experimental analysis of the same face.National Science Foundation

"This is really Nazi Germany here, what we're talking about," Lynch said. "And I see little difference in the way people are being tracked under this, just getting one wide net and getting information on all American citizens."

At the very least, he said, warrants for face searches should be required "if we're going to build these databases."

Rep. John Duncan, R-Tennessee, said: "I think we're reaching a very sad point, a very dangerous point, when we're doing away with the reasonable expectation of privacy about anything."

The committee's ranking Democrat, Elijah Cummings of Maryland, noted research indicating that facial recognition systems are less accurate in distinguishing identities among people with dark skin, women and younger people.

"If you're black, you’re more likely to be subjected to this technology,” said Cummings, who is African-American. "And the technology is more likely to be wrong. That's a hell of a combination, especially when you're talking about subjecting someone to the criminal justice system."

Kimberly Del Greco, the FBI's deputy assistant director of criminal justice information, stressed under questioning that "the only information the FBI has and has collected in our database are criminal mugshot photos."

That number doesn't include databases compiled by state and local law enforcement agencies — culled from police mugshots, driver's licenses, passports, visas, security video and other sources.

The FBI has reciprocal agreements with 18 states giving it access to such local databases, and the agency has made it clear that it wants access to all of the rest. And it has also sought an exemption from federal privacy laws that give Americans the right to check the accuracy of information the government has compiled about them.

"Like many technologies, used in the wrong hands or without appropriate parameters, it is ripe for abuse," the committee's chairman, Jason Chaffetz, R-Utah. .

"It would be one thing if facial recognition technology were perfect or near perfect, but it clearly is not," Chaffetz said. "Facial recognition technology does make mistakes."

Internal FBI documents obtained in a Freedom of Information Act lawsuit by the nonprofit Electronic Privacy Information Center indicate that the FBI's own database, called the Next Generation Identification Interstate Photo System, or NGI-IPS, had an acceptable margin of error of 20 percent — that is, a 1-in-5 chance of "recognizing" the wrong person.

And research published in the October 2015 issue of the scientific journal PLOS ONE by researchers at the universities of Sydney and New South Wales in Australia found that the humans who interpret such data build in an extra error margin approaching 30 percent.

Alvaro Bedoya, executive director of the Center on Privacy and Technology at Georgetown University Law School, told lawmakers at the hearing Wednesday: "We need to take a step back and ask: If this technology had been in place for the Boston Tea Party or the civil rights protests, what would have happened?"