Does Police Face Recognition Technology Racially Profile Blacks?

An unregulated expansion of facial recognition technology, used freely by law enforcement, has led to racial profiling, according to a new report.

Half of all American faces, that’s 117 million, are stored in at least one of several law enforcement facial-recognition databases, according to a study by the Center for Privacy & Technology at Georgetown University’s law school. But because African-Americans are disproportionately “over represented” in facial databases, which contain mug shots and other police photos, blacks are more likely than other to have their images in the system, the report found.

Image: A New York Police Department security camera is seen in Times Square
A New York Police Department security camera is seen in Times Square on May 6, 2010 in New York City. Mario Tama / Getty Images

"Instead of correcting dysfunctional and disparate law enforcement practices, technologies like facial recognition can supercharge bias and exacerbate the most profound flaws in our justice system," said Sakira Cook, a lawyer with The Leadership Conference on Civil and Human Rights, which along with 50 civil rights organizations is asking the Justice Department’s civil rights division to investigate the impact of using such technology on minority communities.

Neither the DOJ or the FBI immediately responded to requests for comment.

The technology works by collecting facial images and plugging them into a network via driver’s license photos, state ID’s, police bookings, and real time scans from cameras in public locations. Face matches, or even potential face matches of possible suspects, are then used as an investigative tool by many law enforcement agencies, including the FBI, for further probe, said Alvaro Bedoya, coauthor of the report and executive director at the Center for Privacy & Technology at Georgetown University’s law school.

“Face recognition technology lets police recognize you from far away and in secret,” he said. “Before they had to stop and ask you questions. That’s not the case anymore,” he said.

Image: A security camera in Philadelphia
A security camera in Philadelphia on Jan. 8, 2016. Mark Makela / Getty Images, file

Even if you're a law-abiding citizen, it should concern you, he said. Facial recognition systems can put anyone that is in a database on the chopping block, he said. An individual can be investigated for a crime he didn’t commit only because his face may have resembled a suspect, he said.

And while the technology doesn’t see race, it does see numbers, which is disparately skewed towards African Americans.

“African Americans are disproportionately likely to come into contact with—and be arrested by—law enforcement,” the report said. For example, in 2014,” African Americans represented 5.4 percent of Minnesota’s population but 24.5 percent of those arrested.” In Michigan, African- Americans are arrested at a rate “136 percent higher than their state population share.”

Another potentially dangerous problem is that the technology makes mistakes — especially when it comes to African-Americans.

According to a 2012 FBI-coauthored study face recognition technology “may be least accurate for African Americans, women, and young people,” with as much as a 10 percent lower accuracy rate.

The technology was used during the Freddie Gray protests in Baltimore to identify and arrest protesters who had outstanding warrants, said Neema Singh-Guliani from the American Civil Liberties Union at a press conference.

Image: Judge Declares Mistrial In First Freddie Gray Trial
Police stand guard as protesters march through the streets hours after a mistrial was declared in the trial of Baltimore police Officer William G. Porter on Dec. 16, 2015 Mark Wilson / Getty Images

“The technology will affect how individuals interact in public spaces, which are constitutional rights” she said. They may be afraid to join a protest or speak out for fear of being caught by cameras,” she said in a press conference.

The technology is completely unregulated, she said. “Police are free to identify and track anyone even without any evidence that they’ve done anything wrong.”

In several jurisdictions, face searches don’t require even a baseline level of reasonable suspicion as they would with fingerprints or DNA testing. The report found that only 25 percent of agencies using the technology had any legal standard attached to the search.

Many police agencies, like in Arizona, Florida, Ohio, and Virginia, can scan photos for a wide variety of reasons, according to the report.

The report’s authors also cite the importance of voters being able to make the choice as to how their drivers license picture is used, Bedoya said.

“This technology is not under control,” he said. “117 million American faces are used everyday without warrants, without oversight, without audit, without accuracy, and without transparency.”