Earlier this month, a broad coalition of civil rights, civil liberties, privacy and immigrant-rights groups met with representatives from the FBI and Justice Department to demand more transparency around the use of an increasingly popular law enforcement tool: facial recognition technology.
The meeting was in response to a recent report from the Georgetown Law Center on Privacy & Technology that found that law enforcement agencies across the country are adding this technology to their arsenal of investigatory tools. While the report found that the practice affects over 117 million people, agencies across the board have failed to put in place safeguards to protect our privacy.
Worse yet, while the technology potentially threatens the rights of everyone in America, the report uncovered damning racial biases within the systems.
Facial recognition technology works by running a photo against an algorithm that attempts to pull distinct characteristics — like cheekbone or eye position — and compare them to a connected database. It then returns either the most similar-looking faces or all photos above a certain similarity threshold.
The system is a network of various state and federal databases that are generally built of drivers’ licenses and mug shots. These databases sweep up millions of innocent Americans without their knowledge and put them into what Georgetown calls the “perpetual lineup.” It’s the first time in our nation’s history that the FBI has maintained a biometric database made up primarily of innocent people.
Law enforcement might run a search on a person while on a routine stop. Officers might run a photo taken from a surveillance camera at the scene of an alleged crime. Particularly worrisome is the fact that some agencies have expressed interest in running searches in real time of public spaces.
This kind of persistent surveillance has serious implications for our ability to simply move through life with anonymity. Even more concerning, it may impact people’s ability to exercise their First Amendment rights without being identified and targeted by the government.
While jurisdictional controls exist to prevent (in theory) each agency from taking extraterritorial action outside its territory, there’s extensive sharing among the many systems. For example, more than 5,300 officials from 242 different federal, state and local agencies can access Pinellas County, Florida’s database, which has been in place since 2001.
The FBI hosts one of the nation’s largest facial recognition databases, built mostly from mug shots submitted by local, state and federal law enforcement agencies. In total, the database contains nearly 25 million photos.
Racial Bias
While law enforcement has argued that the new technology is colorblind, the Georgetown report points to several studies that have found racial bias in these systems. The most prominent of these — co-authored by FBI expert Richard W. Vorder Bruegge — found that several of the leading algorithms were 5–10 percent less accurate when used to identify Black people.
One of the algorithms failed to identify white people accurately 11 percent of the time, but that failure rate jumped to 19 percent when the subject was Black. While a false negative could result in police missing an important lead, a false positive could implicate innocent people. As the Georgetown report points out, many systems are programmed to deliver what the algorithm thinks are the closest matches for a particular face even if there’s no degree of certainty about the identity, potentially leading to investigations of innocent people.
The report suggests that one source for such dangerous and deeply ingrained bias is that the technology best recognizes the kinds of people who create it. It points to a study from the National Institute of Standards and Technology that tested for facial recognition accuracy on East Asian and white people. Researchers found that algorithms developed in East Asia performed better on East Asians while those created in Western Europe performed better on white people.
Drawing on their own knowledge base and the nuances they themselves can see, the people who write the code tend to do better at recognizing people in their own racial and ethnic demographic groups.
But the tech industry is notoriously deficient when it comes to hiring people of color. A recent study found that while Hispanic people represented 8 percent of those graduating with computer-science degrees and Black people represented 6 percent of that group, they account for only 3 percent and 1 percent of the workforce at major tech firms, respectively.
The racial justice implications for the growing use of facial recognition technology are exacerbated by other alarming issues. For example, the databases often use arrest records, where Black people are overrepresented due to existing biases in policing. Consequently, the technology is the least accurate on the people that it’s most often trying to identify.
The ramifications of this discrepancy are problematic for a number of reasons. But when they reinforce bias that’s already built into problematic police practices, the consequences can be deadly.
Despite all this, the Georgetown report found that two of the major companies producing this technology didn’t even test for racial bias before taking their products public.
First Amendment Implications
Pervasive surveillance of public spaces compromises our right to anonymous speech. Officers may take photos or film video, or stationary surveillance cameras may constantly and quietly record public spaces. In either case, the ability to identify people attending political protests from a distance and long after the fact could have a chilling effect on free speech.
While direct interaction between protesters and law enforcement can be chilling, or even dangerous, at least in those cases people have heightened awareness of the police presence and potential surveillance. When the filming is done more surreptitiously and then matched up with facial recognition tools in real time or afterwards, the effects could be startling.
That reality is closer than we think. Sen. Al Franken recently called out the FBI for photos found in the agency’s internal documents that depict people at Clinton and Sanders rallies being tracked with facial recognition technology. Last August, the FBI itself released 18 hours of spy-plane footage it had taken at Black Lives Matter protests in Baltimore in spring 2015.
While the right to anonymous speech is vital to everyone in a democracy, the stakes for those protesting racial injustice are particularly high. One case that illustrates this fact is NAACP v. Alabama: During the height of the civil rights movement, the state tried to force the group’s state chapter to disclose the names and addresses of all of its members.
Recognizing the danger this could put those members in, the chapter refused. In 1958, the Supreme Court ruled unanimously in favor of the NAACP, arguing that the state order: “would suppress legal association among the group’s members — in fact, earlier disclosures of member identities had led to loss of employment, physical coercion, and other hostile treatment.”
The Supreme Court recognized then that the rights to association and privacy are intrinsically linked, particularly for those engaged in protest and dissent. Some states and law enforcement agencies have made efforts to rein in the use of many other new police technologies that could contribute to pervasive surveillance (like body-worn cameras and drones).
But according to the Georgetown report, not one state has passed a comprehensive law regulating law enforcement’s use of facial recognition technology — despite the clear harms to both civil rights and civil liberties.
In its meeting, the coalition, which included Free Press, asked the FBI to shed additional light on the use of this powerful technology. We need to address those dangers sooner rather than later.
Truthout Is Preparing to Meet Trump’s Agenda With Resistance at Every Turn
Dear Truthout Community,
If you feel rage, despondency, confusion and deep fear today, you are not alone. We’re feeling it too. We are heartsick. Facing down Trump’s fascist agenda, we are desperately worried about the most vulnerable people among us, including our loved ones and everyone in the Truthout community, and our minds are racing a million miles a minute to try to map out all that needs to be done.
We must give ourselves space to grieve and feel our fear, feel our rage, and keep in the forefront of our mind the stark truth that millions of real human lives are on the line. And simultaneously, we’ve got to get to work, take stock of our resources, and prepare to throw ourselves full force into the movement.
Journalism is a linchpin of that movement. Even as we are reeling, we’re summoning up all the energy we can to face down what’s coming, because we know that one of the sharpest weapons against fascism is publishing the truth.
There are many terrifying planks to the Trump agenda, and we plan to devote ourselves to reporting thoroughly on each one and, crucially, covering the movements resisting them. We also recognize that Trump is a dire threat to journalism itself, and that we must take this seriously from the outset.
After the election, the four of us sat down to have some hard but necessary conversations about Truthout under a Trump presidency. How would we defend our publication from an avalanche of far right lawsuits that seek to bankrupt us? How would we keep our reporters safe if they need to cover outbreaks of political violence, or if they are targeted by authorities? How will we urgently produce the practical analysis, tools and movement coverage that you need right now — breaking through our normal routines to meet a terrifying moment in ways that best serve you?
It will be a tough, scary four years to produce social justice-driven journalism. We need to deliver news, strategy, liberatory ideas, tools and movement-sparking solutions with a force that we never have had to before. And at the same time, we desperately need to protect our ability to do so.
We know this is such a painful moment and donations may understandably be the last thing on your mind. But we must ask for your support, which is needed in a new and urgent way.
We promise we will kick into an even higher gear to give you truthful news that cuts against the disinformation and vitriol and hate and violence. We promise to publish analyses that will serve the needs of the movements we all rely on to survive the next four years, and even build for the future. We promise to be responsive, to recognize you as members of our community with a vital stake and voice in this work.
Please dig deep if you can, but a donation of any amount will be a truly meaningful and tangible action in this cataclysmic historical moment.
We’re with you. Let’s do all we can to move forward together.
With love, rage, and solidarity,
Maya, Negin, Saima, and Ziggy