Privacy and civil liberties advocates are celebrating the California Senate’s recent bipartisan passage of AB 1215, which would place a three-year moratorium on the use of facial recognition surveillance in police body-worn cameras. The bill now awaits Gov. Gavin Newsom’s signature.
The legislation, which faced stiff opposition from the state’s police unions, comes after municipal bans on the use of the controversial technology were passed earlier this year in Oakland and San Francisco amid concerns over how the sensitive data collected by facial recognition software will be shared with other branches of the federal government, particularly Immigration and Customs Enforcement. A similar municipal ban recently advanced out of the public safety committee in Berkeley, and now awaits a full vote by the city council.
Federal agencies and increasingly, local law enforcement, are using facial recognition systems to identify or verify a person from a digital image or a video frame. Since the California bill is limited to police body cameras, it would not prevent local law enforcement from using the technology in other ways, such as San Mateo County Sheriff’s Office’s mugshot database, which it shares with other law enforcement agencies through the Northern California Regional Intelligence Center.
But even as the technology becomes more prolific, as the Trump administration’s Transportation Security Administration moves ahead with its plan to use facial recognition technology to process and board passengers at the nation’s airports, pushback against the technology is starting to gain steam at the local, state and federal levels.
Massachusetts lawmakers are likewise considering a moratorium bill that would place an indefinite hold on the technology until the legislature passes a law guiding its use. Meanwhile, two bipartisan bills that would place a five-year moratorium on law enforcement use of facial recognition are moving ahead in Michigan, and New York State is also considering a one-year moratorium on use of the technology in schools.
Still, the technology continues to be widely adopted, with the Detroit Board of Police Commissioners voting in favor of a policy to govern the police department’s use on September 19. The vote to approve facial recognition in Detroit “was foreseeable,” says Tawana Petty, who directs the Data Justice Program at the Detroit Community Technology Project.
She says public pressure helped to scale back a provision that would have allowed Detroit police to use the system in real time if there was a “credible terrorist threat,” while also helping to add stronger accountability mechanisms for officers. Ultimately though, she would like to see the technology banned from law enforcement use.
The Detroit Police Department had been quietly using the technology for the last two years to make arrests after the city council approved a $1 million purchase of the software in July 2017. But meaningful public debate over the technology was spurred only after Georgetown University revealed its use in a report published in May that shocked the majority-Black city. “They skated this through,” Petty says. “There was no transparency or community input.”
Still, she and other community organizers remain hopeful that the city council will approve a model American Civil Liberties Union (ACLU) ordinance on September 30 that would force transparency in the police departments’ acquisition and use of other secretive surveillance technologies. At least 11 cities and one county have passed the ordinance, and 19 cities and two states have introduced it.
“Even though the [Detroit Board of Police Commissioners] vote deflated us, we still have a lot of work to do,” Petty says. Her organization has submitted a Freedom of Information Act request seeking all records related to the police department’s use of the technology over the past two years, and is working with a national coalition of more than 30 organizations to push for a federal ban.
Meanwhile, Kade Crockford, the director of the Technology for Liberty Program at the ACLU of Massachusetts, who has been working to support and advance the moratorium bill in her state, told Truthout that a coalition of at least 44 Massachusetts groups has lined up behind the bill.
Even though there is strong bipartisan agreement among lawmakers that “something ought to be done,” Crockford says, differences remain over whether a moratorium is the right approach. “There is this sense that these technologies should not be used unless they’re subject to very, very strict oversight and regulation independent of law enforcement agencies.”
As in California, Massachusetts is also seeing movement at the municipal level amid the passage of a municipal ban in Somerville in June. Crockford says the ACLU is also working to support similar ordinances now under consideration in Cambridge and Brookline, as well as calling for a ban on the technology’s use in public schools. A coalition of groups sent a letter last week to the state’s Department of Elementary and Secondary Education Commissioner Jeffrey C. Riley, calling on him to ban facial recognition in state schools.
The ACLU of Massachusetts released a poll in June conducted by Beacon Research that found that 76 percent of Massachusetts voters don’t think the government should be able to monitor people using facial recognition. A whopping 91 percent of Massachusetts voters believe the state needs to regulate the government’s use of the technology.
Bipartisan support is also strong at the federal level, as Rep. Elijah Cummings (D-Maryland) and Rep. Jim Jordan (R-Ohio) are planning a bipartisan bill on facial recognition this fall that could also entail a similar moratorium on the federal government’s acquisition of new facial recognition technology, according to the lawmakers’ offices. Their plans come after two congressional oversight hearings on the issue have been held and as at least four other pieces of federal legislation that would place limits on the technology have already been introduced.
Even in a sharply divided Congress, both Democrats and Republicans agree about the dangers posed by facial recognition technology and many support a federal moratorium or ban. In particular, Democratic lawmakers have pointed to the technology’s widely known tendency to misidentify people of color, women and children, and misgender transgender and nonbinary people.
“There’s been a lot of focus on the fact that the technology is not ready for prime time yet in the sense that it is often inaccurate, and I think that’s important, but I don’t want us to lose sight of what happens when the technology does work significantly better,” Crockford told Truthout. “This technology is dangerous when it works and when it doesn’t.”
As lawmakers work to introduce bipartisan legislation on Capitol Hill, a coalition of more than 30 advocacy groups including United We Dream, Greenpeace, RootsAction and Color of Change are ramping up pressure. The coalition launched a campaign this month pressuring local, state and federal lawmakers to completely ban law enforcement use of the technology.
“While the use of facial recognition technology is continuing to spread, resistance to it is also spreading,” said Evan Greer, director of the digital rights advocacy group Fight for the Future. “We see this as a moment to draw a line in the sand and just make it clear that this is not like other forms of surveillance technology. It’s uniquely dangerous, and it’s something that needs to be banned before it’s too late.”
Fight for the Future also recently launched a separate campaign calling on Ticketmaster and other companies to commit to not using facial recognition surveillance at music festivals and concerts. The effort has drawn support from a number of musicians, including Tom Morello and Amanda Palmer, among others. The campaign recently secured commitments from major festivals, such as Bonnaroo and Austin City Limits, as well as several other events like Paradiso Festival, Lucidity Festival, Summer Meltdown and Punk Rock Bowling.
Meanwhile, 2020 presidential candidates are taking positions on the use of the controversial technology. Sen. Bernie Sanders became the first 2020 Democratic presidential candidate to call for a total ban on law enforcement use of the technology as part of his plan to overhaul the U.S. criminal legal system, with a campaign spokesperson saying, “Police use of facial recognition software is the latest example of Orwellian technology that violates our privacy and civil liberties under the guise of public safety, and it must stop.”
Sen. Kamala Harris also recently released her own criminal legal system reform plan, saying she would work with stakeholders “to institute regulations and protections” to ensure that facial recognition technology “does not further racial disparities or other biases.” Harris’s position reflects that of other candidates, including Sen. Elizabeth Warren, former Housing Secretary Julián Castro and Sen. Cory Booker, who have called for the creation of a taskforce, set of guidelines, or a survey of law enforcement on the use of the technology to inform oversight efforts.
Advocates like Greer say that such calls for vague “regulations and protections” would fail to address the harms inherent in facial recognition surveillance and only serve to repeat talking points that have been pushed by law enforcement agencies and big tech companies.
“There’s nothing more empty than saying, ‘Yeah, we’ll take a look at this,’ when it’s actively being used on low-income communities and communities of color across the United States right now and actively causing harm right now,” Greer told Truthout. “This is not something to ‘take a look at’; it’s something that’s spreading incredibly quickly, and we need our elected officials to do something about it and take a meaningful stand.”
While Truthout didn’t receive comment from the campaign offices of former Vice President Joe Biden; South Bend, Indiana, Mayor Pete Buttigieg; Sen. Amy Klobuchar; and former tech executive Andrew Yang, a campaign spokesperson for Rep. Beto O’Rourke told Truthout that the congressman supports a federal ban on law enforcement use, saying O’Rourke “opposes the use of facial recognition software in both pursuing immigration offenses and policing. Beto believes the use of this software is an abuse of individual privacy, and has been shown to consistently make more errors for women and people of color than white men.”
For advocates like Greer and Crockford, any proposal that falls short of a full ban on law enforcement use of the technology is problematic due to the unprecedented threat the technology poses to basic civil liberties.
“You can change your Social Security number. You can get rid of your phone. You can change your password. You can’t change your face. [Facial recognition] enables real-time tracking of essentially an entire population. It enables surveillance at an automated scale that fundamentally changes the nature of that surveillance’s impact on society,” Greer says. “It’s an automated form of surveillance that breeds conformity and really backs up authoritarianism not only in the United States but around the world.”
Crockford in Massachusetts agrees. “We need to talk about the larger question of whether we think it’s ever appropriate for the government to create an architecture of surveillance that is so pervasive and all-seeing that it enables that kind of mass tracking of every person’s public movements, habits and associations, and we think the answer is clearly ‘no,’ that that cannot coexist with a free society,” she said.