Facial Recognition Bans Handcuff Law Enforcement

Last week, San Francisco made history by becoming the first major city to ban law enforcement from using facial recognition technology, a type of computer system that can match similar faces, either by searching for similar images in a database (one-to-many) or confirming whether two images match (one-to-one). By a vote of 8 to 1, the Board of Supervisors approved an ordinance that prohibits the police and other agencies from using any form of facial recognition. It is understandable that people want to set limits on police surveillance and address racial bias in policing, but a blanket ban on facial recognition technology does nothing to address those underlying issues and only makes it more difficult for police to investigate crimes and improve public safety.

There are many potential ways police can use facial recognition technology. As an investigative tool, facial recognition can help police identify witnesses, suspects, and other persons of interest in a crime. Without facial recognition, police must do these searches manually, such as combing through mug shot photos or asking the public to help identify someone — a process that is slow, inaccurate, and expensive. Indeed, facial recognition will make it much more feasible to investigate low-dollar crimes, such as car break-ins and packages stolen from doorsteps, which often go unaddressed. These digital searches are of course only part of the investigative process, and police still must have probable cause to make arrests.

Police are also using facial recognition to help stop crimes. For example, some police departments, such as in Fort Worth, Texas, are using facial recognition technology to search for missing children online who may have become possible victims of sex trafficking. Facial recognition can also help police improve security at schools, stadiums, and other public places. If police receive a credible threat of a person planning a shooting or bombing at one of these venues, they can use real-time facial recognition to quickly monitor the crowd and receive an alert if this person arrives.

Finally, police can use facial recognition to improve security internally, such as for access control to their buildings or as an added layer of security when authenticating to their computer systems.

In the future, there may also be opportunities to use real-time facial recognition to help respond to amber alerts for missing children and silver alerts for people with dementia. Or police may use facial recognition in the field to let officers know who is approaching them — a tool that could be used to improve community relations. Real-time facial recognition systems work by identifying faces in an image, creating a unique signature for each face, and then sending an alert if one of those faces matches someone they are looking for.

Opponents have three main objections. First, there is the concern of Chinese-style mass surveillance. But the United States is not China, and Americans have rights and freedoms that the courts continue to uphold. Similar claims were made about other technologies, like license plate readers, without these fears being realized. Regardless, it may still be helpful for Congress to clarify some of these rights, such as affirmatively establishing a warrant requirement to track the movements of individuals, to address this concern.

The second major objection is about specific uses of facial recognition, such as to monitor political protests. But the real problem is not about technology — it is about police conduct. If police have a lawful reason to monitor political protests, then it should not matter whether they are using facial recognition, cameras, drones, or undercover officers. Likewise, if they are intimidating, harassing, or otherwise violating people’s rights, that behavior should be banned, rather than the tools, which are only incidental to the conduct.

The third main concern is that the technology has an inherent racial bias. Much of this belief is based on a flawed report from the ACLU last year alleging that in its test Amazon’s facial recognition software falsely matched the faces of 28 members of Congress with arrest photos, with a disproportionate number of those false matches being people of color. Only later did the ACLU admit that it set a lower threshold for accuracy than Amazon recommends, and had it followed this guidance, the error rate would have dropped from 5 percent to 0 percent.

Still some facial recognition systems may perform better than others. To ensure police departments do not waste tax dollars on ineffective systems or ones with significant racial bias, the federal government should expand its evaluation of commercial facial recognition systems. Importantly, the National Institute of Standards and Technology should include testing of cloud-based systems and the National Institute of Justice should offer guidance and best practices to state and local law enforcement.

It is unfortunate, but not surprising, that San Francisco took such radical action against facial recognition — the privacy panic cycle has been a constant presence over the past century with similar peaks of hysteria about portable cameras in the 1900s, miniature microphones in the 1960s, and RFID tags in the early 2000s. But history shows that bans are the wrong approach and only delay the benefits of emerging technologies.

Moreover, the public largely opposes such bans. In a poll last December, fewer than one in five Americans would agree with strictly limiting facial recognition technology if it came at the expense of public safety. So instead of bans, policymakers should focus on a balanced approach that would allow law enforcement to use facial recognition technology to make communities safer while also providing additional oversight, testing, and guidance to protect the rights of all Americans.

Daniel Castro (@castrotech) is Vice President of the Information Technology and Innovation Foundation (ITIF) and Director of the Center for Data Innovation.

Comment
Show comments Hide Comments

Related Articles