“Facial Recognition” Body-cams for Police Prove Unreliable in California Test

Warning: Undefined array key "sample_issue" in /home2/jbswbdv/public_html/wp-content/themes/tna/template-parts/paywall/trinity.php on line 15
Article audio sponsored by The John Birch Society

As the Surveillance State really began to ramp up in the wake of 9/11, cameras began to appear everywhere. The next phase — a few years later — brought “facial-recognition” software to identify passers-by by digitally comparing their features to a database of criminals and terrorists. Besides the obvious privacy issues related to their use, perhaps the best argument against “facial-recognition” cameras is that they plain don’t work.

Enter the use of “facial-recognition” software as part of body-cams worn by police officers. The argument for them is that police will know if someone they encounter has outstanding warrants or a violent criminal history. So far, so good (except, of course, for the mass-surveillance issue). Again, the underlying problem is that the “facial-recognition” software is bogus.

In fact, in a recent test conducted by the American Civil Liberties Union (ACLU) in California, 26 state legislators were incorrectly matched by the software to mugshots of criminals. Those numbers equate to about 20 percent — or one in five. While that may be almost comical, all of the funny would wear right off if one of those legislators were mistaken by police as a dangerous criminal because that mismatch happened in the wild. If the false positive rate is that high, it is apparent that these cameras present a danger to police and the public.

{modulepos inner_text_ad}

Now, California Assemblyman Phil Ting (D-San Francisco) — who was one of the 26 legislators mistakenly identified as criminals — authored a bill to ban the use of “facial recognition” cameras by police state-wide. They are already banned in Ting’s neck of the woods. In fact, San Francisco banned them in May and other cities and municipalities around California are looking to follow suit.

The ACLU test in California is consistent with a similar test last year in which 28 federal legislators were erroneously matched with criminal mugshots. This writer would like to clarify that those 28 mismatches were for mugshots of other people, not the federal legislators themselves. Any confusion on the topic can easily be understood.

Given that both tests show false positives involving many of the very legislators who will have to vote on laws regulating the use of these cameras, one would expect such legislation to have an easier path to becoming law. Not only is Ting’s California bill (Assembly Bill 1215) moving forward, but there has been discussion on how to handle this on the federal level. As the Los Angeles Times is reporting:

Ting’s proposal, Assembly Bill 1215, could soon be on the governor’s desk if it passes the Senate. Sponsored by the ACLU, the civil rights organization hopes its recent test will grab attention and persuade legislators to put the technology on hold.

There is little current federal regulation of facial recognition technology. Recently, members on both sides of the aisle in Congress held oversight hearings and there has been a strong push by privacy advocates for federal action. But concrete measures have yet to materialize.

That has left states and local jurisdictions to grapple with the complex technology on their own. New Hampshire and Oregon already prohibit facial recognition technology on body-worn cameras, and San Francisco, Oakland and Somerville, Mass., also recently enacted bans for all city departments as well as police.

The software used in the ACLU tests was Rekognition, developed by Amazon. The company has previously denied that its software is unreliable, but did not comment on the most recent test in California. Here is a tip for the folks over at Amazon: A 20 percent rate of false positives is the very definition of “unreliable.”

As stated above, the real problem with such a rate of false positives is that police encountering someone and believing him to be a dangerous criminal owing to just such a false positive will — because of their training — react differently than they would without that bad intelligence. Another possible problem would be the yet-unknown rate of false negatives. Police who learn to depend on the software to warn them of potential danger may be lulled into a false sense of safety and suddenly find themselves in very real danger.

And then, of course (as also mentioned above), there is the issue of privacy. Body cameras worn by police officers as an objective witness in an encounter where a police officer has to use force — including lethal force — has been a good thing so far. There have been multiple accounts of police officers being cleared of fraudulent charges due to the evidence of the video captured by their body-cams. There have been other instances where the cameras show police officers behaving badly.

That is the purpose of these cameras. As Ting said, “Body cameras were really deployed to build trust between law enforcement and communities.” Turning those tools of trust into tools of surveillance will undermine whatever good they have done. “Instead of trust, what you are getting is 24/7 surveillance,” Ting added.

SYLP Banner A

 Photo: AP Images