-The British government has approached the public to consult on revisions to the Surveillance Camera Code of Practice. The code is part of the Protection of Freedoms Act which provides guidance on the appropriate use of CCTV by local authorities and the police. This is the first revision to the code since its introduction in June 2013.
AnyVision has responded to the Biometrics and Surveillance Camera Commissioner, Professor Fraser Sampson in an open letter entitled: “Facial Recognition Apps Should Be Provided to the Police with an Empty Database.”
Given AnyVision’s expertise in ethical facial recognition and commercial experience identifying persons of interest, including shoplifters, felons, and security threats, the company wanted to lend its perspective to the discussion and share several best practices on the application of ethical facial recognition to law enforcement settings.
AnyVision’s CEO Avi Golan wrote: “The ethical use of facial recognition is a thorny one and requires a nuanced discussion. Part of that discussion must explain how the underlying facial recognition system works, but, just as important, the discussion must also involve how the technology is being used by police departments and what checks and balances are built into their processes. We welcome an honest and objective dialogue involving all stakeholders to draft fair and balanced regulation.”
In recent years, face-based and object recognition systems have been adopted broadly before methods of due diligence have been fully thought through. The company agrees that the use of facial recognition or other biometric-based recognition systems need to be clearly justified and proportionate in meeting the intended purpose and should be appropriately validated.
First, it is important to highlight the unique characteristics and risk factors specific to police use of facial recognition technology. The most common use case for video surveillance is when police and other law enforcement agencies get a picture of a suspect from a crime scene and want to find out: “Who is the person in the picture?” That often requires an extensive database — one that could potentially include every human on planet earth.
This is very different from commercial use cases of facial recognition (e.g., within supermarkets, casinos, or stadiums) which are fundamentally asking a different question: “Is the person in the video a known security threat?” To answer this question doesn’t require a comprehensive database of all people, but rather a defined list of specific people who represent security threats.
In the company’s view, the path to fair and ethical use of facial recognition by police agencies is through adherence to three principles:
- Empty Database: We recommend building their watchlists from the ground up based on known felons, persons of interest, and missing persons. Some facial recognition solution providers have scraped billions of photos and identities of people from social networks, usually without their consent. Unfortunately, this method of facial recognition has justifiably angered privacy groups and data protection agencies around the globe and damaged public trust in the accuracy and reliability of facial recognition systems. We believe that lists of suspects should be limited and justified. In this way, unjustified invasion of citizens’ privacy can be prevented, false arrests can be reduced and public confidence in the technology can be increased.
- Safeguarding Data & Privacy: Many privacy advocates are justifiably concerned about how video surveillance systems capture and store data of innocent bystanders. At AnyVision, we don’t capture photographic images of people. The watchlists that comprise the reference data for our facial recognition algorithms are created and uploaded by our commercial customers – that is, they are created from scratch and specific to the security needs of that organization. The data that we capture is rendered using mathematical vectors that act as secure cryptography, preventing identity hacking even if data is stolen.
AnyVision goes a step further in safeguarding the privacy of non-watchlist individuals. We offer our customers the ability to activate “GDPR-mode” which effectively blurs all faces of people not explicitly listed on an organization’s watchlist. When this feature is activated, only individuals identified on the watchlist are visible — all other people in the camera’s field of view are blurred. Privacy Mode goes even further as it discards all detections of non-enrolled individuals. This means that police agencies cannot capture any metadata from non-watchlist detections which further protects the identities of bystanders. These advanced privacy features are designed to help organizations capture and collect data on individuals that is strictly necessary for the purposes of video surveillance (i.e., data minimization).
- Lack of Operational Due Diligence: Police admit that facial recognition technology has been instrumental in helping crack some tough cases, but in the last year, there have also been claims of wrongful arrests. In many of these cases, the wrongful arrests were the result of a poor investigative process vs. shortcomings of the facial recognition software. Facial recognition is more than just the technology — it’s about having specific rules that helps the software understand how to process potential face-based matches. These rules must operate within established boundaries that protect an individual’s privacy and conform to compliance law.
Facial recognition software is designed to identify a handful of likely suspects based on potential matches to a reference database. However, a potential match does not mean that the police department is absolved from performing a proper investigation. It’s critical that the police use the technology responsibly and determine whether any of the potential matches should be investigated further based on appropriate due diligence procedures and following established protocols. When police take shortcuts and wrongfully arrest innocent people based on a supposed match without the necessary due diligence, it reflects poorly on the underlying facial recognition technology. It’s imperative to highlight the importance of human review and investigation when applying this powerful technology.
“AnyVision is willing to share its industry insights and best practices from our vast research experience with leading global players, including name-brand retailers, global hospitality, financial services and law enforcement agencies,” said AnyVision’s CEO, Avi Golan. “If the regulations set forth by the Surveillance Camera Code of Practice are committed to the principles outlined above, then law enforcement agencies can strike the right balance between solving crime and protecting the privacy of innocent citizens.”
To learn more about AnyVision’s Visual AI solutions, visit www.anyvision.co.