Digital infrastructureBFEG publishes ethical principles to guide police facial recognition trials

BFEG publishes ethical principles to guide police facial recognition trials

An independent Biometrics and Forensics Ethics Group has published ethical principles to guide police facial recognition trials

Biometrics and Forensics Ethics Group (BFEG), a non-departmental public body, has commissioned a report to outline a framework of ethical principles that should be taken into consideration when developing policy on the use of Live Facial Recognition (LFR) technology. The report primarily reasons out the need to making a distinction errors and biases. These errors and biases are attributed to both, humans and technology, as per the report.

The Facial Recognition Working Group of the BFEG were responding to a letter from policy sponsor Alex Macdonald requesting guidance. Alex Macdonald also heads the Home Office’s identity policy unit.

A public ethics body had been asked to scrutinize probable ethical matters associated to the Home Office’s utilization of large and complex data sets. BFEG was to render independent oversight to boost the public’s confidence in how the department utilises data.

The report, commissioned by Professor Nina Hallowell (Chair), Oxford University; Professor Louise Amoore, Durham University; Professor Simon Caney, Warwick University; and Dr Peter Waggett, IBM, concentrates on the use of LFR in various categories of places. The categories of places examined were general public gathering; places where people are relatively static like concert venues, sports stadiums, public rallies; and places where there are clearly defined entry and exit points or where people are ‘channeled’ past the cameras.

Nine principles and questions

The framework classifies the following list of ethical extent that needs to be contemplated before detailing the guidelines:

  1. Public interest
  2. Effectiveness
  3. Avoidance of bias and algorithmic injustice
  4. Impartiality and deployment
  5. Necessity
  6. Proportionality
  7. Impartiality, accountability, oversight and the construction of watch-lists
  8. Public trust
  9. Cost-effectiveness

The report concluded that there were a number of questions with regards to the accuracy of LFR technology, its potential for not only biased outputs but also biased decision-making and an ambiguity about the nature of current deployments. The questions are detailed out based on the same nine issues covered as ethical principles.

A need to differentiate the errors and biases that are inherent to the design and training of the technology from those that are introduced when a human operator decides on an action on the basis of the system output gets highlighted through the report, though the errors and the biases are attributed to both, humans and technology.

There have been mixed reviews on this issue from the technology big-wigs. Amazon has refused to retreat and are actively marketing the technology to police. One of the pioneers of the technology, Google, has withdrawn from government sales of this kind of AI surveillance. On one hand Microsoft has called for tighter regulation until safeguards are in place and on the other hand it has termed stopping of such technology as cruel.

The UK Information Commissioner not long ago launched an investigation into the degree of success and legal validity of facial recognition court proceedings.

Last year Professor Andrew Charlesworth, University of Bristol, pointed out that the law is lagging well behind developments in surveillance technologies such as Automated Facial Recognition (AFR), which was already deployed in some areas of the UK and was the subject of two court cases. The white paper from Charlesworth makes a series of recommendations for a more constructive approach.

Related Articles

Most of NHS trusts yet to go for fully digital patient records

Digital Transformation Most of NHS trusts yet to go for fully digital patient records

3h Jay Ashar
Embedding workforce optimisation in the care home psyche

Adult Social Care Embedding workforce optimisation in the care home psyche

5h Charles Armitage
Cybersecurity is a market for lemons

Cyber Security Cybersecurity is a market for lemons

8h Bernard Parsons
Driving change through place-based partnerships: Part one

Change Management Driving change through place-based partnerships: Part one

1d Austin Clark
Rural gigabit connectivity programme kicks off

Gigabit & Fibre Rural gigabit connectivity programme kicks off

1d Jay Ashar
Chelmsford City Council switches to SaaS solution

Cloud Computing Chelmsford City Council switches to SaaS solution

1d Jay Ashar
Can you achieve transformation without change management?

Change Management Can you achieve transformation without change management?

1d Nital Hooper
NHS 111 online crosses 1 million calls for help

Digital Transformation NHS 111 online crosses 1 million calls for help

2d Jay Ashar