Automated Facial Recognition Technology (AFR): Big Brother in action?

facial recognition

Artificial intelligence continues to be in the news. One area that has hit the headlines recently is the use of AFR – automated facial recognition technology – which is now well established.

In simple terms AFR is able to assess whether two facial images depict the same person.

Live footage from CCTV can be used to extract digital images of faces of members of the public and convert this into related biometric information.  This information can then be compared with that in a watchlist in real time.

CCTV footage can capture a digital photo of a person’s face, which can then be processed using software to extract unique biometric information, such as measurements of facial features.  This data is then compared with facial biometric data from images contained in a database.

The comparison is scored by the software, with a higher number indicating a greater likelihood of the CCTV facial image being a match with one on the database.  Such technology clearly lends itself to use in public places and for public events and is in fact already in use.

For example, between 2017 – 2019 South Wales Police (SWP) used the technology at a number of events including the 2017 UEFA Champions League Final, rugby matches and at an Elvis Presley Festival. Such technology will scan many faces –  21,500 at a rugby international for example.  A person’s image is personal data, so capturing images in this way will be processing personal data as is any comparison against a watchlist of crime suspects or other persons searched against.  Clearly such use could be seen as Big Brother in action.  Equally, it can be justified as a way of preventing crime, detecting criminals and help ensuring public safety.

The use of AFR by South Wales Police was recently challenged in the courts but the outcome was in favour of the police force.  The SWP had taken care in using AFR and had considered the privacy aspects of using AFR at the outset – it was only used for specific and limited purposes.  The CCTV information was deleted unless there was a match, which a human being (not a machine) then assessed. It was found that the force had taken care to comply with the Data Protection Act 2018.

But the use of AFR continues to generate controversy.  Its use by a property developer in the busy King’s Cross area of London has been controversial.  It was uncertain why the property developer had been using AFR and what their legal basis for processing the information was.  The Information Commissioner (ICO) decided to investigate and commented: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.”

Facial recognition technology is a priority area for the Information Commissioner’s Office (ICO) and when necessary, ICO has stated it will not hesitate to use its investigative and enforcement powers to protect people’s legal rights.

In its investigation of the use of AFR in King’s Cross the ICO will require detailed information from the relevant organisations about how the technology is used. They will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.

As the ICO has highlighted any organisations wishing to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must document how and why they believe their use of the technology is legal, proportionate and justified.

While the ICO supports keeping people safe, it stresses that new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.

In the USA such technology has also been used in job interviews to assess how candidates respond to questions during the interview.  There is evidence that such techniques can be biased e.g. discriminating against female applicants on the basis that previous hires for the company in question were mainly men.

Regardless of this we can expect to see AFR being used more often.  It is not currently illegal to use it in the UK – but its use must be proportionate, transparent and in line with appropriate regulatory codes as well as the GDPR and Data Protection Act 2018.

This means those using it must give its use careful thought, carry out a privacy impact assessment, avoid the risk of bias and have appropriate policy documents in place to justify its use.

Consulting with the ICO first is also a good idea.  Given that the ICO are actively prioritising this area, businesses can expect further guidance shortly.  But even if lawful, do businesses want the adverse publicity of being perceived as using “Big Brother” tactics?


Simon Stokes

Simon Stokes

Simon Stokes is a Partner with law firm Blake Morgan . He leads the firm's technology practice in London and specialises in information technology law.
Simon Stokes

http://www.blakemorgan.co.uk

Simon Stokes is a Partner with law firm Blake Morgan . He leads the firm's technology practice in London and specialises in information technology law.