Skip to main content

Biometrics: Ethnic Groups

Question for Department for Digital, Culture, Media and Sport

UIN HL5494, tabled on 9 June 2020

To ask Her Majesty's Government what steps they are taking to regulate the use of facial recognition technology to ensure that it is not discriminatory towards people from BAME communities.

Answered on

23 June 2020

Uses of facial recognition technology in the UK, both private and public, are regulated by the GDPR and the Data Protection Act 2018 that set standards for protecting personal data. Organisations have an obligation to ensure that any personal data they hold is accurate and processed in a manner that is lawful, fair and transparent.

Facial images, which constitute 'special category' data for the purposes of the legislation are subject to heightened safeguards and can only be processed if specific conditions in the legislation are met. Processing must be necessary, proportionate and justified. The legislation is enforced by the Information Commissioner's Office, which has shown a willingness to take action against commercial organisations that are acting unlawfully.

To ensure a safe use of facial recognition technology (FRT) in all sectors, the government tasked the Centre for Data Ethics and Innovation (CDEI) to produce a Snapshot briefing paper looking at the uses and potential implications of facial recognition technology’s deployment in the UK. The paper was published on 28 May and we are considering its findings. The CDEI are currently working on a review into bias in algorithmic decision-making and will continue to examine the impacts of FRT and algorithms on society and provide recommendations on how to minimise bias.