Skip to main content

Audio Recordings: Disinformation

Question for Department for Science, Innovation and Technology

UIN HL258, tabled on 13 November 2023

To ask His Majesty's Government what steps they are taking to combat the creation and spread of digitally generated fake audios.

Answered on

27 November 2023

The Government recognises the fast-moving development of AI systems, including those used to generate fake audio and the potential of such tools for facilitating criminal offences such as fraud.

The Online Safety Act received Royal Assent on 26 October. It has been designed to keep pace with emerging technologies, and to provide Ofcom with broad horizon-scanning and robust information-gathering powers so that it can review and regulate technologies effectively.

The Act’s illegal content duties require providers to proactively mitigate the risk that their services are used for illegal activity or to share illegal content. and to design their services to mitigate the risk of this occurring. Services must also take steps to prevent content that constitutes a priority offence from appearing on their service — this includes a number of fraud and financial crime offences.

This applies to fake-audio content, whether that content is created by a human of AI-generated. On services it regulates, the Act will regulate AI-generated content in much the same way it does content created by humans.

Further, ahead of the Bills implementation, the Government plans to deliver a voluntary Online Fraud Charter. This charter will demonstrate the ambition of signatories to work with the Government to tackle online fraud.