Skip to main content

Social Media: Regulation

Question for Department for Digital, Culture, Media and Sport

UIN 2284, tabled on 17 May 2021

To ask the Secretary of State for Digital, Culture, Media and Sport, what assessment he has made of the potential merits of requiring social media companies to report on the algorithms they use to monitor online hate speech on their platforms and any biases found within those algorithms.

Answered on

21 May 2021

Hate speech is completely unacceptable in an open and tolerant society. Our new laws will mean social media companies must keep promises to their users about their standards and stamp out this sort of abuse. Companies will need to take steps to mitigate the risks of harm associated with their algorithms. This will apply in the case of illegal content and, in particular, companies will need to ensure that systems for targeting content to children, such as the use of algorithms, protect them from harmful material.

Ofcom will have a range of powers at its disposal to help it assess whether companies are fulfilling their duties. The largest and most high risk companies will also be required to produce transparency reports, which will include information about the steps companies are taking to protect users. These reports may include information about the processes and tools in place to address illegal and harmful content and activity, including, where appropriate, tools to identify, flag, block or remove illegal and harmful content.