At the European Union (EU) level, cross-sector initiatives regulate the rights of marginalised groups and establish HRDD responsibilities for online platforms to expeditiously identify, prevent, mitigate, remedy and remove online hate speech. These initiatives include the Digital Services Act, the Audiovisual Media Services Directive, the proposed Directive on Corporate Sustainability Due Diligence, the proposed Artificial Intelligence Act and the Code of conduct on countering illegal hate speech online. Nevertheless, the HRDD framework applicable to online hate speech has focused mostly on the platforms’ responsibilities throughout the course of their operations – guidance regarding HRDD requirements concerning the regulation of hate speech in the platforms’ Terms of Service (ToS) is missing. This paper employs a conceptualisation of criminal hate speech as explained in the Council of Europe Committee of Ministers’ Recommendation CM/Rec(2022)16, Paragraph 11, to develop specific HRDD responsibilities. We argue that online platforms should, as part of emerging preventive HRDD responsibilities within Europe, respect the rights of historically oppressed communities by aligning their ToS with the conceptualisation of criminal hate speech in European human rights standards.

By author

Leave a Reply

Your email address will not be published. Required fields are marked *