Content moderation involves reviewing user-generated content (UGC) on the internet in order to be able to assess the appropriateness of the content based on platform regulations (e.g. ToS) and legal frameworks (e.g. TCO Regulation, the DSA)1Roberts, S.T. (2017). Content Moderation. In L. Schintler & C. McNeely (eds.): Encyclopedia of Big Data. Springer, Cham. https://doi.org/10.1007/978-3-319-32001-4_44-1. Platform regulations and legal requirements are enforced through content moderation so that the requisite action is taken against prohibited and/or problematic content. HSPs are required to be transparent about specific measures taken to identify and remove terrorist content, including proactive and reactive content moderation processes and any automated tools used.
Content moderation is a very important, but also very complicated and sensitive procedure. Having the appropriate technical and subject matter resources can be a significant financial burden for smaller companies. In addition, terrorist content can be produced in all languages and it is extremely resource-intensive, to ensure that moderators are able to assess content in all these languages.
The fundamental rights enshrined in the UN Declaration on Human Rights, in particular the right to freedom of expression, must always be weighed against the harm caused by the content in question. On the other hand, HSPs have a social responsibility to mitigate harmful content on their platforms.
Comprehensible and transparent content moderation helps to defend users and stakeholders against the effects of harmful content and builds trust. From the perspective of HSPs, content moderation can bolster limitations of liability, compliance with applicable laws, and protection against reputational damage caused by misuse of the platform.
Content moderation does not necessarily mean content removal. While content deletion can be mandatory due to legal requirements – as is the case with removal orders based on the TCO Regulation – there are other ways to deal with harmful content. Depending on how problematic and serious the content is and the damage it may cause, alternative options can be considered. Examples and suggestions of alternative moderation strategies can be found in section 3 of this chapter.
Moderation styles vary from platform to platform depending on their functionality, services offered, platform values and risk tolerance. It is important to be as transparent as possible with content moderation. This includes various factors that need to be considered before, during and after moderation takes place to ensure users know the moderation actions their content might incur.