Lesson 3: Establishing Effective Moderation Mechanisms for Terrorist Content Online

Summary: Contents & Main Points of this Chapter

  • In content moderation, user content is reviewed to assess whether legal (e.g. TCO Regulation, DSA) and platform-specific regulations (e.g. ToS) have been complied with or violated.
  • If a rule has been violated, content will be ‘moderated’. In other words measures are taken to limit the reach of the content.
  • When receiving a removal order from a competent authority based on the TCO Regulation, content moderation always includes the removal of the respective terrorist content, though HSPs and users can contest this if they disagree.
  • Various points must be taken into account before, during or shortly after the moderation measures are undertaken by the HSP, as well as in a later review. These include the notification of users whose content has been moderated and the option to appeal the decision.
  • If platforms take proactive action against other potentially harmful forms of content, such as hate speech or insults, regardless of the TCO Regulation, other alternative moderation approaches may also be considered.