3. Practical tips & advice: What helps to assess whether the content is illegal?

Regarding the TCO Regulation, HSPs do not have to assess the legality of content as it is the responsibility of the competent authorities to do so before issuing a removal order. However, HSPs are required to proactively identify terrorist content on their platforms once they have been exposed to terrorist content. It is often challenging to assess whether content really crosses the legal line and should therefore be prohibited. In addition, the psychological stress caused by seeing problematic content should not be neglected. Employees who moderate content should have regular opportunities to reflect on their work and, if needed, receive psychological support.

To assess the illegality of content, you will find some clues and practical tips below. Before that, let’s take a look at the relevant parts about content assessment in the TCO Regulation:

Paragraph 11 states, among other things:

“When assessing whether material constitutes terrorist content within the meaning of this Regulation, competent authorities and hosting service providers should take into account factors such as the nature and wording of statements, the context in which the statements were made and their potential to lead to harmful consequences in respect of the security and safety of persons.” (TCO Regulation, 11)

In addition, fundamental rights must always be weighed up:

“When determining whether the material provided by a content provider constitutes ‘terrorist content’ as defined in this Regulation, account should be taken, in particular, of the right to freedom of expression and information, including the freedom and pluralism of the media, and the freedom of the arts and sciences. Especially in cases where the content provider holds editorial responsibility, any decision as to the removal of the disseminated material should take into account the journalistic standards established by press or media regulation in accordance with Union law, including the Charter.” (TCO Regulation, 12)

If HSPs receive a removal order, the content has already been classified as terrorist by the competent authority. In two scenarios it is useful for HSPs to be competent to be able to classify content as terrorist or non-terrorist:

  1. If a HSP has not yet received an official removal order but wants to act proactively and/or enforce compliance with its own ToS in respect of suspicious content.
  2. If a HSP receives an official removal order but has doubts about the assessment of the content as terrorist by the competent authority and wants to review the order (as a first step towards a legal remedy).

Below you will find various forms of practical assistance to classify content as terrorist or non-terrorist.

National and international designation lists that name terrorist organisations provide a good framework and reference point for content classification. For example, adopting designation lists as the basis for banning or blocking, which should be referenced in the ToS (chapter 1), gives a platform’s sanctions the backing of the law. Paragraph 11 of the TCO Regulation explicitly suggests that the European Union list may be used when assessing content.

People who manually review the content should not only have the definition of terrorist content readily available but also be conversant with and maintain their knowledge of keywords and phrases typically used by terrorist organisations. The more intensively a person assessing the content is familiarised with a terrorist phenomenon (e.g., right-wing, left-wing, Islamist), the easier it is to recognise obfuscation tactics such as so-called ‘dog whistling’, i.e. the use of seemingly harmless words that have ideological meanings within the scene . A database of logos and symbols, i.e. visual elements that indicate a terrorist background, is equally indispensable for assessing content accurately and at pace.

Moderation should consider the context in which content is likely to have been published. Relevant contextual factors include but are not limited to (a) political conditions, (b) current events, including recent developments in the news, and (c) cultural circumstances that may shape views on certain issues. This is often difficult to judge – especially when users are anonymous, and the content they post contains few or no textual clues to their identities – but in some cases, when external factors are referenced in content, context considerations turn out to be helpful.


Additionally, when dealing with content that is not terrorist but harmful in another way (e.g. different forms of hate speech), the following points may be considered.

Terrorist, extremist and further harmful content like hate speech or incitement to violence especially when it incites the commission of terrorist offences, can cause significant harm. When assessing content, taking into account the extent to which content may be a causative factor in such damage can be useful. The greater the damage that may occur, the faster and more deliberate the action that must be taken. Adverse psychological effects on those affected by the harmful content should be considered a form of damage.

The TCO Regulation calls for a sensitive balancing of the fundamental rights enshrined in the Charter (e.g. freedom of expression and information) when considering content removals. If removal has not been ordered by a competent authority, there are other ways in which harmful content can be handled – examples of which can be found in chapter 3.