Module 4: Creating Human Rights Safeguarding Processes

Key Recommendations

The recommendations of this toolkit emphasise the foundational tenets of strong safeguards, proportionality of response, and transparency in policy and processes.

The key recommendations set out below are intended to be extensive. They offer an ideal roadmap for a platform to fully build out its human rights commitment into its counterterrorism and violent extremism practices, and are listed in a linear fashion highlighting which steps to take and in which order. Tech Against Terrorism acknowledges that certain of these recommendations might be resource heavy and therefore difficult to achieve for small and medium platforms. For this reason, we identify in bold type the recommendations to prioritise if resources are limited. We also offer different human rights and online counterterrorism support services for tech companies, for more information please reach out at [email protected].

  1. Include an explicit commitment to human rights and freedom of expression in your Content Standards, including a direct reference to the UNGPs, and how this commitment informs your CTVE efforts.
  2. Acknowledge how your CTVE and content moderation practices may impact human rights and consider all rights set out in the Universal Declaration of Human Rights equally.
  3. Provide an explainer of the internal policy used to assess Terrorist/Violent Extremist (TVE) content on your service, including your tiered TVE listing if applicable – if using such a listing, explain the criteria for listing including specific references to national and international terrorist designation lists and to expert resources from counter violent extremism organisations.
  4. Detail your platform’s CTVE commitment and approach to countering CTVE, with reference to specific measures intended to safeguard human rights.
  5. Explain clearly both what constitutes prohibited content and how this is moderated.
  6. Refer to international law and international human rights standards, such as the UNDHR, ICCPR, and the Rabat Plan of Action when setting content limitations which may impact freedom of expression.
  7. Include an exemption to your CTVE prohibition to allow for content that is journalistic or reporting on a terrorist organisation.
  8. Ensure sufficient language resources are dedicated to enforcing your CTVE policy without infringing on freedom of expression.
  9. Ensure that responses to violations of content standards are proportionate, including by considering different enforcement actions according to the severity of the violation.
  10. Ensure that moderation activities, in particular proactive moderation, respect the fundamental right to privacy online – in particular if offering private and end-to-end encrypted services.
  11. Provide clear guidelines on how to submit an appeal, should a user wish to contest a moderation decision. User appeals are a critical human rights safeguard by offering an avenue for redress should users rights have been infringed.
  12. Consider the human rights implications when purchasing or developing CTVE and content moderation tools. Ensure regular human review of tools to limit the impact on freedom of expression.
  13. Publish regular, detailed, and clear transparency reports, including both the platform’s own efforts to moderate content and referrals made to law enforcement.
  14. Provide detailed metrics in transparency reports on the user appeals received and resolved.
  15. Develop and publish the procedures by which you preserve content, to ensure that content that may constitute evidence of war crimes and human rights violations is duly archived.
  16. Have an accountability and oversight framework in place, and work with third party organisations, including counterterrorism experts, digital rights advocates, and local civil society groups, to review your CTVE policy and moderation guidelines on a regular basis.

To mitigate against these risks, Tech Against Terrorism recommends incorporating the following principles into your counterterrorism practices.

Spotlight on: Appeals, Redress & Archiving

User appeal

Article 8 of the Universal Declaration of Human Rights (UDHR) states that “everyone has the right to an effective remedy by the competent national tribunals for acts violating the fundamental rights granted him by the constitution or by law,” with the United Nations Guiding Principles on Business and Human Rights stating that companies should “provide for or cooperate in their remediation through legitimate processes” when associated with adverse human rights impacts.

User appeal processes permitting moderation decisions to be contested by users are a vital accountability practice and human rights safeguard. They are also an essential component of the due process that must be followed to restrict a rights, e.g., when restricting the right to freedom of expression by moderating content.

Tech Against Terrorism recommends tech companies:

  • Include an appeal process, that is clearly signposted in the Content Standards, including Community Guidelines and Transparency reports, for users to be able to contest decisions they believe are not justified.
  • List which moderation decisions cannot be appealed [if any] in the Content Standards, and whether it is due to the type of moderation enforcement (e.g., warning for the first offence, restricted ability to use the service, permanent suspension of the account), or to the nature of the content in violation.
  • Ensure users are notified when their content/account is actioned, with the notifications explaining the Content Standards which have been breached, or the legal basis for removing content, as well as whether an appeal is available.
  • Allow users to provide supplementary information in support or defence of a complaint if necessary.
  • Triage appeals according to potential human rights impact (with appeals on content removed for TVE violation given a priority).
  • Provide contextualised statistics on user appeals in transparency reports.
Meta’s Oversight Board

Meta, then Facebook, announced in 2018 that it would set up an independent body to decide on complex moderation issues raised by user-generated content on both Facebook and Instagram. The Oversight Board was announced a year later, in September 2019, and its first members were appointed in 2020. The Board began accepting cases in October 2020.

The goal of the Board is to “protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook’s content policies.” The Board is set up as a last appeal for users who wish to contest the removal of their content. In selecting and handling cases, the Board focuses on cases that have significant impact on online freedom of expression and public discourse, great real-world impact generally, or which “raise questions about current Facebook policies”.

The bulk of the Oversight Board’s decisions refer to relevant human rights standards such as The UN Guiding Principles on Business and Human Rights. Drawing on the UNGPs, the following international human rights standards were considered in the Oversight Board’s cases relating to dangerous individuals and organisations: the right to freedom of expression, the right to non-discrimination, the right to life and the right to security of a person.

To read more about the Board and its decisions, see here: https://www.oversightboard.com/

Archiving – Preserving Content for Human Rights Considerations

To support appeal process, tech companies should ensure that they have the necessary archiving system in place for removed content to be reviewed and re-instated if the appeal proves to be legitimate. Archiving thus is important for appeal processes, but also for administrative or judicial review if content is removed for as part of legal and regulatory requirements.

An example of this is the EU TCO which requires tech companies to archive content for 6 months for:

  • Administrative or judicial review, or complaint handling,
  • The prevention, detection, investigation, and prosecution of terrorist offences.

Archive content is also critical to preserve potential evidence of war crimes and human rights violations. This is particularly relevant when removing terrorist content as such content can be used to document abuses and hold the perpetrators to account. An example of the important work or archiving war crime evidence is the work of Mnemonic whose mission is to “help human rights defenders effectively use digital documentation of human rights violations and international crimes to support advocacy, justice and accountability.”

Tech Against Terrorism recommends tech companies:

  • Publicly state their position on preserving content and relevant procedures in their Content Standards, with the explanation that removed content may be needed at a later date to comply with administrative or judicial reviews, or to support the investigation of terrorist offences or human rights violations and war crimes.
  • Build up capacity to understand local context and to minimise the possibility of incorrect deletion. This is particularly relevant in conflict zones.