Content Moderation and Data Protection

It’s important that platforms remain as safe and appropriate as possible for users at all times. To achieve this goal, it’s usually necessary for organisations to moderate featured user-created content. This ensures that what’s being published on their platform is indeed suitable and harmless. Otherwise, said platform could rapidly devolve into chaos, with explicit content, harassment, and a whole manner of undesirable content running amok.

Content moderation has its responsibilities

Regarding content moderation, there is a whole world of considerations that many may not have realised. In moderating user-created content, personal information is processed to determine the nature of the content. Therefore, organisations doing so must ensure that they’re carrying this out appropriately and lawfully.

The Information Commissioner’s Office (ICO) has recently published guidance surrounding this topic. In this article, we’ll go over this recent guidance and the action that your organisation may need to take.

What does the ICO say about this?

Content moderation consists of a type of processing that is likely to result in a high risk to people’s rights and freedoms. To help rectify this potential issue, the ICO has identified a number of steps that organisations need to take to comply with data protection legislation.

Keep in mind that this list is not exhaustive. For more information, visit the ICO’s website for the full guidance.

The steps set out in this guidance are:

  • A data protection impact assessment (DPIA) must be carried out prior to the processing.
  • You must identify a lawful basis before you start using personal information in your content moderation system. 
  • You must identify that the personal data is only processed in ways that people would reasonably expect; and could not have unjustified adverse effects on them. 
  • You must ensure that your content moderation systems perform accurately and produce unbiased, consistent outputs.
  • You must ensure that any technologies you use to process personal information are sufficiently statistically accurate and avoid discrimination.
  • You must tell people about (in a way that is accessible and easy to understand):
    • How you use their information
    • What decisions you make using their information
    • How they can exercise their data protection rights
  • If your content moderation involves solely automated decision-making based on personal information that has a legal or similarly significant effect on the people involved, then you must explain: 
    • that you use automated decision-making processes;
    • what information you use; 
    • why you use it; and 
    • what the effects on them might be
  • You must be clear from the outset:
    • Why you are using personal information for content moderation; and
    • What you intend to do with that information.

Automated decision making in content moderation

Many content moderation systems involve – sometimes solely – automated decisions, utilising algorithms that flag and take down unsuitable and inappropriate content.

For example, Google uses “a combination of Google’s AI and human evaluation”. They use this to detect and remove ads in violation of their policies or that could cause harm to its users and the Google Ads ecosystem. While a number of organisations opt to combine automated moderation and human involvement like this, that isn’t the case for all. It’s important to ascertain which category your organisation’s content moderation system falls under. Is there an element of human intervention, or are these decisions automated and left up to the algorithm?

How the GDPR applies to content moderation systems

If your content moderation does use automated decision making, then Article 22 of the GDPR shall apply.

It sets out that you must only involve solely automated decisions that have legal or similarly significant effects if they are:

  • authorised by domestic law;
  • necessary for a contract; or
  • based on a person’s explicit consent.

 

It is up to you to determine whether this exemption applies to your organisation, and you should document and justify which part of the legislation authorises your use of solely automated decision making.

What to consider

The above advice is by no means an exhaustive covering of everything your organisation will need to take into account, but rather serves as a brief overview of the ICO’s guidance. We highly recommend reading this in full if you feel that it may be relevant to your organisation’s business activities.

When it comes to using a content moderation system, there seems to be a wide range of areas that organisations need to be more aware of. There are a number of obligations that are imposed on your organisation if you wish to use a content moderation system that processes personal information. It’s crucial that you understand what these are so that you can moderate content lawfully and responsibly.

How can DPAS help?

At DPAS, we have many services that can support you as an organisation when it comes to making big decisions surrounding data protection compliance.

If you’d like to talk to us more about how we can help, either give us a call on 0203 3013384 or send us an email at info@dataprivacyadvisory.com – or fill in a contact form and we’ll get in touch with you.

related posts

Mel

Looking back at 2024 for DPAS

As we reflect on another remarkable year, I want to take a moment to personally thank you for choosing DPAS. Your loyalty and trust drives us to continually deliver the highest-quality training and services for our clients.

Read More »

Get a Free Consultation