logo

Content moderation on social media: Whose responsibility is it really, platforms or users?

LetsLaw / Digital Law  / Content moderation on social media: Whose responsibility is it really, platforms or users?
Moderación de los contenidos en las redes sociales

Content moderation on social media: Whose responsibility is it really, platforms or users?

The moderation of content on social media is generating much debate about who takes responsibility for posts globally. The debate arises as to whether the responsibility lies with the platform on which the content is posted or with the users themselves. Social media is a medium for sharing information managed by third parties and made available to the general public. It is a difficult debate because of the controversies that arise from existing regulations. In addition, it affects fundamental rights such as freedom of expression and privacy in the field of internet platforms such as social networks. 

Evolution and challenges

Content moderation was driven by users’ refusal to receive advertising, spam or advertisements that were not of interest to them or that they simply found unpleasant. This is why the need for moderation arose, because of the satisfaction of users of the platforms. 

The debate in question is born out of the challenges brought about by disinformation through the platforms, as well as hate speech, and even all kinds of harmful content that is promoted on platforms such as Facebook, Instagram or Twitter. 

Moderation plays a key role in social media because of the exchange of large amounts of information, opinions and audiovisual content. It is essential to ensure compliance with existing laws that seek to protect fundamental rights such as privacy, non-discrimination and personal integrity, while also implementing policies in line with the use of the platforms.

The role of platforms

Based on the different regulations that exist on this matter, platforms have different responsibilities. On the one hand, according to Section 230 of the US Communications Decency Act, platforms are not responsible for the type of content that is posted by third parties, as it promotes moderation without the need to remove specific content that is hosted on the platform. It is important to add that the same section states that ‘No provider or user of an interactive computer service shall be treated as the publisher or originator of any information provided by another information content provider’. 

Contrary to the Communications Decency Act, in the UK, the Safety Bill Act makes it clear that it is the duty of platforms to be diligent and proactive in analysing and mitigating any risk posed by content posted that is deemed to be illegal. Therefore, the responsibility for moderation would lie solely with the platforms.

In the same way as the British law, the EU’s Digital Services Act determines that there is an obligation to assess risk and adopt risk mitigation measures. Another issue it insists on is transparency and clarity about the terms and conditions of use of the platforms, and the mechanisms made available to users to act against decisions taken by the platforms. 

It is understood from the above that platforms must act in a way that guarantees human rights, fair processes and clear policies and legislation. All the regulatory texts aim to achieve responsible moderation of content on social networks, in addition to the protection of freedoms in the digital sphere.

User management

Users of internet platforms are indeed responsible for their role as disseminators or sharers of information hosted on the platforms. They are held liable in cases where, by sharing unlawful content such as insults or defamatory comments, they do not act prudently or with the diligence required. 

Recent judgments of the European Court of Human Rights, as well as the Supreme Court in Spain, have determined that mismanagement of defamatory comments on platforms may entail some civil liability for those who have authored them. This arises when users do not commit to following the obligations that come with interacting on social media, such as taking responsibility for what is posted or promoting disinformation. 

They must act as transparently as possible about their identity, so being able to identify users is crucial to achieving this. Whether that information has been provided in the public interest or plays an important role in the safety of members is another matter. In some cases, users act through anonymity as a means of ensuring that they can express themselves freely.

And therefore, content moderation involves both parties, platforms and users. Platforms must comply with policies that protect fundamental rights, but users must also act diligently and share all information accordingly. All current regulations in different countries give us different approaches to the responsibility of platforms.  On the one hand, some seek a balance between immunity and liability. On the other hand, another determines that it is the users who should be responsible in the area of digital platforms.

Contact Us

    By clicking on "Send" you accept our Privacy Policy - + Info

    I agree to receive outlined commercial communications from LETSLAW, S.L. in accordance with the provisions of our Privacy Policy - + Info