Censorship on social media

Censorship on social media

On 2 September 2020, Belgian Member of the European Parliament (MEP) Tom Vandendriessche of the Identity and Democracy Group posed a written parliamentary question to the European Commission:

“The fact that a succession of posts has been removed from my political page by Facebook raises a number of issues about the arbitrary exercise of power by major social media platforms. Over the last few years, Facebook – among others – has developed into a global player. Facebook’s monopoly is such that users choose the platform not so much because of its superior user interface, but because it is used globally and across the board. More than 7.5 million Belgians have a Facebook account. Politicians therefore use Facebook to reach out to their voters. Accordingly, I have the following questions:

1. Given the widespread use of the platform, to what extent is Facebook still in the private domain?

2. Should the content of postings by elected representatives for so large an audience be determined by a private entity or by national courts?

3. Can the Commission force Facebook to exhibit greater transparency as regards terms of use and violations thereof, and by whom precisely is the decision-taking process coordinated?”

On 19 January 2021, Internal Market Commissioner Thierry Breton responded on behalf of the European Commission stating: “The Commission is attentive to the challenges emerging in the environment of online platforms and in 2018, proposed a set of guidelines on the actions expected from online platforms for tackling online disinformation and a recommendation on the dissemination of illegal content uploaded by their users, while ensuring a high level of protection of freedom of expression online.

This includes a series of due diligence measures and, ultimately, timely information on content removal and ensuring that effective redress is available to users online.

In addition to illegal content, online platforms’ terms of service can indicate specific types of content considered as undesirable or objectionable and set out policies on the removal or disabling of access to any content that they store. For video-sharing platforms, the revised Audiovisual Media Services Directive foresees due diligence obligations to protect users from certain types of harmful audiovisual content while safeguarding freedom of expression.

On 15 December 2020, the Commission proposed a new Digital Services Act, which also provides rules for companies can enact their terms of service in a transparent and non-discriminatory manner, in respect of freedom of expression and information.

Further, the proposal includes a clear set of due diligence obligations for online platforms, including notice-and-action procedures for illegal content, redress, transparency and accountability measures as regards content moderation policies, and cooperation obligations.”

Source: https://www.europarl.europa.eu/doceo/document/E-9-2020-004760_EN.html

Photo Credit : https://pixabay.com/fr/photos/facebook-mobile-smartphone-3021068/

%d bloggers like this: