2. March 2021

The DSA: Part #06 – Effective, uniform reporting procedure for illegal content

This article is the sixth instalment of a series concerning the Digital Services Act (DSA) with which the European Union seeks to set out new rules for the internet.

 

The way in which YouTube, Instagram, Twitter, TikTok and other large platforms currently filter and moderate content restricts our freedom of expression. Hate-filled content remains online too often, whilst legitimate posts, videos, accounts and advertising are too frequently removed or blocked. Those affected are hard-pushed to defend themselves. This creates online collateral damage which we can prevent by imposing strong rules for everyone with the new Digital Services Act.

 

An EU-wide, harmonised reporting channel (Notice and Action) would offer users, civil society, journalists and fact checkers an effective option for combating illegal content. The swift reporting and prosecution of illegal content and hindering and prosecuting the perpetrators instead of simply deleting their comments must be the aim here. A future joint mechanism should comprise a standard form which simply and effectively supports users and can quickly be found on the platforms.

 

I am absolutely convinced that participatory processes can help us as political decision-makers to draft better laws and incorporate the latest scientific findings in our legislation. As part of the “My Content, My Rights” campaign, the Greens/EFA launched a consultation process for an EU-wide uniform reporting channel and gathered valuable input from civil society, industry, science and internationally acknowledged experts. Our proposed law can be found here.

 

It is pleasing that the European Commission has incorporated several proposals for the Digital Services Act in Articles 14 and 15. Overall, however, the Commission is not consistent enough.

 

My internet of the future:

 

  • Online platforms will be obliged to establish an easily accessible reporting channel (Notice and Action) and clearly and comprehensibly describe that mechanism in their terms and conditions. It must also be clearly visible on the platform, ideally exactly where the reportable content is.
  • In our democratic societies, only judges may decide what does or does not conform with applicable law. Private companies must no longer be allowed to assume the role of online police and courts.
  • Users should be given the opportunity to report infringements anonymously in future. This would protect victims from uproar and targeted hate campaigns. That would help women and minorities in particular. I shall continue to fight for them. This option has not yet been provided for in the European Commission’s bill (in Article 14(2)c).
  • Experience with the German Network Enforcement Act has shown that large US platforms generally do not care what is reported as an infringement of a Member State’s laws because they give preference to their own T&Cs or community standards. The reporting mechanism should therefore be given priority over any inspection of breaches of T&Cs.
  • The new procedure provides the option of lodging a complaint and defending oneself before the platform intervenes – unless the content is clearly illegal and would be detrimental to the public if it were to remain visible (child abuse, incitement to racial violence etc.).
  • The current approach to deleted illegal content seems to be ‘out of sight, out of mind’. That allows perpetrators to simply continue elsewhere on the internet. To prevent this from happening in the future, the platforms must report severe crimes to national authorities to ensure proper investigation with appropriate consequences.


Learn more:

 

Stay up to date