18. January 2022

How will the DSA change the world?

How will the DSA change the world?

The Digital Services Act will become the new digital constitution for Europe. It initiates Europe-wide changes, not only in reporting procedures for illegal content and user rights but also with its systemic measures such as risk assessment, independent audits and data access which will allow us to analyse hateful speech, disinformation, and radicalisation on platforms in an evidence-based manner. After all, large digital platforms now play a key role in our democracy and must therefore be compatible with democracy. Europe has achieved a fine balance between ensuring law enforcement and tackling systemic risks without arbitrarily interfering in legal content and thereby endangering freedom of expression.

The most important achievements that we fought for as the Greens are:

Consistent law enforcement regarding illegal content 

 

Online services must immediately take measures against illegal content when ordered to by national judicial or government authorities and provide information about users upon request.  At the same time, we were able to strengthen user rights, as affected persons can now also apply to authorities in their home country for a preliminary injunction.

 

Users are granted more rights

 

With the DSA, we have created the first binding legal framework to enshrine the rights of users and the handling of illegal content. Uniform reporting procedures for illegal content and other content in breach of T&Cs simplify such handling. Citizens now have recourse to internal complaint and external arbitration procedures to defend themselves against the arbitrary deletion of content and blocking of accounts.

 

Protection from manipulation

 

‘Dark patterns’ unfairly force users to make decisions. For instance, when the option for consenting to cookies is positioned in a prominent, large, and directly clickable way in cookie banners, whilst the option for rejecting cookies is hidden in the main text and can only be accessed via a number of clicks. Or by hassling users repeatedly even though they have already objected to the use of cookies. That is now history. Buttons have to be designed fairly so that users have an actual choice in the future and settings configured once must be respected, including via automatic browser signals. With this, we will turn the internet into a place of trust again.

 

No more advertising at the cost of children and teenagers

 

The DSA sets new standards in privacy protection because we can better restrict the collection of our personal data. In the future, it will no longer be possible for people to spy on children and teenagers to show them advertising and content tailored to them. In practical terms, that means that Instagram, for example, will no longer be able to target content which glorifies anorexia to young girls. This a first key success in tackling the large platforms’ business model, namely surveillance advertising. I shall continue to fight to protect all people from tracking and profiling.

 

Risk assessments and independent audits

 

In the future, very large platforms will have to carry out risk assessments where they estimate the risks posed by their technology, their business model, and their algorithmic systems. The risks to be assessed include the dissemination of illegal content and content which breaches T&Cs, as well as risks to human dignity, data protection, freedom of expression, media diversity, non-discrimination principles, gender equality, child protection, and consumer protection.

 

Independent audits

 

For the first time, independent organisations shall inspect whether platforms comply with the duties arising from the DSA and the codes of practice on an at least annual basis. The auditors shall be granted access to the required data.

 

Access to data for researchers and NGOs

 

The new Act shall give researchers and non-governmental organisations (NGOs) the opportunity to look under the hood of the large platforms such as Google, Facebook, and YouTube for the first time. We Greens fought for access for NGOs. In the future, they will gain many new insights, such as how purely profit-driven algorithms disseminate disinformation. By analysing dissemination mechanisms, we can develop measures against radicalisation in public discourse. We shall then no longer be reliant on whistle-blowers in the future; rather, we can gain the knowledge needed to define better rules to protect our democracy ourselves.

 

Effective measures against abuse on porn platforms

 

Upon the initiative of the Greens, Parliament shall introduce a completely new article regarding image-based sexual abuse on porn platforms. For years, intimate images have been illegally published without the consent of the depicted women, causing severe damage to the affected people, and restricting women’s freedom. Content must now be professionally moderated by qualified staff and victims have the right to identify themselves anonymously with the platforms to request their images be removed. Furthermore, uploaders on porn platforms will have to verify themselves with a phone number and email address. We are hereby taking effective steps against revenge behaviour by ex-partners who publish nude images (sometimes complete with the depicted person’s name and address), against covert recordings, e.g., at festivals, and against images which are used specifically to blackmail journalists and politicians.

 

More protection for editorial content

 

Media companies and public broadcasters in Europe rightly complain that their content is subject to at time arbitrary measures on online platforms because it breaches the T&Cs or simply falls foul of upload filters. To ensure media diversity and freedom on the internet, I fought for provisions in the DSA which restrict the arbitrary removal of legal content by online platforms, for instance with the option of counter-notification before content is deleted. Plenary shall vote this week on an amendment tabled by us in this regard. We do not, however, support a general exception which wholly exempts media content from fact checking and assessment for compliance with T&Cs. Such a regulation would open the floodgates for disinformation because the recommendation mechanisms currently used by the large platforms disseminate disinformation much more readily than facts.

 

Stay up to date