The DSA deal: A gamechanger for our lives, society and the internet
How will the DSA change the world?
In the fifth round of negotiations between the Parliament, the European Commission and the Council of the 27 Member States, the Digital Services Act (DSA) was finalised. After 17 hours of negotiations, there is a political agreement on the DSA. This law will kick-start the Big Tech revolution. It will curb surveillance advertising and manipulative practices by online platforms, it will curb hate, hate speech and disinformation, it will strengthen users’ rights and it will hold online platforms accountable like never before. This will lead to the DSA changing our lives, society and the internet – for example, because we will defend ourselves with clear rules against polarization and the attention-grabbing business with the spread of fake news, violent videos and hateful comments.
These are the agreements in detail:
Clear rules for illegal content and strong user rights: In future, orders by judicial and administrative authorities of the Member States against illegal content (Article 8) and requests for information (Article 9) must be implemented without undue delay by all online services. This creates legal certainty on the net.
All online services must have a contact point or legal representative in the EU. This also applies to services like Telegram. No one can operate on the market in Europe without complying with European law.
Stronger rights for users: Terms and conditions and community standards that set rules for content moderation must be applied in an objective and non-arbitrary way in the future. Unequal treatment of the same content by different users, as is currently the case on Facebook, will thus become illegal. (Article 12). Online services are also obliged to duly respect the fundamental rights of their users when applying and enforcing their terms and conditions.
Content moderation must be reported annually in machine-readable transparency reports (Article 13). Very large platforms must also disclose how many staff are used for content moderation and how staff are trained and supported (Article 33). This is a strong Greens/EFA success.
Users will have harmonised notification procedures across Europe to quickly and easily report potentially illegal content online (Article 14). But users will also finally get more rights. Mandatory information for users and complaints procedures with the platforms (Articles 15 and 17) as well as external dispute resolution procedures (Article 18) will help users to get their rights and put an end to the arbitrariness in dealing with illegal content as well as wrong take-downs and account blocks. As Greens/EFA, we have also campaigned for this.
Restricting advertisement tracking: Online advertising is one of the financial foundations of the internet. But today’s business model, which operates according to the principle of surveillance capitalism, means that large platforms create comprehensive data profiles on individuals that enable manipulation and control of entire populations. We pushed through a ban on profiling for advertising purposes based on sensitive data (e.g. political and sexual orientation, trade union membership, religion in Art. 24) and the data of minors for advertising purposes. This is an important first step in reducing data profiling. It also serves to curb disinformation, as it is first disseminated in groups that are likely to believe and spread it.
Protection against manipulation: So-called “dark patterns” unfairly pressure internet users to make decisions. For example, by placing the option for consent in cookie banners large and prominently and directly clickable, while the option for rejecting cookies is hidden in the body text and can only be reached via several clicks. Or by pestering users over and over again, even though they have already refused their consent. The DSA includes a ban on misleading patterns (dark patterns, Art. 23a) that deceive or push users into consenting or buying. Unfortunately, the text was weakened in the final round of negotiations to the effect that practices already covered by existing consumer protection and data protection legislation are not included in this ban. However, the EU Commission can publish guidance documents explaining which specific practices are covered by the ban. With stronger regulation, we could have protected users from manipulation by cookie banners.
Online Marketplaces: Consumer associations keep uncovering unsafe and illegal activities on the internet, especially when it comes to the sale of dangerous products on online marketplaces. A whole new chapter of the DSA was created to introduce an obligation for marketplaces to identify all traders while preserving the anonymity of private users. Online marketplaces must make “reasonable efforts” to randomly check relevant databases for illegal products and generally “make reasonable efforts” to verify the identity and thus ensure traceability of traders.
Obligations for very large platforms (VLOPs): Platforms with over 45 million users in Europe have an impact on democracy because of their role in forming public discouse. Article 26 obliges very large platforms to carry out annual assessments of the risks posed by their design, including their algorithmic systems, and the functioning and use of their services to fundamental rights, human dignity, data protection, diversity of expression and media, non-discrimination, protection of minors and consumer protection. Any negative impact of platform services on public discourse and elections, on gender-based violence and also on the mental and physical well-being of users must be analysed by the platforms. Article 27 obliges platforms to address the identified risks.
Access to platform data for researchers and civil society: Very large online platforms play an increasingly important role in society because they influence opinions and public discourse with their enormous power. Access for researchers and civil society is therefore crucial to hold platforms accountable, to allow independent scrutiny and to understand how these platforms operate (Article 31).
Stronger enforcement powers: Regulators have strong enforcement measures at their disposal (Article 41 ff.) They can impose fines of up to 6 % of their global turnover. In the case of persistent infringements, periodic fines of up to 5 % can be imposed. They can also order interim.
The EU Commission will have central oversight of the rules specifically applicable to very large platforms to prevent an enforcement “traffic jam” in individual Member States. The Council and the EU Parliament have also agreed that very large platforms should share the financial burden of their own supervision by means of fees – an idea initially put forward by our Greens/EFA Group in the plenary vote in January 2022.
Defeat in the fight against abuse on porn platforms: For years, intimate images have been unlawfully published against the will of the women portrayed, with severe harm to those affected and a restriction of women’s freedom. Unfortunately, the DSA remains blind at this point. We have failed to enshrine effective means of protection against gender-based violence on the internet. In the final text of the law, there will be no separate article protecting against revenge actions of ex-partners who publish nude pictures (partly with the names and addresses of the persons concerned) or against secretly made pictures or videos, e.g. at festivals. Now, only very large platforms have to examine to what extent they can take measures to better protect persons from gender-based violence. This is too little and disappointing, I will continue to fight for effective protection.
Accessibility: The EU Commission will support platforms and civil society in developing codes of conduct to ensure accessibility on the internet.