My assessment of the Council’s position on the DSA
The EU Ministers shall formally announce the Council’s position on the Digital Services Act and the Digital Markets Act today (25 November 2021). This is my assessment:
We urgently need strict rules concerning the transparency of large platforms in Europe. Their unbridled greed for personal data coupled with their opaque recommendation systems is damaging to our economy and our democracy. Sadly, the Council is dominated by ditherers and laggards.
The Facebook whistleblower Frances Haugen’s revelations provided us with clear proof of the negative effects of online platforms’ recommendation systems. We know that their algorithms proliferate divisive content, but yet the Council remains silent. They have failed to stand up for users and protect them from disproportionate profiling.
Surveillance advertising is a veritable blind spot in this decision. Surrendering in the face of this issue is naive and dangerous. Especially as we have held broad public debate about the mechanism which drives this dissemination of hatred and disinformation.
The majority of what we know about how the platforms work is indeed thanks to NGOs, journalists, whistleblowers and independent researchers. We have a duty to facilitate their work. The Council is doing the exact opposite. Like the Commission, it looks solely at researchers in academic institutions, many of which receive third-party funding from large platforms, and creates new obstacles for them such as further requirements in application processes. That will not help us in truly gaining an insight into how large platforms influence our democracy or subjecting them to public inspection and supervision.
The passage prohibiting platforms from using “dark patterns”, which use tricks to coerce users into making purchases, is a small ray of hope. Unfortunately, this requirement is only applicable to online market places and recommendation systems, but not to the infuriating cookie banners which make it much more difficult to reject sharing data than to accept it. Discussions in Parliament have made more progress in this regard.
I also see the extended transparency proposals regarding how platforms manage their content and block or delete it as positive, as well as the obligation to declare how many qualified human staff are entrusted with content moderation and how filters are used.
In summary: The Council has made cosmetic improvements to the Commission’s proposal. The big step which would allow us to gain deep insight into democracy-relevant recommendation systems, stop unbridled data collection, and protect our citizens from hate speech, disinformation and manipulation has not, however, been taken.