IMCO adopts new transparency and reporting requirements
Today, the first step towards the future digital world was decided in the European Parliament’s IMCO Committee. The final report will be voted on in the plenary session in the second half of October. It contains recommendations for the EU Commission to take them into account as the Parliament’s opinion for the upcoming legislation.
We are therefore taking a closer look at what was decided today. In this post you will find an overview of the main negotiating successes for the report “Digital Services Act: Improving the function of the Single Market” in the Committee on Internal Market and Consumer Protection (IMCO), for which I am responsible on behalf of the Greens/EFA Group. The report deals with important issues of transparency of algorithms, rules for digital advertisement technologies (Adtech), notification procedures and interoperability, among many others.
We support the direction of the report:
- The liability exemptions for user-generated content remain in place as long as the platforms have no knowledge of illegal content. It is the prerequisite for a free internet.
- Regulating illegal content. “Problematic” or “harmful” but legal content does not fall within the scope. We do not want to give the platforms even more power to delete legal content in a non-transparent way without control. Instead, we want to shed light on the business model of market-dominant companies.
These are the Greens/EFA successes:
- Transparency obligations for recommendation algorithms: We want more visibility and transparency obligations for the mechanisms used to decide which content appears in the news. A three-level system would make sense, which gives supervisory authorities and scientists access to the code and, at a third level, allows generally understandable explanations for the public. We need this obligation of transparency because social networks have become an important player in our democracy. We had hoped for even clearer words, but the IMCO report at least calls on the European Commission to investigate the lack of transparency of the recommendation algorithms.
- Rules for the Adtech market: Google controls an overwhelming share of the advertising market. This dominance is crushing European companies, especially European media companies, which are dependent on their own advertising funding. The Committee has decided to shed more light on the entanglements in advertising technology and to introduce restrictions. I will advocate for tougher rules, including a ban on behavioural advertisement and micro-targeting.
- Notice-and-Action: This chapter bears an almost completely green handwriting. We have set out the future procedures on how to report and deal with potentially illegal content. An important achievement is that more differentiation should be made between notifiers, types of content and the seriousness of the infringement. The aim is also that any report of a potential infringement should in future include the name, URL, date and time of the infringement and a personal statement that the information has been provided in good faith.
- Consumer rights and product safety: Platforms need to know the suppliers operating on them so that they can be held liable for illegal offers such as toys containing chemicals banned in the EU. Information must also be provided on the environmental impact of orders, for example, the CO2 emissions caused by shipping.
- Interoperability: It was set as our goal that different services must be able to communicate with each other this means for instance that we should be able to communicate with contacts on Facebook even if we use a different messaging service. Through real interoperability, we will enable a competitive market for the most innovative services and user choice.
Unfortunately, the report does not include our demand for “Social Media Councils”, which could be set up as diverse councils made up of representatives of the affected groups, citizens and experts. Such a body, which can stimulate public debate, has been missing so far and is in my opinion necessary to initiate social discourse about algorithms and decisions of platforms on content governance, to stimulate a broad consensus and to put pressure on platforms and politics. Also not included is the idea of unbundling hosting and moderation services (content moderation). We have introduced this for the first time into the EU Parliament debate and have set the topic. We will continue to fight for it.