21. January 2021

The DSA: Part #05 – Social Media Councils: Power to the people!

This article is the fifth instalment of a series concerning the Digital Services Act (DSA) with which the European Union seeks to set out new rules for the internet.

 

The storming of the Capitol in Washington on 6th January seemingly turned the spotlight onto the most urgent questions facing online social cohesion. One of these questions is: Who actually decides which people can publish their opinions on social networks? Can we let companies make this decision, like in this instance where Twitter, Facebook, Instagram, YouTube, Snapchat and Twitch blocked Donald Trump’s accounts following the incident? The simple answer is: of course they can, as soon as content or user activities breach their terms and conditions. However, that kind of decision begins to pose problems when social networks become more powerful than nation states and their private infrastructure is perceived as public space.

 

The platforms’ intervention has set a new precedent. Companies excluded a democratically elected head of government from global public debate online. On the one hand, it was well overdue, with Trump having breached the social networks’ conditions of use at least a dozen times during his tenure. On the other hand, however, it was also questionable and unacceptable in a democracy. So, who should be able to decide when and how about what can and cannot be published and expressed? The answer is complex. Fundamentally speaking, companies ought not to have that much power, and nor should governments in totalitarian states. Power in democratic states belongs to the people and this debate needs to be led by society.

 

We therefore seek to strengthen citizens’ rights with the Digital Services Act. I propose the establishment of ‘Social Media Councils’ made up of members of civil society, experts for freedom of expression, democracy and technology, and representatives of groups particularly affected by hate speech in order to publicly debate exactly these key questions about online communication in the future. They can trigger debates, identify good and bad platform practice, and issue recommendations for action to politicians. It is important, however, that they should not make decisions about the (il)legality of individual posts. Only courts can do that in a constitutional state. But they can initiate public debate: How much disinformation can a society tolerate? How much hate speech do we have to put up with to protect freedom of expression? Or is freedom of expression at risk if considerable numbers of people – especially strong young people, women and PoC – no longer dare to openly express their opinions online? Neither companies nor governments can be entrusted with these difficult considerations. It is up to civil society.

 

Social Media Councils are the opposite of what platforms have themselves offered so far, namely internal ethics councils. How independent can an internal ethics council at Facebook be if its members are elected by the executive board? That would cement the privatisation of justice and ethics. Facebook was one of the first companies to establish such a council, their “Oversight Board”. However, it was strongly criticised and deemed to be purely “cosmetic” because it failed to tackle the root of the problem, namely that Facebook’s algorithms support the extensive dissemination of divisive and problematic content.

 

When platforms block channels of their own accord, they are thereby interfering in our freedom of expression – which is a precious asset. A public debate needs to be conducted here as well. How much freedom of expression can we – each individual and our democracy – tolerate in social networks? Let’s take the example of disinformation: it significantly damages our democracy, but only in the rarest cases is it illegal and can be prohibited by a court. Militant Trump supporters who were absolutely convinced that Trump had won the election have shown us how dangerous it is.

 

My internet of the future:

  • Various committees and expert panels on the one hand, and platform councils (similar to the citizens’ assemblies in Ireland) with randomly elected members on the other. Public debate could already impel the platforms to adjust their moderation practices. Any necessary legal fine-tuning could then be done on the basis of this evidence-based, broadly supported consensus.
  • Independent, socially representative and diverse, in particular gender-balanced, platform councils do exist which provide an open, transparent, accountable and participative forum for the analysis of content moderation principles.
  • These platform councils issue guidelines, statements, political recommendations and expert knowledge on content moderation practices both for political decision makers and for social networks.


Learn more:

 

Stay up to date