European democracy and elections: The case of political advertising
Online political advertising is a relatively new phenomenon that has not been sufficiently regulated yet – a regulatory gap which threatens electoral integrity through voter manipulation, intransparent campaigning and disinformation campaigns, as shown by the Cambridge Analytica scandal where 87 million Facebook profiles were obtained and misused to target voters during the 2016 presidential election. Recent Facebook whistleblower testimonies have also highlighted how big tech platforms’ business models have played a crucial role in amplifying these issues. Based on the collection and aggregation of people’s personal data that allow to build profiles to predict people’s behaviour, microtargeting and ad delivery algorithms are used to reach voters with tailored messages to exploit their sensitivities fragmenting the political public debate.
The goal of EU-wide rules on political ads is to tackle new threats and to maintain the integrity of elections and free democratic processes as well as to improve transparency and limit harmful practices.
My proposals for new rules in 5 points
- First of all, it is necessary to prohibit the targeting of advertising based on the personal data and online behaviour of people – without any exception. In line with the ERGA and EDPS opinions on the proposed Regulation, there should be a prohibition of targeted advertising based on pervasive tracking techniques. As the collection and processing of personal data and people’s online behaviour can be abused for psychological profiling, as highlighted by the Facebook-Cambridge Analytica scandal. Beyond the risk for democracy and society as a whole, recent studies and polls have shown that people do not wish to receive personalised ads. Studies show that micro-targeted ads allow for selective information exposure, resulting in voters’ biased perception of political actors and their agenda. Furthermore, it enables manipulation by matching messages to specific voters’ vulnerabilities and can fuel fragmentation of political debate, create echo chambers, and exacerbate polarisation. The recent case of the French presidential candidate Eric Zemmour is a good example of how the amplification of political targeted ads can work. His political campaign team was able to collect different types of data to target via SMS French Jewish voters with Islamophobic messages. It is very likely that the advertisers did not even use sensitive data to tailor and reach French Jewish voters. Instead, it was sufficient to use inferred data and contextual data (browser history, clicks on webpages, likes, shares, etc.). Finally, a ban on amplification would also largely solve the issue of price discrimination. Finally, due to the highly tailored messages that are not visible to and viewed by all users, such advertisement practices enable opaque campaigns to which political competitors cannot respond and can make the detection of disinformation difficult.
- We should prohibit problematic algorithmic ad delivery techniques to display political ads. During ad delivery optimization, the platform picks a specific subset of users from within the targeted group chosen by the advertiser. The subgroup is optimised to align with the platform’s business interests. This results in the advertisement being delivered to a subgroup of users who, when presented with the ad, will stay longer on the platform. This algorithmic selection has inherent and undesirable by-including the creation of filter bubbles, the fostering of polarisation, the fragmentation of the public space of deliberation and importantly price differentiation. Adverse effects on society were shown by a group of researchers, who measured in a recent study how Facebook delivers ads to separate groups, depending on an ad’s content (for ex. the political viewpoint featured) and targeting criteria. They found that the amplification techniques selectively deliver ads within these target groups in ways that lead to demographic skews along race, political alignment, and gender lines, often without the knowledge of the sponsor. Similarly, algorithmic amplification – automated recommendations displayed on online platforms – is problematic and new rules should introduce limitations.
Price differentiation results in more expensive advertisements, when Facebook predicts that users do not align politically with the ad shown to them, making it more expensive to reach diverse audiences and exacerbating problems such as filter bubbles. - The transparency rules should provide for more data granularity on each individual ad, as also demanded by ERGA. Access to complete, real-time, machine-readable and detailed information on political online campaigns by watchdogs such as journalists and NGOs is essential to spot violations of electoral law on time and to dissuade sponsors of political ads to engage in bad practice.
- We need one single ad repository: Current practices by industry vary widely – making it difficult for researchers and civil-society watchdogs to analyse political campaigns as well as potential disinformation and foreign interference. The rules proposed by the Commission are unfortunately easy to circumvent, as only the very large online platforms are obliged to set up a database. Therefore, all political ads, regardless of the website or service on which they are published, should be available over an online ad repository (see for ex the idea for such a database developed by WhoTargetsMe). Ideally, an independent EU institution or body, such as ERGA, should be responsible for maintaining the ad repository, which would function as the universal ad repository for all political ads in the European Union. This would require the EU body to establish APIs for the automatic transmission of information contained in transparency notice. As the European Partnership for Democracy notes, the industry is in favour of this solution, as it is operation costs are close to zero and the initial one-off costs are low.
- Finally, it is necessary to improve cross-border cooperation for the enforcement of the rules and to harmonise sanctions across the EU and provide for a minimum level of sanctions. It is crucial that there is an efficient enforcement mechanism to give teeth to the Regulation. The patchwork of authorities potentially in charge needs to be clarified and the cooperation mechanisms further strengthened.
What is in the draft EU Regulation?
The Commission proposed a regulation on political advertising (RPA) on 25 November 2021. It acts as lex specialis to the Digital Services Act to focus specifically on political advertising online, with the aim to overcome the failures of the self-regulatory Code of Practice on Disinformation. It is built on two main pillars: transparency and tracking.
Transparency is the first pillar of this Regulation as the proposal points out that specific measures are needed to reduce the scope of problematic targeting tactics. The objective is that people could easily distinguish when they are being shown paid political content. Under this framework, the paid political advertising should be labelled as such and provide basic information about the identity of the sponsor, dissemination period, amount spent, links to an election, source of the funds used and other information useful to achieve the fairness of the dissemination of the political ads. The text suggests that these rules only apply to political advertising services and should not apply to the sharing of information through electronic communication services (e.g. Whatsapp, Telegram or Signal).
The second pillar of the proposal is the introduction of limitations to tracking, via a prohibition of processing sensitive personal data. At a first glance, this ensures that sponsors are no longer allowed to use microtargeting techniques. However, they would still be able to do so, under two conditions: a) a data subject gives “explicit consent” or b) the processing is being carried out “in the course of legitimate activities by a foundation, association or any other not-for-profit body with a political, philosophical or religious or trade union aim and on condition that the processing relates solely to the members or to former members of the body or to persons who have regular contact with it”, in line with the General Data Protection Regulation (GDPR), Article 9(2a) and 9(2d).
What are the good elements of the draft law?
Online political advertisement is a space that needs to be regulated by hard law, as its impact has consequences for the health of the democratic system overall and can no longer be left to self-regulatory instruments.
An EU instrument to harmonise transparency and tracking rules in all Member States is welcome in an environment of increasing cross-border services.
The added-value of the draft Regulation is the creation of (real-time) transparency for all political ads, with the opportunity to set up a more comprehensive and reliable EU-wide database for online ads, as a single point of research resource for watchdogs and anti-disinformation organisations.
It is important to note that the time period to which the regulation applies is not limited to election periods. It thereby acknowledges that other political matters such as referendums and voting behavior more generally can also be influenced through political advertisements.
The Regulation also takes a broad approach to political advertisement by including a definition of “political ads” not only based on the actors but also based on issues. It aims at covering the whole value-chain of political advertisement, which is especially important for complex ad-delivery infrastructures, such as real time bidding.
Finally, the Regulation contains a flexible approach to enable to regulators to keep up with technological developments. The two Annexes would allow the Commission to add and change the requirements for the transparency notice and the explanations of why the ad was shown to the user.
What needs to be improved?
First, the proposal does not distinguish between the two very different techniques of “targeting” (meaning the targeting of a message to match the interest/behaviour of a person or group of persons) and “amplification” (meaning increasing the spread and reach of a message). The limitations to tracking and amplification techniques are far too weak, as they replicate exemptions of the GDPR and thereby introduce loopholes leading to a situation where it would still be allowed to abuse personal data for manipulative practices in the future.
The current proposal by the Commission to restrict targeting in Article 12 would however still allow microtargeting on the basis of two exceptions: based on consent and based on whether the advertiser has regular interactions with the person or group of persons targeted. This is probelamtic for two reasons:
- The general idea of the consent exemption to enhance user autonomy by allowing people to choose whether their sensitive data can be used or not, is laudable. In practice, it has not typically worked. Consent frameworks tend to favor the gatekeepers and do not truly provide users with choices. Moreover, a European regulator has recently ruled that the current consent framework, as developed by the advertising industry, is contrary to EU law.
- The second exemption would allow micro-targeting for associations, foundations or non-profit organisations that have regular contact with its members or former members. This could potentially be a big loophole as it could favour extremist communications, depending on how “regular interactions” is interpreted. It is unclear, whether this exemption applies to interactions via email addresses that have been collected in compliance with data protection law or if this could also apply to all interactions on social networks. If you take for example the French presidential elections, researchers showed that candidates defending extreme positions had the most interactions online: two right-wing extremist candidates had close to 2mio interactions together in only two weeks while Emmanuel Macron, the French President running for re-election, only 251.000 engagements on social media.
Additionally, implementation and design choices are left up to a new code of practice which will be drawn up by industry. This proposal is an attempt to write a code of practice into hard law, as the code of practice on disinformation did not work as intended. Therefore, leaving crucial design and implementation choices again up to a code of practice, bears considerable dangers. Specifically, with respect to the design of the transparency notice and the accessibility and user friendliness of the flagging mechanism, the effectiveness of the applied law is at stake. This also implies that there is no parliamentary control and de facto handing over control to two companies (Google/FB) that already duopolies in the online advertising space.
The ad repository as proposed by the EU Commission merely makes reference to Article 30 DSA, thereby applying the obligation only to VLOPs. This could incentivise sponsors – and malign actors in particular – to move most of their advertising to media publishers’ platforms and small to medium online platforms to circumvent the proposed Regulation. What is also missing from the Commission proposal, is the requirement that transparency information needs to be available in a machine-readable format, that it needs to be complete and in real time.
The self-declaratory mechanism foreseen in the Regulation which obliges sponsors to indicate that an ad is political is insufficient, as it leaves an open door for malicious actors. This is an enormous loophole as one can doubt what choice malicious foreign actors would make: to voluntarily agree to submit themselves to a set of strict obligations for political ads or rather not to indicate that their ad is political.
There is no obligation for human review and checks by the platforms or on services where ads are published, like there is for instance in the DSA which contains an Article on the Traceability of Traders. In tandem with leaving the control of ads that are not declared as political up to users that flag an ad, when they perceive it as misdeclared, the regulation is clearly insufficient for the task it aims to achieve.
There are also questions as regards what will happen when an ad is notified – and how fast the reactions by publishers will be. What happens if malign actors abuse the notification and have legitimate political ads taken down in the last days before an election? At least VLOPs should be obliged to have human moderators to check the ads, if they are correct and complete. Another option would be to introduce an obligation for publishers to carry out random checks and that they have to ask for corrections of transparency information before any ad can be published.
Timeline in the EU Parliament
The IMCO Committee (on the internal market and consumer protection) is the lead Committee, where I am shadow rapporteur for the Greens/EFA group in the EU Parliament.
Draft report to translation | 15 June 2022 |
Consideration of draft report | 11/12 July 2022 |
Public hearing | 11/12 July 2022 |
Deadline for amendments | 14 September 2022 at noon |
Consideration of amendments | 26-27 October 2022 |
Consideration of compromise AMs | 28-29 November 2022 |
Vote | 8 December |
Plenary | tbc |