We go on social media, check our Instagram, Facebook, or Pinterest accounts, in order to connect with others. But we pay a high price for using these services. Online platforms track all our moves, creating profiles and offering these data sets for advertisers. Their business model is based on our personal data.
In most cases, ads appear in our feed related to our recent searches, topics we discussed, posts we liked. These behavioral data sets are useful for advertisers to define us as a target audience. For example, we see many political ads on behalf of political parties, paid influencers, or local activists during election periods. Political advertisers can use profiles to segment groups of people susceptible to being convinced by a given message and send those people highly personalized appeals to support a particular candidate or policy proposal.
Targeting techniques benefit political actors by allowing them to reach disengaged citizens and those who ignore traditional mass media, which may increase political participation and knowledge about specific issues. But targeting can also be used to mislead, manipulate, discriminate against or demobilize voters. Political parties can use targeting techniques to say different things to different people. This allows candidates to engage in duplicitous campaigning (promising different things to different people) and can lead to feeding citizens only with information and arguments that reinforce their own existing beliefs. Instead of enriching political debate, it creates echo chambers and increases polarization.The ability of political parties to deliver political messages to the public is protected by the right to freedom of expression. At the same time, users also have a right to share their own opinions and have access to the opinions of other users and politicians. On the other hand, online political targeting practices may undermine citizens’ fundamental rights, including the protection of personal data, privacy, and the right to a fair election, affecting the lives of millions of people.
Targeted advertising can be used to discriminate against and exclude certain groups from receiving information, which can increase marginalization and social exclusion. For example, advertisements about employment, housing, or elections can be hidden from certain people, based on age, gender, location, or more sensitive data, like ethnicity, political and sexual orientation, or browsing behavior. This was demonstrated in a study by investigative journalists who published housing advertisements and, using Facebook’s targeting tools, excluded certain groups, such as Black Americans, Jews, mothers of high school kids, or people interested in wheelchair ramps. Lawmakers both at EU level and national level must take the necessary steps to protect people from misusing their personal data, or being targeted based on their behaviors, such as sexual orientation or health conditions.
In our policy paper, we focus on political advertisements only. Therefore we limit our suggestions to that area.
Liberties advocates for limitation on targeting methods to the minimum. We also want lawmakers to introduce safeguards to protect the fundamental rights of the users.
Strict transparency obligations should be introduced on online platforms. Ad archives should be publicly available, easy to navigate, and designed to facilitate research and analysis. We advocate for a mechanism where online platforms must answer users’ requests about their targeting methods, the data processed, and the rights set out in the GDPR.These are the first steps that would allow independent researchers, relevant authorities, national electoral commissions, other public authorities, and regulatory bodies to monitor political advertising and better understand its impact on democracy and fundamental rights.
Enforce the GDPR. The European Commission and national Data Protection Authorities (DPAs) must properly enforce the GDPR. The GDPR has the potential to safeguard EU residents' rights and prevent the misuse of their personal data for targeting purposes. It can eliminate dark patterns that online platforms use to trick users into sharing their data, such as "I agree" buttons that users click to get rid of annoying pop-ups or banners. Consent of the data subject is needed prior to processing personal data for targeted advertising. Even though the GDPR provides solid ground for valid consent requirements, the lack of enforcement creates a reference in new pieces of legislation, such as the Digital Services Act (DSA) and the relevant upcoming proposal for targeted political advertising. Proper enforcement of the GDPR and further rules would correct the current power imbalance between online platforms and users.
Strengthen data protection rules through DSA and ePrivacy Regulation
The Commission and national DPAs should elaborate guidance to clarify how the GDPR should be applied to political advertising. It is obvious by now that more detailed data protection rules are needed to establish a robust and universal application of privacy-friendly advertising methods. The draft ePrivacy regulation or the draft Digital Services Act offers the possibility for European legislators to fine-tune GDPR rules in this field. In addition, the Commission should urge the Member States to provide DPAs with the funds necessary for the tasks they are expected to undertake and explore ways of supporting DPAs directly, for example by providing them with expertise and services.
Conduct Data Protection Impact Assessments and Human Rights Impact Assessments
In fulfilling their transparency obligations, political parties, interest groups, and platforms should be required to conduct and publish Data Protection Impact Assessments and Human Rights Impact Assessment relating to online political campaigns hosted on relevant platforms. National DPAs, Digital Services Coordinators (DSCs), and the electorate bodies should have the authority to order binding remedial action. This includes issuing fines to online platforms and political parties or interest groups and referral of the DPAs’ and DSCs’ findings to national electoral commissions. Joint liability of platforms and political parties could force them to follow the rules.
There is a severe power imbalance between online platforms and users. Users should have more control over their news feed and their personal data online. They should be allowed to decide whether they want to receive targeted political advertisements or not. For this to happen, and in accordance with EU data protection rules, online platforms should receive users’ explicit consent via an opt-in. To limit pop-up fatigue, there should be rules that limit how often online platforms can ask users to opt-in.
Limit targeting methods to the minimum
Regulators should limit the targeting methods that online platforms make available to political advertisers. Targeted political advertisements based on observed (e.g. what sort of content users like and share) and inferred data (assumptions that algorithms make about users’ preferences based on their online activity) should be fully prohibited. The only form of personalized targeting allowed should be based on relevant broad demographic data provided by users and are proven to be necessary to promote greater democratic engagement by citizens, such as data shared voluntarily about broad location data, age, and language preferences or for using opt-in mechanisms.Here too it is only legitimate if the data subject consents to use these data sets for targeting. This limitation on the choice of targeting criteria would reduce the possibility that political actors tailor different promises to different homogenous groups of people and manipulate the electorate. Instead, we believe that non-surveillance methods such as contextual advertising offer the best way forward.
Strong enforcement of new rules
Regulating targeted political advertising is essential to healthy democratic debate and fair elections across the EU. As we have seen concerning the GDPR, the key is how rules are enforced. We learned the lesson that self-regulation and voluntarily applied transparency rules are not enough. We believe that regulatory oversight is a must. Data Protection Authorities, Digital Services Coordinators, electoral authorities, and independent auditors are critical in creating meaningful mechanisms. We need a European-level, cross-sector authority for proper oversight. One solution is the European Digital Services Coordinators Board, similar to the European Data Protection Board.