Tech & Rights

The European Commission Should Fight Online Child Sexual Abuse Without Mass Surveillance: Policy Brief

The Commission’s proposal to fight online child sexual abuse is well intentioned, but badly executed. It would threaten encryption and harm our fundamental rights, including children's. Even without authoritarianism on the rise, this would be dangerous.

by Liberties.EU

Child sexual abuse (CSA) is a horrific crime which can destroy lives and families. Woefully, the internet makes spreading, storing and accessing illegal material way easier. Fighting this dark and shameful side of society is undeniably urgent and necessary. That's why on 11 May 2022 the European Commission under the Commissioner for Home Affairs Ylva Johannson proposed a new EU legislation which, among other actions, would make platform providers report, remove, and/or block grooming approaches and illegal content.

Sounds promising? Think again. We argue that while the proposal is well-intentioned, it is fundamentally misguided. Cutting off this head of the hydra would let many more grow.

A new policy brief by Liberties states that enforcing this proposal would clash with the following fundamental rights of the European Union:

  • Respect for private and family life
  • Protection of personal data
  • Freedom of expression and information
  • Freedom of assembly and of association

Why encryption is key to your fundamental rights

One of the fundamental problems we identified in the proposal is that it would render end-to-end encryption practically meaningless.

How? Lets get technical real quick to explain what encryption is and why your abovementioned rights depend on it.

Firstly, what is end-to-end encryption? If you are using end-to-end encrypted services only you and the receiver see the content of your communication, be it funny memes or documents leaking a corruption scandal. This is thanks to a so-called key which no one knows and which makes, "I will be late, please put the pizza in the oven" look something like: G6Sx8%HE)N3aBF&NFWsE"eFWE=fWEp.

Client-side-scanning, the technology that would likely be introduced should the proposal get accepted in this (or a similar) version, means that content of your communication is scanned before it is encrypted and sent. While it is not as bad as cancelling encryption altogether, it opens doors to hackers and governments to snoop on you.

This would be problematic in fully functioning democracies. Unfortunately, as this Liberties' report shows, authoritarianism is on the rise and the rule of law is declining in the EU.

In times like these where a number of governments can't be fully trusted, vulnerable communities, whistleblowers or victims of abuse (sexual or otherwise) need secure channels of communication. How can journalists, lawyers, and human rights organisations do their job if they must fear governments being able to filter out, or even criminalise unwanted thoughts? This proposal harms everyone, including children.

The past has shown us that even democratic countries don't shy away from spying on their citizens and friends. In 2014 Edward Snowden proved that the NSA was secretly collecting the information of millions of citizens around the world, including officials like Germany's ex-Chancellor Angela Merkel. From our point of view it is better to avoid making snooping easy for anyone, including governments.

Civil society calls for other strategies

Reflecting on the risks involved it shouldn’t be surprising that when the European Commission came forward with their proposal, civil society responded with a backlash. 114 organisations, including the Civil Liberties Union for Europe, signed an open letter calling for solutions which respect fundamental rights. Even some child safety organisations are against the proposal, like the Germany-based "Deutsche Kinderverein" and "Kinderschutzbund". They believe that better strategies exist to combat CSA.

One argument widely shared by the digital and human rights community is that law enforcement is already overwhelmed by the number of detected cases. This load will be intensified by the inevitability that searching through all available material will generate many false positives, like holiday pictures sent from the beach. It will then be the job of National Authorities to decide on a case-by-case basis if content is illegal or not. Furthermore, this would be a missed opportunity as encrypted communication only plays a small part in the distribution of this life-destroying material. Lastly, while this proposal would pose grave consequences to our fundamental rights, criminals will probably find other ways to access child pornography, such as creating their own platforms.

In our own policy brief, we argued that the proposal undermines a number of fundamental rights enshrined in the Charter of Fundamental Rights of the European Union. We also argued that the legislative proposal is worded in such a way that it gives too much space for interpretation. Given that fundamental rights are at stake in a Europe where authoritarianism is on the rise, this must be avoided.

Policy recommendations

From Liberties' perspective, while EU governments undeniably have an obligation to protect children from abuse, the proposal put forward by the Commission is not the right way forward. We believe that the current draft is fundamentally misguided and a brand new proposal should be put forward. The new proposal should keep in mind the following policy recommendations:

  1. Measures opening up the possibility of indiscriminate online mass surveillance are to be avoided
  2. Measures undermining the safety and confidentiality of end-to-end encryption are to be avoided
  3. The core concepts of the legislation should be better defined so that the risk of potential abuse by politically captured national authorities is significantly reduced

Check out Liberties’ policy paper on this topic for a more in depth analysis of the legal specifics which the Commission’s proposal touches on.

Civic Space Policy Paper 2022

Civic Space Needs Our Protection

Read more