Tech & Rights

Faking Action: EU to Fine Social Media Firms Over Extremist Posts

Social media companies must remove extremist content within one hour or face hefty fines, according to a new proposal unveiled by the head of the European Commission. But is the motivation for the regulation rooted in politics rather than security?

by György Folk

Voluntary action is no longer enough when it comes to the removal of terrorist content, the president of the European Commission, Jean-Claude Juncker has declared. During his annual state of the union speech, Juncker announced a draft regulation to force social media companies like Facebook, Twitter and Google to remove extremist content within an hour.

But while the measure may sound very important in tackling extremist propaganda, there are a number of details that make this latest terrorism-related regulation – the third the last 12 months – particularly concerning. Even the way the proposal was drafted – without transparency, an impact assessment or public consultation – is just as alarming as what’s in it.

Election tool instead of real legislative proposal

The proposal’s arrival has been rumored since the summer, after the Financial Times leaked the first related document. Its release is particularly timely now ahead of the Austrian EU presidency’s planned Salzburg summit on 18 September, where the key topics on the agenda are internal security and migration.

The Juncker Commission aims to pair border control, security and judicial cooperation proposals with a lasting and more robust migration policy. The proposal for the removal of extremist content is one small element of this larger puzzle.

Bye-bye voluntary approach

With the proposal, the Commission decided to abandon a voluntary approach for internet companies to remove terror-related videos, posts and audio tracks from their platforms. They first came up with a voluntary mechanism in January this year, followed by tougher guidelines and with a lowered, one-hour grace period for content removal.

The EC’s draft proposal includes the following points:

  • Content flagged by undefined police and law enforcement bodies must be removed/disabled within an hour.
  • Competent national authorities may decide to refer potential terrorist-related breaches of a company's terms of service to that company, which would then decide whether to act against the questionable content.
  • Allows undefined proactive measures that may result to an authority requesting general monitoring obligation.

The last point is particularly worrisome from a free speech perspective because it represents the first proposal that would allow member states to go for an explicit derogation from the e-commerce Directive’s Article 15, preventing governments from requiring internet firms to actively monitor what is uploaded and published online.

Data too suggests that member states’ interest in anti-terror laws dropped – since only over half of the member states implemented the EU's passenger name record directive, touted by the EU as a key measure to fight terrorism, by the deadline. That directive was passed in April 2016 in the wake of terror attacks in France and Belgium.

The EU distributed to the member states 70 million euros to support the setting up of the information exchange system. Supporters of the system claim it is needed to identify suspicious behavioral patterns. But critics argue that the anti-terror law undermines fundamental rights and does little to help police track down terrorist suspects. In France, for example, only 13 people have been intercepted on the basis of the information exchange system.

One size does not fit all

Given the lessons of past European terror laws, Liberties is of the opinion that the EU should restrain from applying a one-size-fits-all program for automated content filtering and removal, especially without proper preparations and harmonization with EU free speech law and the Charter of Fundamental Rights.

This solution fits into the filtering trend the Commission has been setting up lately for other controversial topics, such as online hate speech, online child protection, audiovisual media services or recently copyright protection. Furthermore, Liberties urges the European Commission to avoid passing the task of the law enforcement to internet giants. These companies lack the resources and knowledge to properly define extremist content. And the forthcoming European elections should not be used as an excuse to force them to try.

Donate to liberties

Your contribution matters

As a watchdog organisation, Liberties reminds politicians that respect for human rights is non-negotiable. We're determined to keep championing your civil liberties, will you stand with us? Every donation, big or small, counts.

We’re grateful to all our supporters

Your contributions help us in the following ways

► Liberties remains independent
► It provides a stable income, enabling us to plan long-term
► We decide our mission, so we can focus on the causes that matter
► It makes us stronger and more impactful

Your contribution matters

As a watchdog organisation, Liberties reminds politicians that respect for human rights is non-negotiable. We're determined to keep championing your civil liberties, will you stand with us? Every donation, big or small, counts.

Subscribe to stay in

the loop

Why should I?

You will get the latest reports before everyone else!

You can follow what we are doing for your right!

You will know about our achivements!

Show me a sample!