The new Copyright Directive was adopted in April 2019. The Directive requires the European Commission to hold stakeholder dialogues, prompting Liberties to pen an open letter urging for the participation of human rights organizations.
The letter called on the Commission to invite human rights and digital rights organizations and the knowledge community to appear alongside platforms such as Google, Facebook, and Twitter, and rightsholders such as big record companies, television companies, and collective management organizations. The Commission offered 15 seats to organizations representing the users. Liberties was given one seat.
The first stakeholder dialogue took place in Brussels on 15 October, and will continue with a series of future meetings. All of them are streamed online.
Here is Liberties' address at the dialogue, delivered by senior advocacy officer, Eva Simon, about the minimum safeguards regarding user uploaded content and how to avoid upload filters:
Liberties’ approach is to voice how to sustain the level of fundamental rights protection with regard to the Guideline to be issued by the Commission on the application of Article 17.
First I will speak about balancing fundamental rights and the problems we face due to automated decision-making process. I will speak a little about the way copyright is used against counter interest groups in politics and finally I will give a list of safeguards to respect fundamental rights.
So, first balancing fundamental rights. Platforms have a right to conduct a business. Rightsholders have a right to property. And the public has rights to freedom of expression and information and data protection. These rights have to be balanced against each other. And that can only be done on a case by case basis.
In most cases, when we try to balance between these rights, in the everyday operation of content-sharing services, the courts are not involved. We only rely on automated decision making, bots, and filters, Content ID, and other content management systems. Sometimes there is a possibility of human intervention, but sometimes not. This approach seems easy and effective. And it usually is.
However, when there is an automated decision-making process it can easily create false positives and is sometimes unable to identify lawful use of copyrighted material. These mistakes can only be corrected if equal parties can dispute the decision. However, the core of the problem is that there is an imbalance between the parties: content-sharing service providers or the rightsholders are in a dominant position over the user. The existing regime does not offer the opportunity for the users to fight for their rights effectively. This is why we need safeguards.
Copyright law can have a censoring effect, especially when counter interest groups are targeting non-infringing content. It can also be used to silence political opposition. The recent Hungarian "Ibiza video" which is a leaked sex video of a mayor, was removed for copyright reasons. They used copyright as an easy avenue to get rid of the content. "Straight Pride UK" group, silenced a critical blogger referring to copyright violation. And there are many more examples.
If proper safeguards are introduced, it is possible to protect the interest of rightsholders and platforms while protecting the rights of the users. Here are 6 safeguards you can introduce to find the right balance.
- Empower individuals. Give individuals a right to challenge decisions taken by automated content management systems and require that such challenges be decided by a human.
- Ensure access to an effective remedy. Users must be given proper reasons for any decisions, and must have access to an independent judiciary for review.
- Ensure transparency for users. Content-sharing service providers should be obliged to inform users how decisions are taken over the removal of content, which user data is collected and how it is used, when content is removed, and the extent to which user activity is monitored.
- Rebalancing the incentive for platforms to remove content: Blocking or removing content that does not infringe on copyright, violates freedom of expression. Accordingly, the guidelines should introduce a rebalancing incentive. Rightsholders and content-sharing service providers should be held liable for removing or blocking lawful user-generated content.
- Use alternatives for controversial content. There are alternatives to taking down controversial content from the outset. Content can stay up, while the algorithm can rank the controversial content in a different way. A warning notice could also be attached until the dispute is solved.
- Common approach across Europe. Finally, it is very important to urge Member States to wait for the guidelines to be published before transposing Article 17, and to follow the Commission's initiative by holding similar stakeholder dialogues at national level.
We have the opportunity to make things better. Article 17 is an opportunity to ensure that no disproportionate limits are added to fundamental rights and to ensure that no filters are added where there is no need for one.