The European Parliament is about to vote on two pieces of legislation in the near future. One is the Directive on copyright in the Digital Single Market, the other is the Audiovisual Media Services Directive. These Directives would forever change the internet for everyone.
To protect children and to fight against copyright infringement are legitimate reasons to limit freedom of speech. But it’s important to strike a fair balance between the rights at stake, namely copyright and the protection of children, and free speech. Any limitations on free speech always have to be proportionate. The way we usually balance free speech and other interests is to leave it in the hands of judges (and sometimes the authorities) to issue guidelines and solve problematic legal cases one by one.
Now the EU is turning in a new direction, one where free speech is not respected properly, where limitations are disproportionate and where decisions are put in the hands of privately owned companies.
Experts come out against proposal
The new draft Directive on copyright would introduce new obligations to all information society service providers that share and store user-generated content. The new requirement would oblige internet service providers to take measures against copyright infringement in cooperation with rightholders. The draft Directive identifies content recognition technologies as a possible means to fulfil this obligation. It is difficult to understand this requirement other than as an obligation to monitor and filter users’ activity. This new rule would be applied to all information society service providers, such as video sharing platforms (YouTube), blogging platforms (Twitter, Tumblr), social media platform (Facebook), even document sharing platforms (Dropbox) and market places (eBay, Etsy).
Several academics (here, here, here and here) and human rights and digital rights organisations from all over Europe argue that the proposal contradicts existing rules of the EU and creates legal uncertainty. It violates fundamental rights, such as freedom of expression, freedom of information and the right to privacy, and contradicts previous decisions of the Court of Justice of the European Union.
Human rights organisations often argue that filtering violates freedom of expression and freedom to access information, while constant monitoring of users’ activity violates the right to privacy.
Interference with human rights
Filtering creates an environment where people don’t have access to certain information. Filtering is a type of automatic preemptive censorship in the name of protecting certain values. These values can include the public’s moral obligation to protect children. This prevents them from accessing so-called harmful content. This value can be seen in communist ideology, such as in China. A common feature in these cases is that private companies or states set up algorithms for the protection of different indefinite values.
But copyright is something different. There are objective criteria that serve to protect the rightsholders’ creative works. However, the level of protection differs according to the wishes of the creator. There are creators who voluntarily share their works while others give you permission only if you pay them. Using copyrighted works for parody can be legal or illegal depending on the regulations in each European country.
What we see here is a clear interference with fundamental rights. On one hand, there is freedom of expression and the right to access information; on the other hand, there is the protection of the creators and their works. Creative industry wouldn’t thrive without proper copyright protection. It is important for the economy and for the development of information technology as well. However, the requirement that fair balance be struck between the rights at stake, namely copyright and fundamental rights, is not realised in this case. That is because copyright is already being adequately protected in ways other than filtering – ways that are far less intrusive. The Directive on Electronic Commerce is a good example. It limits liability to those copyright infringements that the service providers have actual knowledge of, or in cases where they obtain such knowledge but don’t remove or disable access to the content. According to the draft Copyright Directive, internet service providers would be liable for anything that their filters let through, even without their actual knowledge. Thus, the new rules would create legal uncertainty for internet service providers as to which piece of legislation they should follow.
Taking a broader view, we can easily see that the proposed filtering solution to avoid copyright infringement fits into a trend in the EU. Besides the draft Copyright Directive, the proposed new version of the Audiovisual Media Services Directive also requires filtering solutions. The proposed Directive would also require that privately owned video sharing platforms apply filtering mechanism in the name of protecting minors.
While the protection of minors is a desirable goal, there are two basic problems with it. First, it is difficult to define what can be considered ‘harmful content’. For example, it can be difficult to draw the line between softcore pornography and important sex education material. The same content can be harmful for an average 10-year-old but very useful for a 10-year-old who is a victim of sexual harassment. Second, video sharing platforms do not have the knowledge and the human resources to classify content properly.
Mandatory filtering would require privately owned companies to solve difficult fundamental rights problems. The Copyright Directive would require companies to distinguish between protected free speech and copyright infringement, while the Audiovisual Media Services Directive would require them to distinguish between free speech and harmful content.
Passing the buck to businesses
With this solution, the Commission attempts to resolve the problem of copyright infringement and harmful and hateful online content by shifting responsibility to internet companies, such as search engines, video sharing platforms and social networks. These companies, however, lack the resources and knowledge to solve problems related to fundamental rights.
And it is not only about their expertise in this area – it also goes against the nature of business. For these companies, the protection of fundamental rights is not of primary importance. When it comes to choosing between business interests and protecting freedom of speech, businesses have a strong incentive to opt for the former, namely to remove content if there is any risk at all that they may be legally liable.
Putting businesses in control of content puts free speech and freedom of information at serious risk because it makes it more difficult for individuals to exercise and enforce their right to free speech. There won’t even be the possibility for public debate around certain content because it will never come to light. These companies will control all information available. Having businesses make decisions about content is not only a heavy burden on the commercial sector, but it is also highly non-transparent, which is an entirely inappropriate way for democracies to regulate such an important issue as freedom of expression. The lack of transparency is a problem because it means no accountability – individuals do not know who to challenge when their content gets blocked or taken down, denying them due process.
So why is the EU changing the rules? First and foremost, the EU is happy to shift responsibilities to big companies that have the financial resources to solve problems, develop software and pay fines if necessary. Second, the EU is trying to create a more balanced creative industry. On one hand, internet companies have certain interests, while on the other hand, rightsholders have their own interests, and somewhere in between the focus on users is lost. We believe that the EU should focus on the users as well and protecting people’s free speech, freedom of information and privacy.