The European Parliament's Committee for Internal Market and Consumer Protection (IMCO) voted yesterday on the Digital Services Act (DSA) protecting freedom of speech and freedom to access information online. We welcome the final outcome, and we believe the vote brings us a step closer to a human rights-friendly digital ecosystem in the EU with a possible positive impact on other parts of the world.
The result of the vote is also a triumph of digital rights organisations over Big Tech companies. While platforms spent millions on lobbying on the DSA and the DMA behind closed doors, civil rights organisations worked transparently to convince Members of the European Parliament to safeguard people's rights over Big Tech's interests. And IMCO members listened.
“While the vote is a step closer to a human rights-friendly digital ecosystem, we are seriously concerned about the enforcement of the DSA: letting the Member States decide about unwanted online content and cross-border removal without a court order are serious mistakes that would allow authoritarian regimes, like Orbán's, to control online free speech”, said Eva Simon, senior advocacy officer at Liberties, highlighting that mistakes from the terrorist online content regulation should not be repeated.
With the vote, the IMCO Committee saved the prohibition of general monitoring and mandatory upload filters. This is a great success, and it seems that the debate about copyright upload filters had an impact and will not be repeated.
This means that a horizontal piece of legislation will safeguard freedom of speech, but the question remains what it means in reality and how to oversee the practice of Big Tech companies. IMCO pushed back on incentivised content removals and privatised decisions over users' content. So it will be essential to follow the enforcement of the law.
IMCO also voted down the media exemption clause that would have severe consequences for the media market and would have pushed national media authorities and Big Tech companies to decide what is reliable media and what is not.
Risk assessment and transparency
Mandatory transparency requirements and risk assessments, safeguards for audits, and the involvement of civil society are important steps towards a reliable and transparent online ecosystem. In addition, the mandatory risk assessment for content moderation and content curation will help us avoid being locked in bubbles created for us by Big Tech.
Liberties supports the possibility for an opt-out from targeted advertisements, even though we advocated for a stronger and more privacy-friendly solution to change the business model of targeting and spreading disinformation online.
The next step is the final vote in the European Parliament in mid-January. We will continue our work to ensure that there are privacy and free speech safeguards in the DSA during the trilogue negotiations between the Commission, the Council and the European Parliament.
Our long-term goal is to ensure that no member state can misuse the rules in the DSA to suppress the voices of dissents.