Sacrificing freedom of expression and collaboration online to enforce copyright in Europe?

Translate This Post

Photo by Kain Kalju, CC BY 2.0.

Respect for copyright laws is a fundamental part of Wikipedia’s culture, and embedded in the free online encyclopedia’s five central pillars. Contributors diligently monitor new edits for compliance with copyright and collaboratively resolve disputes over the permissible use of a work or its status. When a rightsholder finds that their work is used without permission on a Wikimedia Project, we encourage them to talk to the community of editors to address the issue. If this does not lead to a resolution, under the Digital Millennium Copyright Act (the US analogue to the EU’s E-Commerce Directive), rightsholders can notify the Wikimedia Foundation, as the content host, of the alleged copyright infringement. The fact that we only received twelve such notices (only four of which were valid) in the second half of 2016 is a testament to the diligence of Wikipedia editors and the accuracy of human-based detection of copyright infringement.
Yet, because European lawmakers see copyright infringement as a problem on other platforms, they are currently debating a proposal for a new EU copyright directive that—if applied to Wikipedia—would put the site’s well functioning system in peril. Article 13 of the proposal would require “information society services” that store large amounts of content uploaded by users (as Wikimedia does) to take measures to prevent uploads of content that infringes copyright. A radical new “compromise amendment” would apply this requirement to all services—not just ones hosting “large amounts” of content. The Commission’s proposal suggests that hosts implement upload filters, or, as they call them, “effective content recognition technologies”. Some large, for-profit platforms already have such filtering technologies in place, but we strongly oppose any law that would make them mandatory. Filtering technologies have many flaws, and a requirement to implement them would be detrimental to the efficient and effective global online collaboration that has been Wikipedia’s foundation for the past 16 years.
First, filters are often too broad in their application because they aren’t able to account for the context of the use of a work. Automated content detection generally has no knowledge of licenses or other agreements between users, platforms, and rightsholders that may be in place. Such filtering systems also fail to make good case-by-case decisions that would take into consideration copyright laws in various countries that may actually allow for the use of a work online. As a result, a lot of culturally or otherwise valuable works are caught as “false positives” by the detection systems and consequently taken off the platforms. In fact, automated takedowns are such a prevalent phenomenon, that researchers have seen the need to document them in order to provide transparency around these processes that affect freedom of expression online, and maybe even the rule of law. Moreover, such filter systems also have been shown to create additional opportunities for attacks on users’ privacy.
Second, mandatory filtering technology that scans all uploads to a platform can be used for all kinds of purposes, not just copyright enforcement. Automatic content filters can also monitor expression and target illicit or unwanted speech, for instance under the guise of anti-terrorism policies. In other words: they can be repurposed for extensive surveillance of online communications. While intended to address copyright infringement, Art. 13 of the proposed copyright directive would actually lay the groundwork for mass-surveillance that threatens the privacy and free speech of all internet users, including Wikipedians who research and write about potentially controversial topics. All Europeans should be as concerned about these threats as the EU Wikimedia communities and we at the Wikimedia Foundation are.
Third, the broad and vague language of Art. 13 and the compromise amendment would undermine collaborative projects that rely on the ability of individuals around the world to discuss controversial issues and develop content together. Free knowledge that is inclusive, democratic, and verifiable can only flourish when the people sharing knowledge can engage with each other on platforms that have reasonable and transparent takedown practices. People’s ability to express themselves online shouldn’t depend on their skill at navigating opaque and capricious filtering algorithms. Automatic content filtering based on rightsholders’ interpretation of the law would—without a doubt—run counter to these principles of human collaboration that have made the Wikimedia projects so effective and successful.
Finally, automatic content detection systems are very expensive. YouTube spent USD 60 million to develop ContentID. Requiring all platforms to implement these filters would put young startups that cannot afford to build or buy them at a tremendous disadvantage. This would hurt, not foster, the digital single market in the European Union, as it would create a tremendous competitive advantage for platforms that already have implemented such filters or are able to pay for them. The result would be diminished innovation and diversity in the European interior market and less choice for European internet users.
As currently written, Art. 13 would harm freedom of expression online by inducing large-scale implementation of content detection systems. For many Europeans, Wikipedia is an important source of free knowledge. It is built by volunteers who need to be able to discuss edits with other contributors without opaque interference from automatic filters. Therefore, we urge the European Parliament and the Council to avert this threat to free expression and access to knowledge by striking Art. 13 from the proposed directive. If the provision stays, the directive will be a setback to true modernization of European copyright policy.
Jan Gerlach, Public Policy Manager
Wikimedia Foundation

Archive notice: This is an archived post from blog.wikimedia.org, which operated under different editorial and content guidelines than Diff.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?

3 Comments
Inline Feedbacks
View all comments

I think it’s time for a second blackout. This is almost like SOPA, PIPA and ACTA had a child. I’m not European, but I only follow those laws because I heavily back OpenMedia.

Totalitarian regimes filter information first. Free regimes trust the citizens and repair infringements.

When trying to change something, it should only ever be changed for the better, for inclusion, and equality. The greater the divide the bigger your risks become for a world run by a bunch of ignorant, under educated, inept human beings, all because you created it that way. EMPOWERMENT is key for a better way of life, for a better world, for an advanced world.