- What's the problem with the Copyright Reform Directive Proposal?
- Where does the text of the proposed directive demand upload filters?
- What are the downsides of upload filters?
- Are there alternatives to upload filters?
- Who is required to use upload filters?
- What about exceptions from article 13 for start-ups and small platforms?
- What about exceptions from article 13 for non-commercial platforms?
- Doesn't the restriction of platforms like YouTube by the proposed directive open opportunities for European start-ups?
- Doesn't article 13 of the proposed directive strengthen the legitimate rights of artists?
- What are the consequences of article 11?
- What about article 11 and the Berne convention?
- What about the exceptions from article 11?
- Doesn't article 11 of the proposed directive strengthen the legitimate rights of press publishers?
- Are there other problems with the proposed directive?
- How should MEPs vote?
The text of article 13 of the proposed directive makes Internet platforms liable for users' uploads. It does not mention upload filters, but, as explained by many, including independent academics, if platforms are liable, they will have to filter to avoid liability. See question 2 for details.
This undermines the freedom of speech and data privacy, and it places European platforms in a bad position against established competitors like YouTube. See question 3 for details.
Article 11 of the proposed directive grants additional rights to press publishers for the online use of their press publications, which cut down the right of citation. On the Internet, whose structure depends much more on citations (“hyperlinks”) than printed media do, this is particularly problematic. See question 10 for details.
It has been claimed, but disputed, that press publishers will benefit from this article. See question 13 for details.
Article 13(4) reads: [1, p. 66]
If no authorisation is granted, online content sharing service providers shall be liable for unauthorised acts of communication to the public of copyright protected works and other subject matter, unless the service providers demonstrate that they have:
(a) made best efforts to obtain an authorisation, and
(b) made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information, and in any event
(c) acted expeditiously, upon receiving a sufficiently substantiated notice by the rightholders, to remove from their websites or to disable access to the notified works and subject matters, and made best efforts to prevent their future uploads in accordance with paragraph (b).
This means: As soon as a user uploads some content to a service provider, the service provider becomes liable. To act lawfully, they are required to implement all of (a) and (b) and (c), where the requirement (b) can only be met by using upload filters.
Regarding (a), it is impossible to acquire licenses for all content that might be uploaded to their platforms. (See question 4 for details.) The obligation to make a best effort creates legal uncertainty.
Regarding (c), this is notice and take down, and a filter to ensure content stays down.
Regarding (b), the sheer quantity of uploads makes it impossible for humans to check each upload individually. This task requires advanced software systems running on giant server farms which only the biggest companies in the world can afford – Google (who owns YouTube), Amazon, Facebook, Apple, and maybe Microsoft.
Upload filters cannot reliably determine whether a specific upload is infringing or not. Additionally, they require massive computing power and huge databases, which are out of reach for small and medium-sized enterprises. YouTube may be able to make a best effort, smaller companies will fall short of what YouTube can do.
Upload filters force information providers to install a system whose purpose is to block specific information from being published – in other words, a system for censoring the Internet. It can be easily abused.
Even without wilful abuse, upload filters will censor the Internet. Even the most advanced filtering software running on the world's biggest computing centres cannot reliably distinguish between a legal citation or parody and an unauthorised copy. Since the provider takes the risk, all they can do is to reject everything when in doubt. This still won't filter out all unauthorised copies, but then at least the provider can claim to have made “best efforts”.
In fact one upload filter, called Content ID, has been operated by YouTube since 2013. They voluntarily developed this system to reduce the number of lawsuits about copyright infringement they are faced with.  The Content ID system failed dramatically, taking down legitimate content, causing harm for creators. 
With article 13 of the proposed directive in action, all European platforms would be obliged to commit the same mistake as YouTube did.
Another problem arises from the practical difficulties of the implementation of upload filters. Given the requirement that they need huge databases and massive computing power, only the biggest companies in the world – Google (who owns YouTube), Amazon, Facebook, Apple, and maybe Microsoft – can implement it. Everyone else has to send their data to one of those companies and pay for the service to get them analysed. This renders small and medium-sized enterprises into dependency of those big companies, and it has has disastrous implications for data privacy.
Thus, a directive intended to get more control over the activities of YouTube would have the effect that YouTube gets control over a large fraction of their competitors.
Some proponents claim that under article 13 taking down unauthorised content upon notice would render it unnecessary to install upload filters. However, article 13 requires all of (a) and (b) and (c), where (c) is reaction to take-down notices, and (b) is the use of upload filters. Providers are required to act upon notice in addition to using upload filters rather than “instead of”.
A similar misunderstanding is that providers could acquire licenses for all content that might be uploaded to their platforms. This is impossible because any single copyrightable work in the world can potentially be uploaded by a user. What a user chooses to upload is outside the control of the provider. This leaves the provider no other option than using upload filters.
‘online content sharing service provider’ means a provider of an information society service whose main or one of the main purposes is to store and give the public access to a large amount of copyright protected works or other protected subject-matter uploaded by its users which it organises and promotes for profit-making purposes.
This is targeting established platforms such as YouTube. However it can be interpreted in a much wider sense. In fact the way the Internet works does not allow for a clear distinction between the activities of YouTube and other activities which appear unrelated at a first glance.
For example, think of a European company that offers a public platform where its customers can discuss its products. Each contribution to the discussion constitutes an upload of text, maybe in combination with other media such as an image. Does this qualify as a content sharing service?
User contributions in a discussion forum can qualify as “copyright protected works”. A company which provides a discussion platform for its users does organise and promote them. Since they are providing this service for popularising their products this can also be seen as “for profit-making purposes”. This leaves room to argue that the company is an “online content sharing service provider” in accordance with article 2(5) of the proposed directive.
In that case, article 13(4), point (b) of the proposed directive would require the company to scan every single contribution for copyright-infringing content before making it public. (The current practise to take down the illegal content upon notice would no longer be legally sufficient.) If this is done by humans, it creates delays which render the platform useless for discussion. So they are required to use upload filters if rightholders have provided the service providers with relevant and necessary information.
Article 13(4aa) of the proposed directive states that
[...] new online content sharing service providers whose services have been available to the public in the Union for less than three years and which have an annual turnover below EUR 10 million [...] are limited to the compliance with the point (a) of paragraph 4 and to acting expeditiously, upon receiving a sufficiently substantiated notice, to remove the notified works and subject matters from its website or to disable access to them.
Where the average number of monthly unique visitors of these service providers exceeds 5 million, calculated on the basis of the last calendar year, they shall also demonstrate that they have made best efforts to prevent further uploads of the notified works and other subject matter for which the rightholders have provided relevant and necessary information.
According to the Communia analysis  this means that all providers which have either an annual turnover of EUR 10 million or more or are older than three years are obliged to implement upload filters.
There is no exception for small enterprises, as after three years at the latest, all platforms have the same obligations as YouTube.
Recital (37a) of the proposed directive tries to limit the applicability of article 13: [1, p. 35]
[...] The definition does not include services which have another main purpose than enabling users to upload and share a large amount of copyright protected content with the purpose of obtaining profit from this activity. These include, for instance, electronic communication services [...], as well as providers of business to-business [sic] cloud services [...], or online marketplaces whose main activity is online retail and not giving access to copyright protected content. Providers of services such as open source software development and sharing platforms, not for profit scientific or educational repositories as well as not-for-profit online encyclopedias are also excluded from this definition.
Although this recital grants an exception specifically for Wikipedia, the Wikimedia Foundation, who operates Wikipedia, does not support the EU Copyright Directive in its current form: “As content outside of Wikipedia shrinks, so will the depth, accuracy, and quality of Wikipedia's content. We rely on the outside world to build our collaborative encyclopedia, and what affects the Internet ecosystem as a whole affects Wikipedia, regardless of direct legal carve-outs.” 
Similar confusion holds for free software or open source software development and platforms which can be – and sometimes are – used for many kinds of related services. There is a fluent passage between a pure “open source platform” and a “content sharing service provider” which would be a source of legal uncertainty.
Well-established platforms such as Wikipedia and GitHub might be safe due to their size and their prominence. Small platforms such as public development platforms run by small companies or associations which cannot afford a lawsuit will be substantially harmed by this legal uncertainty.
This method of introducing carve outs has been criticised as “bad law making” by a reputable professor of intellectual property. 
The proposed directive affects European start-ups older than three years in exactly the same way as it affects YouTube, but YouTube can afford to install upload filters, while start-ups cannot. So the directive forces start-ups to use YouTube's upload filters and pay for that service. This gives YouTube control over their competitors, and it has disastrous implication for data privacy. (See question 2 for details.)
Representatives of copyright collectives have publicly claimed that the proposed directive will force platforms like YouTube to obtain licenses for their copyrighted works, and that they will transfer a share of the license fees to their artists.
At the same time, an artist who is a board member of a major copyright collective stressed that it does not serve the artists if the platforms are simply blocking their content. (“Blockieren ist nicht in unserem Sinne.” – “Blocking is not in our interest.” [7, min. 12:47])
However, since it is impossible for platforms to license all potentially copyrightable material in the world, they have no other choice than to block the contents using upload filters.
As a result, it is not clear whether artists can benefit from licensing, but it is clear that they will be hurt by blocking.
(For further analysis how to help artists in the digital world, see, for instance, .)
The additional rights granted to press publishers by article 11 of the proposed directive would make the use of more than “individual words or very short extracts of a press publication” subject to licensing.
On the Internet it is common practise to reference (“hyperlink”) media by citing its headline, often together with a small snippet from the referenced material. Such links are essential for the usefulness of the Internet.
Article 11 of the proposed directive is likely to reduce the number of hyperlinks significantly, thus weakening the backbone of the Internet, making it more susceptible for abuse.
For example, when reputable news sources would charge a fee for linking to them in combination with headlines and snippets, they would become more difficult to find. This would discourage the spread of reliable news, giving unreliable sources (propaganda, “fake news”) an advantage over respectable journalism. 
The same mechanism would favour big, established platforms over small start-ups which cannot afford the fees, causing harm for innovative news businesses in Europe. 
Article 11 of the proposed directive is against the spirit of article 2(8) of the Berne Convention, which specifically excludes daily news and press information from the realm of copyright: 
The protection of this Convention shall not apply to news of the day or to miscellaneous facts having the character of mere items of press information.
Article 10(1) of the Berne Convention, specifically allows for the use case which article 11 of the proposed directive intends to forbid: 
[...] including quotations from newspaper articles and periodicals in the form of press summaries.
The text of article 11(1) of the proposed directive provides exceptions for non-commercial use by individuals and for acts of hyperlinking. [1, p. 63]
What does “act of hyperlinking” stand for? Literally, it means that it is still allowed to refer to some media by technical means, but it is no longer allowed to describe where the hyperlink is leading to. Since this contradicts the purpose of hyperlinking, this exception is too vague to help anyone.
The exception for “non-commercial use by individuals” is open to interpretation. Is someone with a huge following on social media, who posts adverts to that audience, a “private and non-commercial” entity? 
Article 11 is the all-European version of the German Leistungsschutzrecht für Presseverleger (Ancillary Copyright for Press Publishers) which has been in effect since 2013. According to the studies, the effects of this law for the press publishers were negative, especially for start-ups and small businesses.
Similar amendments of Spanish copyright law have been found to have done “substantial damage to the Spanish news industry”. [15, p. 85]
Yes, there are. For a detailed analysis of the pros and cons of all individual articles, see .
In addition to the problems with article 11 and 13 discussed above, the proposed directive puts burdens on European researchers in the field of text and data mining , and it fails to establish reasonable exceptions from copyright for educational purposes  and for works that are permanently located in public spaces .
In summary, the proposed directive fails to deliver on the primary goal of the reform as promised in article 1(1) to ensure a “well-functioning marketplace for the exploitation of works and other subject-matter”. [1, p. 49]
We believe the European Parliament should at least remove articles 11 and 13. Unless both of the articles are removed from the directive, the Parliament should vote to reject the entire directive.