Support La Quadrature du Net!

Automated platform filtering: La Quadrature sends its arguments to MEPs

Paris, 7th march 2017 —  The draft of the new European copyright directive has been presented in september 2016. For now, the work in progress in the european Parliament and mobilisations by concerned people and organisations are multiplying. People pay great attention to the two articles that La Quadrature du Net pointed in september : Article 11 about ancillary copyright for press publishers, and Article 13 about the use of effective content recognition technologies for content platforms.
La Quadrature du Net publishes today its positions about Article 13, that have been fed by discussions and workshops with creators, legal experts and more globally with common users of digital culture. These positions are also send to the Members of the European Parliament to feed the work done in the Committees. The preliminary work carried out by the European Parliament Committtees show that, contrary to what one might think, nothing is locked and many subjects remain open in the copyright dossier. Articles 11 and 13 are subject to various discussions and some proposals by MEPs show that they pay attention to the evolution of use.

Article 13 concerns the use of protect content by Internet services which store and give access to a large number of works and protected materials uploaded by the users. It provides that Internet services must take measures to respect their agreements with rights holders, especially the "use of effective content recognition technologies".

La Quadrature du Net, after several workshops and meetings, choose to develop its arguments against Article 13 by taking several approaches : one strictly focused on legal approach and rights oriented, a second one dedicated to show how Article 13 will be counter-productive for creation and creators, and the last one bringing out the lack of resolution of the "value gap" between platforms and creators, as well as incompatibility with the current regime of content hosters.

Automatization in removing illicit content : a serious attack against legal principle

Inverting the burden of proof

Tobegin with, this article inverts the burden of proof: ratherthan requiring the rightholder to prove illicit use of their work, it requires the user who put work online to prove -- after it has been automatically suppressed -- that its content does not violate the other's rights. This mechanism can gravely damage freedom of expression and creation.

The sanction's automatic nature discourages all appeal and eliminates the right to the fair hearing which underpins the principles of the rule of law.

Nothing in the directive requires platforms to consider potential claims orto put in place appeal procedures (other than a vague obligation for "adequacy and proportionality" and the mention of a complaint mechanism with no guarantee).

Broken equity

Additionally, this measure forcefully breaks legal equity: although rightsholders need not take legal action to suppress content, publishers whose content has been abusively suppressed are themselves required to take legal action to validate their rights a posteriori.

Anotherbreach of equity will certainly appear between rights holders richenough to mark all or most of their catalogue so that the robots can detect their reuse, and those who can't afford to do this: if this automatization in removing illicit content becomes the norm, only those wealthy enough to underwrite its costs will be able to have their rights respected.
Will platforms which haven't received the fingerprints from rights holders nonetheless be required to deploy detection tools? Will the absence of these tools imply the de facto illegality of these platforms?

If the situation is unclear, there will be serious risks to competition inthe sense that rights holders may find themselves in a position to decide which platforms they deem legitimate, and therefore which ones may or may not continue to exist.

Overseeing the tools to detect illicit content

The matter of overseeing the robots is equally crucial: who will oversee the robots and verify their workings? Who will be able to certify that the robots have the analytic finesse to distinguish between a work's illicit use and its parody? Who will be able to validate that there will be no abuse, no excess, no abusive interpretation of copyright?

In light of how this type of robots work for video platforms (YouTube), it's already proven that they make many mistakes.
Among these mistakes, for example, we have already seen that rights holders who fingerprint works may themselves re-appropriate others' works, depriving them ofthe right to publish their work freely.


In light of these many disquieting points, we recommend to refuse automating the process of detecting protected works on content platforms, on pain of rendering the legal environment for publishing on the Internet considerably more difficult, and vastly expanding damage to fundamental rights.

Article 13 of copyright Directive : a threat against creation

Censorship unable to identify legitimate exceptions to copyright

Automatic censorship tools are by nature incapable of determining whether a work's re-use is a mere unchanged copy or a satire, a criticism or a remix (among other legitimate legal possibilities to re-use an extract of a protected work). This type of measure nullifies and seriously endangers all creative culture based on the usage of other works to nourish creation.

However, transformatory culture is deeply rooted in new uses and services on which this article touches. To damage it indiscriminately in the way article 13 requests actually constitutes endangering a very important part of current audio and video creation.

This kind of creation which transforms or uses pieces of other works is apart of the global cultural ecosystem which can't be suppressed without consequences. For example, many video creators play an important role in popularizing science and disseminating learning. Hundreds of thousands view these publications, participating in vibrant cultural and educational creation, especially the young audience which informs itself and cultivates its knowledge on its own through YouTube and other platforms rather than by traditional means.

Predictable conflicts among rights holders, a negation of the amateur creator

Furthermore, this disposition [Article 13] could have harmful repercussions for works shared under free licence or which have entered into the public domain. The experience of YouTube's robotic detection of protected work has led to many conflicts between rights holders, with important litigation at stake, and as a side effect has also led to modifying the conditions of creation, since creators can't be guaranteed control over how their works are distributed. It will be quite impossible, for creator, to manage their promotion based on release of content. How will the robotic detection make the difference between an "illegal" release and a promotionnal release of content ?

With respect to the very principle of these tools, they flagrantly neglect the status of amateur creators, who can only be acknowledged and protected when registered with a rights management company responsible for supplying the fingerprints of works to "protect" on sharing platforms. This contradicts the principle of copyright, which protects each creator independently of his professional oramateur status.

The draft directive offers no guarantee at all to ensure the greatest possible reduction of censorship errors if it imposes no duty of carefor results or methods. Nor does it take into account either the territoriality of the law or national differences in enforcing copyright, thus putting creators and users in a situation of permanent legal uncertainty.

This disposition, supposedly protecting creators, is actually a way to restrict the capacities of creation and distribution which brings no advantage to creators themselves. Furthermore, it risks creating an outlaw culture which will then migrate to private or hidden platforms, since the targeted practices won't disappear (they are already in massive use) but only disappear from the visible face of the Internet, thus discouraging new generations of creators. To give an example of how this works out, we need only look at the result of Hadopi in France. When the law was passed in 2008 it was supposed to solved the problem of illegal sharing, but by 2015 it applied to only 9% of music downloads.

To adress new cultural practices, it would be better to include into the Directive the proposals made by IMCO and CULT rapporteurs:

  • to create an extended "quotation exception" for audiovisualworks (CULT)
  • and an exception for transformative works (IMCO).

It would be a significant progress to adapt copyright to new kind of uses.

Article 13 conflicts with content hosters status and doesn't solve the "value gap"

In demanding that platforms deploy tools to detect illicit content automatically, this article severely damages basic legal principles. But beyond that, it poses many problems of compatibility with the electronic commerce directive of 2000, which regulates the greatest part of responsibilities among Internet actors, and endangers much existing equilibrium without ever solving the problem of the value gap.

The electronic commerce directive of 2000 imposes no requirement for preemptive surveillance of content for providers of shared online content. It is inconceivable to reconcile a general requirement to install tools to detect illegal content with this total absence of apriori responsibility for content hosters, which was originally enacted to permit the development of new services. For the last 15 years it's this balance which has legally protected content hosters. No future corrections to this law can be made only through a directive on copyright and without prior global consultation.

A method which fails to resolve the problem of value gap

The problem of value gap isn't resolved by removing content, because that leads to no remuneration for the creator. Even worse, creators are deprived of the visibility of presenting their works on the Internet, even illegally. The ability to be recompensed disappears under suppression, and controlling illicit content can play no role in redistribution. Thus it can't meet its stated objective.

The Internet has become a very significant advertising resource. We would like to support economic models other than advertising revenue, but it can at least serve as a basis for regulating the problem of transferring value more effectively than suppressing content. One can imagine more comprehensive fiscal measures: European fiscal harmony, measures for extra-european companies, a change in the level of advertising revenue or general revenue of the platforms, etc.

The matter of the income differential between platforms and creators can be settled only by facing the problems of apportionment, while genuinely accepting new ways of sharing by the rights-holders.

Economic disparities among platforms

The general obligation to deploy tools for automatic detection of illegal content will generate a strong disparity among platforms: both the development and the purchase of this kind of solution are extremely expensive. The few companies currently able to develop reliable tools for detecting illegal content are themselves actors in the digital content market, and will get the upper hand and make the smaller actors highly dependent on them, to buy or rent the use of their tools.

The probable growth of litigation due to the inevitable errors of the tools will also bring additional costs. It is the existing large platforms which, for many non-Europeans, will therefore be able maintain acceptable quality of service and to be in good standing with the automatic detection of illegal content, while the smaller entities or the newcomers will have to undertake a much higher cost, even a totally prohibitive one.

Paradoxically, this measure will probably favor the GAFA monopoly and kill the emergence of European actors by disproportionally increasing the cost of access to the market or the unpredictable financial risks increating a content-sharing service. It is really a matter of choosing which economic model we want to promote within the European Union.

La Quadrature du Net urges MEPs of the Committees involved in the work on the Copyright Directive to pay attention of the various problems presented in that analysis, and to simply delete Article 13. Proposals emerge from Committees to intelligently adapt Copyright to the digital era. It would be more useful to support and improve it so that this Copyright reform can be ambitious and involve creators and users in a new dynamic beneficial for everybody.