Censorship and Freedom of Expression

Posted on


Censorship and Freedom of Expression

The French Law for Trust in the Digital Economy (Loi pour la Confiance en l’Économie Numérique, or LCEN) adopted in 2004 and frequently updated since then, regulates censorship and Internet content removal in France.

Download our proposals in PDF

Moratorium on blocking measures



Administrative censorship of websites is an unacceptable encroachment not only on freedom of expression but also on the mere principle of the separation of powers. La Quadrature du Net strongly opposes it, and therefore commenced a legal action before the French Council of State (Conseil d’État).

However, even when they are pronounced by judicial authorities, such blocking appear both ineffective and disproportionate:

  • Disproportion : in 2011, following a bipartisan information report on net neutrality, the Socialist Group in the French National Assembly (Assemblée Nationale) offered (FR) to introduce a moratorium and an assessment of website blocking measures in the law. A few weeks earlier, an UN report (FR) similarly highlighted that these blocking measures were often adopted by States in violation of their duties pursuant to international law. Taking into consideration issues of overblocking (FR) and the questionable efficiency of such measures, website blocking seems contrary to the principles of proportionality and necessity, according to European law (FR) as well as the French Constitution. Especially since alternative measures exist, such as removal of content at the source, even if their efficacy remains hampered by the lack of effort at the diplomatic level to facilitate judicial and police cooperation in order to implement international law in the borderless space of the Internet.
  • Lack of legal basis: Adding to the lack of proportionality, there is an other problem related to the increase of blocking and their use by French courts: it is the lack of legal basis for such censorship measures. As a matter of fact, these blocking measures are ordered on the basis of remarkably vague legal provisions, especially the expression according to which the French judicial authority can take ‘every appropriate measure’ to prevent or stop a damage (LCEN, Art. 6-1-8).

    Yet, European law enjoins such measures to be provided by French law “expressly, and in clear, precise and predictable terms”, as stated by the Advocate General of the Court of Justice of the European Union in the Scarlet Extended case.

    By the same token, in a concurring opinion appended to the Yildrim v. Turkey ruling of December 18, 2012, ECtHR judge Pinto De Albuquerque put forward a list of criteria to include in national laws in order to regulate website blocking measures. Among other measures, he indicated that the law should specify which categories of people and institutions may see their content blocked, and should define upon which interests those measures could be justified, as well as a definition of the different categories of blocking orders that could be issued by a judge and their technical requirements. He also proposed that the law should guarantee the principle of the right to fair trial and therefore the possibility for the person or institution aggrieved by the blocking to be heard before the judge’s blocking order. Finally, judge Albuquerque specified that “neither the general provisions and clauses governing civil and criminal responsibility nor the e-commerce Directive constitute a valid basis for ordering Internet blocking”.

    On all of these aspects, French law as well as European Union law are particularly incomplete. In these circumstances, and in the absence of a moratorium on blocking measures, the government and the parliament must engage in providing a precise framework on the conditions and procedures applicable to judicial blocking of content on the Internet. They must do so by reviewing and completing the succinct formula of 2004 in the LCEN, stating that the judge can take ‘every appropriate measure’ (art. 6-1-8) and included to many other laws since then.

    Prohibition of notice-and-staydown

    Numerous politicians and official reports call to bypass the spirit of the law (article 15 of eCommerce Directive and Article 6-I-7 of the French LCEN) and of the jurisprudence (FR) of the Cour de cassation (highest judicial jurisdiction of France) by claiming their desire to uphold notice-and-staydown.

    • Adding to the fact that these measures seem to go substantively against what European and French lawmakers envisioned, La Quadrature du Net reminds the reader that these technical measures aiming at preventing any content from reappearing on the Internet, are akin to a form of preemptive automatic censorship.
    • Not only do such measures lack a proper legal basis, but more fundamentally, such measures are unable to concretely lead to a sound assessment of whether any given use of the Internet constitutes an offence or not. For instance, when it comes to preventing the re-posting of online content due to copyright infringement, those who will be in charge of the technical implementation might calibrate it in order to ensure maximum legal certainty, without any consideration of legal uses such as parody, public information or quotation right. Delegating to private actors, and the technical tools they have developed, the task to carry on a declaration of illegality, thereby judging whether a content shall or shall not be blocked, is a deleterious tendency for the rule of law, especially in the context of the rise of algorithm-based decision-making.
    • Ideally, the law should be clarified and strengthened to end regulatory and jurisprudential procrastination on this issue.

    Implementing a system of notice-and-notice

    The so-called “limited” responsibility regime for technical intermediaries, including hosting providers, is accompanied in the French LCEN by several obligations resulting practically in a privatization of the regulation of public expression in the digital space.

    The French expression of ”blatantly illicit” is outdated: article 6-I-7 and the notice-and-takedown regime instituted by this provision of the LCEN, that transposes the European eCommerce Directive, leads to a situation that the French Constitutional Council has warned against. The Council explained in its comments to its decision on the LCEN that hosting providers should not be forced to prove the lawfulness of online content, since ”the characterization of an unlawful message can be highly arduous, even for a lawyer”.

    The Council’s interpretation of the expression ”blatantly illicit” was meant to prevent drifts and limit the extra-judicial procedure set up by notice-and-takedown to the most serious offences, in which the reported contents are indeed ”blatantly illicit”. But case-law has expanded the use of this notion to cases dealing with new categories of content (e.g. copyright infringement, libel), thereby emptying its protective property for freedom of communication.

    As a matter of fact, such an expansion’s impact is twofold. It intensifies both the legal uncertainty burdening hosting providers and their tendency to censor online contents by fear of being sentenced by a judge. Judges even tend to sentence hosting providers for not having withdrawn contents “likely” to be illegal(see TGI Paris, 15 avril 2008, Jean-Yves Lafesse c/ Dailymotion (FR)) or, likewise, distinguish the concept of ”blatantly illicit” from the concept of ”certainly illicit” (see TGI de Brest, 11 juin 2013, Josette B. c/ Catherine L. et Overblog).

    Restore the jurisdiction of the judicial authorities: notice-and-takedown procedure should be replaced by a notice-and-notice procedure:

    The hosting provider should only remain a bridge between the person complaining about content considered illegal and the person who published it online.

    From the moment the hosting provider receives the notification, he or she should give to the publisher a reasonable time to decide whether or not to remove it. In case of counter-notification from the publisher(if he deems the content legal), the hosting provider should notify the third party who sent the withdrawal request that the publisher refuses to do so, and offer that the case be brought before a court.

    As for notifications of content corresponding to categories of serious offences that could justify preventive measures (to be specified in the law), such as child pornography, the hosting provider should suspend access to the content from the reception of a notification, until the dispute is settled either amicably or through court proceedings.

    Finally, it is important to consider the creation of a platform to document transparently with the extra-judicial measures of content removal, particularly if the legal regime remains one of notice-and-staydown. For now, apart from the “transparency report” published by some major platforms, citizens, policy makers, researchers and journalists have no reliable and transparent information on the extent and nature of removals made following notifications by third parties or by the administrative authority. The US platform Chilling Effect might be an example to learn from.

    Forbid censorship by technical private intermediaries

    In its study of September 2014 on the digital environment and fundamental rights, the French Council of State decided to take a position according to which “the possibility for a platform to remove certain legal content cannot be questioned: it is entailed by its contractual freedom and its freedom of enterprise”. Nevertheless, policies implemented these past years by platforms such as Google or Facebook have revealed the risk of legal content being censored by stakeholders which otherwise claim the status of hosting providers and hence, the status of neural intermediary. Transparency and the possibility to bring a case against these stakeholders are not sufficient in themselves to provide satisfactory protection.

    • Echoing a proposal (FR) brought in by Laurent Chemla in 1999, the NumNow activist group recently proposed that general provisions (FR) for the repression of attacks on freedom of expression should be included to the French Penal Code, in order to avoid that ToS of technical intermediaries, which benefit from a liability exemption, undermine their users’ freedom of expression.
    • With these proposals, La Quadrature du Net calls to prohibit any censorship by private technical intermediaries, under the threat of dissuasive or punitive damages calculated according to their annual turnover. Such a provision should be limited to technical intermediaries (without any editorial control), providing a means of public expression and described as “universal” since they do not target their service to a restricted community of interest (a community of interest being defined (FR) by the French Court de Cassation as a “group of people sharing the same aspirations and objectives”).

      Thus, a universal social network like Facebook, would inevitably fall within the scope of such a provision, unlike a social network whose by-laws or ToS specify that it is put up for a particular community of interest (as one devoted to a company or a religious community).

    Exclusive jurisdiction of State agencies to collect illegal content notifications

    In view of the extension of the monitoring obligations of hosting providers in application of several laws in line with Article 6-I-7 of the French LCEN, it is important to streamline the legal regime while avoiding private censorship. In parallel with the establishment of notice-and-notice regime, La Quadrature du Net recommends centralizing the notifications of illegal content in the hands of State services. Hosting providers should have but one obligation: bringing to their customers a feature (i.e. a piece of software provided over public infrastructure) that would directly transfer citizens’ reports to public authorities. For example, the French platform: internet-signalement.gouv.fr, set up by the French OCLCTIC (Central office for the fight against crime related to technologies of information and communication). This platform was meant precisely for the aforementioned purpose but remains largely under-used and under equipped. Worse, hosting providers have never been required to use it (apart from their obligation under the procedure notice-and-notice, to forward notifications of content to authors or content publishers, if necessary by temporarily suspending access to the content if the alleged offence justifies so.

    Legislative authorisation to take a case to court for organisations fighting for civil liberties in the digital world

    As The Pirate Bay case recently showed (FR) in France, numerous freedom restrictions on the Internet are ordered without any of the involved people having been given the opportunity to defend themselves in court. The law, and especially the LCEN, allows direct request to technical intermediaries: for instance, an Internet Service Provider (ISP) can be sued in order to obtain the blocking of a site. Yet, ISPs mainly object that such measures are too cumbersome for technical or economic reasons, pointing at the cost and the ineffectiveness of these measures. In such conditions, judgements scarcely take any account of fundamental rights.

    • Generally speaking, as stressed above, procedures resulting in rulings that deprive people of their rights must carry the possibility for the persons involved to be heard, in the sake of the right to a fair trial.
    • In addition, especially in cases where the persons affected by a measure restricting their freedom cannot be represented in the proceedings (for instance because they prefer to remain anonymous), the possibility for civil liberties organisations to bring up a case for them would ensure that their fundamental rights be defended. However, for now, the relevant organisations, including those specialised in protecting fundamental rights in the digital world, lack the legal, material and human resources to intervene in cases of strategic interest in terms of jurisprudence. Furthermore, in the absence of express authorisation, recognition of their interest (FR) to bring a case before the courts is not yet secured, especially before criminal courts.. (Such interest justifying one organisation’s interest in taking part to a legal action can be found in its by-laws.)

      In this context, La Quadrature du Net recommends the adoption of a legislative authorisation in favour of organisations allowing them to bring legal actions to defend fundamental rights on the Internet, as long as it is provided for in their by-laws. This authorisation should apply to civil, criminal and administrative courts and shall allow them to be civil party and to be awarded damages, in particular in order to allow the funding of the furtherance of their actions.

Posted in