Regulating the Internet through decentralization

Posted on


The copyright directive and the recent debates on “fake news” have served as an introduction to the general debate on Web regulation that will be happening next year. Today, La Quadrature du Net presents its concrete propositions.

The French government wants the large social networks to stop encouraging the diffusion of “hateful or extremist speech”. So be it.

The report ordered by the Prime Minister aiming to “reinforce the fight against online racism and anti-Semitism on the Internet” published last Thursday explains it really well. It denounces “a vicious correlation between hate speech and advertising impact

: individuals making offensive comments are those who generate the most income, as one of these individuals could create fifty or a hundred more. Seen from this point of view, it is in the social networks’s best interests to host as many of them as possible.”

In general, the report laments the “rule according to which offensive remarks will “buzz” more than an agreeable one, fueling these platforms’s economic model more reliably”. We had made this same analysis to explain why we should attack Google or Facebook in May last year, when we were preparing our collective complaints against the Big Five.

To compensate for this “rule” that would make hatred and conflict profitable, the government wants to reinforce the obligations imposed to the giant platforms that profit from these two things, through an increase in transparency and duty of care. Why not? This could be done in ways that are more or less relevant, we will come back to this later. But this solution will never be enough on its own to counter the abuses allowed by the “profitability of conflict”. And it would be foolish to think, just as the aforementioned report does, that this issue would be lessened by assigning a judge to each and every libel or insult said on the Internet. There are far too many of those.
No. If we seriously want to deal with this issue, we need to challenge the attention economy as a whole. And for this, sane alternatives that rely on a different model than that of the Big Five must arise.

Current laws favorable to the giants

For fifteen years, and still as of today, laws slow down the development of such alternatives. They impose considerable obligations to all “hosts” (individuals who keep and share content provided by the public on the Internet). If a host is notified of “clearly illicit” content, they must censor it “promptly”, or they become personally responsible for it.1This rule is set by article 6 of the 2004 loi pour la confiance dans l’économie numérique (LCEN), transposing article 14 of the Electronic Commerce Directive 2000.

In practice, we at La Quadrature du Net have thought about offering a video hosting service (by allowing anyone to upload videos on our streaming service, PeerTube). This would be a concrete way to participate to the construction of an alternative to YouTube, while gaining nothing from antagonistic speech or mass surveillance. But we had to give up on it. We do not have enough legal experts at all to assess which videos would be “clearly illicit”. We do not have the means to take on fines in case of complaints. YouTube remains on top.

Lightening the load on the hosts’s shoulders by distinguishing them from the giant platforms

If the government want to effectively fight the diffusion of “hateful and extreme content”, it has to change laws to encourage the development of alternatives to the attention economy. Here is our proposal.

First, hosts should no longer be subject to the same obligations as the giant platforms, who actively regulate the information for their own economic interests.
Secondly, neutral hosts, who do not gain anything from featuring this or that content, should no longer be responsible for the evaluation of whether or not some content is “clearly illicit” and has to be censored. Only a judge should be able to demand that they censor some content.

The virtuous circle of decentralized regulation

Allowing a multitude of small hosts to grow lets the hope of efficient self-regulation grow, placed in the hands of the population as a whole.
Under the law, each host applies their own moderation and rules, strict or lax, and each person gets to choose the discussion space that is most adapted to their needs and desires. This freedom of choice is reinforced by the development of “decentralized social networks standards”, notably that of the ActivityPub standard, published in January 2018 by the World Wide Web Consortium (W3C, behind the Web standards) and is already put into practice by Mastodon (an alternative to Twitter) or PeerTube. Those standards will allow an infinity of hosts to communicate together, according to one another’s rules. They will also allow each individual to freely switch from one host to another, from a set of rules to another (which giant platforms are currently trying to stop by all means).

Each individual will choose whether or not they want to expose themselves to such and such kind of conflict, and each host will moderate their community on a human scale. This structure lets us hope of a significant diminution of unwanted interpersonal conflicts on the Internet. This way, jurisdictions will no longer have to make as many decisions as there are conflicts on the giant platforms and will be able to focus on the more serious infractions.

If the government wants to better regulate the Web, they must do it seriously. Simply making the giants’s obligations heavier is too superficial of an action. To act in depth and on the long term, their approach must also be constructive, and encourage the development of virtuous models.

References

References
1 This rule is set by article 6 of the 2004 loi pour la confiance dans l’économie numérique (LCEN), transposing article 14 of the Electronic Commerce Directive 2000.