The president of the United States has encouraged the assault on Parliament through social media. Facebook decided to close his account until the end of his term. Twitter shut down the president for 12 hours and then did so permanently based on an employee petition. Twitch, Snapchat, and YouTube have also limited Donald Trump's externals in various ways. What are the consequences of these decisions? One can answer on principle or pragmatically.
The first path is followed by those who talk about rights. Those who put the issue in terms of freedom of expression recommend caution whenever a decision is made that infringes on this right. The argument is important. Diminishing freedom of expression is a detriment to everyone: those who rejoice because their opponent is silenced at one stage in history may find themselves suffering the same treatment at a later stage. But no one argues that freedom of expression is an absolute right. The rules must be balanced so that other rights, from privacy to security and public order, are also safeguarded. Jurists and philosophers know how to deal with these delicate issues. Do private citizens who own social networks know this as well?
It is instructive to see how the owner of Facebook can impose his choices, asserting his principles. Exactly as private television and newspaper owners used to do. But whatever the decision of a particular platform, the social phenomena it hosts will adapt, not disappear. Twitter's progressive interventionism has gone hand in hand with the emerging success of Parler, a similar system founded by John Matze and Rebekah Mercer that presents itself as a champion of freedom of expression and is therefore adopted by anti-Semitic, conspiratorial, Republican groups, even finding advertising from companies in tune with these positions. By the way: Apple has excluded Parler from the App Store. Google and Amazon, a little later, made similar decisions.
All of this has consequences. In the past, it could be argued that these platforms are nothing more than software available to users: and users are responsible for how they use software. Today this idea is less credible. The way platforms are designed and maintained affects user behavior. And vice versa.
In the meantime, however, the next discussion is coming up. Because the platforms used to organize attacks on public order are not the ones used for open debate. Actions are planned on platforms that guarantee the privacy of communications. Like Whatsapp and Telegram. What responsibility do these platforms have if their owners don't know anything about what their users are doing? Does the difference lie in the tools made available? If they allow a strong virality of information, in complete secrecy, do they put themselves at the service of potential acts of violence?
And all of this is only part of the problem. It remains the most difficult. The responsibilities of social platforms are diluted in the larger media accountability system. The interplay of cross-references between media and public figures generates important cross-media stories. If anything, the specifics of the platforms lie in the algorithms that present only the information that confirms users' biases.
But there is a general rule: infodiversity makes the media ecosystem healthier. Each element of the system must favor or at least tolerate the encounter with diversity, not the obsessive confirmation of opinions. A society that learns to vaccinate itself against self-referential positions lives better.