Socio-Digital Networks as a Research Object

From the outset, social networks, and in particular Twitter (now X), have been the focus of research, particularly in the field of computational social sciences. Just look at the more than 7 million references on Google Scholar on the topic.

Deployed by the Silicon Valley giants in the mid-2000s, without any real questioning of their impact, digital social networks have turned our societies upside down, both from the point of view of individuals (mental health, new ways of getting information, impact on privacy, etc.) and from that of collectives (forms of citizen mobilization, economy, social dynamics, manipulation of opinion, foreign interference, etc.).

Around 60% of the world’s population now uses social networks to communicate and form opinions. The study of their impact on society and opinion-forming is therefore becoming essential. This is the focus of the 2024 European Roadmap “Digital Media and Humanity Well-Being””, written by 40 scientists from 14 countries and coordinated by ISC-PIF (David Chavalarias). It is complemented by interactive maps from the European DIGEING project.

These digital infrastructures are both a source of social innovation and a pitfall for our societies, and democracy in particular, both of which are the subject of a great deal of research.

The notion of systemic risk

Very large digital infrastructures (VLOPs, >45M users in Europe), and in particular social networks, have a direct impact on the circulation of information and disinformation at national and even global level, and consequently on the functioning of democracies. The cancellation of the 2024 elections in Romania is a case in point. Numerous academic studies are therefore focusing on the qualitative and quantitative assessment of these impacts and their origins.

One of the main vulnerabilities of VLOPs is that they rely on a very limited number of algorithms that regulate the circulation of information on a global scale. This is the case, for example, with the news feeds of social networks, the main entry points for their users, who select the information to which they have access.

It is unprecedented in the history of humanity that the modification of an artefact, a few lines of code on a centralized server, can almost instantaneously modulate the perception of reality of hundreds of millions of individuals (over 2 billion in the case of Facebook).

If, for any reason, a VLOP’s News Feed algorithm were to fail (for example, by amplifying misinformation, toxic messages, certain political parties or a particular person), all its users would be affected instantly and in the same way, with potentially serious consequences.

This is what likely happened, for example, on the night of February 13, 2023, when Elon Musk is reported to have asked his engineers to specifically highlight his own tweets. A settings error with no major consequences then replaced the ‘For You’ feed for all X users on Valentine’s Day with a long list of Elon Musk’s messages. Subsequent adjustments have kept the priority for messages from the owner of X, while leaving room for other users.

To regulate the new powers of BigTech, European law recently introduced the notion of systemic risk. These are risks that BigTechs could generate by virtue of their size, and which only they can anticipate and prevent, since their algorithms and internal workings are mostly opaque and covered by business secrecy.

The notion of systemic risk enshrined in the Digital Service Act Art. 34 (the European law) thus concerns all VLOPs, which must prevent any risk “arising from the design or operation of their service and related systems, including algorithmic systems, or from the use made of their services”. These systemic risks include (quote):

La notion de risque systémique inscrite dans le Digital Service Act Art. 34 (la loi européenne) concerne ainsi toutes les VLOPs qui se doivent de prévenir tout risque “découlant de la conception ou du fonctionnement de leur service et de ses systèmes connexes, y compris les systèmes algorithmiques, ou de l’utilisation qui est faite de leurs services.” Ces risques systémiques incluent (citation) :

  • (a) the dissemination of illegal content through their services;

  • (b) any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;

  • (c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

  • (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

One way of mitigating the systemic risks associated with news feed recommendation algorithms is to implement algorithmic pluralism, i.e. the possibility for users to choose their algorithm from a wide range of possibilities.

The advantage of algorithmic pluralism is twofold:

  1. users can change their filter on the world if they identify that the one they are using is faulty;
  2. even if users don’t identify such a risk, a faulty algorithm will only affect a fraction of users, unlike the algorithms imposed by default on platforms without algorithmic pluralism.

Potential systemic risks related to X

Several VLOPs are potentially carriers of systemic risks, yet do not comply with European regulations requiring them to prevent them. For example, Bouchaud et al. 2023 measured that X amplifies so-called toxic content (harassment, personal attacks, obscenities, etc.) in users’ ‘For You’ news feed. This amplification of toxicity rose from +32% to 49% in 2023 following Elon Musk’s takeover of X and his multiple interventions in the algorithm and moderation policy.

Other work (Chavalarias et al. 2024) has also shown, through modeling calibrated on X data, that this amplification of toxicity is directly linked to the maximization of engagement in recommendation algorithms. Another result is that this type of algorithm concentrates social capital in the hands of the most toxic users, and increases the polarization of user communities. An amplification of the digital “entre-soi” also measured on X by other works (Bouchaud 2024).

Further suspicions of malfunctioning in X’s algorithms led the European Commission, at the end of 2023, to open an investigation against X in connection with the application of the DSA, which in early 2025 led it to take an interest in its News Feed algorithm.

Effective data portability as a means of mitigating systemic risks

Old-fashioned social networks are based on capturing their users’ data, as described by Shoshana Zuboff in L’âge du capitalisme de surveillance. One of their strengths is that users can’t leave the network without losing everything they’ve built up: content and audience. So, beyond a certain size, these networks become hegemonic: everyone has to be on them because everyone else is, and this capture of data and audience can be seen as a capture under duress: your digital identity is attached to the platform, and you can’t leave it without losing all your subscribers (unlike telephony, where you can change operator without losing your phone number).

As a result, the non-interoperability of the major traditional social networks hinders innovation and can lead their users, as a collective, to run a systemic risk should the network fail.

To solve this problem, the new European DSA law has introduced the notion of the right to effective portability of data (Art.59 of Regulation (EU) 2022/1925), which both encourages innovation and mitigates the systemic risk of capturing users’ audiences:

59. Gatekeepers benefit from access to vast amounts of data that they collect while providing the core platform services, as well as other digital services. To ensure that gatekeepers do not undermine the contestability of core platform services, or the innovation potential of the dynamic digital sector, by restricting switching or multi-homing, end users, as well as third parties authorised by an end user, should be granted effective and immediate access to the data they provided or that was generated through their activity on the relevant core platform services of the gatekeeper. The data should be received in a format that can be immediately and effectively accessed and used by the end user or the relevant third party authorised by the end user to which the data is ported. Gatekeepers should also ensure, by means of appropriate and high quality technical measures, such as application programming interfaces, that end users or third parties authorised by end users can freely port the data continuously and in real time. This should apply also to any other data at different levels of aggregation necessary to effectively enable such portability. For the avoidance of doubt, the obligation on the gatekeeper to ensure effective portability of data under this Regulation complements the right to data portability under the Regulation (EU) 2016/679. Facilitating switching or multi-homing should lead, in turn, to an increased choice for end users and acts as an incentive for gatekeepers and business users to innovate.

This article is completed by Art.6-9 :

9. The gatekeeper shall provide end users and third parties authorised by an end user, at their request and free of charge, with effective portability of data provided by the end user or generated through the activity of the end user in the context of the use of the relevant core platform service, including by providing, free of charge, tools to facilitate the effective exercise of such data portability, and including by the provision of continuous and real-time access to such data.

The problem is that, at present, most VLOPs carry systemic risks without implementing either effective data portability or algorithmic pluralism. Worse still, in some cases, such as X, they do not respect the law defined by Article 40 of the DSA, which allows third parties, including researchers, to access data from these networks to assess systemic risks:

Upon a reasoned request from the Digital Services Coordinator of establishment, providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.

X is therefore a good example of how centralized networks can lead to systemic risks.

On the contrary, Mastodon and BlueSky were conceived as digital ecosystems capable of ensuring three fundamental principles of user freedom (data portability, algorithmic pluralism, decentralization), with a few minor differences.

Unlike Twitter and other centralized social platforms (Facebook, LinkedIn, Insta, etc.), Mastodon and BlueSky are based on open protocols. Basically, we agree on what it means to post a resource online and relay an online ressource; and then a number of players propose ways of doing social in these digital ecosystems.

As everyone uses the same protocol, everyone can exchange with and follow everyone else, even if they’ve chosen different services. Everyone owns their data and their digital identity, just like in modern telephony.

Having social networks based on open protocols also has a considerable advantage: decentralization. Unlike X, LinkedIn or Facebook, no one can buy the protocols on which Mastodon or BlueSky are based, just as no one can buy the http:// protocol that connects the Web or the convention that enables phones to communicate.

If a billionaire buys a Mastodon host, its users can, if they wish, migrate in a few clicks to another Mastodon host that suits them better, taking their data and audience with them. BlueSky has fallen behind on the decentralization aspect, but initiatives such as Free Our Feeds are emerging to develop this aspect.

From this point of view, it’s undeniable that more virtuous networks than X exist, but that their deployment on the social network ‘market’ is hampered by the lack of portability of X’s social graph and data. If you leave, you lose your threads and your subscribers. X has made you captive to your audience.

This post is also available in: French