The Paris prosecutor’s office has launched an investigation into the functioning of social media platform X, formerly known as Twitter.
The inquiry follows a report submitted on 12 January by French MP Eric Bothorel, a member of the centre-right EPR party, alleging that the platform’s algorithms are biased. The case was reported by Franceinfo on 7 February.
Concerns Over Algorithmic Manipulation
The investigation is being handled by magistrates and specialised cybercrime investigators, who are conducting initial technical assessments. According to the Paris prosecutor’s office, the algorithms used by X may have “distorted the functioning of an automated data processing system.”
Bothorel raised concerns about the platform’s algorithmic bias in a post on X, stating: “I have seen several posts referring to ‘those responsible for the distorted functioning of X’s recommendation algorithm.’ I referred the matter to the cybercrime division (J3) of the Paris prosecutor’s office by simple letter on 12 January.”
The exact nature of the alleged bias remains unclear, but previous reports have suggested that X’s algorithm may influence content visibility in a way that advantages certain narratives while limiting others. This aligns with broader debates over how social media platforms curate content and whether their algorithms promote or suppress specific viewpoints.
Elon Musk’s Influence Over X’s Algorithm
The investigation follows growing scrutiny of X’s algorithmic operations, particularly since Elon Musk acquired the platform in October 2022. Under Musk’s leadership, X has implemented changes to its recommendation system, which some critics argue have resulted in the amplification of certain content over others. Reports have suggested that Musk has exerted direct influence over the platform’s ranking system, allegedly favouring certain posts, including his own.
A previous Franceinfo report highlighted Musk’s influence on X’s algorithm, suggesting that his interventions have shaped the way content is prioritised. This has raised concerns among regulators and policymakers about the potential manipulation of public discourse.
Legal and Regulatory Implications
The Paris prosecutor’s investigation could have significant legal and regulatory implications for X, particularly under French and European digital governance frameworks. The European Union’s Digital Services Act (DSA), which came into force in 2024, imposes stricter transparency and accountability requirements on large online platforms, including X. Under the DSA, companies must provide detailed explanations of their algorithmic processes and ensure that their systems do not facilitate disinformation or discriminatory practices.
If the investigation finds that X’s algorithms have indeed distorted the functioning of an automated data processing system, the platform could face legal consequences under French law. France has taken an increasingly proactive stance on digital regulation, with authorities scrutinising major tech firms over issues ranging from data privacy to content moderation.
A Broader Trend of Social Media Scrutiny
The inquiry into X is part of a broader wave of regulatory actions against major social media platforms in Europe. Meta, Google, and TikTok have also faced investigations over their algorithmic practices and data processing methods. European regulators are increasingly concerned about the role of artificial intelligence and machine learning in shaping public discourse, particularly in relation to elections and political debates.
The outcome of the Paris prosecutor’s investigation could set an important precedent for how digital platforms operate in France and the wider EU. If evidence of algorithmic manipulation is found, X could face penalties, and the case may prompt further regulatory action against other tech companies.
For now, the investigation remains in its early stages, with cybercrime experts conducting technical assessments. The findings will determine whether further legal action is warranted.