HomeElection News

X Faces Scrutiny Over Risks to Electoral Integrity and Content Moderation

Social media giant X admits high risks to electoral integrity in its EU-mandated risk assessment, raising concerns over disinformation and content moderation policies.

X said it had moved from a “binary, absolutist take down/leave up moderation framework … to a more reasonable, proportionate and effective moderation process.” | Nicolas Tucat/AFP via Getty Images

Social media platform X, owned by Elon Musk, is under fire after a legally mandated risk assessment revealed the platform poses a “high risk” to democratic processes, despite its efforts to promote freedom of expression. The assessment, conducted under the EU’s Digital Services Act (DSA), was published on Wednesday and highlights significant challenges for the platform, particularly ahead of critical elections across the European Union.


X’s Approach to Content Moderation

In the risk analysis, dated September 2023, X claimed its content moderation policies and tools, including features like community notes and verified checkmarks, have reduced risks in most areas to a “low to medium level.” However, the platform acknowledged ongoing vulnerabilities, particularly in combating terrorist content and disinformation around elections, as tactics by malicious actors evolve rapidly with advancements like generative AI.

The company touted its move from a “binary” approach to content moderation—where posts were either removed or left up—to a system emphasizing demotion of problematic content. X argued this approach fosters “open conversation,” a hallmark of Musk’s vision since his acquisition of the platform in 2022.

Critics, however, have noted that this shift coincided with a rise in hate speech and fake news, fueling concerns over its impact on users’ safety and electoral integrity.


Reaction from Experts and Regulators

Ellen Judson, an investigator at Global Witness, called it “extraordinary” that X openly admitted to the high risk its platform poses to democratic processes. She emphasized the importance of these findings for the European Commission’s ongoing investigation into X’s compliance with the DSA.

The EU has already launched a probe into X for its alleged failure to control toxic content and disinformation. In July, X was formally charged with violating several provisions of the DSA, which requires very large online platforms (VLOPs) to mitigate systemic risks linked to their services.

Adding to the criticism, European lawmakers accused Musk of using the platform to amplify his own political views, including his vocal support for Donald Trump during the U.S. presidential race.


Independent Audit Flags Further Issues

An independent audit by consultancy FTI, published alongside the risk assessment, cast doubt on the effectiveness of X’s moderation efforts. The audit deemed the company’s processes “not rigorous enough,” suggesting potential noncompliance with the DSA’s strict requirements.


A Platform Under the Microscope

With critical elections looming in the EU, X’s approach to content policing remains under intense scrutiny. Regulators are likely to ramp up enforcement measures as the platform navigates growing concerns about its role in spreading disinformation and harmful content.

Subscribe to our newsletter

COMMENTS