Musk, the EC Warning, and ‘Inception’-Levels of DisinformationMusk, the EC Warning, and ‘Inception’-Levels of Disinformation
Threat of fines looms as X faces questions whether its policies empower digital lies that further incite emotional responses to a grim conflict.
At a Glance
- Letter from European Commissioner Breton calls on Elon Musk to respond to questions about disinformation on X.
- Instances of fake or mislabeled images found on social media inaccurately attributed to Israel-Hamas War.
- Musk and X may face fines under the European Commission's Digital Services Act.
On Tuesday, European Commissioner for the Internal Market Thierry Breton posted a letter aimed at Elon Musk, warning of potential fines regarding content platformed by X (formerly known as Twitter) -- content Breton described as “illegal” and “disinformation.”
Said content was cited after the terrorist attack launched over the weekend by Hamas on Israel. Violent images and content spread quickly across the internet purported to depict the attack, but the veracity of some of that content might be questioned given the infusion of false or altered imagery from questionable sources.
Europe levied hefty fines previously against Big Tech for data privacy violations and ad targeting; now disinformation on digital platforms is coming under regulatory scrutiny and enforcement.
The Israel-Hamas War already carries a hefty human toll. The spread of misleading information, meant to provoke responses, could incite ill-informed actions that exacerbate the conflict.
There have been a variety of changes to X under Musk’s ownership that raised questions about the way information is presented on the platform, whose content gets pushed in front of the public, and what measures, if any, are taken to call out disinformation. Wired recently ran a story that declared X was “drowning” in disinformation after the outbreak of the Israel-Hamas War.
According to the story, the disinformation problem got so bad that finding reliable sources and content through the X platform became something of a snipe hunt. Even images from video games were passed around the platform as footage from the current real-world conflict. Further, sources who might actually be present on the frontlines of the war with real footage can get overshadowed on the platform because of the push the algorithm gives to paying users, regardless of their authenticity.
Musk himself seemed to stumble on this front when, in a now-deleted tweet, he cited two sources he thought were reliable for information about the conflict only to have it pointed out by others that those sources had also posted derogatory, anti-Semitic messages as well as disseminated disinformation.
This is far from the first time Musk and X have come under the EC’s scrutiny. In late September, European Commission Vice President Vera Jourova called the platform the biggest source of fake news, while also cautioning other platforms such as TikTok to take action against disinformation.
Of late, Musk’s leadership brought changes to the X platform that include letting users pay for verification checkmarks. More recently, X introduced a gatekeeping feature that lets paying users restrict responses to their posts to only other paying users.
In his letter, Breton expressed concern about Musk’s recent decisions in the management of X: “Your latest changes in public interest policies that occurred overnight left many European users uncertain.”
Breton reiterated in the letter the Digital Services Act (DSA) includes obligations on content moderation and gave Musk 24 hours to comply.
The letter enumerated some of the requirements under the DSA, including transparency about what content is permitted on the platform and the consistent, diligent enforcement of those policies. Furthermore, when notice is given that there is content deemed illegal by the European Union, it must be removed in a “timely, diligent, and objective” manner when warranted.
Breton’s letter also called for “effective mitigation measures” to fight the risks disinformation poses to public security and civic discourse. The letter cited the spread of fake and altered images on Musk’s platform, including the reuse of old photos from prior, unrelated military conflicts and the use of images from video games -- presented as content captured from the current turmoil.
The letter closed with a warning of unspecified potential fines X could face if an investigation were to find noncompliance with the DSA.
The intersection of disinformation and politics has been of growing concern, but unimpeded digital manipulation in the midst of a military conflict can have direct, human consequences.
About the Author(s)
You May Also Like