Skip to Content, Navigation, or Footer.
The Tufts Daily
Where you read it first | Tuesday, December 3, 2024

Is X a threat to American democracy?

Loosened speech rules on Elon Musk-owned platform foster misinformation and far-right echo chambers.

Twitter/X and Democracy
Graphic by Rachel Wong

When first-year Thomas Park first downloaded X in 2019, the social media app formerly known as Twitter, he mostly used it to follow art accounts and look at funny tweets. Five years later, in 2024, the content on his “For You” page is largely unchanged, with one notable exception. Now, interspersed with art accounts and memes are far-right extremist advertisements, AI-manipulated images and videos known as deepfakes and an influx of hateful and discriminatory rhetoric.

Park’s experience on the site mirrors that of X’s roughly 550 million monthly users and can largely be attributed to X’s acquisition by multi-billionaire businessman and investor Elon Musk in October 2022. In the two years following Musk’s purchase of X, the site has seen unprecedented levels of misinformation and disinformation clouding its user base, a trend that has only been exacerbated in recent months by the 2024 presidential election. The combination of growing artificial intelligence capabilities and a social media platform that has, in nearly every sense, exonerated its regulations of what can or cannot be shared to the site has raised the question: Is X a threat to American democracy?

When Musk first acquired X, he posted a statement to the social media juggernaut captioned “Dear Twitter Advertisers” in which he stated that his vision for the platform was to have a common digital town square … There is currently a great danger that social media will splinter into far right wing and far left wing echo chambers that generate more hate and divide our society.” He added that “Twitter obviously cannot become a free-for-all hellscape, where anything can be said with no consequences!”

In the weeks following, Musk disbanded X’s Trust and Safety Council, which loosened rules surrounding hate speech and disinformation, and introduced Twitter Blue, a subscription service that provides users verification for a monthly fee. The result, according to Park, has been an entirely transformed user experience. You don’t really see advertisements for normal things on Twitter [anymore]. It’s mostly just people who want to buy into Twitter, like alt-right, far-right companies,” Park said.

Park also believes that Twitter Blue allows users to amplify their platform and credibility with no qualifications other than paying the monthly fee. “When, of course, you allow [verification] to be bought by anyone, it’s anyone’s game … because you literally just opened up more people to believe that everything that these checkmark people [are saying] is true when they were conditioned to that before Elon Musk took over,” he said.

As the U.S. is in the midst of one of its most polarized elections in recent political history, the lack of regulations over what is posted on X has fueled concerns over its impact on the election. In July, Musk endorsed former President Donald Trump, going so far as appearing at his rally and hosting a live-streamed audio conversation with the Republican nominee on X.

Musk has also shared AI-generated images and videos propagating election disinformation on X, including a deepfake video of Vice President Kamala Harris that brands her as a ‘diversity hire’ and uses a manipulated version of her voice. The post is seemingly in violation of X’s own policy that requires media that is “significantly and deceptively altered, manipulated or fabricated” to be either labeled or removed.

The deepfake posted to Musk’s account is far from being an isolated incident; according to an analysis by The New York Times, almost a third of the 171 posts Musk made to X over a five-day period in September 2024 were false or misleading. However, Musk continues to be the most prominent figure on the site. An August 2024 analysis by the Center for Countering Digital Hate stated that Musk’s false or misleading claims about U.S. elections on X have been viewed 1.2 billion times.

Park believes that AI-generated content will be a main factor in setting the 2024 election apart from previous presidential elections. What’s changed from [the] 2016 to 2020 [elections] is just AI, Park said. “If you ask it to stereotype people, it will stereotype people. It doesn’t really have a filter yet. That’s definitely dangerous for a very polarized election.”

However, Associate Professor of Political Science Michael Beckley is skeptical that the increase in political misinformation sets this year’s presidential election apart, citing comparable patterns that have occurred throughout history. We’ve seen similar things when radio first came out. [People wondered], was this going to allow strong men to rally people behind their cause? We saw the same thing with TV,” he said. “So [the misinformation] is jarring, but I don’t see it as a unique factor. It is rather a pretty chronic factor in a democratic system.”

Kelly Greenhill, an associate professor of Political Science, identifies the normalization of false information spread by notable figures as a key reason behind increased disinformation in the media. “Social media is one of many channels used to communicate information, be that information high or low quality,” Greenhill wrote in an email to the Daily. “That it has become normalized and far more common than it used to be for some political figures to unabashedly and chronically spread false and misleading information means that we also see more of this garbage information on social media.”

The influx of misinformation on X and other media platforms comes as increasingly more Americans, especially younger generations, receive their news from social media. According to a 2022 study by Deloitte, 51% of teens aged 14–19 mainly receive their daily news from “social media feed[s] or messaging services,” rather than traditional media forms such as television or news sites. Park falls within this category. “I basically get most of my info from social media, either Instagram or Twitter, especially recently with the election,” he said. 

The case of X has raised concerns over the power that monolithic media companies wield over the beliefs of the American public, especially during events as monumental as presidential elections. While social media has been used as a tool in past elections, it is unprecedented for the platforms themselves, and especially their owners, to blatantly attempt to sway the opinions of their users. Park notes how he has personally witnessed a shift in the type of speech he encounters on the site. [X is] a town square, but the loudest people, the people given the stage and the microphones, are far-right white supremacists, racist people and people who want to discriminate,” he said.

However, X is not the only media source that is experiencing a surge in misinformation. Greenhill believes that the issue of political misinformation extends far beyond social media. The root of the problem is not social media per se. It is the normalization of chronic lying and shading of truth, something that is relatively new and deeply, deeply problematic,” she wrote. “Among other problems, hearing things multiple times—even things individuals knew or thought were lies when they first encountered them—makes them ‘feel more true to our brains,” she wrote.

Greenhill suggests that possibly the best solution to avoid X’s abundance of misinformation is, simply, to leave. “People don’t have to use X. They can leave. They can delete their accounts. They can also leave social media. They may not want to, and they may not choose to, but they can,” she wrote. Some users are choosing to do just that; social media sites that have advertised themselves as alternatives to X, such as Bluesky and Mastodon, have amassed popularity in recent years.

But Park continues to prefer X to other apps, despite the substantial changes made to the platform. “[X] is a town square and it is a town square where the racists and the sexists all have the podium at once … but you can still have side conversations, being like, ‘Wow, that guy’s a jerk,’” he said. “You have that sense of community because there’s more people on it.”