The digital era’s new toxic relationship

By Avril Lynch

Five hours. Two hours, 48 minutes. Thirteen hours, 7 minutes.

However long or short our screen time for different social media applications may be, many of us are users of social media. These are platforms through which we can share ideas, express ourselves, spread awareness about certain topics, connect with other people and more. 

But what if those social media apps did not exist anymore? What if they just froze?

This is what happened on Oct. 4. Instagram, Facebook and WhatsApp shut down abruptly on a regular Monday afternoon, forcing many of us to get back to our English essays, math assignments and psychology papers.

The shutdown lasted for nearly six hours, giving us more than enough time to finish our homework. However, Instagram was not back up even after some of us were done being productive, hoping to reward ourselves with some mindless scrolling. Then, maybe a few of us asked the question: Why did the shutdown happen in the first place?

Facebook and platforms like WhatsApp and Instagram shut down less than 24 hours after a 60 Minutes interview with Facebook whistleblower, Frances Haugen, aired. This interview raised questions regarding the amount of trust we put in these platforms as forms of entertainment and communication.

Haugen discussed what she witnessed as the product manager on the civic integrity team during her time at Facebook, making public dozens of reports she had saved that consistently showed Facebook prioritizing “profit over security.”

As one report states, there was evidence that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world.” The idea that violent and hateful content has become a part of many of our daily routines has made many users uneasy and distrustful

Haugen also expressed her concerns over Facebook’s influence on the violence perpetuated in the real world by saying, “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence.” Haugen stated that this is because it has become easier to inspire people to anger by putting polarizing content in more people’s algorithms to achieve more user engagement. Some examples of discriminatory posts on Facebook show the double standards applied to hateful and harmful posts and the decisions to either remove them or to let them remain up.

This ultimately means that we, as social media users, are allowing ourselves to be exposed to more violent content every day as Facebook continues to make more money. This exemplifies Facebook’s true values: profit and growth.

As Facebook regulates what content we are exposed to the most, they can also regulate what we are exposed to the least. Facebook activated safety systems to reduce misinformation during the 2020 presidential election, which were later deactivated following the results of the election, exposing users to harmful posts and misinformation again. To Haugen, this is a “betrayal to democracy.” 

On an internal Facebook message board, one Facebook employee stated that “colleagues … cannot conscience working for a company that does not do more to mitigate the negative effects of the platform.” Haugen directed much of her criticism toward Mark Zuckerberg, co-founder and chief executive officer of Facebook, who has been criticized for the executive decisions made in regulating algorithms and content. While she acknowledged that this was not necessarily Zuckerberg’s intention at the onset of the platform’s creation, his subsequent choices have allowed for the continued distribution of hateful, polarizing content. 

According to John Tye, the founder of a Washington legal group called Whistleblower Aid, Facebook has also been misrepresenting to advertisers and investors the number of people who have been viewing the ads they have paid for. At this point, the lack of transparency from Facebook regarding what the platform is set out to promote has transformed into a toxic relationship. Many of us use Facebook and its services every day. How do we reconcile this with all of its lies, its continued dissemination of hateful content and the fact that it misinforms its users? What more is needed to declare it as toxic? Doesn’t it already do enough?

Especially in this digital age where it is rare to find someone who does not use social media, it is important to remember that the algorithms we experience within these platforms may not necessarily reflect the truth. We cannot let ourselves get too influenced by its effects.

Even though it may be a coincidence that these two occurrences, the social media outage and whistleblower interview, happened within 24 hours of each other, they present us with a common, important message: We cannot easily trust the Internet, and it will bode well for all of us to become less reliant on social media platforms.


COPYRIGHT 2021 THE TUFTS DAILY. ALL RIGHTS RESERVED.