Social Media and Political Polarization: An Experiential and Analytical Account of One of American Society’s Most Toxic Relationships
Credit: Adobe Stock
This article was originally written in October of 2024.
X (formerly Twitter) is a political melting pot. I have encountered deep-state, Hollywood-eats-babies conservatives, and many a Marxist disciple, each promoting extremist rhetoric from the furthest ends of the spectrum. Most notably, Elon Musk’s official X account can be found ubiquitously throughout my feed. I don’t follow Elon Musk, yet I can’t seem to avoid him. Elon’s tweets, at least those on my feed, are almost always some form of political messaging–favoring Trump while bashing Kamala. Elon is entitled to his own opinion just as much as I am. However, I do not own X, nor do I have the ability to push my political views on the billions of people using my app. There is something inherently, morally different between the people tweeting conspiracies to a dozen followers and the platform-wide megaphone which social media has given to its most powerfully polarizing figures.
It is no secret that political polarization is on the rise. Any worldly citizen could make such an observation. However, what I find so fascinating is the degree to which social media platforms and their algorithms exacerbate this growing phenomenon, particularly among young people. Consider TikTok, the short-form video platform used by people across the world. Tiktok’s official age limit is thirteen, but we all know that any savvy pre-teen can easily circumvent this restriction by lying on the pop-up which the app gives you when you first sign up for an account, just by scrolling their date of birth back a couple of years.
TikTok is a fairly simple app. There is a standard For You Page (FYP) for first-time users, serving up generally popular videos, which quickly adapts to the individual’s tastes and preferences to show more of what they engage with. This is great for users looking to keep up with current trends and try new dances, but it becomes increasingly poisonous when more existential ideas are involved. I have witnessed first-hand how quickly my FYP can go from DJing clips to deep-state conspiracies. If it's happening to me, it's happening to everyone. I pride myself on my social media literacy, and like to think I am capable of recognizing when I’m being exposed to propaganda and, therefore, not take what I am viewing as truth. However, I’m twenty years old, and have spent many years learning to navigate social media. The same can not be said for a ten-year-old kid logging onto TikTok for the first time. Children tend to be trusting until they have a reason not to be. When a clip from Alex Jones’s podcast comes up on their FYP telling them Sandy Hook was a lie, they very well may believe him. Parents aren’t always available to peer over their children’s shoulders to fact-check their For You Page. This leaves these children vulnerable to indoctrination and, given the nature of the TikTok algorithm, likely to be exposed to further conspiracies and propaganda in the same way.
The nature of this effect has come to the forefront of political discourse in recent years, notably during Mark Zuckerberg’s congressional hearing, in which he adamantly denied Facebook’s role in exacerbating political polarization. However, experts, and I, beg to differ. The former’s claim rests on years of accumulated knowledge and research, mine on experience. With regard to the former, the Brookings Institute recently released a study identifying a “relationship between tech platforms and the kind of extreme polarization that can lead to the erosion of democratic values and partisan violence” (Brookings 2024). The study suggests that social media platforms employ types of algorithms which increasingly isolate individuals within networks of like-minded users, because people interact the most with views which are similar to their own, driving engagement and, for the platforms, profit. Brookings concluded that this kind of tailored content is driving polarization. This is “in part because of the contagious power of content that elicits sectarian fear or indignation.” In layman’s terms, the Brookings report frames social media as an echo chamber, in which users will naturally engage with that which interests them and, in doing so, will be exposed to both like-minded individuals and increasingly extremist rhetoric.
As an avid social media user, I have witnessed this process firsthand. My X feed is regularly infiltrated by radical conservatives like Marjorie Taylor Green, Candace Owens, and Nick Fuentes. I do not follow any of these users, but I do engage with their profiles from time to time (keep your friends close and your enemies closer). After just one click on the profile of any of the aforementioned users, I find their tweets and engagements on my timeline for weeks on end. Sure, this is largely self-inflicted, but it demonstrates the efficacy of X’s algorithm, affirming the Brookings hypothesis.
So what’s next? There is obviously a problem here, and it's one that transcends social media, impacting American civil society at large. However, social media platforms are not fueling polarization deliberately, or at least so they claim. Furthermore, it would be a breach of privacy for the government to dictate how users engage with social media, bringing us back to the crux of the issue, the algorithm. The Brookings report suggests that the government ought to “mandate more disclosure about the inner workings of social media platforms, so outside researchers can analyze the data.” However, this would likely have to occur at the federal level, and given the staunchly divided nature of congress and the return of president-elect Donald Trump, I find it highly unlikely such a mandate will come to pass. Therefore, social media algorithms will likely remain unchecked, the consequences of which only time will tell.
Bibliography