Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

Are you only being told the things you want to hear?

Audit your social media algorithm – it might be forcing you into an echo chamber
Oleg+Laptev+via+Unsplash
Oleg Laptev via Unsplash

At the end of a long day, when the lights turn off and you’re all alone, curled up in bed with your screen aglow, you might open Instagram or TikTok for a late-night scroll. What’s the harm in a little browsing before bed?

We often enter the mindset of scrolling with frayed nerves and over-stimulated thoughts – either from work or previous social content. The content algorithm knows this all too well. It caters to your true purpose for being there: to comfort yourself, perhaps even to self-medicate. In your vulnerable state, there’s no better time to tell you exactly what you want to hear.

Did the melodrama of this description spark a little self-consciousness about your social media scrolling? I hope so. It’s a self-awareness that occurs far too infrequently. To be clear, this is not meant to be an accusation or denouncement; it’s important to jolt ourselves for a moment and become more mindful about our relationship with scrolling. Algorithmic manipulation is a common experience, so if this description made you feel uneasy about your relationship with social media, that is OK. Let’s affirm one thing: you are not alone.

I harbor these same experiences. Over a year ago, in the fall of 2022, I published an opinion column entitled “The case for vanishing from social media without a trace.” I discussed how I had abandoned social media before college, reflecting on the profound relief that accompanied the shock of sudden digital silence.

The underlying conclusion of the article was that social media had no net positives to offer. Its costs outweighed the benefits. Its design was exploitative. Its intentions were all wrong.

I still believe this. For example, if I can help it, I will never have a personal social media account again. But this past year, there was a wrinkle in my anti-social media endeavors: I came back.

After all, I am a marketing major. It was bound to happen at some point, and as the Creative Director of the Massachusetts Daily Collegian, part of my job includes social media monitoring and strategy. I returned to Instagram not as “Kelly McMahan” but through the official Daily Collegian account.

The experience of maintaining a professional account is exceptionally different from holding a personal one. This Instagram refrain was an intriguing experiment; less all-consuming the second time around, but all the same in its algorithmic fodder.

Over three months, I gained some striking insights: whether the account is personal or corporate, so long as there’s a human behind the screen, the content algorithm will discover their ideologies and desires with the aggressive accuracy of a heat-seeking missile. This nature is inherently manipulative, and the implication here is a heavy contribution to our personal echo chambers.

A social media echo chamber occurs when “one experiences a biased, tailored media experience that eliminates opposing viewpoints and differing voices,” as described by Paige Cabianca, Peyton Hammond and Maritza Gutierrez from the University of Texas Austin Moody College of Communication.

“Due to social media algorithms that ensure we only see media that fits our preferences, we have found ourselves in a comfortable, self-confirming feed,” they add.

Despite hosting a newspaper account, somehow the Instagram Reels algorithm knew I was a woman. It knew I was young, but not a teenager. It knew what makeup I liked, even when I had never browsed for it on the app. It correctly predicted my political, ethical and social beliefs. Except in the rare case of a paid political post, over 90 percent of the content I viewed was self-confirming.

Ask yourself: how often do you disagree with what you’re shown on your explore page? I can guarantee that it won’t be frequent.

I began to audit the algorithm. I would count the number of times an ideology was repeated within one scrolling session. For example, every third reel seemed connected by the same notion.

In his piece from Wired on how social media platforms profit from human confirmation biases,   Christopher Seneca explains, “Social media companies therefore rely on adaptive algorithms to assess our interests and flood us with information that will keep us scrolling.”

“The algorithms ignore the recency and frequency of what our friends are posting and instead focus on what we ‘like,’ ‘retweet’ and ‘share,’” he wrote, “to keep feeding content that is similar to what we’ve indicated makes us comfortable.”

In late March, Meta, the parent company of Facebook, Instagram and Threads stated that it would stop recommending political content from accounts that users don’t already follow. This includes limiting content that Instagram described as “likely to mention governments, elections or social topics that affect a group of people and/or society at large.”

The argument is that users can further control or limit content they don’t have the “appetite” for. But in the absence of this content, can you take a wild guess at what’s going to take its place? Again, we are only being told exactly what we want to hear.

I strongly urge everyone to audit their algorithms. Be serious and scientific about it. Get out a paper and pencil and truly inspect them. If you’ve got a soft spot for the scientific method, try putting the content you see into categories and keeping a running tally of its frequency. If this is too much work, even keeping a keen self-awareness of content patterns can be a game-changer.

Echo chambers lose their power when we become aware of them. They must be dismantled and resisted. They are dangerous for their contributions to political polarization, misinformation and social division. When we only see evidence of our beliefs, why would we ever think to doubt ourselves? When we never doubt ourselves, how could we ever make any progress at all?

I believe that the echo chamber issue goes beyond social division. It’s part of a deeper cultural epidemic in which we’re seeing an absence of humility.

Not everything you believe is necessarily wrong, but we need to start accepting that other views can also be right. Here’s an opportunity to learn something new, and it’s magnificent evidence of the human mind’s flexibility and compassion.

In the age of the algorithm, we must relinquish the conviction that we ought to have some sort of ownership of truth. This starts with putting down your phone, which is not designed to help you with this journey.

Face the world. Listen to something that you don’t want to hear. Maybe you’ll agree, or maybe you won’t — but you’ll be all the better for it.

Kelly McMahan can be reached at [email protected].

Leave a Comment
More to Discover

Comments (0)

All Massachusetts Daily Collegian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *