Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

Facebook is making us more narrow-minded

Social media algorithms leave us in an echo chamber of our own biases
Facebook+is+making+us+more+narrow-minded

With the advent of social media becoming such an influential force in the landscape of lifestyle and communication, having the ability to connect with anyone in the world at any given time seems to create a worldwide open forum. Those who previously had no outlet to voice their opinions are now able to connect with a worldwide audience. However, over time, sites such as Facebook have personalized the user experience by instilling algorithms that can study one’s engagement with posts, such as what we ‘like’ and ‘share,’ in order to learn more about our beliefs. Through selective exposure, Facebook can sharpen it’s knowledge on what you are most likely to interact with and, as a result, segregate us from anything we may not agree with.

On the surface, this may seem like a good thing. According to a 2015 study by the Reuters Institute, 74 percent of Americans use online sources and social media platforms, like Facebook, as a news outlet. One could argue that having easy access to articles we find interesting helps keep us informed. However, if the algorithm is simply showing us more of what we are already guaranteed to react positively to, we are left in an ‘echo chamber’ in which our own opinions are reverberated back to us. If this technology can pick up on a user’s political leanings in terms of single issues or party agenda, the algorithm could begin to show them ‘news’ that is vastly radicalized without the user even realizing it.

An example of this radicalized presentation was revealed by Facebook after the 2016 elections. Russian agents used Facebook’s algorithm to identify users who were likely to engage with political propaganda. These agents then used this algorithm to their advantage, strategically showing ads to certain demographics to further polarize voters on certain voting points, such as gun rights, police brutality and immigration. While some of these may not have directly mentioned political candidates, their presentation was made with voter persuasion in mind; this perhaps contributed to how divisive the election was. The advertisements were shown to extremely specific demographics based on their previous activity; the ads had the power to persuade them without more moderate voters even seeing them and having the opportunity to speak out against the authenticity of the message. This is unsettling because the fact that a foreign power was so easily able to have a hand in influencing our election undermines our democracy as a whole by spreading misinformation and propaganda, and leaves even moderate voters questioning if the ads and articles selected for them are biased.

The algorithm’s intelligence is depriving us of a healthy dose of opposition. Users’ feeds become filled with posts that the algorithm has predicted they will react positively to, leaving no room for competing opinions. In the age of fake news and hate groups becoming emboldened to spread their ideas, fact-checking and cross-referencing is essential to verifying information. If Facebook is showing a user a set of articles all enforcing the same idea, they may accept it as truth because they see nothing saying otherwise. Radicalization of ideas can occur much more quickly if groups know the exact demographic that is likely to react well to their ideas as opposed to slow persuasion through personal contact. If this ‘echo chamber’ of ideas becomes robust enough to filter out any sign of opposing viewpoints, we may grow to not even realize that people hold opinions different than ours, and may recluse ourselves further from the truth.

Recognizing that this is a problem is the first step towards defeating it. Buying into an idea simply because Facebook shows you that someone thinks the same way cannot be our only means of developing our own set of opinions. We must not only actively seek out news from credible sources, but be forced to challenge our own confirmation biases. Until we actively search for a broader array of ideas, we all may be making ourselves more narrow minded than we realize.

Lauren Sointu is a Collegian columnist and can be reached [email protected].

Leave a Comment
More to Discover

Comments (0)

All Massachusetts Daily Collegian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *