Facebook and other major social media platforms have become essential aspects of our everyday lives ,while somehow escaping much regulation. The Facebook Files—a series of Facebook’s internal documents recently obtained by the Wall Street Journal—show in detail how harmful this lack of regulation has been for individuals’ mental and civic health.
Jeff Horowitz, one of the reporters working on the Facebook Files, summed up their findings saying, “Time and again, the documents show, in the United States and overseas, Facebook’s own researchers have identified the platform’s ill effects, in areas including teen mental health, political discourse and human trafficking.” He continues, “Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them.”
These findings confirm what many have been saying for years now: social media cannot remain unregulated when their impact on society is so vast.
Regulation seems to be the opposite of what Facebook wants. Facebook has introduced an independent Oversight Board and a campaign to promote vaccine awareness, both trying to show how its platform can ultimately be a social good. The Facebook Files reveal, however, that Facebook has at the very least fallen short of its aspirations.
A good example of this is Facebook’s effort to improve social interactions on its platform. In January 2018, Facebook CEO Mark Zuckerberg announced a new plan to shift Facebook’s goals from “helping people find relevant content to helping them interact more with friends and family.” Zuckerberg argued that these changes could reduce user engagement, but he noted that “the time [users] do spend on Facebook will be more valuable. And if we do the right thing, I believe that will be good for our community and our business over the long term too.” But while Facebook’s motives were ostensibly well-intentioned, what Facebook did in practice made things worse.
Facebook attempted to increase user interaction by creating a new algorithm for ranking posts that prioritized “meaningful social interactions.” But by the summer of 2018, it became clear that the new algorithm had some unintended effects. Users reported that the quality of their newsfeed had decreased, while news organizations like Buzzfeed and ABC news saw a more than 10 percent decline in online traffic. Facebook researchers also noted that a group of Polish politicians told them that “the proportion of [their party’s] posts [shifted] from 50/50 positive/negative to 80% negative, explicitly as a function of the change to the algorithm.” These negative consequences stem from the algorithm’s focus on prioritizing user engagement.
While the algorithm successfully slowed the decline in comments and improved Facebook’s “daily active people” metric, it inadvertently increased the ranking of posts that promoted outrage and debate. Since these posts tend to get more engagement, outrage, anger and controversy became the best way for publishers to spread content on Facebook.
One of the more glaring revelations from the Facebook Files concerns Instagram, which is owned by Facebook. According to the documents, Facebook had internal research detailing the harmful effects Instagram had on teens’ mental health, especially girls. The research showed that Instagram worsened body image issues for one in three teenage girls. Other research on teens from the U.S. and United Kingdom found that “40% of Instagram users who reported feeling ‘unattractive’ said the feeling began on the app.” Furthermore, psychology professor Jean Twenge notes that these mental health effects can often include “clinical-level depression that requires treatment. We’re talking about self-harm that lands people in the ER.” Rather than sharing this research with other academics and lawmakers, Facebook has publicly downplayed the harmful effects of its platforms on teens.
The documents make it clear that Facebook has made real efforts to improve user experiences but has largely failed to produce positive changes. Some of these failures are because Facebook, being a private corporation, is focused primarily on profit. Tristan Harris, a former Google employee who directed the Netflix documentary “The Social Dilemma,” has argued that the Facebook employees are incentivized against making the necessary reforms to the platform because it would inevitably lead to a reduction in user engagement—Facebook’s most important metric for success.
It would be inaccurate to say that Facebook is motivated solely by profit. Facebook was founded on the idealistic belief that the world would be a better place if more people were connected to each other. Many within the company genuinely believe that Facebook is ultimately a positive good. The issue the Facebook Files highlight is not that Facebook is an inherently harmful platform, but that Facebook often lacks the means or incentives that could facilitate reform.
This is why the U.S. needs to regulate social media companies like Facebook to ensure that steps are being taken to improve these platforms. The lack of transparency from Facebook has cast doubt on its ability to improve on its own. Rather than depending on the prudence of social media executives, the U.S. needs to take a more active role in mitigating the harmful effects of these platforms.
Benjamin Schnurr can be reached at [email protected].