Matthew Kaufman
Correspondent
A new investigation by The Wall Street Journal has found that numerous reviews within Facebook have revealed critical flaws with several of the company’s products, but the company has failed to make these findings public.
Dubbed “The Facebook Files,” the investigation reveals that Facebook is aware that content moderation is not applied evenly, Instagram hurts teens’ mental health and that its News Feed algorithm incentivizes divisive content.
The first part of the report details how Facebook provides special treatment for celebrities, influencers, politicians and many other well-known people. This is done through a program called XCheck.
The past few years have seen all social media companies struggle with the debate over removing posts of influential users. Facebook’s solution was to create a way to temporarily (or sometimes permanently) exempt some users from automated content moderation. As The WSJ notes, most Facebook employees had the ability to add someone to the XCheck program; accounts that met the criteria included those that were “newsworthy,” “influential or popular,” or “PR risky.” This widespread ability to add users to the program led to it including 5.8 million users as of 2020.
The WSJ reported that, in the past, Mark Zuckerberg had publicly said that Facebook’s content rules applied to everyone equally, regardless of their social status.
Despite the company’s public statements, an internal reviewer at Facebook warned, “we are not actually doing what we say we do publicly. Unlike the rest of our community, these people can violate our standards without any consequences.”
In 2019, Brazilian soccer player Neymar da Silva Santos Jr. posted nude photos of a woman after she accused him of rape, but since he was protected by XCheck, the photos remained up for a day.
The WSJ reported that XCheck prevented moderators from taking down the posts for more than a day, subjecting the woman to what an internal reviewer described as abuse.
Though Facebook’s official policy is that users who post nude photos of others without their consent will be banned from the platform, Neymar’s account remains active.
Leaving up problematic posts is not the only concern with Facebook’s moderation; another issue is mistakenly taking down content that does not violate any rules. As the AP reported, the governor of Alabama’s account was recently suspended for an hour due to what Facebook said was a mistake “unrelated to any posted content,” though she claimed they were censoring her criticism of President Biden.
A second article reveals that internal reviews have repeatedly found that Instagram, which is owned by Facebook, is harmful to teens’ mental health, specifically teen girls.
According to a March 2020 presentation obtained by The WSJ, “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Another slide from 2019 bluntly admits, “we make body image issues worse for one in three teen girls.”
“The tendency to share only the best moments, a pressure to look perfect and an addictive product can send teens spiraling toward eating disorders, an unhealthy sense of their own bodies and depression,” The WSJ reported from the internal documents. “The Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful.”
Again, Facebook’s public stance on the issue did not match what they had found internally. The WSJ reported that Zuckerberg told Congress in March 2021, “the research that we’ve seen is that using social apps to connect with other people can have positive mental health benefits.” Around the same time, Facebook announced that it was developing a version of Instagram specifically for children.
Following The WSJ’s revelations, Senator Ed Markey, D-Mass., Rep. Kathy Castor, D-Fla. and Rep. Lori Trahan, D-Mass., sent a letter to Mr. Zuckerberg recommending that they cancel plans for the kids’ version of Instagram. The letter also included a series of questions regarding Facebook’s knowledge of the mental health issue.
In a blog post, Karina Newton, Instagram’s Head of Public Policy, said that The WSJ’s limited research ignored other sources that show the platform has both positive and negative mental health effects. Regarding the lack of transparency surrounding the reports, Newton suggested that Instagram would be more forthcoming in the future.
“We’ll continue to look for opportunities to work with more partners to publish independent studies in this area, and we’re working through how we can allow external researchers more access to our data in a way that respects people’s privacy.”
The third part of the series describes how a tweak to Facebook’s News Feed algorithm in 2018 that was designed to increase engagement with posts ended up requiring content creators to post more divisive content.
Research within Facebook had reportedly found that simply reading or watching content led to users feeling worse after their browsing session. Zuckerberg believed that the new algorithm, which would promote posts that made people want to get involved, would boost “meaningful social interaction” (MSI).
While Facebook’s tweak achieved its goal — people were more engaged and active with their opinion — the change led to a multitude of unforeseen consequences.
The WSJ notes how BuzzFeed, after seeing a 13% reduction in visits to its site from the change, felt the need to alter its content to be more controversial so it would perform well with the new algorithm. In an email to Zuckerberg, Jonah Peretti, the chief executive of BuzzFeed, wrote that the algorithm made his staff feel pressured “to make bad content or underperform.”
The new algorithm also had large political effects. Several European political parties wrote to Facebook with concerns that they felt compelled to “tap into anger,” or else their posts would not be seen. An internal analysis in 2019 found that “one party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80% negative, explicitly as a function of the change to the algorithm.”
In a statement provided to The Guardian, a Facebook spokesperson said: “Is a ranking change the source of the world’s divisions? No. Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed.”
These reports are just the beginning of The WSJ’s findings, and more reports will continue to be published. They are certain to stir up more anger towards Facebook in the public and in the government; as The WSJ notes, “at least some of the documents have been turned over to the Securities and Exchange Commission and to Congress by a person seeking federal whistleblower protection.”