The Wall Street Journal has published a lengthy series, called “The Facebook Files,” which sheds a damning light on Facebook practices that allow sex traffickers and drug cartels on the social media platform. The Journal also asserts that the company treats celebrities and politicians differently from average users and that the company knows its Instagram app is psychologically damaging for many teen girls.

After reviewing internal company documents, The Journal reported:

Time and again, the documents show, in the U.S. and overseas, Facebook’s own researchers have identified the platform’s ill effects, in areas including teen mental health, political discourse and human trafficking. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them.

The documents included “research reports, online employee discussions and drafts of presentations to senior management, including Mr. Zuckerberg.” The Journal accused Facebook of knowing about serious issues – and providing “misleading or partial answers” to politicians and regulators, “masking how much it knows.”

Facebook Vice President of Global Affairs Nick Clegg responded to the allegations, saying:

At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company. This impugns the motives and hard work of thousands of researchers, policy experts and engineers at Facebook who strive to improve the quality of our products, and to understand their wider (positive and negative) impact. It’s a claim which could only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer. 

The first report showed that most Facebook users were held to a different standard than politicians, celebrities and other elites. Through a program known as “cross check” or “XCheck,” the tech giant favors these high profile users.

Facebook’s own confidential, internal review stated, “We are not actually doing what we say we do publicly … Unlike the rest of our community, these people can violate our standards without any consequences.” Meanwhile, the big tech firm said its rules applied to all users.

The Journal reported that the 5.8 million people who were part of XCheck in 2020 were “rendered immune from enforcement actions – while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.” The special privileges allowed elites who were “whitelisted” to post items that “contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users.”  

The second report in The Facebook Files showed that the company knows how dangerous Instagram, a photo- and video-sharing app can be for teen girls. Facebook bought the smaller company for about $1 billion in 2012.

Again, internal company reports showed how girls comparing themselves to other girls and young women on Instagram was damaging for their mental health:

Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse, the researchers said in a March 2020 slide presentation posted to Facebook’s internal message board, reviewed by The Wall Street Journal. Comparisons on Instagram can change how young women view and describe themselves.

Teens spend more time on Instagram than Facebook:

More than 40% of Instagram’s users are 22 years old and younger, and about 22 million teens log onto Instagram in the U.S. each day, compared with five million teens logging onto Facebook, where young users have been shrinking for a decade, the materials show.

The Journal reported:

The tendency to share only the best moments, a pressure to look perfect and an addictive product can send teens spiraling toward eating disorders, an unhealthy sense of their own bodies and depression, March 2020 internal research states. It warns that the Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful

According to Facebook research, 40% of those who use the app feel unattractive, while about one-fourth of users feel they are not good enough. Many teens who wanted to log off Instagram lack the self-control to do so, feeling like they were addicted to stay connected with other users.

A third report from The Journal showed that CEO Mark Zuckerberg wanted to change Facebook’s algorithm to increase “meaningful social interactions” and “make its platform a healthier place,” but the adjustment actually had the opposite effect – increasing conflict and anger on the platform.

Success on the social media site is driven by “high levels of comments and reactions,” The Journal reported, which led to publishers and political groups “reorienting their posts toward outrage and sensationalism.”

The paper said Facebook knew about this, but added, “Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company’s other objective—making users engage more with Facebook.”

The fourth report from The Facebook Files shows that as the company engages in more countries around the world – fewer than 10% of its users are in the U.S. or Canada – it is less able to monitor content. The newspaper alleges that this has led to the company allowing content from drug cartels and human traffickers or being slow to remove it:

The documents reviewed by The Journal are reports from employees who are studying the use of Facebook around the world, including human exploitation and other abuses of the platform. They write about their embarrassment and frustration, citing decisions that allow users to post videos of murders, incitements to violence, government threats against pro-democracy campaigners and advertisements for human trafficking.

For example, a former police officer, working as a Facebook investigator, revealed in early 2021 that a Mexican drug Cartel, the Jalisco New Generation Cartel, was using the social media platform “to recruit, train and pay hit men.”

The Facebook Files stated:

The cartel, which law-enforcement officials say is the biggest criminal drug threat to the U.S., didn’t hide its activity. It had multiple Facebook pages with photos of gold-plated guns and bloody crime scenes, the documents show. …

Facebook didn’t fully remove the cartel from its sites. The documents say it took down content tied to the cartel and disrupted the network.

The company’s internal investigation team also documented “a bustling human-trafficking trade in the Middle East” on Facebook pages. The Journal reported:

The company took down some offending pages, but took only limited action to try to shut down the activity until Apple Inc. threatened to remove Facebook’s products from the App Store unless it cracked down on the practice. The threat was in response to a BBC story on maids for sale.

In his online response, Clegg said:

What would be really worrisome is if Facebook didn’t do this sort of research in the first place. The reason we do it is to hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media. These are often complex problems where there are no easy answers — notwithstanding the wish to reduce them to an attention-grabbing newspaper headline.

Related articles and resources:

Canopy – New Parental Control App Protects Children from Nudity and Pornography

Facebook Can be Sued for Knowingly Benefiting from Human Trafficking, Texas Supreme Court Rules

Facebook Censors ‘The Daily Citizen’ and Tucker Carlson Over Reporting on CDC Mask Data

Facebook Changes Algorithm Post Election to Suppress Conservative News Voices

Plugged In Parent’s Guide to Today’s Technology

Sexting Isn’t Going Away. Here are Five Ways to Talk to Your Kids about It.

Photo from Shutterstock.