JUMP TO…

  • Background
  • The Issue
  • The Canaries
    • Plans to Shore Up, Shut Down
    • Beauty is Pain
    • X-Rated
  • What It Means
  • Why It Matters

Meta founder and CEO Mark Zuckerberg faced renewed criticism this week after The New York Times unearthed information suggesting he repeatedly steered the social media company away from adopting protections for children.

Background

Meta is embroiled in more than a dozen child exploitation lawsuits.

Launched by more than 45 states and the District of Columbia, the cases allege Meta used habit-forming technology and harmful content to addict teens to Instagram and Facebook — all while telling parents the platforms were “safe.”

The cases reference internal documents and communications from Meta going back to 2016.

The Issue

The Times’ analysis of lawsuits brought against Meta by Tennessee and New Mexico — including more than 1,000 leaked documents — found:

Mr. Zuckerberg and other Meta leaders repeatedly promoted the safety of the company’s platforms, playing down risks to young people, even as they rejected employee pleas to bolster youth guardrails and hire additional staff.

The Times’ findings support a recent story from The Wall Street Journal alleging Meta lied about the effectiveness of their latest age-related content restrictions.

Not only did Meta know that Instagram fed underage users sexual content, the Journal claims, they conducted tests finding Instagram recommended explicit and violent content to underage users more frequently than those aged 30 and above.

Safety staff reportedly only “discussed whether it would be sufficient to reduce the frequency of minors seeing prohibited content to the level adults see” late last year.

The Canaries

The Times highlights five telling interactions between Zuckerberg and Meta employees, fellow executives and a prominent advertiser.

Plans to Shore Up, Shut Down

Kevin Systrom, a former chief executive at Instagram, emailed the Silicon Vally scion in 2017 asking for more staff to “work on mitigating [Instagram and Facebook’s] harm to users.”

Zuckerberg prevaricated, killing the project while Facebook faced “more extreme issues,” but promising to create a plan to hire more staff.

Systrom attempted to impress the problem’s urgency on Zuckerberg, using examples of “imminent harm” including a boy who shot himself on Instagram Live.

Its unclear if Zuckerberg ever responded to Systrom’s email or hired the requisite staff. Documents do show, however, that this email exchange occurred while Zuckerberg was absorbed in competing with Snapchat, a rival social media app that briefly surpassed Instagram’s popularity in 2016.

At that time, the Times claims Zuckerberg “directed executives to focus on getting teenagers to spend more time on the company’s platforms.” An email from November 2016 further confirmed, “The overall company goal is total teen time spent.”

Systrom left Meta in 2018.

David Ginsberg, a Meta executive, reiterated Systrom’s request in 2019, asking Zuckerberg for 24 engineers, researchers and staffers to lessen “problematic use [of] and addiction [to Instagram and Facebook among] teens.”

Though Ginsberg claimed Instagram had a “deficit” of staff to address the serious issue, Zuckerberg never responded. Instead, both Susan Li, a financial executive at Meta, and Adam Mosseri, the head of Instagram, denied Ginsberg funding.

Li is now Meta’s Chief Financial Officer.

Nick Clegg recommended Zuckerberg stave off lawsuits in 2021 by shoring up Meta’s “understaffed and fragmented” efforts to protect teens’ mental health.

“We need to do more, and we are being held back by a lack of investment on the product side,” Clegg’s email reads, “which means that we’re not able to make changes and innovations at the pace required to be responsive to policymaker concerns.”

Clegg asked for 25 employees and 20 engineers to research and mitigate any harm Instagram and Facebook cause teens — but Zuckerberg failed to reply. After three months of silence, Clegg reduced his request to only 32 employees with no engineers.

“This investment is important to ensure we have the product roadmaps necessary to stand behind our external narrative of well-being on our apps,” he emphasized.

Li replied to Clegg indicating the project likely wouldn’t be greenlit. It’s unclear whether Zuckerberg ever addressed Clegg’s request — or even responded to his email.

Zuckerberg did, however, respond to a Times article from September 2021 mentioning a video of the billionaire riding an “electric surfboard.” He had actually been riding a manual hydrofoil, according to the correction he posted on Facebook.

Clegg is now Meta’s head of global affairs.

Beauty Is Pain

Meta came under fire in 2019 after introducing a “Fix Me” camera filter that superimposed dotted lines, apparently referencing a plastic surgeon’s pre-operation markings, on user’s faces. The company temporarily banned the filter while the company debated whether to delete the product altogether.

Ahead of a meeting with Zuckerberg, one group of employees interviewed 18 mental health experts, all of whom feared camera filters referencing plastic surgery could harm young people.

Zuckerberg canceled the meeting.

Instead, he elected to email the company that he wanted to lift the ban, “especially [because] there’s no data I’ve seen that suggests doing so is helpful or not doing so is harmful.”

Former vice president for product design and responsible innovation, Margaret Gould Stewart, subsequently emailed Zuckerberg, expressing concern about the pressure such a filter could put on teen girls.

“I was hoping we could maintain a moderately protective stance here given the risk to minors,” she wrote. “I just hope that years from now we will look back and feel good about the decision we made here.”

It’s unclear if Zuckerberg ever responded. Stewart no longer works with Meta, who claims to no longer allow filters “that directly promote cosmetic surgery, changes in skin color or extreme weight loss.”

One of social media’s most well-documented effects on girls — particularly those aged 11 to 12 — is problems with body image.

X-Rated

Instagram’s problematic content drew the attention of an advertiser last year.

Bernard Kim is CEO of the Match Group, the parent company of online dating apps including Tinder and OKCupid. Kim contacted Meta after ads for his companies ran alongside what the Times called “‘highly disturbing’ violent and sexualized content, some of it involving children.”

Though Meta subsequently removed some of the objectionable content and assured Kim it was a fluke, the businessman emailed Zuckerberg himself, writing in part:

Meta is placing ads adjacent to offensive, obscene — and potentially illegal — content, including sexualization of minors and gender-based violence.

Zuckerberg allegedly failed to respond.

DID YOU KNOW?

Social media companies make money when advertisers buy advertisements on their platforms. The more people see an advertisement on social media, the more companies will pay to place one. That’s why companies like Meta rely so heavily on addictive algorithms — they want as many eyes on their site as possible, for as long as possible, so they can make the most money.
What It Means

Meta clearly doesn’t move without Zuckerberg’s say so, which means he is responsible for the company’s stubborn refusal to meaningfully protect children on Instagram and Facebook.

Rumors of dissent throughout Meta’s ranks have swirled for years, fanned by testimony from whistleblowers like Frances Haugen. Even Mosseri has expressed concerns about the kind of content Instagram shows minors, according to the Journal.

If employees and executives alike are concerned about the effects of their products on children, I’ve frequently wondered, why has Meta so uniformly failed to protect children on their site?

The answer, I now know, is Mark Zuckerberg.

While it’s no secret that he — like most businessmen — prioritizes Meta’s financial success, the Times’ reporting shows he has been instrumentally involved in making teens Instagram’s target audience and killing investments into child safety protections.

Why It Matters

Parents know everything they need to know about social media:

  • They know that social media — including Instagram and Facebook — harm children mentally, emotionally, physically and socially.
  • They know that age-appropriate content restrictions aren’t fool proof, and that social media companies frequently lie about their effectiveness.
  • They know that companies have financial incentive to entice teens to use their products.

Now, they know that the creator of social media — and arguably the industry’s most influential player —has deprioritized children’s safety at Meta for years.

The puzzle isn’t hard to piece together: kids don’t belong on social media. They will never be safe, because it is a product designed to keep their eyes glued to the screen at all costs.

That’s why the Daily Citizen strongly recommends you consider prohibiting or restricting your children’s social media use.

Additional Articles and Resources

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Four Ways to Protect Your Kids from Bad Tech, from Social Psychologist Jonathan Haidt

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Meta Takes Steps to Prevent Kids from Sexting

Horrifying Instagram Investigation Indicts Modern Parenting

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

The Harmful Effects of a Screen-Filled Culture on Kids

Social Media Age Restriction — Which States Have Them and Why They’re So Hard to Pass

REPORT Act Becomes Law

Plugged in Parent’s Guide to Today’s Technology