National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

The National Center on Sexual Exploitation (NCOSE) has launched a campaign to reform an outdated law effectively allowing tech companies to aid, abet and profit off online sex abuse.

Section 230 of the Communications Decency Act

Congress passed the Communications Decency Act (CDA) in 1996 to protect children from online exploitation. It included Section 230, a carveout preventing “interactive computer services” from liability for content users posted online.

Section 230 was a direct response to a New York Supreme Court case from 1995 which found Prodigy Services, a web company that hosted online discussion forums, liable for a user’s defamatory content because it had tried to moderate the objectionable posts.

The ruling made tech companies reluctant to censor even the most offensive content. Section 230, lawmakers reasoned, gave moderators freedom to remove indecent posts.

What Went Wrong

CDA would have made it illegal to “knowingly send obscene or indecent messages … as determined by contemporary community standards, to someone under 18.”

In 1997, the Supreme Court found these protections violated the First Amendment. The ruling stripped CDA of its prohibitions against targeting children but left Section 230 intact.

Over the next two and a half decades, tech companies grew to unprecedented size and power. Courts, meanwhile, interpreted Section 230 to excuse websites, browsers and social media sites of all responsibility regarding the content and activity on their platforms.

The difference between lawmaker’s intentions for Section 230 and its contemporary use is stark. NCOSE writes:

[Section 230] of CDA was originally designed to help protect children online in the early days of the internet.
However, this law — which is over 25  years old — has instead become a shield for Big Tech corporations, allowing them to evade accountability even as their platforms are blatantly used to facilitate egregious acts like sex trafficking, child sexual abuse and image-based sexual abuse.
Twelve Survivors

Every year, NCOSE releases a list of “12 mainstream entities facilitating, enabling and even profiting from sexual abuse.” Dirty Dozen alumni include Twitter (now X), Facebook, Google, Instagram, Amazon, HBO, Netflix, Reddit and even the U.S. Department of Defense.

This year, instead of identifying twelve exploitative companies, NCOSE is “highlighting 12 survivors who were denied justice in the courts because of Section 230.”

“These survivors were silenced,” NCOSE writes, “and the companies that enabled their abuse were given immunity.”

X Refuses to Remove Sexually Explicit Photos of a Minor

One of the 12 stories NCOSE highlights is that of a 13-year-old John Doe. A sextortionist posing as a 16-year-old girl convinced John and his friend engage in a romantic exchange of explicit images. The predator attempted to blackmail John and, when John blocked them, spread the pictures and videos across X. They eventually reached parents and peers at John’s school.

John and his mom contacted X to get the photos taken down. They sent pictures of John’s ID to prove he was a minor. X responded:

We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.

NCOSE is one of three groups representing John and his friend in court against X. They argue the social media juggernaut “knowingly possessed child sexual abuse material and knowingly benefitted from sex trafficking.”

A federal district court initially dismissed the case under Section 230.

The Ninth Circuit Court of Appeals heard the case in February. NCOSE recounts X’s argument:

X’s attorney himself acknowledged that the platform did not remove the [child sexual abuse] content, even after reviewing [John’s] report.
He continued to maintain that X should be immune anyway, stating that holding tech companies accountable for illegal third-party content being uploaded to the site is “antithetical” to what Congress intended with Section 230.

It continues:

The mere fact that Twitter can admit to partaking in a despicable crime and maintain that they should still be protected shows that Section 230 urgently needs to be reformed.
Why Nothing Has Changed

Lawmakers from both sides of the political aisle support reforming — or repealing — Section 230. Democrat and Republican senators blasted blanket immunity for big tech in a Senate Judiciary Committee hearing on the issue in February.

Congress has gathered near-unanimous support for at least six child internet safety bills in recent years. Only two — the REPORT Act and the Take It Down Act — have passed.

Even the Department of Justice (DOJ) has recommended revising Section 230, writing in a 2020 report,

The expansive statutory interpretation [of Section 230], combined with technological developments, has reduced the incentives of online platforms to address illicit activity on their services and, at the same time, left them free to moderate lawful content without transparency and accountability.

The DOJ specifically noted that platforms had used Section 230 to get immunity “even when they knew their services were being used for criminal activity” and to “evade laws and regulations applicable to brick-and-mortar competitors.”

If changing Section 230 is such a popular issue, why has nothing changed?

According to Senator Dick Durbin: money.

When Congress tried to curtail tech companies’ immunity or institute child internet safety protections, “Big Tech opened up a $61.5 million dollar lobbying war chest to make sure [those] bills never became law,” Senator Durbin told the committee in February,

Next Steps

NCOSE asks citizens to encourage Congress to remove Section 230’s immunity protections for tech companies. The snowballing reckoning around social media’s effect on children, and this year’s legislative and congressional activity around Section 230, could indicate fertile ground for change.

The Daily Citizen will continue reporting on Section 230 and efforts to hold tech companies accountable.

Additional Articles and Resources

First Lady Melania Trump Celebrates House’s Passage of Take It Down Act

Proposed SCREEN Act Could Protect Kids from Porn

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Zuckerberg Implicated in Meta’s Failures to Protect Children

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram Content Restrictions Don’t Work, Tests Show

Horrifying Instagram Investigation Indicts Modern Parenting

REPORT Act Becomes Law