• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

csam

Sep 19 2025

Pornhub’s Lies: What the federal complaint against Pornhub’s parent company tells us about the industry.

Pornhub’s parent company, Aylo, will pay $15 million dollars to settle a damning consumer protection complaint out of court.

The joint filing, which the Federal Trade Commission and state of Utah released on September 3, alleged the pornography company lied to consumers about how it identified and removed child sexual abuse material (CSAM) and non-consensual material (NCM) from its websites.

The years-long charade allowed Aylo to make money off abusive content.

Aylo’s settlement is not an admission of guilt but, as the Daily Citizen previously reported, much of the evidence cited in the complaint is already public knowledge. Taken together, these facts show porn companies have little desire or incentive to rid their platforms of CSAM and NCM.

The FTC and Utah accused Aylo of telling six egregious lies about its platforms between 2018 and 2022.

The first and most egregious was that its sites did not contain CSAM or NCM.

Aylo began making this claim in 2018, vociferously assuring consumers all content on its websites was indisputably legal.

This is a demonstrable lie. Between June and December 2020, Aylo identified and removed more than 8,800 CSAM videos from its free pornography sites, most of which the company had been making money on for years.

On Pornhub alone, Aylo found 1,300 videos and 4,000 images other companies and regulatory agencies had already “fingerprinted” as CSAM. In December 2020, Aylo removed 10.5 million Pornhub videos uploaded by unverified users, tens of thousands of which included CSAM and NCM.

Aylo benefited from ignoring CSAM and NCM on its platforms because new, popular content increases subscription and advertising revenue. Evidence included in the FTC’s suit suggests Aylo prioritized this profit motive over identifying and removing abusive content.

On several occasions, Aylo representatives instructed “Content Partners” — pornography studios the company licenses content from — to create content with labels evoking NCM and CSAM, like “young girl,” “school girl” and “teen strip.”

Aylo arranged this content into playlists with titles and tags suggesting NCM and CSAM, like “less than 18” and “the best collection of young boys.”

In March 2020, prior to the audits, Aylo found several of the videos they had licensed contained actual CSAM and NCM. Instead of taking these videos down, the FTC alleged, “[Aylo] merely edited their titles to remove any suggestion that they contained [illegal content].”

Aylo told five other lies to prop-up its primary claim that its websites contained no illegal content.

Aylo personnel reviewed “flagged” content “as soon as they were made aware of it.”

Aylo’s websites allow users to “flag” content for NCM, CSAM or other objectionable elements. In 2018, the company assured users it removed “flagged” content “as soon as it became aware of [the complaint].”

But between 2015 and May 2020, the FTC alleged Aylo wouldn’t even review a video until it had been flagged by 16 unique users. Even then, a review wasn’t guaranteed. By the time Aylo mandated content reviews, Pornhub alone had amassed a backlog of over 700,000 videos that had been flagged more than fifteen times each.

Thousands of these videos contained CSAM and tens of thousands contained NCM. Aylo had been making money off many of them for over a decade.

Aylo bans anyone who uploads CSAM to its websites.

Aylo claims it has “zero tolerance” for users who upload CSAM. But the FTC’s complaint lists several examples in which the company deleted specific photos or videos, but failing to ban the user entirely.

Further, Aylo didn’t ban offenders across all its websites until June 2021. When the policy changed, the company discovered at least 20 Pornhub users had already been banned from Aylo sites for uploading CSAM.

Even then, Aylo made it easy for offenders to re-access its platforms. Until October 2022, the company’s policy merely prevented users from making an account with the same username and email address.

Aylo stops CSAM that has been deleted from its websites from being re-uploaded.

Aylo assures users it digitally “fingerprints” every piece of CSAM it identifies and deletes from its sites. This digital identifier ostensibly alerts the company when banned content is reuploaded.

But the FTC claims Aylo’s fingerprint detection software didn’t work between 2017 and August 2021. During that time, the complaint reads, “hundreds of videos that had been previously identified as CSAM were re-uploaded to [Aylo’s] websites.”

Internal documents show the company knew about this problem at least as far back as November 2020, but it continued to publicly boast about its safety capabilities.

Aylo verifies the identities of all individuals in their licensed content.

U.S. law requires companies like Aylo to verify the names and ages of every individual who appears in sexually explicit content on their subscription-based sites.

The FTC alleged Aylo failed to require this paperwork until December 2020. In January 2021, it discovered 700 of its “Content Partners”— nearly one-in-five — couldn’t provide identities for at least some performers.

Thousands of videos were subsequently removed from Aylo’s paid websites.

Aylo employees review every piece of content before it is posted to its sites.

When Aylo began making this claim in 2019, millions of videos across its websites had allegedly never been reviewed by the content moderation team.

Internal documents suggest moderators spent mere seconds looking at each video. Ina performance report from January 2019 cited by the FTC’s complaint, Aylo’s head of moderation claimed he reviewed about 1,200 videos every eight hours. That’s an average of about 24 seconds each.

When Aylo actually started reviewing every piece of content, moderators couldn’t handle the load. Per the complaint, the company’s head of moderation told leadership “his team was in ‘panic mode’ because it was severely understaffed and, as a result, moderators were making mistakes due to the high volume of videos they were required to review.”

Why It Matters

The evidence suggests Aylo has no apparent incentive or desire to stop making money off abuse barring corporate or public pressure. Even then, their solutions frequently contain loopholes and backdoors designed to undermine safety measures and maximize profits.

A company with this track record is statistically unlikely to spontaneously discover its moral compass, let alone a company that produces and profits off pornography. That’s why it’s so important for federal regulators like the FTC to hold it accountable for deceptive and unfair business practices.

Aylo and companies like it will only stop facilitating CSAM and NCM when it becomes more expensive to platform it then to take it down.

Additional Articles and Resources

Help with Pornography

Pornhub’s Parent Company Settles With Feds Over CSAM Complaint

Porn Companies Condition Viewers to Desire Illegal and Abusive Content

President Donald Trump, First Lady Sign ‘Take It Down’ Act

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them

Pornhub Quits Texas Over Age Verification Law

Written by Emily Washburn · Categorized: Culture · Tagged: csam, pornography

Sep 18 2025

Pornhub’s Parent Company Settles With Feds Over CSAM Complaint

This article is part one of a two-part review of the FTC’s judgement against Pornhub’s parent company for profiting off child abuse. 

Pornhub’s parent company, Aylo, will pay $15 million to settle a federal complaint alleging it facilitated the online spread of child sexual abuse material (CSAM) and non-consensual material (NCM) for more than a decade.

The Federal Trade Commission (FTC) and state of Utah filed a joint complaint against the pornography company on September 3 for engaging in deceptive and unfair business practices. The filing suggests Aylo spent years lying about the prevention, detection and removal of CSAM and NCM from its websites.

Between 2018 and 2022, the FTC claims Aylo spread six major falsehoods:

  • That its sites do not contain CSAM and NCM.
  • That it reviews every piece of content before consumers can view them.
  • That it reviews every piece of content users report as inappropriate “as soon as the company is made aware of [the complaint].”
  • That it bans users that upload CSAM and NCM.
  • That it prevents CSAM and NCM that has been deleted from being uploaded again.
  • That it knows the name and age of every individual in every piece of content on its subscription-based sites.

The Daily Citizen will explore each of these alleged lies in more detail in Part 2.

“Aylo’s business model was simple: more content, more money,” FTC Chairman Andrew Ferguson wrote in a statement explaining the complaint.

“Aylo profited from the distribution of CSAM and NCM for many years by deceiving consumers ‘both in the press an on their websites’ to convince them that its business practices were all above board.”

The years-long con allowed Aylo to hide abusive content in plain sight — and continue to profit from it.

Aylo settled with the FTC and Utah on September 8, agreeing to fork over $15 million in fines. While not an admission of guilt, it is an expensive bid to avoid contesting the complaint under oath.

The porn giant cannot truthfully contest many of the facts included in the filing. Several of its lies emerged in full view of the public. In December 2020, for instance, the New York Times’ Nicholas Kristof wrote an exposé claiming Pornhub was rife with CSAM, NCM and other objectionable content.

Ten days later, Aylo removed 10.5 million videos from PornHub for “failing to meet the company’s guidelines.”

Aylo’s settlement agreement requires more than a mere fine. The company must also:

  • Suspend all licensed photos and videos containing individuals Aylo cannot identify.
  • Suspend all “Content Partners”— porn producers Aylo sources content from — that cannot identify individuals featured in their content.
  • Create a “CSAM and NCM Prevention Program” facilitating the prompt reporting and removal of CSAM and NCM.
  • Remove all content featuring a person who no longer wishes to appear on Aylo’s platforms.
  • Ban all users that post comments or direct messages “encouraging or soliciting CSAM or NCM or encouraging or engaging in child abuse or non-consensual sexual activities.”

Perhaps most importantly, Aylo must “clearly and conspicuously” post a notice on all its websites disclosing the FTC’s allegations. Ferguson writes:

Coming from an entity that the complaint alleges refused, for years, to own up to its egregious wrongdoing, this notice is a major victory for countless victims of sexual abuse.

The Daily Citizen heartily agrees.

Aylo’s settlement with the FTC and Utah is one of several critical steps legislators and regulators have taken this year to fight back against extreme digital pornography, including passing age-verification legislation and laws like the Take It Down Act.

This important progress is driven by our growing understanding of the ways digital pornography shapes our brains and sexual behaviors and what motivates companies like Aylo and Roblox to profit off CSAM, pedophilia and other abusive content.

To that end, the Daily Citizen dives further into Aylo’s perfidy in part two.

Additional Articles and Resources

Help with Pornography

Porn Companies Condition Viewers to Desire Illegal and Abusive Content

President Donald Trump, First Lady Sign ‘Take It Down’ Act

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them

Pornhub Quits Texas Over Age Verification Law

Written by Emily Washburn · Categorized: Culture · Tagged: csam, pornography

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2025 Focus on the Family. All rights reserved.

  • Cookie Policy