Pornhub’s Lies: What the federal complaint against Pornhub’s parent company tells us about the industry.
Pornhub’s parent company, Aylo, will pay $15 million dollars to settle a damning consumer protection complaint out of court.
The joint filing, which the Federal Trade Commission and state of Utah released on September 3, alleged the pornography company lied to consumers about how it identified and removed child sexual abuse material (CSAM) and non-consensual material (NCM) from its websites.
The years-long charade allowed Aylo to make money off abusive content.
Aylo’s settlement is not an admission of guilt but, as the Daily Citizen previously reported, much of the evidence cited in the complaint is already public knowledge. Taken together, these facts show porn companies have little desire or incentive to rid their platforms of CSAM and NCM.
The FTC and Utah accused Aylo of telling six egregious lies about its platforms between 2018 and 2022.
Aylo began making this claim in 2018, vociferously assuring consumers all content on its websites was indisputably legal.
This is a demonstrable lie. Between June and December 2020, Aylo identified and removed more than 8,800 CSAM videos from its free pornography sites, most of which the company had been making money on for years.
On Pornhub alone, Aylo found 1,300 videos and 4,000 images other companies and regulatory agencies had already “fingerprinted” as CSAM. In December 2020, Aylo removed 10.5 million Pornhub videos uploaded by unverified users, tens of thousands of which included CSAM and NCM.
Aylo benefited from ignoring CSAM and NCM on its platforms because new, popular content increases subscription and advertising revenue. Evidence included in the FTC’s suit suggests Aylo prioritized this profit motive over identifying and removing abusive content.
On several occasions, Aylo representatives instructed “Content Partners” — pornography studios the company licenses content from — to create content with labels evoking NCM and CSAM, like “young girl,” “school girl” and “teen strip.”
Aylo arranged this content into playlists with titles and tags suggesting NCM and CSAM, like “less than 18” and “the best collection of young boys.”
In March 2020, prior to the audits, Aylo found several of the videos they had licensed contained actual CSAM and NCM. Instead of taking these videos down, the FTC alleged, “[Aylo] merely edited their titles to remove any suggestion that they contained [illegal content].”
Aylo told five other lies to prop-up its primary claim that its websites contained no illegal content.
Aylo’s websites allow users to “flag” content for NCM, CSAM or other objectionable elements. In 2018, the company assured users it removed “flagged” content “as soon as it became aware of [the complaint].”
But between 2015 and May 2020, the FTC alleged Aylo wouldn’t even review a video until it had been flagged by 16 unique users. Even then, a review wasn’t guaranteed. By the time Aylo mandated content reviews, Pornhub alone had amassed a backlog of over 700,000 videos that had been flagged more than fifteen times each.
Thousands of these videos contained CSAM and tens of thousands contained NCM. Aylo had been making money off many of them for over a decade.
Aylo claims it has “zero tolerance” for users who upload CSAM. But the FTC’s complaint lists several examples in which the company deleted specific photos or videos, but failing to ban the user entirely.
Further, Aylo didn’t ban offenders across all its websites until June 2021. When the policy changed, the company discovered at least 20 Pornhub users had already been banned from Aylo sites for uploading CSAM.
Even then, Aylo made it easy for offenders to re-access its platforms. Until October 2022, the company’s policy merely prevented users from making an account with the same username and email address.
Aylo assures users it digitally “fingerprints” every piece of CSAM it identifies and deletes from its sites. This digital identifier ostensibly alerts the company when banned content is reuploaded.
But the FTC claims Aylo’s fingerprint detection software didn’t work between 2017 and August 2021. During that time, the complaint reads, “hundreds of videos that had been previously identified as CSAM were re-uploaded to [Aylo’s] websites.”
Internal documents show the company knew about this problem at least as far back as November 2020, but it continued to publicly boast about its safety capabilities.
U.S. law requires companies like Aylo to verify the names and ages of every individual who appears in sexually explicit content on their subscription-based sites.
The FTC alleged Aylo failed to require this paperwork until December 2020. In January 2021, it discovered 700 of its “Content Partners”— nearly one-in-five — couldn’t provide identities for at least some performers.
Thousands of videos were subsequently removed from Aylo’s paid websites.
When Aylo began making this claim in 2019, millions of videos across its websites had allegedly never been reviewed by the content moderation team.
Internal documents suggest moderators spent mere seconds looking at each video. Ina performance report from January 2019 cited by the FTC’s complaint, Aylo’s head of moderation claimed he reviewed about 1,200 videos every eight hours. That’s an average of about 24 seconds each.
When Aylo actually started reviewing every piece of content, moderators couldn’t handle the load. Per the complaint, the company’s head of moderation told leadership “his team was in ‘panic mode’ because it was severely understaffed and, as a result, moderators were making mistakes due to the high volume of videos they were required to review.”
The evidence suggests Aylo has no apparent incentive or desire to stop making money off abuse barring corporate or public pressure. Even then, their solutions frequently contain loopholes and backdoors designed to undermine safety measures and maximize profits.
A company with this track record is statistically unlikely to spontaneously discover its moral compass, let alone a company that produces and profits off pornography. That’s why it’s so important for federal regulators like the FTC to hold it accountable for deceptive and unfair business practices.
Aylo and companies like it will only stop facilitating CSAM and NCM when it becomes more expensive to platform it then to take it down.
Additional Articles and Resources
Pornhub’s Parent Company Settles With Feds Over CSAM Complaint
Porn Companies Condition Viewers to Desire Illegal and Abusive Content
President Donald Trump, First Lady Sign ‘Take It Down’ Act
Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content
UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them
Pornhub Quits Texas Over Age Verification Law
ABOUT THE AUTHOR

Emily Washburn is a staff reporter for the Daily Citizen at Focus on the Family and regularly writes stories about politics and noteworthy people. She previously served as a staff reporter for Forbes Magazine, editorial assistant, and contributor for Discourse Magazine and Editor-in-Chief of the newspaper at Westmont College, where she studied communications and political science. Emily has never visited a beach she hasn’t swam at, and is happiest reading a book somewhere tropical.
Related Posts

Katy Faust’s 7 Ways to Spend a Billion Dollars
December 2, 2025

Supreme Court To Hear Evangelist’s Religious Freedom Case
December 2, 2025

Vatican Reaffirms: Marriage is Between One Man and One Woman
December 1, 2025
