Porn Companies Condition Viewers to Desire Illegal and Abusive Content

Porn companies aren’t just using content recommendation algorithms to drive users toward more extreme content, a deluge of new data shows — they’re conditioning users to desire child sexual abuse material (CSAM), image-based sexual abuse (IBSA) and other illegal content.

Experts have long acknowledged the toxic link between digital pornography and online exploitation. Companies like Pornhub not only provide platforms for predators to upload and monetize abusive content, but have financial incentive promote it.

Until recently, porn companies could claim violent, abusive and illegal videos had “slipped through the cracks.” But information released in the last two months suggests porn sites do far more than turn a blind eye — they create computer algorithms that drive traffic to CSAM, IBSA and IBSA-themed content.

Content Recommendation

In a new study on the relationship between porn and IBSA, the National Center on Sexual Exploitation (NCSOE) writes:

It is critical to understand that modern distributors of pornography are information technology experts. [These] platforms often feature algorithmic “recommendations” that profile popular content or elevate specific content based on users’ profiles and/or viewing history.

Content recommendation algorithms are built on “tags” — digital identifiers that help the system organize content into categories and lists. So, by banning problematic tags like “unwilling” or “force,” companies like Pornhub could theoretically generate recommendations that steer people away from abusive videos.

But porn companies never eliminate all inappropriate tags. They make the most money by catering to users’ preferences — regardless of whether they are legal.

Pornhub’s profit motive is etched into thousands of internal communications and memos a federal court accidentally published earlier this month. In one exchange reviewed by investigative journalist Nicholas Kristof, Pornhub executives debated banning some concerning tags.

They elected to jettison “kiddy” and “infant.” But “brutal,” “childhood,” “force,” “snuffs,” “unwilling,” “minor,” and “wasted?” Those were all green-lit.

Kristof writes for The New York Times:

Pornhub executives clearly had some concern about illegal content, such as sex videos involving people who were 17 or younger … But my impression is that Pornhub managers felt conflicted, because they closely tracked the popularity of topics and saw that videos of naked teenagers were a huge draw.

Most of the leaked documents pre-date Kristof’s 2020 Times exposé on Pornhub’s predation and negligence, which effectively forced the company to delete 80% of its content. Since then, the porn distributor claims it has adopted sweeping reforms.

Kristof is skeptical. He opines, “[The] website still seems to me to wink at pedophiles and sadists … There are countless references to videos with the words ‘it hurts’ or ‘painful,’ or about ‘schoolgirls’ or ‘school.’”

Notably, outside English-speaking countries, Pornhub users can still search for videos tagged “young” or “adolescents.”

Pornhub isn’t the only porn company making CSAM and IBSA searchable. NCOSE’s study demonstrates the world’s most popular porn sites all contain content algorithms capable of finding and recommending heinous content.

That means, at the very least, these companies are enabling existing offenders to access abusive content. NCOSE writes:

When pornography users consume IBSA-themed material on pornography websites, the sites are designed to further fuel that desire based on the user’s past viewing history.
Shaping Sexuality

Porn isn’t merely enabling people’s sexual appetites — it’s shaping them.

In April, The Guardian published an investigation exploring the link between digital pornography and pedophilia. The piece included quotes from two men who had been arrested and charged for viewing CSAM. Now in recovery, both claim they had had no sexual interest in children before becoming addicted to porn.

“I didn’t start out wanting to see kids. I was addicted to porn and I went down a road of being totally desensitized as I got further and further from what was normal,” one told the outlet.

“The police never found a single search for images of children: it was all from clicking through links —what the algorithms were offering me,” the other claimed.

These men’s experiences do not negate the seriousness or harm of their choice to view or search for CSAM. They do, however, contextualize data suggesting porn companies are effectively inspiring pedophilic and deviant sexual desires in people that would have never otherwise experienced them.

NCOSE cites a 2016 study of online sexual activity in men. Of the 434 respondents surveyed, 204 (47%) reported they were “involved in a practice or searched for pornography which previously was not interesting or even disgusting to them.”

In 2021, Protect Children, a Finnish child advocacy group surveyed more than 8,000 anonymous child sex abuse offenders on the dark web. Of the more than 5,000 who answered the question, 70% (3,521) claimed they first viewed CSAM as a minor.

Nearly 40% (1962) said they were less than 13 years old.

In a more recent survey from Protect Children,  65% (1,800) of more than 2,700 anonymous offenders reported they’d “habitually viewed adult pornography” prior to searching for CSAM. Of those, 67% reportedly watched “every day” or “most days.”

Pornography’s influence on sexual predilections also manifests in offender demographics. Over the last decade and a half, police in the UK have been overwhelmed by a “huge number of lower-level offenders,” veteran profiler Michael Sheath told the Guardian.

According to Detective Chief Inspector Tony Garner, who leads a team targeting child abuse in Worcester, offenders are also skewing younger.

“We are seeing people who are turning 18 and have had 10 years’ exposure to hardcore porn,” Garner piggybacks off Sheath. “My officers are finding young people watching the most abhorrent material, including child abuse.”

For Sheath, there’s no question the phenomenon is linked to porn:

Before the smartphone, most people’s first experience of sex was with a living person — and that included resistance, pushback, romance.
Now young people are growing up with unfettered access to porn, and porn norms are not about consent. It’s shaping their erotic templates.

Digital porn is neither a commercial product nor a harmless activity — It’s an algorithmic technology that equally profits off online sexual abuse and conditions viewers to seek it out.

That’s an indescribably consequential problem. Parents, legislatures and everyone concerned with children’s health and safety need to address it immediately.

Linked below are resources to help you get started.

If you or a loved one are struggling with an addiction to pornography, Focus on the Family has resources to help.

Additional Articles and Resources

Counseling Consultation & Referrals

Help For Pornography Addiction

Addicted to Pornography

Porn Companies Sued for Violating Kansas Age Verification Law

Proposed SCREEN Act Could Protect Kids from Porn

UPDATED: Pornography Age Verification Laws — What They Are and Which States have Them

A Mother’s Sensibility at the Supreme Court Regarding Pornography

Pornhub Quits Texas Over Age Verification Law

Kid’s Online Safety Act—What It Is and Why It’s a Big Deal

Your Marriage Can Win the Battle Against Pornography