• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

deepfakes

Jan 15 2026

Free Course Helps Parents and Schools Protect Kids from Explicit AI Deepfakes

Elliston Berry, a teenage victim of explicit AI deepfakes, has helped develop a free program to teach parents and schools how to protect their kids from AI-driven sexual abuse.

A classmate used AI to make explicit, nude images of Berry when she was just 14 years old. The adults at her high school were at a loss over how to protect her.

“One of the situations that we ran into [in my case] was lack of awareness and lack of education,” Berry, now 16, told CNN this week. “[The leaders of my school] were more confused than we were, so they weren’t able to offer any comfort [or] any protection to us.”

“That’s why this curriculum is so important,” she emphasized.

Adaptive Security built the free resource in partnership with Berry and Pathos Consulting Group, a company educating children about AI deepfakes and staying safe from online abuse.

“We partnered with Adaptive to build a series of courses together because we believe now is a critical time to protect our youth against these new AI threats,” Evan Harris, the founder of Pathos and a leading expert in protecting children from AI-driven sexual abuse, explained in a video launching the curricula.

The courses explain:

  • What deepfakes are and why they can be harmful.
  • When messing around with AI becomes AI-driven sexual abuse.
  • How to broach discussions about online sexual exploitation with students and parents.
  • How to broach discussions about online sexual exploitation with students and parents.

Schools can generate a personalized version of the curriculum by filling out the form on Adaptive Security’s website.

Adaptive’s free lessons also explain the rights of victims under the Take It Down Act, which President Donald Trump signed into law in May 2025. The law penalizes generating explicit, deepfake images of minors with up to three years in prison. It also requires social media companies scrub nonconsensual intimate images from their platforms within 48 hours of a victim’s request.

Berry, who helped First Lady Melania Trump whip up support for the Take It Down Act, waited nine months for her deepfakes to be taken off the internet.

The Take It Down Act gives victims of AI deepfakes an opportunity to seek justice. Berry, Harris and Adaptive Security CEO Brian Long hope their program will discourage the generation of explicit deepfakes altogether.

“It’s not just for the potential victims, but also for the potential perpetrators of these types of crimes,” Long told CNN, emphasizing:

They need to understand that this isn’t a prank  … It’s against the law and it’s really, really harmful and dangerous to people.

Parents and educators should not dismiss AI-driven sexual abuse as a rare occurrence.

The National Center for Missing and Exploited Children received more than 440,000 reports of AI-generated child sexual abuse material in the first half of 2025 — more than six times as many as in all of 2024.

In March 2025, one in every eight of the 1,200 13- to 20-year-olds surveyed by the child safety nonprofit Thorn reported knowing a victim of explicit, AI-generated deepfakes.

That number is likely higher now — if only because X (formerly Twitter) integrated an AI editing feature in November allowing users to generate explicit images of real people in the comment section.

xAI, the company behind X’s built-in AI chatbot, limited the feature last week after the platform flooded with illegal, AI-generated images.

The Daily Citizen thanks Elliston Berry and other victims of AI-driven sexual abuse for using their experiences to help parents keep their kids safe online.

Additional Articles and Resources

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Melania Trump Celebrates Committee Passage of Bill Targeting Revenge Porn, Sextortion and Explicit Deepfakes

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Written by Emily Washburn · Categorized: Culture, Family · Tagged: AI, deepfakes, Take It Down Act

Apr 11 2025

First Lady Melania Trump Celebrates Committee Passage of Bill Targeting Revenge Porn, Sextortion and Explicit Deepfakes

First Lady Melania Trump celebrated with lawmakers this week after the House Energy & Commerce Committee passed the Take It Down Act (H.R. 633) in a near unanimous, 49-to-1 vote.

“This marks a significant step in our bipartisan effort to safeguard our children from online threats,” Mrs. Trump wrote Monday. “I urge Congress to swiftly pass this important legislation.”

The Take It Down Act would make it illegal to share, or threaten to share, nude images and videos without consent. Violators would face up to three years in jail. The bill also requires websites and social media companies to remove explicit images within 48 hours of a victim’s request.

“I was only fourteen years old when one of my classmates created deepfake, AI nudes of me and distributed them on social media,” Elliston Berry, now fifteen, wrote in her own statement. “I was shocked, violated and felt unsafe going to school.

She continued:

Today is an important milestone towards [the Take It Down Act] becoming law, so that no other girl has to go through what I went through without legal protections in place.

The first lady invited Berry to attend President Trump’s joint address to Congress in March. The president praised both women for their commitment to the bill, promising to sign it into law once it passed the House of Representatives.

The Senate passed the Take It Down Act in a rare unanimous vote on February 13.

Bipartisan support for the legislation reflects lawmakers’ desire to protect children from some of the most common forms of internet exploitation.

Berry was a victim of explicit AI deepfakes — images and videos edited to make it appear as though a person is performing a sexual act. Though the photos may be fake, the damage they do to victims is very real.

The Take It Down Act punishes those who distribute of explicit “digital forgeries” with up to two years in prison for images featuring adults, and up to three years in prison for images featuring children.

The bill applies the same penalties to “revenge porn,” a colloquial term describing when explicit images are shared to harm someone mentally, financially or reputationally. It’s commonly associated with bad breakups between boyfriends and girlfriends that had once shared nude images consensually.

The Take It Down Act also prohibits people from threatening to share someone’s nude images, which occurs in sextortion schemes.

Online sextortionists create fake social media accounts to solicit nude images from unsuspecting victims — frequently teenage boys. Once scammers get their hands on explicit photos, they blackmail victims for money in exchange for keeping the images hidden.

In February, sixteen-year-old Elijah Heacock took his own life after getting trapped in a sextortion scam. His mother, Shannon Cronister-Heacock, praised the Energy & Commerce Committee for passing the Act on Monday.

“In February, our family mourned the loss of our loving son and brother, Elijah Heacock, after he fell victim to an extortion scheme on the internet,” Cronister-Heacock wrote, continuing,

We are grateful for the support of Chairman Guthrie and the House Committee on Energy and Commerce for passing the Take It Down Act today to ensure that no parent, sibling or loved one experiences a similar tragedy in the future.

The bill would penalize threatening to share nude images of minors like Elijah by 30 months in prison, and threatening to share explicit pictures of adults by 18 months.

The House of Representatives must approve the Take It Down Act before the president can sign it into law. House Speaker Mike Johnson has previously voiced support for the bill:

“As the dark side of tech advances, these unspeakable evils of [explicit deepfakes, revenge porn and sextortion] become part of culture, and the laws have to keep up,” Johnson told a roundtable of supportive lawmakers last month.

“We are anxious to put it on the floor in the House and get it to President Trump’s desk for a signature,” he concluded.

The Daily Citizen will continue publishing updates as this important bill makes its way toward the White House.

Additional Articles and Resources

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Meta Takes Steps to Prevent Kids From Sexting

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Taylor Swift Deepfakes Should Inspire Outrage — But X Isn’t to Blame

Written by Emily Washburn · Categorized: Culture, Government Updates · Tagged: deepfakes, revenge porn, sextortion, Take It Down Act

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2026 Focus on the Family. All rights reserved.

  • Cookie Policy