• Skip to main content
Daily Citizen
  • Subscribe
    • DC Morning Headlines
    • Weekly Newsletter
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Mar 05 2025

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

The House of Representatives is preparing to pass a bill targeting revenge porn, online sextortion and pornographic deepfakes, multiple sources report, following exhortations from America’s first family.

The bipartisan Take It Down Act (H.R. 633) makes it illegal to share, or threaten to share, nude images and videos without consent. It passed the Senate on February 13 in a rare unanimous vote.

First lady Melania Trump joined Speaker of the House Mike Johnson at a Congressional roundtable Monday to support the bill.

“I am here with you today with a common goal — to protect our youth from online harm,” Mrs. Trump began, continuing:

In today’s AI-driven world, the threat of privacy breaches is alarmingly high. As organizations harness the power of our data, the risk of unauthorized access and misuse of a person’s information escalates.
We must prioritize robust security measures and uphold strict ethical standards to protect individual privacy.

Johnson echoed the first lady, acknowledging “laws need to keep up” with the “unspeakable evils” spawned by the “dark side of tech.”

“We are anxious to put it on the floor in the House, to get it to President Trump’s desk for his signature, because we’ve got to do what we can to stop [nonconsensual sharing of explicit images],” he said.

The president highlighted Melania’s support for the bill in last night’s wide-ranging joint address to congress, calling it “so important.”

“Once it passes the house, I look forward to signing it into law,” he said, thanking Senate Majority Leader John Thune for shepherding it through the Senate.

The House Committee on Energy and Commerce must approve the Take It Down Act before the House can vote on it. Chairman Brett Guthrie said Monday a committee hearing on the bill will occur “very, very soon.”

How It Works

The Take It Down Act addresses three of the most common ways bad actors weaponize nude images online.

Revenge Porn

The first, and perhaps most familiar, way people exploit nude photos is “revenge porn” — when explicit images are shared to harm someone mentally, financially or reputationally. It is most closely associated with aggrieved ex-boyfriends leaking once-private, sexual images of former girlfriends.

The Act makes it illegal to publish sexual images that were:

  • Created or shared with a reasonable expectation of privacy, like those sent to a romantic partner.
  • Shared to cause harm.

Violators would face up to two years in prison for sharing images of an adult, and up to three years for sharing a minor’s.

Deepfakes

The same penalties apply to what the Act calls “digital forgeries” —  images and videos edited to make it appear as though a person is performing a sexual act. They are more commonly known as deepfakes.

Elliston Berry (15), who attended the joint address to congress with the first lady, is one of the many victims of pornographic deepfakes. When she was just 14 years old, Berry discovered a peer edited one of her Instagram posts to make it look like she was posing nude.

Berry told Monday’s congressional roundtable how the violation affected her life:

Fear, shock, and disgust were just some of the many emotions I felt.
I felt responsible and began to blame myself. I was ashamed to tell my parents, despite doing nothing wrong.
As I attended school, I was scared of the reactions of [people] or [that] someone could recreate those photos.

The Take It Down Act imposes the same penalties on people who share deepfakes as on those who share real photos, consistent with the real damage doctored images do to victims like Berry.

Online Sextortion

Threatening to leak explicit photos can be just as harmful as actually exposing them. Since 2021, at least 20 teenage boys have reportedly committed suicide after falling victim to sextortion.

Online sextortionists create fake social media accounts to convince users to strike up a romantic exchange of nude images. Once they get their hands on a someone’s explicit photos, the scammers ask for money in exchange for keeping the images quiet.

In 2022, Gavin Guffey, a 17-year-old from South Carolina, ended his life less than two hours after being contacted by a sextortionist. His dad, Representative Brandon Guffy (SC), is one of the bill’s biggest supporters. He described his experience in an article for The Hill.

As a father, I believe it is my job to protect our kids. Since Gavin took his life, I have been focused on continuing to use my voice to advocate, help victims, fight child online sexual abuse and focus on teen mental health. Our children’s safety is an issue that transcends party lines.

Under the Take It Down Act, sextortionists would face up to 18 months in prison for targeting an adult, and up to 30 months for targeting minors like Gavin.

A Pathway to Take It Down

H.R. 633 would also require websites and social media companies to remove explicit images within 48 hours of a victim’s request.

Representative Maria Salazar (FL), who sponsored the bill in the House, said of this provision:

The act, finally, is sending a very big message to Big Tech that they have to bring down these images within 48 hours. No more time than that. No more excuses. If not, Big Tech will be just as guilty as the aggressor.
Why It Matters

The internet is not a safe place for kids. At the very least, the Take It Down Act demonstrates Congress’ desire to help parents protect their kids from exploitation.

That’s something to celebrate.

Additional Articles and Resources

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Meta Takes Steps to Prevent Kids From Sexting

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Taylor Swift Deepfakes Should Inspire Outrage — But X Isn’t to Blame

Written by Emily Washburn · Categorized: Culture · Tagged: pornography, social media, Trump

Feb 20 2025

Utah Family Vlogging Bill Passes Committee After Franke Family Testimony

Utah legislators advanced a bill protecting child performers and social media stars this week after members of the Franke family testified to the harms of family vlogging (video blogging).

House Bill 322 requires parents of child performers, including children featured in family vlogs, to deposit 15% of their child’s income in a trust. The legislation also establishes a pathway for adults to remove social media content they starred in as kids.

The House Business and Labor Committee unanimously approved HB 322 after hearing from members of the Franke family.

Kevin and Ruby Franke posted daily videos of themselves and their six children to YouTube for years. At its height, their channel, “8 Passengers,” boasted more than two million subscribers.

The Franke’s carefully curated family image shattered in August 2023 when police arrested Ruby and her business partner, Jodi Hildenbrandt, for mistreating Ruby’s youngest children. Both women are serving prison sentences for aggravated child abuse.

In the two years since Ruby’s arrest, Kevin and the Franke children have become prominent critics of family vlogging.

“Had [this bill] been in place when my family was doing YouTube, my mom would not have been able to withdraw all my savings from doing YouTube,” Kevin read a statement for the Business and Labor Committee on behalf of his daughter, Julie. “This bill will prevent other kids from having to go through the pain of realizing that the compensation for years’ worth of time and effort is suddenly gone.”

Representative Doug Owens, the bill’s sponsor, emphasizes that HB 322’s financial protections only apply to family vlogging channels making more than $150,000 a year, like “8 Passengers.”

Dave Davis, a lobbyist for family vloggers, says his clients won’t oppose the legislation.

“They can make it work if it’s the will of the body to move forward in this direction,” he told a local news station.

While the Franke’s eldest daughter, Shari, supports all restrictions on family vlogging, she warns that money doesn’t compensate for growing up in a more invasive iteration of reality TV. She was eleven when “8 Passengers” started.

“If I could go back and do it all again, I’d rather have an empty bank account now, and not have my childhood plastered all over the internet,” she told the Business and Labor Committee in October. “No amount of money I’ve received has made what I experienced worth it.”

Often, she recalled, money was used to entice her and her siblings to film increasingly embarrassing and vulnerable videos:

Payment was usually a bribe. For example, we’d be awarded $100 or a shopping trip if we filmed a particularly embarrassing moment or an exciting event in our lives.

Kevin, for his part, wishes he could turn back the clock.

“Vlogging my family — putting my children into public social media — was wrong, and I regret it every day,” he told the committee. “Children cannot give informed consent to be filmed on social media, period.”

Focus on the Family’s Plugged In helps families navigate our technological age. While not all family vlogging channels are exploitive, it cautions parents against using children for content and revenue.

Growing up is tough enough. Growing up in the public eye is exponentially tougher. While lots of wonderful families live out their lives on YouTube, it’s inherently dangerous to commoditize those lives for public consumption.
Kids should be kids first — not entertainers, and certainly not employees.

The Franke’s youngest daughter, Eve, articulated a remarkably similar thought to the Business and Labor Committee on Tuesday.

“I’m not saying YouTube is a bad thing,” she wrote. “Sometimes it brings us together. But kids deserve to be loved, not used by the ones that are supposed to love them the most.”

HB 322 will be brought before the Utah House of Representatives for a vote in coming months. If advanced, the bill will go to the state Senate for approval.

Additional Articles and Resources

Plugged In

Plugged In Parent’s Guide to Today’s Technology equips parents to navigate the ever-shifting tech realm.

Horrifying Instagram Investigation Indicts Modern Parenting

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Written by Emily Washburn · Categorized: Culture · Tagged: social media, vlogging

Jan 21 2025

Trump Revives TikTok

President Donald Trump signed an executive order yesterday staving off the enforcement of the TikTok ban-or-sell law, giving the social media app 75 more days to find an American buyer.

TikTok was scheduled to shut down in the U.S. on January 19, one day prior to Mr. Trump’s inauguration, after failing to comply with the Protecting Americans from Foreign Adversary Controlled Applications Act (the Act).

The bipartisan act, which then-President Biden signed into law last April, required TikTok to divest from its Chinese parent to continue operating in the U.S.

Americans couldn’t access TikTok online or on app stores for about 14 hours Sunday morning before the soon-to-be-President took to Truth Social, promising to delay the Act’s enforcement and shield companies from financial liability.

The Act can fine companies $5,000 per person that accesses TikTok past January 19.

Mr. Trump’s promise was enough for web hosts like Oracle and Akami to restored online access to TikTok before his inauguration. But TikTok remains absent from app store platforms like Google and Apple, even after President Trump issued his official order, in apparent deference to lingering legal questions.

Legal experts disagree over whether a President can effectively change or negate a law passed by Congress with an executive order.

Importantly, Mr. Trump’s order doesn’t actually extend the law’s deadline. It orders the Attorney General not to enforce the law for another 75 days. While most seem to agree the President can control what laws the Department of Justice enforces, some argue it doesn’t set a good precedent. Zachary Price, a professor at the University of California College of Law in San Francisco, told The New York Times:

I don’t think it’s consistent with faithful execution of the law to direct the attorney general to not enforce [the law] for a determinate period.

Though many have expressed doubts about the executive order, it’s unclear what group would go so far as to challenge it in court. The edict has broad, bi-partisan appeal. Legislators spent the last week working furiously to get out of the deadline they had created, reluctant to kick 170 million Americans off an admittedly addictive app.

Though few have criticized Mr. Trump’s order directly, those concerned with national security continued discouraging app stores from platforming TikTok. Arkansas Senator Tom Cotton wrote on X:

Any company that hosts, distributes, services or otherwise facilitates communist-controlled TikTok could face hundreds of billions of dollars or ruinous liability under the law, not just from the DOJ, but also under securities law, shareholders lawsuits and state AGs. Think about it.

Mr. Trump’s commitment to finding TikTok an American buyer might mollify security hawks like Cotton enough to keep them from formally challenging the order. The trouble, then, is convincing China to let TikTok go.

Evidence suggests the Chinese government has significant sway over TikTok’s parent company, ByteDance. Some, like South Carolina Senator Lindsey Graham, have even suggested China’s President, Xi Jingping, holds stock in ByteDance. Either way, the company will be unable to sell TikTok without the People’s Republic of China’s say so.

Yesterday, for the first time, the Chinese government seemed to soften their stance on selling TikTok to an American company.

“For such actions as corporate operations and acquisitions, we always believe they should be decided independently by companies based on market principles,” Chinese Foreign Ministry spokeswoman Mao Ning told reporters in Beijing on Monday.

Your kids might be able to maintain access to TikTok — but should they? Read the articles below to find out why you should think twice about TikTok and other social media platforms.

Additional Articles and Resources

Supreme Court Upholds TikTok Ban

TikTok Ban Three Days Away

TikTok Scrambles Amid Looming Ban

Plugged In Parent’s Guide to Today’s Technology equips parents to navigate the ever-shifting tech realm.

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

Jan 17 2025

Supreme Court Upholds TikTok Ban

The Supreme Court declined today to delay the enforcement of the TikTok ban-or-sell law, all but guaranteeing social media app will cease operating in the U.S. on Sunday.

The company had requested the Supreme Court issue a temporary injunction against the Protecting Americans from Foreign Adversary Controlled Applications Act (the Act), which requires TikTok to divest from its Chinese parent company, ByteDance, in order to continue operating in the United States.

TikTok spent most of last year trying to get the law thrown out, with no success. In December, the D.C. Circuit Court of Appeals became the highest court to find the law unconstitutional. The Supreme Court’s unanimous decision upholds the D.C. Circuit’s ruling — and echoes many of the circuit judges’ arguments.

The Act in question could prevent 170 million American TikTok users from expressing themselves on TikTok. For it to be constitutional, the federal government had to convince the Supreme Court that it 1) serves a legitimate government interest and 2) violates as few rights as possible in achieving that interest.

In an unsigned majority opinion, the justices found the government’s concern about TikTok’s extensive data collection, and China’s interest in that data, legitimate. The opinion reads, in part:

Even if China has not yet leveraged its relationship with ByteDance Ltd. to access U.S. TikTok users’ data, petitioners offer no basis for concluding that the Government’s determination that China might do so is not at least a “reasonable inference based on substantial evidence.”

The justices further concluded the Act was narrowly tailored because it doesn’t require TikTok to leave the U.S. — only cut ties with its Chinese ownership.

Justices Sotomayor and Gorsuch wrote concurring opinions finding that the Act placed a higher burden on free speech than the majority. Still, Justice Gorsuch concluded:

I am persuaded that the law before us seeks to serve a compelling interest: preventing a foreign country, designated by Congress and the President as an adversary of our Nation, from harvesting vast troves of personal information about tens of millions of Americans.

Gorsuch specifically references TikTok’s ability to access non-users’ data:

According to the Federal Bureau of Investigation, TikTok can access “any data” stored in a consenting user’s ‘contact list’ —including names, photos, and other personal information about unconsenting third parties.

The ruling, while not unexpected, puts legislators in a tough position. Many, including those who supported the law, don’t want to assume blame for kicking 170 million American TikTok users off their favorite app.

President-elect Trump, in particular, appears motivated to extend the Act’s deadline so TikTok has more time to find an American buyer. He will be inaugurated just one day after TikTok shuts down.

According to Alan Rozenshtein, former national security advisor to the Justice Department, Mr. Trump could achieve this goal by encouraging Congress to repeal the law or instructing his would-be Attorney General, Pam Bondi, not to enforce it.

Absent a guarantee that the law will not be enforced, its unlikely app stores will continue to host TikTok — even for one day. The Act promises to fine uncompliant app stores up to $5,000 dollars for every user that accesses TikTok after the shutdown deadline.

Both President Biden and Mr. Trump have also considered delaying TikTok’s shutdown date with an executive order, multiple outlets report. An anonymous White House official told Politico this strategy is likely illegal:

Our interpretation of the law that Congress passed is that absent a credible plan from the company on how they will divest, the President does not have the statutory authority to trigger the 90-day extension.

The official continued:

The company has not only not advanced such a plan, they have signaled they have no intention of selling it to an American owner.

The next couple of days promise to be interesting. One thing that won’t change? TikTok is bad for your kids. Check out the articles below to find out why.

Additional Articles and Resources

TikTok Ban Three Days Away

TikTok Scrambles Amid Looming Ban

Plugged In Parent’s Guide to Today’s Technology equips parents to navigate the ever-shifting tech realm.

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Written by Emily Washburn · Categorized: Government Updates · Tagged: social media, TikTok

Jan 16 2025

TikTok Ban Three Days Away

Time is ticking for TikTok.

The social media app has just three days to comply with the Protecting Americans from Foreign Adversary Controlled Applications Act, a law forcing it to divest from its Chinese parent company, ByteDance.

If ByteDance refuses to sell, and the Supreme Court declines to temporarily block the law, TikTok will shut down in the U.S. on Sunday.

The Act, which President Biden signed into law in April 2024, addresses bipartisan concerns that TikTok threatens U.S. security. Chinese law requires all Chinese-owned companies, including ByteDance, to make their data available to the government. That means the PRC has access to the data TikTok collects on all 170 million American users, including their names, ages, emails, phone numbers, contact lists, in-app messages, IP addresses, keystroke and in-app usage patterns, browsing and search history, location data and biometric identifiers.

China uses this data to expand its massive U.S. intelligence gathering operation. Casey Blackburn, the Assistant Director of U.S. National Intelligence, told the D.C. Circuit Court last month,

[China] is the most active, persistent cyber espionage threat to the U.S. government, private-sector and critical networks.

These threats, Blackburn continued, include “extensive, years-long efforts to accumulate structured datasets, in particular on U.S. persons, to support its intelligence and counterintelligence operations.”

TikTok spent 2024 in court trying to get the Act declared unconstitutional, but testimony like Blackburn’s proved too convincing to overcome. The D.C. Court affirmed a lower court ruling finding the Act constitutional, writing:

The First Amendment exists to protect free speech in the United States. Here the Government acted solely to protect that freedom from a foreign adversary nation and to limit that adversary’s ability to gather data on people in the United States.

TikTok has asked the Supreme Court to temporarily stop the law’s enforcement so it can appeal to the Supreme Court or the benevolence of the incoming administration. President-elect Donald Trump will be inaugurated one day after the January 19 shutdown date.  

The Court is expected to weigh in by Sunday.

TikTok poses more than just a national security threat; Thirteen states and the District of Columbia are suing the company for the unethical treatment of children. But ahead of the looming deadline, many former TikTok critics have thrown their weight behind finding TikTok an American buyer — including the President-elect.

Politico reports TikTok CEO Shou Zi Chew will attend Mr. Trump’s inauguration on January 20. On the other side of the aisle, President Biden is considering using an executive order to extend the Act’s deadline, Forbes reports.

Caitlin Legacki, a former member of the Biden Commerce Department, broke down the situation for Politico:

Somebody is going to find a way to strike a deal, and they will be regarded as a hero of the TikTok generation…I think it speaks to a failure of both parties to actually explain to voters why this was necessary, and as a result, we’re going to probably roll back what was the correct policy.

It’s difficult to predict what will happen this Sunday, but one thing won’t change — TikTok isn’t good for children. Check out the articles below to learn how you protect your kids from social media.

Additional Articles and Resources

TikTok Scrambles Amid Looming Ban

Plugged In Parent’s Guide to Today’s Technology equips parents to navigate the ever-shifting tech realm.

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2025 Focus on the Family. All rights reserved.

  • Cookie Policy