• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Feb 26 2026

Deceptive Colorado Bill Hides Kids’ Social Media Activity from Parents

Colorado’s deceptive “Protections for Youth on Social Media” bill would allow minors to hide their online activity from their parents.

HB26-1148, which Representatives Yara Zokaie and Jenny Willford introduced in the Colorado House of Representatives earlier this month, would require social media and online gaming platforms to adopt high default privacy and data protection settings for minors’ accounts.

Plenty of other child protection bills, including the federal Kids Online Safety Act (KOSA), would require social media companies to institute more protections for minors on their platforms.

Unlike other bills, however, Colorado’s “Protections for Youth on Social Media” bill would conceal kids’ data from parents and strangers alike.

HB26-1148 would require covered companies to obtain a minor user’s permission for any adult — including their parents — to view their profile, account activity, friends or location.

To protect minors’ “data privacy,” HB26-1148 instructs:

[Covered businesses] shall not permit an individual, including a parent or guardian of a covered minor, to track the location of the covered minor without providing a conspicuous signal to the covered minor when the covered minor is being monitored or tracked (emphasis added).

To add insult to injury, the bill does not require social media or online gaming platforms to meaningfully protect children from obscene or inappropriate content. Covered companies do not have to verify the ages of users seeking to access mature content.

Instead, HB26-1148 specifies it should not be “interpreted or construed to … prevent or preclude a minor from deliberately or independently searching for or specifically requesting any media.”

In other words, HB26-1148 strips parents of any ability to monitor their kids’ social media activity while empowering minors to view any content and interact with any user they want.

Readers would never know the bill eviscerates parents ability to protect their kids online from its official summary, which writes only that HB26-1148 would require covered companies to provide minors with “the highest level of privacy.”

Parents familiar with bills like KOSA, which would require social media companies to adopt comprehensive parental controls, might reasonably assume Colorado’s bill would exempt exempts parents from privacy protections meant to shield children’s accounts from strangers and internet predators.

Unfortunately, they would be wrong.

Colorado’s “Protections for Youth on Social Media” may not directly affect all families, but Colorado legislators’ proposal and deceptive description of the bill should remind all parents to read the fine print.

Families simply cannot afford to take legislators’ word that a proposed bill is pro-family or pro-child. That’s why the Daily Citizen offers timely analysis and resources precisely to help busy families stay abreast of important policy issues.

For more information about legislative happenings in your state — and how you can get involved —connect with your Focus on the Family-allied state policy group. 

Additional Articles and Resources

How to Get In Touch With Your State Policy Group

Counseling Consultation & Referrals

Introducing Our Parents’ Guide to Technology 2026

Parenting Tips for Guiding Your Kids in the Digital Age

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Feb 20 2026

Meta’s Mark Zuckerberg Denies Instagram is Addictive in Social Media Trial Testimony

Meta’s Mark Zuckerberg denied social media is addictive Wednesday in his testimony at America’s first social media addiction trial.

The highly anticipated case will determine whether social media companies like Meta can be held legally liable for creating addictive products.

The complaint contends Instagram, a social media app owned by Meta, contributed to the social media addiction of a young woman named Kayley (initials KGM). Kayley’s alleged addiction exposed her to child predators and caused her to develop depression, anxiety and body dysmorphia.

The question before the jury is whether Instagram and YouTube, the defendants, are addictive.  

Kayley and her lawyers argue the platforms themselves cause addiction with features like infinite scroll, content recommendation algorithms and beauty filters.

Meta and Zuckerberg say the content Kayley saw on Instagram, in addition to her preexisting trauma and mental health conditions, caused her distress — not Instagram’s design.

Importantly, Section 230 of the Communications Decency Act protects online content hosts, like Instagram, from being held liable for content users post to their sites.

Zuckerberg bobbed and weaved with characteristic agility during his testimony Wednesday, staunchly refusing any suggestion Instagram could be addictive.

“I’m not sure what to say to that,” Zuckerberg told Mark Lanier, Kayley’s lawyer, after Lanier asked whether addictive products increase usage. “I don’t think that applies here.”

Zuckerberg focused on convincing the jury Meta does not benefit from addicting its users.

“There’s a misconception that the more attention the company captures, and the more time people spend on its apps, the better it is for Meta’s bottom line, regardless of the harms they may encounter,” NPR paraphrased the Meta founder.

He continued:

If people feel like they’re not having a good experience, why would they keep using the product?

Here, Zuckerberg assumes users can leave Instagram at will. But the crux of the case against Meta is that Instagram is addictive, making it difficult or impossible for dependent users to quit without help.

Kayley’s mom fears social media “has changed the way [her daughter’s] brain works.”

“She has no longer term memory,” she wrote in a filing reviewed by the Los Angeles Times.

“She can’t live without a phone. She is willing to go to battle if you were even to touch her phone.”

Meta makes money from monopolizing users’ time — which means the company has financial incentive to make their products addictive.

Internal documents referenced by Lanier show Instagram and Zuckerberg responded to this incentive on several occasions.

In 2016, when Instagram was competing with Snapchat for teen users, Zuckerberg reportedly “directed executives to focus on getting teenagers to spend more time on the company’s platforms,” according to the The New York Times.

Another document from 2018 reads: “If we wanna win big with teens, we must bring them in as tweens.” Instagram’s user policy supposedly excludes children under 13 years old.

Zuckerberg argued this incentive structure no longer exists. Now, Instagram focuses on making the platform useful to users.

“If something is valuable, people will use it more because it’s useful to them,” he explained.

But therein lies the problem. Though Instagram may no longer explicitly focus on increasing usage, Zuckerberg’s testimony indicates the amount of time users spend on Instagram remains the platform’s ultimate metric of success.  

In other words, Meta still has incentive to make their products addictive — whether Zuckerberg admits it or not.

Meta’s incentive structure speaks to the company’s motivations. If Zuckerberg can help convince the jury Meta does not benefit from addicting its users, then the social media company can claim it has no reason to create harmful products.

The argument is a subtle one, particularly compared to the ever-growing mountain of evidence showing social media harms children.

Meta’s history of releasing faulty or unsubstantial child safety protections further undermines Zuckerberg’s claim Meta does not increase usage at any cost.

When Instagram launched new tools addressing sextortion in October 2024, the National Center on Sexual Exploitation (NCOSE) warned:

Mark Zuckerberg and Meta leadership have a very long track record of explicitly making the choice not to rectify harms it knows it is both causing and perpetuating on children … or only doing so when forced by public pressure, bad press or Congressional hearings.

Senator Marsha Blackburn (TN), the author of the bi-partisan Kids Online Safety Act, communicated a similar sentiment in her remarks on Zuckerberg’s testimony.

“To no one’s surprise, Mark Zuckerberg followed his usual playbook of denial and deceit while sitting just a few steps away from parents who have tragically lost their children as a consequence of the way his platforms are designed to harm young users,” Blackburn said, continuing:

These companies are using the same playbook as Big Tobacco did decades ago by trying to keep kids hooked on products that hurt them.

Kayley’s case is the first of nine bellwethers, or test cases, for an estimated 1,600 similar civil cases filed against social media companies in California state court. Trial proceedings for the first of a large group of federal social media addiction cases will begin this summer.

Additional Articles and Resources

Social Media Addiction Suits go to Trial — Here’s What You Need to Know

America Controls TikTok Now—But Is It Really Safe?

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Key Takeaways From Zuckerberg’s Tell-All

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Horrifying Instagram Investigation Indicts Modern Parenting

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Feb 06 2026

Social Media Addiction Suits go to Trial — Here’s What You Need to Know

Social media companies face a legal reckoning this year as juries begin hearing the social media addiction lawsuits.

The long-awaited wave of cases will determine whether companies like Meta, YouTube, TikTok and their compatriots can face legal consequences for creating defective products.

The social media addiction lawsuits refer to thousands of civil cases alleging social media companies like Meta, YouTube and TikTok designed and released addictive products without adequate warnings, causing personal and financial injury.

The cases parallel the product liability lawsuits against tobacco companies, which eventually paid $206 billion for damage caused by addictive cigarettes.

The strongest cases are bundled into two groups — cases filed in federal court and cases filed in California state court. The federal cases primarily represent school districts and states which claim they foot the bill for young people’s social media addiction. The first of these cases will go to trial this summer.

The cases filed in California represent an estimated 1,600 individuals, families and school districts which claim social media addictions caused them personal injury.

The first of these cases — a 2023 suit filed on behalf of a young woman named KGM and several other plaintiffs — began trial proceedings on January 27.

KGM allegedly developed depression, anxiety and body-image issues after years of using YouTube, Instagram, TikTok (formerly Musical.ly) and Snapchat. She also became a victim of sextortion on Instagram, where a predator shared explicit photos of her as a minor.

The New York Post, which reviewed KGM’s complaint, describes her alleged experience with Meta following her victimization:

When KGM’s family reported her alleged sextortion to Meta … the company did nothing, [KGM’s] complaint claims, and instead allowed the person to continue committing harm via “explicit images of a minor child.”
According to the court filing, it took multiple family members and friends “spamming” Instagram’s moderation system in a coordinated, two-week effort before Meta handled it.

Predators met their victims on Instagram in nearly half (45%) of all the sextortion reports filed with the National Center for Missing and Exploited Children between August 2020 and August 2023.

Of the scammers who threatened to share explicit photos of minors, 60% threatened to do so on Instagram.

When Instagram launched new tools to address sextortion in October 2024 — long after KGM’s experience — the National Center on Sexual Exploitation (NCOSE) warned:

Mark Zuckerberg and Meta leadership have a very long track record of explicitly making the choice not to rectify harms it knows it is both causing and perpetuating on children … or only doing so when forced by public pressure, bad press or Congressional hearings.

KGM’s mom allegedly used third-party parental control apps to keep her daughter from using social media. It didn’t work. Per the complaint, Meta, YouTube, TikTok and Snap “design their products in a manner that enables children to evade parental consent.”

Social media “has changed the way [my daughter’s] brain works,” KGM’s mom wrote in a filing reviewed by the Los Angeles Times, continuing:

She has no long-term memory. She can’t live without a phone. She is willing to go to battle if you were even to touch her phone.

Snap, the company behind Snapchat, and TikTok settled with KGM in January. Meta and YouTube are expected to proceed with the trial.

KGM’s case is the first of nine California bellwether cases, or cases to determine whether a novel legal theory will hold up in court.

The mystery is whether KGM’s argument can circumvent Section 230 of the Communications Decency Act, which prevents online platforms from being sued for users’ posts.

Social media companies have thus far escaped product liability accusations using Section 230. In a November motion to keep KGM’s case from going before a jury, Meta blamed the harm KGM experience on content posted to Instagram, which it could not be held responsible for.

Judge Carolyn B. Kuhl, who oversees the state cases, ruled against the social media juggernaut, finding KGM had presented evidence indicating Instagram itself — not content posted to Instagram — caused her distress.

“Meta certainly may argue to the jury that KGM’s injuries were caused by content she viewed,” Judge Kuhl wrote. “But Plaintiff has presented evidence that features of Instagram that draw the user into compulsive viewing of content were a substantial factor in causing her harm.”

“The cause of KGM’s harms is a disputed factual question that must be resolved by the jury,” she concluded.

If KGM or any other bellwether case wins, Section 230 may no longer offer unconditional protection for social media companies which behave badly.

The social media addiction cases could send social media companies an expensive message about prioritizing child wellbeing. But parents shouldn’t rely on companies’ altruism (or grudging compliance with court orders) to protect their kids.

Social media is not a safe place for children. Parents should seriously consider keeping their children off it.

To read more of the Daily Citizen’s reporting on the effects of social media on children, read the articles linked below.

Additional Articles and Resources

America Controls TikTok Now—But Is It Really Safe?

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Key Takeaways From Zuckerberg’s Tell-All

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Horrifying Instagram Investigation Indicts Modern Parenting

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Jan 27 2026

America Controls TikTok Now — But Is It Really Safe?

TikTok’s Chinese parent company, ByteDance, relinquished control of American TikTok to the U.S. late last week.

The transaction staves off the looming TikTok ban without sacrificing national security. The platform itself, however, remains as dangerous for America’s 170 million users as ever.

TikTok USDS Joint Venture, an American-controlled company, took over the American divisions of TikTok, Lemon8, CapCut and several other ByteDance apps on Thursday, January 22.

American investors own 50% of the new company. ByteDance retains 19.9% ownership and the remaining 30% belongs to ByteDance investors.

American control of the joint venture solves three national security threats caused by Chinese ownership of ByteDance.

First and foremost, American users’ data will be hosted on U.S.-based servers and protected by U.S. cybersecurity companies.

TikTok always claimed to keep the extensive data it collected on Americans secure. But Chinese law requires companies like ByteDance make their data available to the government.

A congressional investigatory committee determined Chinese officials had mined Americans’ TikTok data on multiple occasions, including:

  • Names
  • Ages
  • Emails
  • Phone numbers
  • Contact lists
  • In-app messages and usage patterns
  • IP addresses
  • Keystroke patterns
  • Browsing and search history
  • Location data
  • Biometric information like face- and voiceprints

Chinese access to ByteDance’s data extended to TikTok’s powerful content recommendation algorithm. TikTok USDS will reset the algorithm and retrain it on American content alone, ensuring China can no longer manipulate what U.S. citizens see on TikTok.

ByteDance will no longer perform content moderation under TikTok USDS, further preventing China from censoring or influencing the success of Americans’ posts.

ByteDance’s divestiture of American TikTok satisfies the Protecting Americans from Foreign Adversary Controlled Applications Act — the ban-or-sell law which required the company to sell majority ownership of TikTok for the app to remain available in America.

President Donald Trump passed an executive order delaying enforcement of the ban in January 2025, giving TikTok time to negotiate with American buyers. When ByteDance and U.S. investors established a framework for the deal in September, the president gave the parties another 120 days to sign on the dotted line.

The launch of TikTok USDS Joint Venture on January 22 came just one day before the deadline.

American control of TikTok might well protect citizens from global security threats. It does not, however, protect users from TikTok itself.

Thirteen states and the District of Columbia sued TikTok in October 2024 for illegally collecting and monetizing American children’s data.

The suits estimated as much as 35% of TikTok’s American ad revenue under ByteDance came from children and teens. Importantly, ByteDance will retain control of TikTok’s e-commerce, marketing and advertising under the TikTok USDS.

The 2024 lawsuits exposed documents showing TikTok not only knew its app was addictive, but that compulsive use in teens caused “a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, increased anxiety … [and interference] with essential responsibilities like sufficient sleep, work/school responsibilities and connecting with loved ones.”

An estimated 95% of American smartphone users under 17 years old use TikTok, according to one of the platform’s own reports.  

Still, TikTok workers did not consider it their responsibility to limit minors’ use of the platform — even when creating a tool allowing parents to set TikTok time limits for their kids.

“Our goal is not to reduce the time spent [on TikTok],” a project manager for the tool wrote in an employee group chat.

“[The goal is] to contribute to daily active users and retention [of other users],” another chimed in.

TikTok evaluated the success of the time limit tool based on one metric alone: “Whether it improved public trust in the TikTok platform via media coverage.”

TikTok doesn’t just fail to protect kids — it targets them. When the Apple App Store challenged TikTok’s 12-and-up age rating in 2022, arguing its “frequent or intense mature or suggestive content” warranted a 17-and-up rating, TikTok refused to change it.

The Daily Citizen supports policies which protect American families from foreign threats. But make no mistake — TikTok remains a dangerous place, particularly for young users.

Parents should think long and hard before allowing their kids to take part.

Additional Articles and Resources

Plugged In Parent’s Guide to Today’s Technology 

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

TikTok Scrambles Amid Looming Ban

Trump Revives TikTok

Supreme Court Upholds TikTok Ban

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Tim Tebow to Parents: ‘Please Be the Protectors You’re Called to Be’

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

Jan 16 2026

UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them

Half of all states — Louisiana, Arkansas, Virginia, Utah, Montana, Texas, North Carolina, Indiana, Idaho, Florida, Kentucky, Nebraska, Georgia, Alabama, Kansas, Oklahoma, Mississippi, South Carolina, South Dakota, Wyoming, North Dakota, Missouri, Arizona and Ohio — require pornography companies to verify the ages of their online consumers.

Ten more states hope to pass age verification legislation in 2026.

Described by Politico as “perhaps the most bipartisan laws in the country,” age verification laws help parents protect their kids by making it harder for minors to access adult content online.

Most age verification bills:

  • Require companies who publish a “substantial” amount of adult content — usually 1/3 or more of their total production — to check the age of every person accessing their website.
  • Create a way for parents to sue pornography companies if their kids access content they shouldn’t.

The Supreme Court found age verification requirements like these constitutional in June 2025, silencing critics who argue they infringe on free speech and privacy rights.

While most age verification laws contain the same basic components, few are identical.

Some states add age-verification requirements for social media companies. Minnesota’s House Filing 1875 would require social media companies to exclude children younger than 14 from their platforms.

Michigan’s Senate Bill 284 would require manufacturers like Apple to verify device users’ ages and communicate that information to other apps and websites.

Wyoming’s HB 43, now law, requires all online websites which publish or host adult content — no matter how little — to verify consumers’ ages.

States also employ different strategies to pass age verification bills.

Ohio rolled its age verification law into the bill establishing the state’s 2026-2027 budget. Missouri legislators introduced five bills this month to build on the state’s existing age verification regulations.

Hawaii separated its legislation into two bills — one establishing age verification requirements and another creating penalties for violators — so representatives could approve the requirements even if they disagreed with proposed penalties.

While not perfect, age verification laws greatly restrict the amount of porn young people can access. After Louisiana became the first state to pass such legislation in 2022, traffic to Pornhub.com from that state dropped by 80%, one spokesperson told the Institute for Family Studies.

Scroll down to see the status of age verification bills in different states. To find out more about age verification and parents’ rights legislation in your state, contact your local Focus on the Family-allied Family Policy Council.

States in dark blue have passed age verification laws. States in light blue have active age verification bills. Missouri has both passed and pending age verification legislation.
Age Verification Laws

Louisiana
HB 142 became law on June 15, 2022.
Date effective: January 1, 2023

Arkansas
SB 66 became law on April 11, 2023.
Date effective: July 31, 2023

Virginia
SB 1515 became law on May 12, 2023.
Date effective: July 1, 2023

Utah
SB 0287 became law on May 4, 2023.
Date effective: May 3, 2023

Montana
SB 544 became law on May 19, 2023.
Date effective: January 1, 2024

Texas
HB 1181 became law on June 12, 2023.
Date effective: September 19, 2023

North Carolina
HB 8 became law on September 29, 2023.
Date effective: January 1, 2024

Indiana
SB 17 became law on March 13, 2024.
Date effective: August 16, 2024

Idaho
HB 498 became law on March 21, 2024.
Date effective: July 1, 2024

Florida
HB 3 became law on March 25, 2024.
Date effective: January 1, 2025

Kentucky
HB 278 became law on April 5, 2024.
Date effective: July 15, 2024

Nebraska
Online Age Verification Liability Act became law on April 16, 2024.
Date effective: July 18, 2024

Georgia
SB 351 became law on April 23, 2024.
Date effective: July 1, 2025

Alabama
HB 164 became law on April 24, 2024.
Date effective: October 1, 2024

Kansas
SB 394 became law without the Governor’s signature on April 25, 2024.
Date effective: July 1, 2024

Oklahoma
SB 1959 became law on April 26, 2024.
Date effective: November 1, 2024

Mississippi
HB 1126 became law without the Governor’s signature on April 30, 2024.
Date effective: July 1, 2024

South Carolina
HB 3424 became law on May 29, 2024.
Date effective: January 1, 2025

Tennessee
HB 1642/SB 1792 became law on June 3, 2024.
Date effective: January 13, 2025

South Dakota
HB 1053 became law on February 27, 2025.
Date effective: July 1, 2025

Wyoming
HB 43 became law on March 13, 2025.
Date effective: July 1, 2025

North Dakota
HB 1561 became law on April 11, 2025.
Date effective: August 1, 2025

Missouri
Rule 15 CSR 60-17.010 published on May 7, 2025.
Date effective: November 30, 2025

Arizona
HB 2112 became law on May 13, 2025.
Date effective: September 26, 2025

Ohio
HB 96 became law on June 30, 2025.
Date effective: September 30, 2025

Age Verification Bills

Hawaii
HB 1212: carried over to the 2026 session on December 8, 2025.
HB 1198: carried over to the 2026 session on December 8, 2025.

Iowa
HF 864 (formerly HF 62): placed on subcommittee calendar for the Senate Committee on Technology on January 13.
SF 443 (formerly SF 236): referred to Senate Committee on Technology on June 16, 2025.

Michigan
SB 901: referred to Senate General Laws Committee on January 8.
SB 284 (HB 4429): referred to the Senate Committee on Finance, Insurance and Consumer Protection on May 6, 2025.
HB 4429 (SB 284) : referred to House Committee on Regulatory Reform on September 18, 2025.

Minnesota
HF 1875: referred to House Committee on Commerce, Finance and Policy on March 5, 2025.
SF 2105 (HF 1434): referred to Senate Committee on Commerce and Consumer Protection on March 3, 2025.
HF 1434 (SF 2105): referred to House Committee on Commerce, Finance and Policy on February 24, 2025.

Missouri
HB 1878: referred to House Committee on General Laws on January 8.
HB 1839: referred to House Committee on Children and Families on January 15.
SB 901: referred to Senate General Laws Committee on January 8.
SB 1346: read in the senate on January 7.
SB 1412: read in the senate on January 7.


New Hampshire
SB 648: heard by Senate Judiciary Committee on January 8.

New Jersey
S 1826: referred to Senate Judiciary Committee on January 13.

New York
S 3591 (A 03946): referred to Senate Committee on Internet and Technology on January 7.
A 03946 (S 3591): referred to Assembly Consumer Affairs and Protection Committee on January 7.

Pennsylvania
HB 1513: referred to House Communications and Technology Committee on May 29, 2025.
SB 603: referred to Senate Judiciary Committee on April 9, 2025.

Washington
HB 2112: heard in the House Committee on Consumer Protection and Business on January 16.

Wisconsin
AB 105: second amendment proposed in the senate on January 7.

Written by Emily Washburn · Categorized: Culture, How to Get Involved · Tagged: age verification, parental rights, social media

  • Page 1
  • Page 2
  • Page 3
  • Interim pages omitted …
  • Page 6
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2026 Focus on the Family. All rights reserved.

  • Cookie Policy