• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Nov 26 2025

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

JUMP TO…
  • The Act
  • First Amendment Concerns
  • Supporters

Congress must pass the bipartisan Kids Online Safety Act (KOSA), child safety advocates say, so parents can better protect their children from sexual exploitation, addiction and myriad other online harms.

The bill, which Senators Marsha Blackburn (TN) and Richard Blumenthal (CT) reintroduced in May, would hold social media companies legally responsible for harming minors. Platforms governed by the bill would fulfill their legal obligations by instituting child safeguards, creating parental controls and increasing transparency.

A similar version of KOSA passed the Senate last year in a near-unanimous, 91-3 vote. It stalled in the House amid First Amendment concerns.

“[KOSA] will compel covered social media companies to center online safety and wellbeing rather than profit alone,” a group of more than 400 organizations representing parents, children, researchers, advocates and healthcare professionals wrote in an October letter encouraging legislators to pass the bill.

Though Senate Majority Leader John Thune (SD) and Minority Leader Chuck Schumer (NY) both endorse the bill, the Senate has not voted on KOSA this year.

The Act

KOSA would apply to any interactive website that primarily allows users to post and share content, including social media platforms, video posting sites like YouTube and some interactive video games.

It would require covered platforms to place automatic safeguards on minors’ accounts, like:

  • Limiting who can communicate with minors or view their profiles.
  • Prohibiting other companies from viewing or collecting minors’ data.
  • Limiting addictive features like infinite scrolling, auto-play, algorithmic content recommendations and rewards for spending time on the platform.
  • Restricting location sharing and notifying minors when location-tracking turns on.

It would also force covered platforms to offer parents tools to:

  • Manage their child’s privacy and account settings.
  • Restrict their child’s ability to make purchases or engage in financial transactions.
  • View and limit how much time their child spends on a platform.

KOSA further addresses Big Tech’s lack of transparency. Covered platforms would have to:

  • Warn parents and minors about a platform’s potential dangers.
  • Clearly disclose marketing and advertising content.
  • Explain how they create personal content recommendation algorithms — and how users can opt out.

Companies with more than 10 million users a month, on average, would additionally undergo annual, third-party audits investigating whether their platforms harm children. Parents could read auditors’ findings in mandatory safety reports.

State attorneys general and the Federal Trade Commission (FTC) could sue covered platforms for failing to uphold their legal responsibilities under KOSA. The FTC could investigate KOSA violations as “unfair or deceptive business practices.”

First Amendment Concerns

Senators Blackburn and Blumenthal adjusted this year’s version of KOSA to alleviate concerns about government censorship, which contributed to the bill’s failure last year.

Senator Mike Lee (UT), one of just three senators who voted against KOSA in 2024, explained on X:

The legislation empowers the FTC to censor any content it deems to cause “harm,” “anxiety,” or “depression,” in a way that could (and most likely would) be used to censor the expression of political, religious and other viewpoints disfavored by the FTC.

The House Committee on Energy and Commerce tried to alleviate concerns like Lee’s in September 2024 by limiting KOSA’s application to companies making more than $2.5 billion in annual revenue or hosting at least 150 million monthly users.

Though the committee’s revisions eventually passed, many legislators argued the changes gutted KOSA. It never received a vote on the House floor.

This year’s version of the bill specifically prohibits the FTC or state attorneys general from using KOSA suits to illegally censor content. A press release announcing KOSA’s reintroduction reads, in part:

The bill text … further makes clear that KOSA would not censor, limit or remove any content from the internet, and it does not give the FTC or state Attorneys General the power to bring lawsuits over content or speech.
Supporters

Several influential advocates for children’s digital safety support KOSA, including many who regularly appear in the Daily Citizen.

“The Kids Online Safety Act is a powerful tool in parents’ defense of their children,” Tim Goeglein, Vice President of External and Government Relations for Focus on the Family, told the Daily Citizen.

Clare Morrell, fellow at the Ethics and Public Policy Center and author of The Tech Exit, writes:

Parents have been left on their own to try to fend off a massive tech-induced crisis in American childhood from online platforms that are engineered to be maximally addictive. KOSA offers a needed solution by making social media platforms responsible for preventing and mitigating certain objective harms to minors, like sexual exploitation.

Morrell’s The Tech Exit offers parents a blueprint to break their children free of addictive technologies.

Jonathan Haidt, social psychologist and author of The Anxious Generation, argues KOSA “would begin to address the [indisputable harm occurring to children at an industrial scale].”

Haidt’s The Anxious Generation raises alarm bells about the effects of ubiquitous internet access on children’ physical, mental and social wellbeing.

Both houses of Congress must pass KOSA by the end of December. If they do not, parents will have to wait yet another year for the bill’s critical protections.

The Daily Citizen will continue covering this important story.

Additional Articles and Resources

Counseling Consultation & Referrals

Parenting Tips for Guiding Your Kids in the Digital Age

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Danger in Their Pockets

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Proposed SCREEN Act Could Protect Kids from Porn

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

Written by Emily Washburn · Categorized: Culture · Tagged: social media, technology

Nov 17 2025

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Most Americans support restricting kids’ access to social media and pornography, this year’s American Family Survey shows. But many parents remain hesitant to monitor their children’s online activity.

Brigham Young University’s annual American Family Survey asks a representative group of 3,000 American adults about economic, cultural and social concerns affecting the family. This year, participants identified “social media, video games [and] other electronic resources” as one of the top five issues families must confront.

Respondents expressed the most concern about porn and social media’s effects on young people. American adults overwhelmingly support government regulations limiting minors’ access to these products.

Per the survey:

  • More than 75% of participants support requiring pornography websites to verify the ages of their consumers.
  • Nearly 80% support requiring social media companies to obtain parents’ consent before allowing a minor to create a social media account.
  • Three in four support holding social media companies legally liable for harm caused by content marketed to minors.

Parents with children under 18 years old living at home also support making technology restrictions part of parenting norms. More than 60% of respondents in this demographic wish other families would implement rules about technology, and half said it would make setting and enforcing their own restrictions easier.  

But the survey also shows many parents don’t limit their children’s access to technology at all — let alone discuss strategies with other parents.

Surveyors asked participants with children under 18 years old in the home whether they implement any of five common technological boundaries: limiting their children’s screen time, restricting the kinds of content they consume, requiring them to keep their online accounts private, restricting who they contact and limiting who they exchange private messages with.

One in five respondents (20%) implement none of these restrictions. Two in five respondents (40%) don’t limit their kids’ screen time. Another 40% don’t police the content their children consume.

Though most participants in this demographic claimed other parents’ rules about technology would help them create and enforce their own rules, only 17% said another parent had influenced them to change a screen time restriction.

One third of respondents said they never talk about managing kids and technology with another parent. Only 13% claim to discuss it frequently.

Ubiquitous technology and internet access make parenting harder. Enforcing technological boundaries can be confusing, thankless and overwhelming — particularly when tech companies frequently undermine parental controls with few consequences.

But these obstacles do not change parents’ duty to protect their children from harmful content and technologies.

Parents, you do not have to allow your children access to smartphones or the internet. If you choose to do so, you must be prepared to:

  • Police your child’s online activity.
  • Educate yourself about parental controls and implement them to the best of your ability.
  • Warn your child about online predation and other pitfalls.
  • Model healthy relationships with technology.

Joining forces with other parents to limit children’s access to social media and smartphones can help families create and maintain healthy boundaries with technology. Take it upon yourself to initiate these partnerships. Odds are, you will not be rebuffed.

For more tips and tricks, check out Plugged In’s Parent’s Guide to Today’s Technology. For more information about technology restrictions — or ditching smartphones altogether — read the articles below.

Additional Articles and Resources

Counseling Consultation & Referrals

More than Twenty States Limit Smartphone Use in Schools

Parent-Run Groups Help Stop Childhood Smartphone Use

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Four Ways to Protect Your Kids from Bad Tech, from Social Psychologist Jonathan Haidt

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Video: Seven-Year-Old’s Confidence Soars After Ordering Chick-Fil-A By Himself

5 Most Important Things OpenAI Lawsuits Reveal About ChatGPT-4o

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Proposed SCREEN Act Could Protect Kids from Porn

Written by Emily Washburn · Categorized: Culture, Family · Tagged: parenting, social media, technology

Jun 27 2025

New York Legislature Passes Bill Requiring Social Media Warning Labels

It’s no secret websites like Instagram, TikTok, Snapchat and Facebook have a chokehold on America’s youth. That’s why New York lawmakers have passed a bill requiring social media platforms to display mental health warning labels to in-state users.

Senate Bill S405 cites a June 2024 statement from U.S. Surgeon General Vivek Murthy, calling for warning labels on “addictive” platforms:

Requiring social media apps with certain particularly noxious design features to display warning labels to all users at the point of user access … is a reasonable and necessary step to take for consumer health and safety.

The legislation references Surgeon General Murthy’s characterization of the current youth mental crisis as a “public health emergency,” and cites various evidence to support this claim:

Research shows that social media exposure overstimulates reward centers, creating pathways comparable to those of an individual experiencing substance abuse or gambling addictions.
Leaked company documents reveal that social media companies knew that compulsive use of their products was also associated with “loss of analytical skills, memory formation, contextual thinking, conversational depth, (and) empathy.”
Among female adolescent users, the association between poor mental health and social media use is now stronger than the association between poor mental health and binge drinking, obesity, or hard drug use.

Additionally, the bill included several statistics supporting the correlation between social media and poor mental health:

  • As of 2023, 12 – 15-year-olds spend an average of 4.8 hours on social media platforms.
  • Today, almost half of adolescents report social media makes them feel worse about their bodies.
  • Teens with the highest levels of social media use are twice as likely to rate their mental health as either “poor” or “very poor.”
  • From 2008 – 2015, the percentage of hospital visits among adolescent social media users nearly doubled due to suicidal ideation or attempts.
  • From 2011 – 2018, self-poisonings among 10 – 12-year-old girls quadrupled.
  • From 2011 – 2018, suicide rates among 10 – 14-year-old girls doubled, and hospital admissions for self-harm tripled.

State Assemblywoman Nily Rozic, a sponsor for the New York bill, commented:

By requiring clear warning labels, we’re giving families the tools to understand the risks and pushing tech companies to take responsibility for the impact of their design choices.
It’s time we prioritize mental health over engagement metrics.

Additionally, Jim Steyer, Common Sense Media Founder and CEO, stated:

We owe it to families to provide clear, evidence-based information about the consequences of excessive use.
When we learned alcohol could cause birth defects, we added warning labels for pregnant women. When nicotine was linked to cancer, we labeled every cigarette pack.
It’s time we took the same approach to social media – the latest addictive product that has kids hooked.

Last week, Senate Bill S4505 was passed in the New York Assembly and Senate. It is currently awaiting the signature of New York’s governor, Kathy Hochul, who previously signed bills prohibiting social media platforms from exposing teens to “addictive” algorithmic content without parental consent. 

If passed, New York will join other states that have or are attempting to issue warnings on social media apps, including Minnesota, Texas, Colorado and California.

Social media platforms have opposed New York’s potential bill, arguing that requiring warning labels on their websites would violate their rights to free speech.

Specifically, NetChoice’s Amy Bos, director of state and federal affairs, recently stated:

The proposed legislation constitutes an unprecedented expansion of government power that would compel private companies to espouse the state’s preferred messaging, a clear violated of the First Amendment’s protection against compelled speech.

However, warning labels on products proven to be harmful do not violate free speech. Rather, they serve to promote truth and transparency by notifying adolescents of the risks associated with social media use.

In a world where internet reliance is ever-increasing, it is crucial for the next generation to be fully informed about the information that constructs and influences their daily lives.

Related Articles and Resources:

‘The Tech Exit’ Helps Families Ditch Addictive Tech – For Good.

New York Prepares to Restrict School Smartphone Use

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Surgeon General Recommends Warning on Social Media Platforms

Written by Meredith Godwin · Categorized: Culture · Tagged: social media

Jun 18 2025

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good.

Social media, screens and smartphones, oh my — parents everywhere are struggling to keep their kids safe in an overwhelming technological age.

Clare Morell, a tech policy expert and fellow at the Ethics and Public Policy Center, throws frazzled families a lifeline in her new book, Tech Exit: A Practical Guide to Freeing Kids and Teens from Smartphones.

In Tech Exit, Morell encourages parents to challenge the idea that addictive technologies are an unavoidable part of modern childhood.

She and hundreds of other “Tech Exit” families are successfully raising their children without smartphones, social media, tablets or video games. The book lays out detailed, step-by-step instructions for families to join their number.

Tech Exit’s proposal might seem drastic — especially for families with older children already addicted to screens. Morell uses her own research and interviews with “Tech Exit” families to show leaving tech behind is not only possible, but logical.

She starts by debunking four myths about screen-based technology.

Myth 1: Screen-based technology is an inevitable part of childhood.

Morell helps create policies protecting children from dangerous technology. She gave birth to her own children as data began showing the catastrophic effects smartphone and social media use wreak on child development and mental health.

The new mom didn’t want her kids to suffer the same effects — but the advice she found in parenting books didn’t seem equal to the problems at hand.

“I saw a major disconnect,” she writes in Tech Exit. “I’d move from horrifying accounts of kids falling into sextortion schemes to advice like ‘set time limits’ [and] ‘turn on privacy settings on their accounts.’”

These aren’t bad strategies, Morell explains, but they also assume that children need access to screen-based technology. That’s not true. Her own family is proof that a “Tech Exit” is sustainable and beneficial.

Myth 2: Screen-based tech can be used in moderation.

We like to conceive of screens like sugar — something that can be enjoyed in moderation.

But screens aren’t like sugar. “For the developing brains of children and teens,” Morell writes, “they are more like fentanyl.”

As the Daily Citizen has previously reported, social networking apps, websites and devices — anything with a messaging or networking feature — triggers the release of dopamine, the brain’s reward chemical.

Crucially, dopamine trains the brain to “want” something but never produces feelings of satiety. Once kids get a taste of tech, they’ll always want more.

When parents bring screen-based tech into the house, they put themselves squarely between “a drug-dispensing machine and an underdeveloped brain,” as one of Morell’s interviewees puts it, and invite daily battles over its use.

“It’s an untenable, exhausting situation,” Morell writes.

Myth 3: The harms of screen-based tech can be fixed with screen-time limits.

Tech companies frequently imply parents can protect kids from screen-based technology by stopping them from spending too much time on their devices. That’s why, in part, screen-time limits are “the most prominent form of parental control [over kids’ devices],” according to Morell.

But addictive technology can negatively affect kids regardless of the amount of time they use them.

The dopamine released in just a couple of minutes of screen time can cause kids to desire tech for hours after it’s been put away. Over time, these intense chemical highs will make other, everyday pleasures seem boring.

The negative social effects of technology burden all kids and teens alike, regardless of their screen use. Morell writes:

The teen mental health crisis today is due not only to negative effects of digital technologies for individuals but also to the group social dynamic that smartphones and social media have created.

Smartphones, for example, change the way kids and teens create and maintain friendships. Every kid must play by these new social rules — even if they don’t use screen-based technology.

Myth 4: Parents can protect their children from danger using parental controls.

Device and app manufacturers have financial incentives to show children inappropriate content. Thus, parental controls are unintuitive, filled with bugs and intentionally easy to manipulate.

But that’s not how they’re sold to parents. Tech companies keep young customers by convincing parents they can sufficiently protect their kids from predators, scams and inappropriate content online.

It’s almost always an exercise in frustration.

Given these intractable problems, Morell uses a startling metaphor to illustrate parental controls’ effectiveness in the digital world:

We don’t take our children to bars and strip clubs and blindfold them or have them wear earplugs. That would be absurd. We just don’t let them go to those places.

Morell’s cost-benefit analysis suggests the benefits of raising children in largely tech-free households far outweigh the consequences. Tech Exit endeavors to create a clear, sustainable path for families to do just that.

Her approach centers around FEAST — an acronym for five common principles all “Tech Exit” families she interviewed follow:

  • Find Other Families: They connect with other “Tech Exit” families.
  • Explain, Educate, Exemplify: They get their kids on board by explaining why they are getting rid of screens, educating them on the dangers of the digital world and exemplifying good digital habits.
  • Adopt Alternatives: They look for creative alternatives to smartphones and other technologies.
  • Set Up Digital Accountability and Family Screen Rules: They create rules and boundaries governing technology in the home.
  • Trade Screens for Real-Life Responsibilities: They replace time spent on screens with independent play and responsibilities.

Morell offers a treasure trove of practical, honest advice and resources to help families adopt these principles in their own lives — even when it seems impossible.

Curious about becoming a “Tech Exit” family? You can find The Tech Exit: A Practical Guide to Freeing Kids and Teens from Smartphones here.

Additional Articles and Resources

Video: Seven-Year-Old’s Confidence Soars After Ordering Chick-Fil-A By Himself

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Pornography Age Verification Laws: What They Are and Which States Have Them

Written by Emily Washburn · Categorized: Family · Tagged: smartphone, social media, technology

May 14 2025

UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them

Arizona became the 24 state requiring pornography companies to check online consumers’ ages on May 13, joining Louisiana, Arkansas, Virginia, Utah, Montana, Texas, North Carolina, Indiana, Idaho, Florida, Kentucky, Nebraska, Georgia, Alabama, Kansas, Oklahoma, Mississippi, South Carolina, South Dakota, Wyoming, North Dakota and Missouri.

Twelve more states hope to pass age verification legislation before the year is up. Congress is also considering the SCREEN Act, which would institute national age verification requirements.

Described by Politico as “perhaps the most bipartisan laws in the country,” age verification laws empower parents to protect their kids by making it harder for minors to access harmful content like pornography.

Most age verification bills follow a template laid out by the Institute for Family Studies (IFS) and the Ethics and Public Policy Center (EPPC) in 2022:

  • They require companies who publish a “substantial” amount of adult content — usually 1/3 or more of their total production — to check the age of every person accessing their website.
  • They create a way for parents to sue pornography companies if their kids access content they shouldn’t.

Some states add social media age restrictions to their bills — another one of IFS and EPPC’s recommended policies. House Filing 1875 in Minnesota prevents children younger than 14 years old from creating social media accounts and requires parents to consent before 14- and 15-year-olds can sign up.

Other states, like Hawaii, separated its legislation into two bills — one establishing an age verification requirement and another creating penalties for violators. This strategy allows representatives to approve age verification laws even if they disagree with proposed penalties.

Still other states incorporate child media protections into their bills. Ohio’s House Bill 84 would crack down on AI deepfakes by forbidding the use of “another person’s likeness to create sexual images of the other person.” Maryland’s House Bill 1212 would require manufacturers to add age verification features to all smart devices activated in the state.

Wyoming’s HB 43, now law, jettisons the “33% adult content” threshold embraced by other states. The bill requires all online companies that publish or host adult content — no matter how little — to verify consumers’ ages.

While not perfect, age verification laws greatly restrict the amount of porn young people can consume. After Louisiana became the first state to pass such legislation in 2022, traffic to Pornhub.com from that state dropped by 80%, one spokesperson told IFS.

Scroll down to see the status of age verification bills in different states. To find out more about age verification and parents’ rights legislation in your state, contact your local Focus on the Family-allied Family Policy Council.

States in dark blue have passed age verification laws. States in light blue have active age verification bills.
Age Verification Laws

Louisiana
HB 142 became law on June 15, 2022.

Arkansas
SB 66 became law on April 11, 2023.

Virginia
SB 1515 became law on May 12, 2023.

Utah
SB 0287 became law on May 4, 2023.

Montana
SB 544 became law on May 19, 2023.

Texas
HB 1181 became law on June 12, 2023.

North Carolina
HB 8 became law on September 29, 2023.

Indiana
SB 17 became law on March 13, 2024.

Idaho
HB 498 became law on March 21, 2024.

Florida
HB 3 became law on March 25, 2024.

Kentucky
HB 278 became law on April 5, 2024.

Nebraska
Online Age Verification Liability Act became law on April 16, 2024.

Georgia
SB 351 became law on April 23, 2024.

Alabama
HB 164 became law on April 24, 2024.

Kansas
SB 394 became law without the Governor’s signature on April 25, 2024.

Oklahoma
SB 1959 became law on April 26, 2024.

Mississippi
HB 1126 became law without the Governor’s signature on April 30, 2024.

South Carolina
HB 3424 became law on May 29, 2024.

Tennessee
HB 1642/SB 1792 became law on June 3, 2024.

South Dakota
HB 1053 became law on February 27, 2025.

Wyoming
HB 43 became law on March 13, 2025.

North Dakota
HB 1561 became law on April 11, 2025.

Missouri
Rule 15 CSR 60-17.010 published on May 7, 2025.

Arizona
HB 2112 became law on May 13, 2025.

Age Verification Bills

Hawaii
HB 1212: referred to House Consumer Protection and Commerce Committee on January 27.
HB 1198: referred to House Consumer Protection and Commerce Committee on January 27.

Illinois
SB 2082: introduced in the Senate on
February 6.
HB 1103: referred to the Senate Rules Committee on January 9.

Iowa
HF 864 (formerly HF 62): passed House Judiciary Committee on March 4; scheduled to be read in the House on March 7.
SF 443 (formerly SF 236): passed Senate Committee on Technology on February 26.

Maryland
HB 1212: referred to House Committee on Economic Matters on February 6.
HB 394: first hearing in the House Judiciary Committee on February 5.

Minnesota
HF 1875: referred to House Committee on Commerce, Finance and Policy on March 5.
SF 2105 (HF 1434): referred to Senate Committee on Commerce and Consumer Protection on March 3.
HF 1434 (SF 2105): referred to House Committee on Commerce, Finance and Policy on February 24.

New York
S 3591: referred to Senate Committee on Internet and Technology on January 28.

Nevada
AB 294: referred to Assembly Committee on Commerce and Labor on February 25.

New Mexico
HB 44: referred to House Commerce and Economic Development Committee on January 22.

Ohio
HB 84: referred to House Technology and Innovation Committee on February 12.

West Virginia
HB 2689 (SB 293): referred to House Judiciary Committee for the third time on March 11.
SB 293 (HB 2689): referred to Senate Judiciary Committee for second time on February 25.

Oregon
HB 2032: referred to House Judiciary Committee on January 17.

Wisconsin
AB 105: first public hearing held in Assembly Committee on State Affairs on March 12.

Written by Emily Washburn · Categorized: Culture, How to Get Involved · Tagged: age verification, parental rights, social media

  • Page 1
  • Page 2
  • Page 3
  • Page 4
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2025 Focus on the Family. All rights reserved.

  • Cookie Policy