• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Mar 06 2026

New Mexico Accuses Meta of Egregious Harm to Children in Court Case

Trial proceedings in New Mexico’s court case against Meta will conclude later this month, with a judge to decide whether the beleaguered social media company violated state law by exposing minors to explicit content, social media addiction and sexual exploitation.

The state’s 228-page complaint, which New Mexico Attorney General Raul Torrez filed in 2023, alleges Meta’s platforms target minors with addictive features and “knowingly expose them to the twin dangers of sexual exploitation and mental health harm.”

If First Judicial District Court Judge Bryan Biedscheid rules in New Mexico’s favor, Meta could face hundreds of millions of dollars in fines for violating state law protecting consumers from unfair and deceptive business practices.

The trial is expected to conclude on March 27.

New Mexico’s case is the first stand-alone state suit against Meta. It includes evidence from a months-long undercover operation in which New Mexico officers posed as children on Facebook and Instagram.

The investigation indicated Meta’s platforms:

  • Show underage users sexually explicit content without prompting.
  • Allow adult predators to contact children and sexually exploit them.
  • Facilitate the spread and exchange of child pornography.

In some cases, the state claims, Facebook recommended children join groups “devoted to facilitating commercial sex.”

In another case, investigators say Meta “allowed a fictitious mother offer her 13-year-old daughter for sale to sex traffickers and to create a professional page to allow her daughter to share revenue from advertising.”

“Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex,” Attorney General Torrez summarized in a press release.

Internal documents from Meta and testimony from former employees bolster New Mexico’s claims. Shortly before the trial began on February 9, the state published several troubling communications from a former Meta employee who worked in child safety.

In a June 2020 email reviewed by the New York Post, the employee revealed sexual predators on Meta platforms target “[approximately] 500k victims per day in English markets only.”

“We expect the true situation is worse,” she confided.

In another email, the employee considered the impact of giving such a large user base access to children.

“I just think nowhere in the history of humanity could you have a secret conversation with 1,000 people,” she wrote. “I’m actually scared of the ramifications here.”

Two-time Meta employee Arturo Béjar testified against his former employer on February 12. Béjar left Meta for the first time in 2015. He returned in 2019 to strengthen Meta platforms’ safety after someone sent his own daughter explicit photos online.

As far as Béjar could tell, Meta wasn’t interested in prioritizing safety.

“So many examples of people with good ideas for good things that would reduce harm within, as it got reviewed and went through the pipeline, would get pushed down,” KOAT quoted Béjar’s testimony.

In 2021, Béjar surveyed more than 237,000 Instagram users between 13 and 15 years old to determine what kinds of harm they faced on social media. One in three reported witnessing cyberbullying. One in 10 said they, themselves, experienced bullying online. One in five reported seeing explicit images.

Though Béjar said he shared his findings with Meta CEO Mark Zuckerberg and other top executives, he claimed the company continued to prioritize profit.

“I think they [the executives] really care about making people think that they care, but I think in practice they don’t care,” Béjar mused, per KOAT.

“Caring is the moment you become aware of something, you engage with it, you understand it, you work on it, you do things that make it better.”

Meta, for its part, maintains New Mexico’s undercover investigation was “ethically compromised.” In opening arguments, the company claimed the state “cherry-picked” evidence which doesn’t accurately reflect its safety protocols.

Judge Biedscheid denied Meta’s request to dismiss New Mexico’s case in May 2024. He subsequently denied its pretrial motion to exclude evidence from the state’s undercover investigation.

Meta’s biggest problem is that New Mexico’s allegations echo those from thousands of other lawsuits against the social media company. More than 1,600 civil cases accusing Meta and other social media platforms of harming children — known as the social media addiction lawsuits — are awaiting trial in California state court.

Trial for the first of these cases began in Los Angeles in January.

Dozens more federal cases, including many brought against Meta by state governments, will make their way into courtrooms starting this summer.

Instagram, in particular, has long been linked to sextortion and inappropriate content. Predators met their victims on Instagram in nearly half (45%) of all the sextortion reports filed with the National Center for Missing and Exploited Children between August 2020 and August 2023.

Of the scammers who threated to share explicit photos of minors, 60% threatened to do so on Instagram.

In January 2024, Meta announced all teen accounts would begin automatically filtering out inappropriate content. Journalists from The Wall Street Journal soon discovered the feature didn’t work.

“Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in,” the outlet wrote, continuing:

Within a half-hour of its creation, a new 13-year-old test account that watched only Instagram-recommended videos featuring women began being served video after video about anal sex.

Regardless of whether New Mexico triumphs against Meta in court, evidence from the state’s case clearly illustrates why minors should not be allowed on Instagram and Facebook unsupervised.

The Daily Citizen will continue covering America’s legal reckoning with social media and the harm it causes children.

Additional Articles and Resources

Social Media Addiction Suits got to Trial — Here’s What You Need to Know

Meta’s Mark Zuckerberg Denies Instagram is Addictive in Social Media Trial Testimony

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Key Takeaways From Zuckerberg’s Tell-All

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Horrifying Instagram Investigation Indicts Modern Parenting

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Feb 26 2026

Deceptive Colorado Bill Hides Kids’ Social Media Activity from Parents

Colorado’s deceptive “Protections for Youth on Social Media” bill would allow minors to hide their online activity from their parents.

HB26-1148, which Representatives Yara Zokaie and Jenny Willford introduced in the Colorado House of Representatives earlier this month, would require social media and online gaming platforms to adopt high default privacy and data protection settings for minors’ accounts.

Plenty of other child protection bills, including the federal Kids Online Safety Act (KOSA), would require social media companies to institute more protections for minors on their platforms.

Unlike other bills, however, Colorado’s “Protections for Youth on Social Media” bill would conceal kids’ data from parents and strangers alike.

HB26-1148 would require covered companies to obtain a minor user’s permission for any adult — including their parents — to view their profile, account activity, friends or location.

To protect minors’ “data privacy,” HB26-1148 instructs:

[Covered businesses] shall not permit an individual, including a parent or guardian of a covered minor, to track the location of the covered minor without providing a conspicuous signal to the covered minor when the covered minor is being monitored or tracked (emphasis added).

To add insult to injury, the bill does not require social media or online gaming platforms to meaningfully protect children from obscene or inappropriate content. Covered companies do not have to verify the ages of users seeking to access mature content.

Instead, HB26-1148 specifies it should not be “interpreted or construed to … prevent or preclude a minor from deliberately or independently searching for or specifically requesting any media.”

In other words, HB26-1148 strips parents of any ability to monitor their kids’ social media activity while empowering minors to view any content and interact with any user they want.

Readers would never know the bill eviscerates parents ability to protect their kids online from its official summary, which writes only that HB26-1148 would require covered companies to provide minors with “the highest level of privacy.”

Parents familiar with bills like KOSA, which would require social media companies to adopt comprehensive parental controls, might reasonably assume Colorado’s bill would exempt exempts parents from privacy protections meant to shield children’s accounts from strangers and internet predators.

Unfortunately, they would be wrong.

Colorado’s “Protections for Youth on Social Media” may not directly affect all families, but Colorado legislators’ proposal and deceptive description of the bill should remind all parents to read the fine print.

Families simply cannot afford to take legislators’ word that a proposed bill is pro-family or pro-child. That’s why the Daily Citizen offers timely analysis and resources precisely to help busy families stay abreast of important policy issues.

For more information about legislative happenings in your state — and how you can get involved —connect with your Focus on the Family-allied state policy group. 

Additional Articles and Resources

How to Get In Touch With Your State Policy Group

Counseling Consultation & Referrals

Introducing Our Parents’ Guide to Technology 2026

Parenting Tips for Guiding Your Kids in the Digital Age

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Feb 20 2026

Meta’s Mark Zuckerberg Denies Instagram is Addictive in Social Media Trial Testimony

Meta’s Mark Zuckerberg denied social media is addictive Wednesday in his testimony at America’s first social media addiction trial.

The highly anticipated case will determine whether social media companies like Meta can be held legally liable for creating addictive products.

The complaint contends Instagram, a social media app owned by Meta, contributed to the social media addiction of a young woman named Kayley (initials KGM). Kayley’s alleged addiction exposed her to child predators and caused her to develop depression, anxiety and body dysmorphia.

The question before the jury is whether Instagram and YouTube, the defendants, are addictive.  

Kayley and her lawyers argue the platforms themselves cause addiction with features like infinite scroll, content recommendation algorithms and beauty filters.

Meta and Zuckerberg say the content Kayley saw on Instagram, in addition to her preexisting trauma and mental health conditions, caused her distress — not Instagram’s design.

Importantly, Section 230 of the Communications Decency Act protects online content hosts, like Instagram, from being held liable for content users post to their sites.

Zuckerberg bobbed and weaved with characteristic agility during his testimony Wednesday, staunchly refusing any suggestion Instagram could be addictive.

“I’m not sure what to say to that,” Zuckerberg told Mark Lanier, Kayley’s lawyer, after Lanier asked whether addictive products increase usage. “I don’t think that applies here.”

Zuckerberg focused on convincing the jury Meta does not benefit from addicting its users.

“There’s a misconception that the more attention the company captures, and the more time people spend on its apps, the better it is for Meta’s bottom line, regardless of the harms they may encounter,” NPR paraphrased the Meta founder.

He continued:

If people feel like they’re not having a good experience, why would they keep using the product?

Here, Zuckerberg assumes users can leave Instagram at will. But the crux of the case against Meta is that Instagram is addictive, making it difficult or impossible for dependent users to quit without help.

Kayley’s mom fears social media “has changed the way [her daughter’s] brain works.”

“She has no longer term memory,” she wrote in a filing reviewed by the Los Angeles Times.

“She can’t live without a phone. She is willing to go to battle if you were even to touch her phone.”

Meta makes money from monopolizing users’ time — which means the company has financial incentive to make their products addictive.

Internal documents referenced by Lanier show Instagram and Zuckerberg responded to this incentive on several occasions.

In 2016, when Instagram was competing with Snapchat for teen users, Zuckerberg reportedly “directed executives to focus on getting teenagers to spend more time on the company’s platforms,” according to the The New York Times.

Another document from 2018 reads: “If we wanna win big with teens, we must bring them in as tweens.” Instagram’s user policy supposedly excludes children under 13 years old.

Zuckerberg argued this incentive structure no longer exists. Now, Instagram focuses on making the platform useful to users.

“If something is valuable, people will use it more because it’s useful to them,” he explained.

But therein lies the problem. Though Instagram may no longer explicitly focus on increasing usage, Zuckerberg’s testimony indicates the amount of time users spend on Instagram remains the platform’s ultimate metric of success.  

In other words, Meta still has incentive to make their products addictive — whether Zuckerberg admits it or not.

Meta’s incentive structure speaks to the company’s motivations. If Zuckerberg can help convince the jury Meta does not benefit from addicting its users, then the social media company can claim it has no reason to create harmful products.

The argument is a subtle one, particularly compared to the ever-growing mountain of evidence showing social media harms children.

Meta’s history of releasing faulty or unsubstantial child safety protections further undermines Zuckerberg’s claim Meta does not increase usage at any cost.

When Instagram launched new tools addressing sextortion in October 2024, the National Center on Sexual Exploitation (NCOSE) warned:

Mark Zuckerberg and Meta leadership have a very long track record of explicitly making the choice not to rectify harms it knows it is both causing and perpetuating on children … or only doing so when forced by public pressure, bad press or Congressional hearings.

Senator Marsha Blackburn (TN), the author of the bi-partisan Kids Online Safety Act, communicated a similar sentiment in her remarks on Zuckerberg’s testimony.

“To no one’s surprise, Mark Zuckerberg followed his usual playbook of denial and deceit while sitting just a few steps away from parents who have tragically lost their children as a consequence of the way his platforms are designed to harm young users,” Blackburn said, continuing:

These companies are using the same playbook as Big Tobacco did decades ago by trying to keep kids hooked on products that hurt them.

Kayley’s case is the first of nine bellwethers, or test cases, for an estimated 1,600 similar civil cases filed against social media companies in California state court. Trial proceedings for the first of a large group of federal social media addiction cases will begin this summer.

Additional Articles and Resources

Social Media Addiction Suits go to Trial — Here’s What You Need to Know

America Controls TikTok Now—But Is It Really Safe?

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Key Takeaways From Zuckerberg’s Tell-All

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Horrifying Instagram Investigation Indicts Modern Parenting

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Feb 06 2026

Social Media Addiction Suits go to Trial — Here’s What You Need to Know

Social media companies face a legal reckoning this year as juries begin hearing the social media addiction lawsuits.

The long-awaited wave of cases will determine whether companies like Meta, YouTube, TikTok and their compatriots can face legal consequences for creating defective products.

The social media addiction lawsuits refer to thousands of civil cases alleging social media companies like Meta, YouTube and TikTok designed and released addictive products without adequate warnings, causing personal and financial injury.

The cases parallel the product liability lawsuits against tobacco companies, which eventually paid $206 billion for damage caused by addictive cigarettes.

The strongest cases are bundled into two groups — cases filed in federal court and cases filed in California state court. The federal cases primarily represent school districts and states which claim they foot the bill for young people’s social media addiction. The first of these cases will go to trial this summer.

The cases filed in California represent an estimated 1,600 individuals, families and school districts which claim social media addictions caused them personal injury.

The first of these cases — a 2023 suit filed on behalf of a young woman named KGM and several other plaintiffs — began trial proceedings on January 27.

KGM allegedly developed depression, anxiety and body-image issues after years of using YouTube, Instagram, TikTok (formerly Musical.ly) and Snapchat. She also became a victim of sextortion on Instagram, where a predator shared explicit photos of her as a minor.

The New York Post, which reviewed KGM’s complaint, describes her alleged experience with Meta following her victimization:

When KGM’s family reported her alleged sextortion to Meta … the company did nothing, [KGM’s] complaint claims, and instead allowed the person to continue committing harm via “explicit images of a minor child.”
According to the court filing, it took multiple family members and friends “spamming” Instagram’s moderation system in a coordinated, two-week effort before Meta handled it.

Predators met their victims on Instagram in nearly half (45%) of all the sextortion reports filed with the National Center for Missing and Exploited Children between August 2020 and August 2023.

Of the scammers who threatened to share explicit photos of minors, 60% threatened to do so on Instagram.

When Instagram launched new tools to address sextortion in October 2024 — long after KGM’s experience — the National Center on Sexual Exploitation (NCOSE) warned:

Mark Zuckerberg and Meta leadership have a very long track record of explicitly making the choice not to rectify harms it knows it is both causing and perpetuating on children … or only doing so when forced by public pressure, bad press or Congressional hearings.

KGM’s mom allegedly used third-party parental control apps to keep her daughter from using social media. It didn’t work. Per the complaint, Meta, YouTube, TikTok and Snap “design their products in a manner that enables children to evade parental consent.”

Social media “has changed the way [my daughter’s] brain works,” KGM’s mom wrote in a filing reviewed by the Los Angeles Times, continuing:

She has no long-term memory. She can’t live without a phone. She is willing to go to battle if you were even to touch her phone.

Snap, the company behind Snapchat, and TikTok settled with KGM in January. Meta and YouTube are expected to proceed with the trial.

KGM’s case is the first of nine California bellwether cases, or cases to determine whether a novel legal theory will hold up in court.

The mystery is whether KGM’s argument can circumvent Section 230 of the Communications Decency Act, which prevents online platforms from being sued for users’ posts.

Social media companies have thus far escaped product liability accusations using Section 230. In a November motion to keep KGM’s case from going before a jury, Meta blamed the harm KGM experience on content posted to Instagram, which it could not be held responsible for.

Judge Carolyn B. Kuhl, who oversees the state cases, ruled against the social media juggernaut, finding KGM had presented evidence indicating Instagram itself — not content posted to Instagram — caused her distress.

“Meta certainly may argue to the jury that KGM’s injuries were caused by content she viewed,” Judge Kuhl wrote. “But Plaintiff has presented evidence that features of Instagram that draw the user into compulsive viewing of content were a substantial factor in causing her harm.”

“The cause of KGM’s harms is a disputed factual question that must be resolved by the jury,” she concluded.

If KGM or any other bellwether case wins, Section 230 may no longer offer unconditional protection for social media companies which behave badly.

The social media addiction cases could send social media companies an expensive message about prioritizing child wellbeing. But parents shouldn’t rely on companies’ altruism (or grudging compliance with court orders) to protect their kids.

Social media is not a safe place for children. Parents should seriously consider keeping their children off it.

To read more of the Daily Citizen’s reporting on the effects of social media on children, read the articles linked below.

Additional Articles and Resources

America Controls TikTok Now—But Is It Really Safe?

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

Australia Bans Kids Under 16 Years Old From Social Media

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Key Takeaways From Zuckerberg’s Tell-All

Zuckerberg Implicated in Meta’s Failures to Protect Children

Instagram Content Restrictions Don’t Work, Tests Show

Surgeon General Recommends Warning on Social Media Platforms

Horrifying Instagram Investigation Indicts Modern Parenting

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Written by Emily Washburn · Categorized: Culture · Tagged: social media

Jan 27 2026

America Controls TikTok Now — But Is It Really Safe?

TikTok’s Chinese parent company, ByteDance, relinquished control of American TikTok to the U.S. late last week.

The transaction staves off the looming TikTok ban without sacrificing national security. The platform itself, however, remains as dangerous for America’s 170 million users as ever.

TikTok USDS Joint Venture, an American-controlled company, took over the American divisions of TikTok, Lemon8, CapCut and several other ByteDance apps on Thursday, January 22.

American investors own 50% of the new company. ByteDance retains 19.9% ownership and the remaining 30% belongs to ByteDance investors.

American control of the joint venture solves three national security threats caused by Chinese ownership of ByteDance.

First and foremost, American users’ data will be hosted on U.S.-based servers and protected by U.S. cybersecurity companies.

TikTok always claimed to keep the extensive data it collected on Americans secure. But Chinese law requires companies like ByteDance make their data available to the government.

A congressional investigatory committee determined Chinese officials had mined Americans’ TikTok data on multiple occasions, including:

  • Names
  • Ages
  • Emails
  • Phone numbers
  • Contact lists
  • In-app messages and usage patterns
  • IP addresses
  • Keystroke patterns
  • Browsing and search history
  • Location data
  • Biometric information like face- and voiceprints

Chinese access to ByteDance’s data extended to TikTok’s powerful content recommendation algorithm. TikTok USDS will reset the algorithm and retrain it on American content alone, ensuring China can no longer manipulate what U.S. citizens see on TikTok.

ByteDance will no longer perform content moderation under TikTok USDS, further preventing China from censoring or influencing the success of Americans’ posts.

ByteDance’s divestiture of American TikTok satisfies the Protecting Americans from Foreign Adversary Controlled Applications Act — the ban-or-sell law which required the company to sell majority ownership of TikTok for the app to remain available in America.

President Donald Trump passed an executive order delaying enforcement of the ban in January 2025, giving TikTok time to negotiate with American buyers. When ByteDance and U.S. investors established a framework for the deal in September, the president gave the parties another 120 days to sign on the dotted line.

The launch of TikTok USDS Joint Venture on January 22 came just one day before the deadline.

American control of TikTok might well protect citizens from global security threats. It does not, however, protect users from TikTok itself.

Thirteen states and the District of Columbia sued TikTok in October 2024 for illegally collecting and monetizing American children’s data.

The suits estimated as much as 35% of TikTok’s American ad revenue under ByteDance came from children and teens. Importantly, ByteDance will retain control of TikTok’s e-commerce, marketing and advertising under the TikTok USDS.

The 2024 lawsuits exposed documents showing TikTok not only knew its app was addictive, but that compulsive use in teens caused “a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, increased anxiety … [and interference] with essential responsibilities like sufficient sleep, work/school responsibilities and connecting with loved ones.”

An estimated 95% of American smartphone users under 17 years old use TikTok, according to one of the platform’s own reports.  

Still, TikTok workers did not consider it their responsibility to limit minors’ use of the platform — even when creating a tool allowing parents to set TikTok time limits for their kids.

“Our goal is not to reduce the time spent [on TikTok],” a project manager for the tool wrote in an employee group chat.

“[The goal is] to contribute to daily active users and retention [of other users],” another chimed in.

TikTok evaluated the success of the time limit tool based on one metric alone: “Whether it improved public trust in the TikTok platform via media coverage.”

TikTok doesn’t just fail to protect kids — it targets them. When the Apple App Store challenged TikTok’s 12-and-up age rating in 2022, arguing its “frequent or intense mature or suggestive content” warranted a 17-and-up rating, TikTok refused to change it.

The Daily Citizen supports policies which protect American families from foreign threats. But make no mistake — TikTok remains a dangerous place, particularly for young users.

Parents should think long and hard before allowing their kids to take part.

Additional Articles and Resources

Plugged In Parent’s Guide to Today’s Technology 

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

TikTok Scrambles Amid Looming Ban

Trump Revives TikTok

Supreme Court Upholds TikTok Ban

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Tim Tebow to Parents: ‘Please Be the Protectors You’re Called to Be’

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

  • Page 1
  • Page 2
  • Page 3
  • Interim pages omitted …
  • Page 6
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2026 Focus on the Family. All rights reserved.

  • Cookie Policy