• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Nov 26 2025

Child Safety Advocates Push Congress to Pass the Kids Online Safety Act

JUMP TO…
  • The Act
  • First Amendment Concerns
  • Supporters

Congress must pass the bipartisan Kids Online Safety Act (KOSA), child safety advocates say, so parents can better protect their children from sexual exploitation, addiction and myriad other online harms.

The bill, which Senators Marsha Blackburn (TN) and Richard Blumenthal (CT) reintroduced in May, would hold social media companies legally responsible for harming minors. Platforms governed by the bill would fulfill their legal obligations by instituting child safeguards, creating parental controls and increasing transparency.

A similar version of KOSA passed the Senate last year in a near-unanimous, 91-3 vote. It stalled in the House amid First Amendment concerns.

“[KOSA] will compel covered social media companies to center online safety and wellbeing rather than profit alone,” a group of more than 400 organizations representing parents, children, researchers, advocates and healthcare professionals wrote in an October letter encouraging legislators to pass the bill.

Though Senate Majority Leader John Thune (SD) and Minority Leader Chuck Schumer (NY) both endorse the bill, the Senate has not voted on KOSA this year.

The Act

KOSA would apply to any interactive website that primarily allows users to post and share content, including social media platforms, video posting sites like YouTube and some interactive video games.

It would require covered platforms to place automatic safeguards on minors’ accounts, like:

  • Limiting who can communicate with minors or view their profiles.
  • Prohibiting other companies from viewing or collecting minors’ data.
  • Limiting addictive features like infinite scrolling, auto-play, algorithmic content recommendations and rewards for spending time on the platform.
  • Restricting location sharing and notifying minors when location-tracking turns on.

It would also force covered platforms to offer parents tools to:

  • Manage their child’s privacy and account settings.
  • Restrict their child’s ability to make purchases or engage in financial transactions.
  • View and limit how much time their child spends on a platform.

KOSA further addresses Big Tech’s lack of transparency. Covered platforms would have to:

  • Warn parents and minors about a platform’s potential dangers.
  • Clearly disclose marketing and advertising content.
  • Explain how they create personal content recommendation algorithms — and how users can opt out.

Companies with more than 10 million users a month, on average, would additionally undergo annual, third-party audits investigating whether their platforms harm children. Parents could read auditors’ findings in mandatory safety reports.

State attorneys general and the Federal Trade Commission (FTC) could sue covered platforms for failing to uphold their legal responsibilities under KOSA. The FTC could investigate KOSA violations as “unfair or deceptive business practices.”

First Amendment Concerns

Senators Blackburn and Blumenthal adjusted this year’s version of KOSA to alleviate concerns about government censorship, which contributed to the bill’s failure last year.

Senator Mike Lee (UT), one of just three senators who voted against KOSA in 2024, explained on X:

The legislation empowers the FTC to censor any content it deems to cause “harm,” “anxiety,” or “depression,” in a way that could (and most likely would) be used to censor the expression of political, religious and other viewpoints disfavored by the FTC.

The House Committee on Energy and Commerce tried to alleviate concerns like Lee’s in September 2024 by limiting KOSA’s application to companies making more than $2.5 billion in annual revenue or hosting at least 150 million monthly users.

Though the committee’s revisions eventually passed, many legislators argued the changes gutted KOSA. It never received a vote on the House floor.

This year’s version of the bill specifically prohibits the FTC or state attorneys general from using KOSA suits to illegally censor content. A press release announcing KOSA’s reintroduction reads, in part:

The bill text … further makes clear that KOSA would not censor, limit or remove any content from the internet, and it does not give the FTC or state Attorneys General the power to bring lawsuits over content or speech.
Supporters

Several influential advocates for children’s digital safety support KOSA, including many who regularly appear in the Daily Citizen.

“The Kids Online Safety Act is a powerful tool in parents’ defense of their children,” Tim Goeglein, Vice President of External and Government Relations for Focus on the Family, told the Daily Citizen.

Clare Morrell, fellow at the Ethics and Public Policy Center and author of The Tech Exit, writes:

Parents have been left on their own to try to fend off a massive tech-induced crisis in American childhood from online platforms that are engineered to be maximally addictive. KOSA offers a needed solution by making social media platforms responsible for preventing and mitigating certain objective harms to minors, like sexual exploitation.

Morrell’s The Tech Exit offers parents a blueprint to break their children free of addictive technologies.

Jonathan Haidt, social psychologist and author of The Anxious Generation, argues KOSA “would begin to address the [indisputable harm occurring to children at an industrial scale].”

Haidt’s The Anxious Generation raises alarm bells about the effects of ubiquitous internet access on children’ physical, mental and social wellbeing.

Both houses of Congress must pass KOSA by the end of December. If they do not, parents will have to wait yet another year for the bill’s critical protections.

The Daily Citizen will continue covering this important story.

Additional Articles and Resources

Counseling Consultation & Referrals

Parenting Tips for Guiding Your Kids in the Digital Age

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Danger in Their Pockets

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Proposed SCREEN Act Could Protect Kids from Porn

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

Written by Emily Washburn · Categorized: Culture · Tagged: social media, technology

Nov 17 2025

Many Parents Still Fail to Monitor Their Kids’ Online Activity, Survey Shows

Most Americans support restricting kids’ access to social media and pornography, this year’s American Family Survey shows. But many parents remain hesitant to monitor their children’s online activity.

Brigham Young University’s annual American Family Survey asks a representative group of 3,000 American adults about economic, cultural and social concerns affecting the family. This year, participants identified “social media, video games [and] other electronic resources” as one of the top five issues families must confront.

Respondents expressed the most concern about porn and social media’s effects on young people. American adults overwhelmingly support government regulations limiting minors’ access to these products.

Per the survey:

  • More than 75% of participants support requiring pornography websites to verify the ages of their consumers.
  • Nearly 80% support requiring social media companies to obtain parents’ consent before allowing a minor to create a social media account.
  • Three in four support holding social media companies legally liable for harm caused by content marketed to minors.

Parents with children under 18 years old living at home also support making technology restrictions part of parenting norms. More than 60% of respondents in this demographic wish other families would implement rules about technology, and half said it would make setting and enforcing their own restrictions easier.  

But the survey also shows many parents don’t limit their children’s access to technology at all — let alone discuss strategies with other parents.

Surveyors asked participants with children under 18 years old in the home whether they implement any of five common technological boundaries: limiting their children’s screen time, restricting the kinds of content they consume, requiring them to keep their online accounts private, restricting who they contact and limiting who they exchange private messages with.

One in five respondents (20%) implement none of these restrictions. Two in five respondents (40%) don’t limit their kids’ screen time. Another 40% don’t police the content their children consume.

Though most participants in this demographic claimed other parents’ rules about technology would help them create and enforce their own rules, only 17% said another parent had influenced them to change a screen time restriction.

One third of respondents said they never talk about managing kids and technology with another parent. Only 13% claim to discuss it frequently.

Ubiquitous technology and internet access make parenting harder. Enforcing technological boundaries can be confusing, thankless and overwhelming — particularly when tech companies frequently undermine parental controls with few consequences.

But these obstacles do not change parents’ duty to protect their children from harmful content and technologies.

Parents, you do not have to allow your children access to smartphones or the internet. If you choose to do so, you must be prepared to:

  • Police your child’s online activity.
  • Educate yourself about parental controls and implement them to the best of your ability.
  • Warn your child about online predation and other pitfalls.
  • Model healthy relationships with technology.

Joining forces with other parents to limit children’s access to social media and smartphones can help families create and maintain healthy boundaries with technology. Take it upon yourself to initiate these partnerships. Odds are, you will not be rebuffed.

For more tips and tricks, check out Plugged In’s Parent’s Guide to Today’s Technology. For more information about technology restrictions — or ditching smartphones altogether — read the articles below.

Additional Articles and Resources

Counseling Consultation & Referrals

More than Twenty States Limit Smartphone Use in Schools

Parent-Run Groups Help Stop Childhood Smartphone Use

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Four Ways to Protect Your Kids from Bad Tech, from Social Psychologist Jonathan Haidt

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Video: Seven-Year-Old’s Confidence Soars After Ordering Chick-Fil-A By Himself

5 Most Important Things OpenAI Lawsuits Reveal About ChatGPT-4o

Louisiana Sues Roblox for Exposing Children to Predators, Explicit Content

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Proposed SCREEN Act Could Protect Kids from Porn

Written by Emily Washburn · Categorized: Culture, Family · Tagged: parenting, social media, technology

Jun 27 2025

New York Legislature Passes Bill Requiring Social Media Warning Labels

It’s no secret websites like Instagram, TikTok, Snapchat and Facebook have a chokehold on America’s youth. That’s why New York lawmakers have passed a bill requiring social media platforms to display mental health warning labels to in-state users.

Senate Bill S405 cites a June 2024 statement from U.S. Surgeon General Vivek Murthy, calling for warning labels on “addictive” platforms:

Requiring social media apps with certain particularly noxious design features to display warning labels to all users at the point of user access … is a reasonable and necessary step to take for consumer health and safety.

The legislation references Surgeon General Murthy’s characterization of the current youth mental crisis as a “public health emergency,” and cites various evidence to support this claim:

Research shows that social media exposure overstimulates reward centers, creating pathways comparable to those of an individual experiencing substance abuse or gambling addictions.
Leaked company documents reveal that social media companies knew that compulsive use of their products was also associated with “loss of analytical skills, memory formation, contextual thinking, conversational depth, (and) empathy.”
Among female adolescent users, the association between poor mental health and social media use is now stronger than the association between poor mental health and binge drinking, obesity, or hard drug use.

Additionally, the bill included several statistics supporting the correlation between social media and poor mental health:

  • As of 2023, 12 – 15-year-olds spend an average of 4.8 hours on social media platforms.
  • Today, almost half of adolescents report social media makes them feel worse about their bodies.
  • Teens with the highest levels of social media use are twice as likely to rate their mental health as either “poor” or “very poor.”
  • From 2008 – 2015, the percentage of hospital visits among adolescent social media users nearly doubled due to suicidal ideation or attempts.
  • From 2011 – 2018, self-poisonings among 10 – 12-year-old girls quadrupled.
  • From 2011 – 2018, suicide rates among 10 – 14-year-old girls doubled, and hospital admissions for self-harm tripled.

State Assemblywoman Nily Rozic, a sponsor for the New York bill, commented:

By requiring clear warning labels, we’re giving families the tools to understand the risks and pushing tech companies to take responsibility for the impact of their design choices.
It’s time we prioritize mental health over engagement metrics.

Additionally, Jim Steyer, Common Sense Media Founder and CEO, stated:

We owe it to families to provide clear, evidence-based information about the consequences of excessive use.
When we learned alcohol could cause birth defects, we added warning labels for pregnant women. When nicotine was linked to cancer, we labeled every cigarette pack.
It’s time we took the same approach to social media – the latest addictive product that has kids hooked.

Last week, Senate Bill S4505 was passed in the New York Assembly and Senate. It is currently awaiting the signature of New York’s governor, Kathy Hochul, who previously signed bills prohibiting social media platforms from exposing teens to “addictive” algorithmic content without parental consent. 

If passed, New York will join other states that have or are attempting to issue warnings on social media apps, including Minnesota, Texas, Colorado and California.

Social media platforms have opposed New York’s potential bill, arguing that requiring warning labels on their websites would violate their rights to free speech.

Specifically, NetChoice’s Amy Bos, director of state and federal affairs, recently stated:

The proposed legislation constitutes an unprecedented expansion of government power that would compel private companies to espouse the state’s preferred messaging, a clear violated of the First Amendment’s protection against compelled speech.

However, warning labels on products proven to be harmful do not violate free speech. Rather, they serve to promote truth and transparency by notifying adolescents of the risks associated with social media use.

In a world where internet reliance is ever-increasing, it is crucial for the next generation to be fully informed about the information that constructs and influences their daily lives.

Related Articles and Resources:

‘The Tech Exit’ Helps Families Ditch Addictive Tech – For Good.

New York Prepares to Restrict School Smartphone Use

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Surgeon General Recommends Warning on Social Media Platforms

Written by Meredith Godwin · Categorized: Culture · Tagged: social media

Jun 18 2025

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good.

Social media, screens and smartphones, oh my — parents everywhere are struggling to keep their kids safe in an overwhelming technological age.

Clare Morell, a tech policy expert and fellow at the Ethics and Public Policy Center, throws frazzled families a lifeline in her new book, Tech Exit: A Practical Guide to Freeing Kids and Teens from Smartphones.

In Tech Exit, Morell encourages parents to challenge the idea that addictive technologies are an unavoidable part of modern childhood.

She and hundreds of other “Tech Exit” families are successfully raising their children without smartphones, social media, tablets or video games. The book lays out detailed, step-by-step instructions for families to join their number.

Tech Exit’s proposal might seem drastic — especially for families with older children already addicted to screens. Morell uses her own research and interviews with “Tech Exit” families to show leaving tech behind is not only possible, but logical.

She starts by debunking four myths about screen-based technology.

Myth 1: Screen-based technology is an inevitable part of childhood.

Morell helps create policies protecting children from dangerous technology. She gave birth to her own children as data began showing the catastrophic effects smartphone and social media use wreak on child development and mental health.

The new mom didn’t want her kids to suffer the same effects — but the advice she found in parenting books didn’t seem equal to the problems at hand.

“I saw a major disconnect,” she writes in Tech Exit. “I’d move from horrifying accounts of kids falling into sextortion schemes to advice like ‘set time limits’ [and] ‘turn on privacy settings on their accounts.’”

These aren’t bad strategies, Morell explains, but they also assume that children need access to screen-based technology. That’s not true. Her own family is proof that a “Tech Exit” is sustainable and beneficial.

Myth 2: Screen-based tech can be used in moderation.

We like to conceive of screens like sugar — something that can be enjoyed in moderation.

But screens aren’t like sugar. “For the developing brains of children and teens,” Morell writes, “they are more like fentanyl.”

As the Daily Citizen has previously reported, social networking apps, websites and devices — anything with a messaging or networking feature — triggers the release of dopamine, the brain’s reward chemical.

Crucially, dopamine trains the brain to “want” something but never produces feelings of satiety. Once kids get a taste of tech, they’ll always want more.

When parents bring screen-based tech into the house, they put themselves squarely between “a drug-dispensing machine and an underdeveloped brain,” as one of Morell’s interviewees puts it, and invite daily battles over its use.

“It’s an untenable, exhausting situation,” Morell writes.

Myth 3: The harms of screen-based tech can be fixed with screen-time limits.

Tech companies frequently imply parents can protect kids from screen-based technology by stopping them from spending too much time on their devices. That’s why, in part, screen-time limits are “the most prominent form of parental control [over kids’ devices],” according to Morell.

But addictive technology can negatively affect kids regardless of the amount of time they use them.

The dopamine released in just a couple of minutes of screen time can cause kids to desire tech for hours after it’s been put away. Over time, these intense chemical highs will make other, everyday pleasures seem boring.

The negative social effects of technology burden all kids and teens alike, regardless of their screen use. Morell writes:

The teen mental health crisis today is due not only to negative effects of digital technologies for individuals but also to the group social dynamic that smartphones and social media have created.

Smartphones, for example, change the way kids and teens create and maintain friendships. Every kid must play by these new social rules — even if they don’t use screen-based technology.

Myth 4: Parents can protect their children from danger using parental controls.

Device and app manufacturers have financial incentives to show children inappropriate content. Thus, parental controls are unintuitive, filled with bugs and intentionally easy to manipulate.

But that’s not how they’re sold to parents. Tech companies keep young customers by convincing parents they can sufficiently protect their kids from predators, scams and inappropriate content online.

It’s almost always an exercise in frustration.

Given these intractable problems, Morell uses a startling metaphor to illustrate parental controls’ effectiveness in the digital world:

We don’t take our children to bars and strip clubs and blindfold them or have them wear earplugs. That would be absurd. We just don’t let them go to those places.

Morell’s cost-benefit analysis suggests the benefits of raising children in largely tech-free households far outweigh the consequences. Tech Exit endeavors to create a clear, sustainable path for families to do just that.

Her approach centers around FEAST — an acronym for five common principles all “Tech Exit” families she interviewed follow:

  • Find Other Families: They connect with other “Tech Exit” families.
  • Explain, Educate, Exemplify: They get their kids on board by explaining why they are getting rid of screens, educating them on the dangers of the digital world and exemplifying good digital habits.
  • Adopt Alternatives: They look for creative alternatives to smartphones and other technologies.
  • Set Up Digital Accountability and Family Screen Rules: They create rules and boundaries governing technology in the home.
  • Trade Screens for Real-Life Responsibilities: They replace time spent on screens with independent play and responsibilities.

Morell offers a treasure trove of practical, honest advice and resources to help families adopt these principles in their own lives — even when it seems impossible.

Curious about becoming a “Tech Exit” family? You can find The Tech Exit: A Practical Guide to Freeing Kids and Teens from Smartphones here.

Additional Articles and Resources

Video: Seven-Year-Old’s Confidence Soars After Ordering Chick-Fil-A By Himself

Social Psychologist Finds Smartphones and Social Media Harm Kids in These Four Ways

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

The Harmful Effects of Screen-Filled Culture on Kids

‘Big Tech’ Device Designs Dangerous for Kids, Research Finds

Pornography Age Verification Laws: What They Are and Which States Have Them

Written by Emily Washburn · Categorized: Family · Tagged: smartphone, social media, technology

May 01 2025

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

JUMP TO…
  • Section 230 of the Communications Decency Act
  • What Went Wrong
  • Twelve Survivors
  • X Refuses to Remove Explicit Photos of a Minor
  • Why Nothing Has Changed
  • Next Steps

The National Center on Sexual Exploitation (NCOSE) has launched a campaign to reform an outdated law effectively allowing tech companies to aid, abet and profit off online sex abuse.

Section 230 of the Communications Decency Act

Congress passed the Communications Decency Act (CDA) in 1996 to protect children from online exploitation. It included Section 230, a carveout preventing “interactive computer services” from liability for content users posted online.

Section 230 was a direct response to a New York Supreme Court case from 1995 which found Prodigy Services, a web company that hosted online discussion forums, liable for a user’s defamatory content because it had tried to moderate the objectionable posts.

The ruling made tech companies reluctant to censor even the most offensive content. Section 230, lawmakers reasoned, gave moderators freedom to remove indecent posts.

What Went Wrong

CDA would have made it illegal to “knowingly send obscene or indecent messages … as determined by contemporary community standards, to someone under 18.”

In 1997, the Supreme Court found these protections violated the First Amendment. The ruling stripped CDA of its prohibitions against targeting children but left Section 230 intact.

Over the next two and a half decades, tech companies grew to unprecedented size and power. Courts, meanwhile, interpreted Section 230 to excuse websites, browsers and social media sites of all responsibility regarding the content and activity on their platforms.

The difference between lawmaker’s intentions for Section 230 and its contemporary use is stark. NCOSE writes:

[Section 230] of CDA was originally designed to help protect children online in the early days of the internet.
However, this law — which is over 25  years old — has instead become a shield for Big Tech corporations, allowing them to evade accountability even as their platforms are blatantly used to facilitate egregious acts like sex trafficking, child sexual abuse and image-based sexual abuse.
Twelve Survivors

Every year, NCOSE releases a list of “12 mainstream entities facilitating, enabling and even profiting from sexual abuse.” Dirty Dozen alumni include Twitter (now X), Facebook, Google, Instagram, Amazon, HBO, Netflix, Reddit and even the U.S. Department of Defense.

This year, instead of identifying twelve exploitative companies, NCOSE is “highlighting 12 survivors who were denied justice in the courts because of Section 230.”

“These survivors were silenced,” NCOSE writes, “and the companies that enabled their abuse were given immunity.”

X Refuses to Remove Sexually Explicit Photos of a Minor

One of the 12 stories NCOSE highlights is that of a 13-year-old John Doe. A sextortionist posing as a 16-year-old girl convinced John and his friend engage in a romantic exchange of explicit images. The predator attempted to blackmail John and, when John blocked them, spread the pictures and videos across X. They eventually reached parents and peers at John’s school.

John and his mom contacted X to get the photos taken down. They sent pictures of John’s ID to prove he was a minor. X responded:

We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.

NCOSE is one of three groups representing John and his friend in court against X. They argue the social media juggernaut “knowingly possessed child sexual abuse material and knowingly benefitted from sex trafficking.”

A federal district court initially dismissed the case under Section 230.

The Ninth Circuit Court of Appeals heard the case in February. NCOSE recounts X’s argument:

X’s attorney himself acknowledged that the platform did not remove the [child sexual abuse] content, even after reviewing [John’s] report.
He continued to maintain that X should be immune anyway, stating that holding tech companies accountable for illegal third-party content being uploaded to the site is “antithetical” to what Congress intended with Section 230.

It continues:

The mere fact that Twitter can admit to partaking in a despicable crime and maintain that they should still be protected shows that Section 230 urgently needs to be reformed.
Why Nothing Has Changed

Lawmakers from both sides of the political aisle support reforming — or repealing — Section 230. Democrat and Republican senators blasted blanket immunity for big tech in a Senate Judiciary Committee hearing on the issue in February.

Congress has gathered near-unanimous support for at least six child internet safety bills in recent years. Only two — the REPORT Act and the Take It Down Act — have passed.

Even the Department of Justice (DOJ) has recommended revising Section 230, writing in a 2020 report,

The expansive statutory interpretation [of Section 230], combined with technological developments, has reduced the incentives of online platforms to address illicit activity on their services and, at the same time, left them free to moderate lawful content without transparency and accountability.

The DOJ specifically noted that platforms had used Section 230 to get immunity “even when they knew their services were being used for criminal activity” and to “evade laws and regulations applicable to brick-and-mortar competitors.”

If changing Section 230 is such a popular issue, why has nothing changed?

According to Senator Dick Durbin: money.

When Congress tried to curtail tech companies’ immunity or institute child internet safety protections, “Big Tech opened up a $61.5 million dollar lobbying war chest to make sure [those] bills never became law,” Senator Durbin told the committee in February,

Next Steps

NCOSE asks citizens to encourage Congress to remove Section 230’s immunity protections for tech companies. The snowballing reckoning around social media’s effect on children, and this year’s legislative and congressional activity around Section 230, could indicate fertile ground for change.

The Daily Citizen will continue reporting on Section 230 and efforts to hold tech companies accountable.

Additional Articles and Resources

First Lady Melania Trump Celebrates House’s Passage of Take It Down Act

Proposed SCREEN Act Could Protect Kids from Porn

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Zuckerberg Implicated in Meta’s Failures to Protect Children

Teen Boys Fall Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram Content Restrictions Don’t Work, Tests Show

Horrifying Instagram Investigation Indicts Modern Parenting

REPORT Act Becomes Law

Written by Emily Washburn · Categorized: Culture · Tagged: Section 230, social media

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2026 Focus on the Family. All rights reserved.

  • Cookie Policy