• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Jan 27 2026

America Controls TikTok Now — But Is It Really Safe?

TikTok’s Chinese parent company, ByteDance, relinquished control of American TikTok to the U.S. late last week.

The transaction staves off the looming TikTok ban without sacrificing national security. The platform itself, however, remains as dangerous for America’s 170 million users as ever.

TikTok USDS Joint Venture, an American-controlled company, took over the American divisions of TikTok, Lemon8, CapCut and several other ByteDance apps on Thursday, January 22.

American investors own 50% of the new company. ByteDance retains 19.9% ownership and the remaining 30% belongs to ByteDance investors.

American control of the joint venture solves three national security threats caused by Chinese ownership of ByteDance.

First and foremost, American users’ data will be hosted on U.S.-based servers and protected by U.S. cybersecurity companies.

TikTok always claimed to keep the extensive data it collected on Americans secure. But Chinese law requires companies like ByteDance make their data available to the government.

A congressional investigatory committee determined Chinese officials had mined Americans’ TikTok data on multiple occasions, including:

  • Names
  • Ages
  • Emails
  • Phone numbers
  • Contact lists
  • In-app messages and usage patterns
  • IP addresses
  • Keystroke patterns
  • Browsing and search history
  • Location data
  • Biometric information like face- and voiceprints

Chinese access to ByteDance’s data extended to TikTok’s powerful content recommendation algorithm. TikTok USDS will reset the algorithm and retrain it on American content alone, ensuring China can no longer manipulate what U.S. citizens see on TikTok.

ByteDance will no longer perform content moderation under TikTok USDS, further preventing China from censoring or influencing the success of Americans’ posts.

ByteDance’s divestiture of American TikTok satisfies the Protecting Americans from Foreign Adversary Controlled Applications Act — the ban-or-sell law which required the company to sell majority ownership of TikTok for the app to remain available in America.

President Donald Trump passed an executive order delaying enforcement of the ban in January 2025, giving TikTok time to negotiate with American buyers. When ByteDance and U.S. investors established a framework for the deal in September, the president gave the parties another 120 days to sign on the dotted line.

The launch of TikTok USDS Joint Venture on January 22 came just one day before the deadline.

American control of TikTok might well protect citizens from global security threats. It does not, however, protect users from TikTok itself.

Thirteen states and the District of Columbia sued TikTok in October 2024 for illegally collecting and monetizing American children’s data.

The suits estimated as much as 35% of TikTok’s American ad revenue under ByteDance came from children and teens. Importantly, ByteDance will retain control of TikTok’s e-commerce, marketing and advertising under the TikTok USDS.

The 2024 lawsuits exposed documents showing TikTok not only knew its app was addictive, but that compulsive use in teens caused “a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, increased anxiety … [and interference] with essential responsibilities like sufficient sleep, work/school responsibilities and connecting with loved ones.”

An estimated 95% of American smartphone users under 17 years old use TikTok, according to one of the platform’s own reports.  

Still, TikTok workers did not consider it their responsibility to limit minors’ use of the platform — even when creating a tool allowing parents to set TikTok time limits for their kids.

“Our goal is not to reduce the time spent [on TikTok],” a project manager for the tool wrote in an employee group chat.

“[The goal is] to contribute to daily active users and retention [of other users],” another chimed in.

TikTok evaluated the success of the time limit tool based on one metric alone: “Whether it improved public trust in the TikTok platform via media coverage.”

TikTok doesn’t just fail to protect kids — it targets them. When the Apple App Store challenged TikTok’s 12-and-up age rating in 2022, arguing its “frequent or intense mature or suggestive content” warranted a 17-and-up rating, TikTok refused to change it.

The Daily Citizen supports policies which protect American families from foreign threats. But make no mistake — TikTok remains a dangerous place, particularly for young users.

Parents should think long and hard before allowing their kids to take part.

Additional Articles and Resources

Plugged In Parent’s Guide to Today’s Technology 

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Proposed ‘App Store Accountability’ Act Would Force Apps and App Stores to Uphold Basic Child Safety Protections

TikTok Scrambles Amid Looming Ban

Trump Revives TikTok

Supreme Court Upholds TikTok Ban

National Center on Sexual Exploitation Targets Law Allowing Tech Companies to Profit from Online Sex Abuse

Tim Tebow to Parents: ‘Please Be the Protectors You’re Called to Be’

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

Jan 16 2026

UPDATED: Pornography Age Verification Laws — What They Are and Which States Have Them

Half of all states — Louisiana, Arkansas, Virginia, Utah, Montana, Texas, North Carolina, Indiana, Idaho, Florida, Kentucky, Nebraska, Georgia, Alabama, Kansas, Oklahoma, Mississippi, South Carolina, South Dakota, Wyoming, North Dakota, Missouri, Arizona and Ohio — require pornography companies to verify the ages of their online consumers.

Ten more states hope to pass age verification legislation in 2026.

Described by Politico as “perhaps the most bipartisan laws in the country,” age verification laws help parents protect their kids by making it harder for minors to access adult content online.

Most age verification bills:

  • Require companies who publish a “substantial” amount of adult content — usually 1/3 or more of their total production — to check the age of every person accessing their website.
  • Create a way for parents to sue pornography companies if their kids access content they shouldn’t.

The Supreme Court found age verification requirements like these constitutional in June 2025, silencing critics who argue they infringe on free speech and privacy rights.

While most age verification laws contain the same basic components, few are identical.

Some states add age-verification requirements for social media companies. Minnesota’s House Filing 1875 would require social media companies to exclude children younger than 14 from their platforms.

Michigan’s Senate Bill 284 would require manufacturers like Apple to verify device users’ ages and communicate that information to other apps and websites.

Wyoming’s HB 43, now law, requires all online websites which publish or host adult content — no matter how little — to verify consumers’ ages.

States also employ different strategies to pass age verification bills.

Ohio rolled its age verification law into the bill establishing the state’s 2026-2027 budget. Missouri legislators introduced five bills this month to build on the state’s existing age verification regulations.

Hawaii separated its legislation into two bills — one establishing age verification requirements and another creating penalties for violators — so representatives could approve the requirements even if they disagreed with proposed penalties.

While not perfect, age verification laws greatly restrict the amount of porn young people can access. After Louisiana became the first state to pass such legislation in 2022, traffic to Pornhub.com from that state dropped by 80%, one spokesperson told the Institute for Family Studies.

Scroll down to see the status of age verification bills in different states. To find out more about age verification and parents’ rights legislation in your state, contact your local Focus on the Family-allied Family Policy Council.

States in dark blue have passed age verification laws. States in light blue have active age verification bills. Missouri has both passed and pending age verification legislation.
Age Verification Laws

Louisiana
HB 142 became law on June 15, 2022.
Date effective: January 1, 2023

Arkansas
SB 66 became law on April 11, 2023.
Date effective: July 31, 2023

Virginia
SB 1515 became law on May 12, 2023.
Date effective: July 1, 2023

Utah
SB 0287 became law on May 4, 2023.
Date effective: May 3, 2023

Montana
SB 544 became law on May 19, 2023.
Date effective: January 1, 2024

Texas
HB 1181 became law on June 12, 2023.
Date effective: September 19, 2023

North Carolina
HB 8 became law on September 29, 2023.
Date effective: January 1, 2024

Indiana
SB 17 became law on March 13, 2024.
Date effective: August 16, 2024

Idaho
HB 498 became law on March 21, 2024.
Date effective: July 1, 2024

Florida
HB 3 became law on March 25, 2024.
Date effective: January 1, 2025

Kentucky
HB 278 became law on April 5, 2024.
Date effective: July 15, 2024

Nebraska
Online Age Verification Liability Act became law on April 16, 2024.
Date effective: July 18, 2024

Georgia
SB 351 became law on April 23, 2024.
Date effective: July 1, 2025

Alabama
HB 164 became law on April 24, 2024.
Date effective: October 1, 2024

Kansas
SB 394 became law without the Governor’s signature on April 25, 2024.
Date effective: July 1, 2024

Oklahoma
SB 1959 became law on April 26, 2024.
Date effective: November 1, 2024

Mississippi
HB 1126 became law without the Governor’s signature on April 30, 2024.
Date effective: July 1, 2024

South Carolina
HB 3424 became law on May 29, 2024.
Date effective: January 1, 2025

Tennessee
HB 1642/SB 1792 became law on June 3, 2024.
Date effective: January 13, 2025

South Dakota
HB 1053 became law on February 27, 2025.
Date effective: July 1, 2025

Wyoming
HB 43 became law on March 13, 2025.
Date effective: July 1, 2025

North Dakota
HB 1561 became law on April 11, 2025.
Date effective: August 1, 2025

Missouri
Rule 15 CSR 60-17.010 published on May 7, 2025.
Date effective: November 30, 2025

Arizona
HB 2112 became law on May 13, 2025.
Date effective: September 26, 2025

Ohio
HB 96 became law on June 30, 2025.
Date effective: September 30, 2025

Age Verification Bills

Hawaii
HB 1212: carried over to the 2026 session on December 8, 2025.
HB 1198: carried over to the 2026 session on December 8, 2025.

Iowa
HF 864 (formerly HF 62): placed on subcommittee calendar for the Senate Committee on Technology on January 13.
SF 443 (formerly SF 236): referred to Senate Committee on Technology on June 16, 2025.

Michigan
SB 901: referred to Senate General Laws Committee on January 8.
SB 284 (HB 4429): referred to the Senate Committee on Finance, Insurance and Consumer Protection on May 6, 2025.
HB 4429 (SB 284) : referred to House Committee on Regulatory Reform on September 18, 2025.

Minnesota
HF 1875: referred to House Committee on Commerce, Finance and Policy on March 5, 2025.
SF 2105 (HF 1434): referred to Senate Committee on Commerce and Consumer Protection on March 3, 2025.
HF 1434 (SF 2105): referred to House Committee on Commerce, Finance and Policy on February 24, 2025.

Missouri
HB 1878: referred to House Committee on General Laws on January 8.
HB 1839: referred to House Committee on Children and Families on January 15.
SB 901: referred to Senate General Laws Committee on January 8.
SB 1346: read in the senate on January 7.
SB 1412: read in the senate on January 7.


New Hampshire
SB 648: heard by Senate Judiciary Committee on January 8.

New Jersey
S 1826: referred to Senate Judiciary Committee on January 13.

New York
S 3591 (A 03946): referred to Senate Committee on Internet and Technology on January 7.
A 03946 (S 3591): referred to Assembly Consumer Affairs and Protection Committee on January 7.

Pennsylvania
HB 1513: referred to House Communications and Technology Committee on May 29, 2025.
SB 603: referred to Senate Judiciary Committee on April 9, 2025.

Washington
HB 2112: heard in the House Committee on Consumer Protection and Business on January 16.

Wisconsin
AB 105: second amendment proposed in the senate on January 7.

Written by Emily Washburn · Categorized: Culture, How to Get Involved · Tagged: age verification, parental rights, social media

Jan 15 2026

Our Christian Worldview is About More Than Our Apologetics

Cultural flashpoints seem to be occurring with increasing frequency, undoubtedly facilitated and exacerbated by the explosion of social media. Situations and issues are quickly magnified and exploited — and more and more people seem to have opinions about all kinds of things that once upon a time would have been well off their radar.

Although arguable, social media’s overall downsides do have their upsides. Included on this small list would be the way the medium reveals an individual’s worldview, for good or bad.

It was Mark Twain who said, “It is better to keep your mouth shut and let people think you are a fool than to open it and remove all doubt.” This philosophy is ignored by too many people out in cyberspace, many of whom weigh in on issues well beyond their expertise. They epitomize King Solomon’s stinging observation:

“A fool’s lips walk into a fight, and his mouth invites a beating” (Proverbs 18:6).

One man who is the antithesis of this is Dr. Del Tackett, a former senior vice president here at Focus on the Family and the creator and host of “The Truth Project” — Focus on the Family’s award-winning Christian worldview curriculum. Many of us here at Focus had the privilege of sitting under Del’s teaching before the lessons were put on film. 

At the heart of Dr. Tackett’s classes in this area is the conviction that Christians should be looking at every aspect of the world through the truth of Scripture.

“A worldview is a set of truth claims that purports to paint a picture of reality,” states Del. Biblical truth claims are unchanging and absolute while cultural claims more often vacillate with the changing times. In essence, our worldview is the lens through which we see reality.

One aspect of our worldview that seems to be overlooked involves how we tend to find what we’re looking for in this world. In Matthew’s gospel, Jesus declares, “Ask, and it will be given to you; seek, and you will find; knock, and it will be opened to you” (7:7). 

If we’re seeking the negative and the hypocritical, it’s a given that in a fallen world, we’ll find it — and even among Christians. Conversely, if we’re keeping our eyes open for the pure and the true, we’ll find that, too. 

It’s always been a curious thing when Christians seem to relish pointing out what they perceive to be pharisaical behavior among other believers. “Iron sharpens iron” (Proverbs 27:17), but it is a destructive practice to gleefully run down another Christian. Correction, if necessary, should be handled carefully and soberly.

Yet another concerning dimension of a certain worldview approach are those who assume the worst of people with whom they disagree. Christian couples often have the Apostle Paul’s words to believers in Corinth read at their wedding: “Love bears all things, believes all things, hopes all things, endures all things” (1 Cor. 13:7), but the sentiment is not just limited to romantic love.

Do we have a generosity of spirit when evaluating others? Simply because we don’t like someone’s politics or policies, must we nitpick and find fault? Henry Wadsworth Longfellow astutely observed, “We judge ourselves by what we feel capable of doing, while others judge us by what we have already done.” 

Scripture provides lots of evidence and encouragement for optimism (Jeremiah 29:11, Romans 15:13) and passages that will resonate with a more pessimistic posture (Psalms, Lamentations). Yet, ideally, a balanced worldview appreciates that both are realities to grapple with and work through on a daily basis. 

Dr. Tackett correctly suggested, “The battle we are in today is not primarily political or social — it is a battle of worldviews.” That struggle is not just existential but also very personal and requires us to examine our own hearts as we navigate the complexities of culture.

Written by Paul Batura · Categorized: Culture · Tagged: social media

Jan 08 2026

X’s ‘Grok’ Generates Pornographic Images of Real People on Demand

A damaging new editing feature allows people on X (formerly Twitter) to generate sexually explicit images and videos of real people using the platform’s built-in AI chatbot, Grok.

“Grok Imagine,” which the bot’s parent company, xAI, rolled out in late November, enables Grok to manipulate photos and videos. Users can request Grok alter photos and videos posted to X in the post’s comment section.

xAI owner Elon Musk promoted “Grok Imagine” on Christmas Eve. The platform subsequently flooded with fake images of real people stripped naked or performing simulated sex acts. On at least two occasions, Grok produced sexual photos of children.

Samantha Smith was one of the first women victimized by “Grok Imagine.” The devoted Catholic described her experience in a piece for the Catholic Herald:

My clothes were digitally removed. My face was plastered into sexual situations I had no control over and no desire to be involved in. I remember looking at it and feeling exposed in a way that was difficult to explain to anyone who had not experienced it.

“It did not matter that the image was fake,” Smith emphasized. “The sense of violation was real.”

The disastrous fallout of “Grok Imagine” is a predictable consequence of Grok’s design.

xAI spent much of last year training Grok to perform some sexual functions by feeding it explicit internet content. The company introduced female Grok avatars capable of undressing, trained Grok to hold sexually explicit conversations with users, and even allowed the bot to generate some pornographic images.

Grok is one of the only mainstream AI chatbots designed to perform sexual functions, because it’s infinitely easier to train a chatbot to avoid all sexual requests than to teach it which requests are illegal.

When xAI started feeding Grok pornographic internet content, it inevitably exposed the bot to illegal content like child sexual abuse material (CSAM).

By September 2025, Grok had already generated sexual images of children.

“This was an entirely predictable and avoidable atrocity,” Dani Pinter, Chief Legal Officer and Director of the Law Center at the National Center on Sexual Exploitation wrote in a press release.

“Had X rigorously culled [CSAM and other abusive content] from its training models and then banned users requesting illegal content, this would not have happened.”

The “Grok Imagine” debacle exposes America’s lack of AI regulation.

Sharing explicit, AI deepfakes is illegal under the Take it Down Act, which penalizes sharing explicit, AI-generated images of adults with up to two years in prison. Those who share explicit images of children face up to three years in jail.

The mass implementation of “Grok Imagine” on X dramatically — and rapidly — increased violations of the Take It Down Act, making it impossible for the FBI to identify and prosecute every perpetrator.

Further, no legislation or court precedent holds AI parent companies legally liable for building defective chatbots. Companies like xAI have no incentive to conduct robust safety testing or implement consumer protection protocols.  

“X’s actions are just another example of why we need safeguards for AI products,” Pinter argues. “Big Tech cannot be trusted to curb serious child exploitation issues it knows about within its own products.”

Grok’s latest shenanigans illustrate why children and teens should not use AI chatbots — particularly without adult supervision. “Grok Imagine” also makes X more unsafe for children, who could easily stumble on one of the thousands of deepfakes plaguing the platform.

Widespread pornographic deepfakes could soon infect other social media platforms. The National Center for Missing and Exploited Children (NCMEC) fielded 67,000 reports of AI-generated CSAM in 2024 — more than 14 times as many as in 2023.  

NCMEC received more than 440,000 reports of AI-generated CSAM in the first half of 2025 alone.

Parents should seriously consider the exploding prevalence of AI-generated pornography before allowing their child to use any social media platform.

Parents should carefully consider sharing their own photos online. In the age of AI, it only takes one bad actor to turn a sweet family photo into something sinister and damaging.

Additional Articles and Resources

Counseling Consultation & Referrals

Parenting Tips for Guiding Your Kids in the Digital Age

You Don’t Need ChatGPT to Raise a Child. You Need a Mom and Dad.

AI Company Releases Sexually Explicit Chatbot on App Rated Appropriate for 12 Year Olds

AI Chatbots Make It Easy for Users to Form Unhealthy Attachments

Seven New Lawsuits Against ChatGPT Parent Company Highlights Disturbing Trends

ChatGPT Parent Company Allegedly Dismantled Safety Protocols Before Teen’s Death

AI Company Rushed Safety Testing, Contributed to Teen’s Death, Parents Allege

ChatGPT ‘Coached’ 16-Yr-Old Boy to Commit Suicide, Parents Allege

AI Company Releases Sexually-Explicit Chatbot on App Rated Appropriate for 12 Year Olds

AI Chatbots Make It Easy for Users to Form Unhealthy Attachments

AI “Bad Science” Videos Promote Conspiracy Theories for Kids – And More

Does Social Media AI Know Your Teens Better Than You Do? AI is the Thief of Potential — A College Student’s Perspective

Written by Emily Washburn · Categorized: Culture · Tagged: AI, social media

Dec 16 2025

Senate Introduces 3 Bills Targeting Sextortion

JUMP TO…
  • Stop Sextortion Act
  • SAFE Act
  • ECCHO Act
  • What Parents Can Do

Lawmakers introduced three bills targeting child exploitation last week following a Senate hearing on sextortion.

Sextortion encompasses online blackmail schemes in which offenders manipulate people into sharing explicit images of themselves, then threaten to release the photos unless victims comply with their demands.

Perpetrators of sextortion may demand money or sexual satisfaction. Both types disproportionately harm kids and teens.

Authorities believe financially-motivated sextortion, which primarily effects young men, has caused as many as 40 teen boys to take their own lives. One of the most recent victim, athlete and honor-roll student Bryce Tate, committed suicide on November 6, just three hours after a sextortionist made contact. He was 15 years old.

The National Center for Missing and Exploited Children (NCMEC) fielded more than 2,000 reports of sexually-motivated sextortion by an online terrorist group called “764” in the first nine months of 2025 — double the total number of reports received in 2024.

“This trend has led to the most egregious exploitation that NCMEC has ever seen,” executive director Lauren Coffren told the Senate Judiciary Committee at last week’s hearing on “Protecting Our Children Online Against the Evolving Offender.”

Sextortion scams increased in frequency and severity this year, Coffren and other experts testified at the hearing. But law enforcement officials rarely charge these heinous offenders appropriately.

“Right now, when we charge crimes like sextortion or [764’s crimes], across the country, we all charge them differently,” Jessica Lieber Smolar, a former federal prosecutor, explained.

“There’s no consistency that allows us to properly address the specific harm that these actors are committing.”

Senators Chuck Grassley (IA) and Dick Durbin (IL), chairman and ranking member of the Judiciary Committee, respectively, introduced three bills empowering prosecutors to fight sextortion following the hearing.

Stop Sextortion Act

The Stop Sextortion Act would amend existing laws against possessing and distributing child sexual abuse material (CSAM) to criminalize sextortion, or “threatening to distribute [CSAM] with the intent to intimidate, coerce, extort or cause substantial emotional distress to any person.”

If passed, the act would make sextortion of a minor punishable by up to ten years in prison — double the current penalty.

The “Take It Down Act,” which President Donald Trump signed into law earlier this year, made sextortion of a minor punishable by up to two and a half years in prison. Unlike the Stop Sextortion Act, however, the Take It Down Act criminalized sextortion by amending the Communication Act of 1934. It did not change laws or sentencing guides directly related to CSAM.

SAFE Act

The Sentencing Accountability for Exploitation (SAFE) Act would modernize the sentencing guides for engaging in CSAM.

“The current CSAM sentencing guideline doesn’t consider modern aggravating factors, allowing some of the most nefarious child abusers to skate by with lesser sentences,” a press release about the bill explains.

The guide would allow judges to impose harsher penalties based on updated indicators of a particularly dangerous offender, including:

  • Whether they belonged to an online group dedicated to CSAM.
  • Whether they concealed their identity online.
  • Whether they engaged with CSAM on multiple online platforms.
  • The length of time they engaged with CSAM.
  • The number of children they victimized.
ECCHO Act

The Ending Coercion of Children and Harm Online (ECCHO) Act would make it a crime to “coerce minors into physically harming themselves, others or animals.” It targets the type of sexually-motivated sextortion, or sadistic online exploitation, committed by offenders like 764.

The FBI describes 764 as a group of “nihilistic violent extremists” which “works to destroy civilized society through the corruption and exploitation of vulnerable populations, which often include minors.”

The group does not engage in financially-motivated sextortion. Instead, members blackmail minors into hurting themselves and others. 764 collects and circulates photos and videos of the abuse as trophies.

“The imagery, the videos, the chats that we are seeing and reading [from 764] are the most graphic that I have ever seen in my 20-year-history,” NCMEC’s Coffren told the Senate Judiciary Committee.

The ECCHO Act would make crimes like 764’s punishable by up to 30 years in prison. If a perpetrator caused a minor to kill themselves or someone else, they would face up to life in prison.

As of now, Senators Durbin and Grassley emphasize, “there is no law that explicitly prohibits the coercion of children to hurt themselves or others.”

What Parents Can Do

The Daily Citizen supports laws disincentivizing and punishing child predation. But the proposed bills are a long way from the finish line — and parents must protect their children from sextortion right now.

The most effective way to stymie online predators is to keep your child offline. Parents can delay their child’s introduction to social media by inviting other families to agree to the same boundary. Partnering with others helps parents stay strong and ensures no child feels left out.

Parents with children on social media should make their kids’ accounts private, which means strangers won’t see what they post. Parents should also take advantage of parental controls blocking messages from strangers, if available.

All parents should teach their kids basic internet safety: don’t communicate with someone you don’t know, don’t share information about your identity or location, and never take or share nude images of yourself — ever.

Sextortion poses a threat to every minor with a smartphone or unregulated access to the internet. The Daily Citizen urges parents to take immediate steps to protect their children from this devastating phenomenon.

To learn more about parental controls on social media, visit Focus on the Family’s Plugged In.

Additional Articles and Resources

Plugged In Parent’s Guide to Today’s Technology

President Donald Trump, First Lady Sign ‘Take It Down’ Act

First Lady Supports Bill Targeting Deepfakes, Sextortion and Revenge Porn

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Meta Takes Steps to Prevent Kids From Sexting

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

‘The Tech Exit’ Helps Families Ditch Addictive Tech — For Good

Written by Emily Washburn · Categorized: Culture · Tagged: sextortion, social media

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Page 4
  • Interim pages omitted …
  • Page 6
  • Go to Next Page »

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2026 Focus on the Family. All rights reserved.

  • Cookie Policy