• Skip to main content
Daily Citizen
  • Subscribe
  • Categories
    • Culture
    • Life
    • Religious Freedom
    • Sexuality
  • Parenting Resources
    • LGBT Pride
    • Homosexuality
    • Sexuality/Marriage
    • Transgender
  • About
    • Contributors
    • Contact
  • Donate

social media

Dec 13 2024

TikTok Scrambles Amid Looming Ban

TikTok asked the U.S. Court of Appeals for the D.C. Circuit this week to postpone the enforcement of a law that could ban the social media platform from America.

The Protecting Americans from Foreign Adversary Controlled Applications Act (the Act) gives TikTok until January 19 to cut ties with its Chinese parent company, ByteDance. The law, which President Biden signed in April, addresses years-long, bipartisan concerns that the Chinese government uses TikTok to gather data on American citizens.

TikTok spent months trying to get the law thrown out as a free speech violation. The D.C. Circuit effectively dashed those hopes last week after unanimously finding the Act constitutional.

“The First Amendment exists to protect free speech in the United States,” Senior Circuit Judge Douglas Ginsburg wrote in the Court’s majority opinion, continuing:

Here the Government acted solely to protect that freedom from a foreign adversary nation and to limit that adversary’s ability to gather data on people in the United States.

Government laws that infringe on free speech rights must 1) serve a compelling government interest and 2) be narrowly tailored to serve that purpose. Though the Court acknowledged a ban would infringe on the free expression of some 170 million users, it concluded the Act met this two-pronged test.

In his concurring opinion, Judge Sri Srinivasan wrote:

To give effect to [the competing interests of free speech and national security], Congress chose divestment as a means of paring away the PRC’s [People’s Republic of China] control — and thus containing the security threat — while maintaining the app and its algorithm for American users.
Congress judged it necessary to assume the risk [of banning TikTok] given the grave national-security threats it perceived.

As Srinivasan notes, the Court’s opinion reflects a preponderance of evidence that TikTok funnels important user data to one of America’s greatest foreign adversaries.

Assistant Director of National Intelligence Casey Blackburn calls China “the most active, persistent cyber espionage threat to the U.S. government, private-sector and critical networks.” Part of these threats include China’s “extensive, years-long efforts to accumulate structured datasets, in particular on U.S. persons, to support its intelligence and counterintelligence operations.”

Chinese law requires all Chinese-owned companies — including ByteDance — to make their data available to the government. That means China has access to the names, ages, emails, phone numbers and, often, contact lists U.S. users divulge to TikTok when they sign up.

The PRC can also access TikTok’s users’ in-app messages, in-app usage patterns, IP addresses, keystroke patterns, browsing and search history, location data and biometric identifiers like face- and voiceprints—all metrics TikTok tracks, according to its “privacy policy.”

TikTok claims it retains American data on local servers the Chinese government can’t access. Public reporting and evidence presented by the Department of Justice suggest otherwise.

Despite TikTok’s assurances, audio recordings of ByteDance meetings indicate the company “retained considerable control and influence” over U.S. users’ data.

The Congressional investigative committee that recommended passing the Act found:

  • “Public reporting suggested that TikTok had stored sensitive information about U.S. persons (including ‘Social Security numbers and tax identifications’) on servers in China;
  • “TikTok’s ‘China-based employees’ had ‘repeatedly accessed non-public data about U.S. TikTok users.’
  • “ByteDance employees had ‘accessed TikTok user data and IP addresses to monitor the physical locations of specific U.S. citizens.”

This is not the first time TikTok has been accused of mishandling user data. In October, thirteen states and the District of Columbia filed lawsuits accusing TikTok, in part, of illegally collecting children’s data.

“TikTok actively collects and monetizes data on users under 13 years old, in violation of Children’s Online Privacy Protection Act (COPPA), and does so without parental consent,” New York Attorney General Leticia James wrote in a press release detailing the allegations.

Evidence uncovered in these cases suggest a significant portion of TikTok’s American users are minors. An internal study from TikTok found as many as 95% of American smartphone users under 17 years old use the app.  An estimated 35% of TikTok’s American ad revenue comes from children and teens.

TikTok hopes the D.C. Circuit will grant it enough time to appeal to the Supreme Court — or the beneficence of the incoming Trump administration — before January 19.

Though President Trump was the first to openly acknowledge the risk TikTok posed to national security in 2019, some allies predict he will help facilitate a deal between TikTok and a new American owner. The Chinese government says it will oppose such a deal, preventing ByteDance from releasing TikTok and forbidding the sale of its money-making algorithm.

The Department of Justice officially opposed TikTok’s petition for a preliminary injunction on Wednesday. As of December 13, the Court has not yet responded.

Parents might not be able to control whether TikTok will be banned in America, but you can take steps to keep your kids safe online. To learn more about social media and its effects on children, check out the articles below.

Additional Articles and Resources

“Plugged In Parent’s Guide to Today’s Technology” equips parents to navigate the ever-shifting tech realm.

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Written by Emily Washburn · Categorized: Culture · Tagged: social media, TikTok

Dec 11 2024

Lorenz Feels ‘Joy’ at UnitedHealthcare CEO’s Execution

This is the second in a two-part series examining America’s reaction to the murder of UnitedHealthcare CEO Brian Thompson. Part 1 introduces the NYPD’s suspect and his alleged motives. Part 2  explores Americans‘ celebration of the violence.

Six days ago, a masked gunman shot UnitedHealthcare CEO Brian Thompson from behind, leaving him dead on a busy Manhattan street. Police believe the alleged killer, 26-year-old Luigi Mangione, viewed the murder as recompense for the corruption and abuse perpetrated by health insurance companies.

The killer could be dismissed as a lone terrorist if not for the public’s supportive reaction. A disturbing number of people have praised the murderer for his sense of humor, his looks, and the person he chose to target.

Taylor Lorenz, an ex-Washington Post journalist and known controversy magnet, has become the unofficial spokesperson for the shooter supporters. Immediately after the murder, she penned a piece titled, “Why ‘we’ want insurance executives dead.”

She elaborates:

No, that does not mean people should murder them. But if you’ve watched a loved one suffer and die from insurance denial, it’s normal to wish the people responsible would suffer the same fate.

I’ll admit, in my angriest moments, I have wished harm on my enemies. I have wished for aggressive drivers to get into fender benders. I have wished to kick a mean roommate or two. I dislike and repent of these thoughts, but acknowledge they occur.

But never have I ever wished for one of my enemies to die, watched them be executed in real time, and then celebrated their demise on the internet. If that’s normal, consign me to the loony bin.

And make no mistake, Lorenz and her compatriots are partying it up. Below is a short transcript of her interview with media personality Piers Morgan. She begins by describing her reaction to the murder:

Lorenz: I do believe in the sanctity of life, and I think that’s why I felt, along with so many other Americans, joy, you know, unfortunately, because…
Morgan: Joy? Seriously? Joy at a man’s execution?
Lorenz: I mean, maybe not joy, but certainly not empathy. Because, again, this is a man responsible for…
Morgan: How can this make you joyful? This guy is a husband, he’s a father, he’s been gunned down in the middle of Manhattan. How does that make you joyful?
Lorenz: So are tens of thousands of innocent Americans who died because greedy health insurance executives like this one push policies of denying care to the most vulnerable people. And I am one of the many millions of Americans who watch people that I care about suffer and in some cases die because of lack of healthcare.

Then Morgan asks the obvious question:

So should they all be killed, then? These healthcare executives? Would that make you even more joyful?

Lorenz looks taken aback before beginning to laugh. The interview continues:

Lorenz: Uh, no, that would not.
Morgan: Well, why not? Why are you laughing? A man’s been murdered in the street. I don’t find it funny at all.
Lorenz: I don’t find it funny that tens of thousands of Americans die every year because they are denied lifesaving healthcare from people like this CEO. Now, I want to fix this system. You’re right, we shouldn’t be going around shooting each other with vigilante justice, no. I think it is a good thing this murder has led to the media elites and politicians paying attention to this issue for the first time.

She concludes:

You mentioned that you couldn’t understand why someone would feel this reaction when you watch a CEO die. It’s because you have not dealt, it sounds like, with the American healthcare system in the way that millions of other Americans have.

Here, Lorenz assumes that everyone who experiences failures of the healthcare system experiences joy when insurance company CEOs die. In articulating this assumption, Lorenz not only presumes to speak on behalf of “millions of Americans,” but suggests they are comfortable with — and even support— the vigilante justice she half-heartedly denounces.  

But that’s the problem with giving citizens the authority to take life as they see fit — it’s always “just” until you’re on the business end of the gun.

Lorenz and X users frequently cite “millions of people” Thompson wronged as justification for his execution. One user commented:

No matter the shooter’s political views the act itself was not left, nor right. It was populist. Everyone, no matter your background, has a horror story with insurance. Whether it’s health, home or auto, they all make their money the same way. By f******* you. It’s quite possibly the most politically neutral assassination of all time.

But the number of people who (allegedly) want someone dead doesn’t justify extra-judicial executions. That’s just vigilante justice cosigned by a mob — and mobs are notoriously bad at rendering justice.

America’s Founding Fathers sought to inoculate America against such capricious judgements. The Fifth Amendment reads, in part:

No person shall be held to answer for a capital, or otherwise infamous crime, unless on a presentment or indictment of a Grand Jury … nor shall any person be … deprived of life, liberty or property without due process of law.

In his first inaugural address, President Thomas Jefferson expanded on “this sacred principle:”

Though the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect, and to violate would be oppression.

Brian Thompson’s murder is a sobering example of what happens when the Fifth Amendment fails. It’s a miscarriage of justice that no one should celebrate — if not for moral reasons, then for sheer self-preservation.

Celebrating vigilante justice makes it more likely to occur. No one is safe from a jury, judge and executioner with their own definition of justice. Please consider praying for Thompson’s family and for justice to be done in his case.

Additional Articles and Resources

Luigi Mangione: Alleged Killer Apprehended with All-Too-Familiar Manifesto

A Year’s Slide into Antisemitism, Examined

Manhood is on Trial in the Daniel Penny Case

Indoctrination Station: New York State Education Department Pushes Critical Theory on Students

Written by Emily Washburn · Categorized: Culture · Tagged: Mangione, social media

Nov 01 2024

TikTok Dangerous for Minors — Leaked Docs Show Company Refuses to Protect Kids

Leaked documents from inside TikTok suggest the social media giant intentionally endangers kids to benefit its bottom line.

A group of 13 states and the District of Columbia filed individual suits against the Chinese-owned company earlier this month, alleging:

TikTok’s underlying business model focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads.

The suits claim TikTok’s addictive features, like autoplay and 24-hour push notifications, as well as, marketing strategies, like promoting beauty filters, harm children’s mental and physical health. Evidence uncovered in the states’ two-year investigation into the platform suggests the company knew about these problems and allowed them to continue.

TikTok’s internal documents and communications should have been hidden from the public. However, problems with the redaction in South Carolina’s and Kentucky’s cases prematurely revealed some clandestine details.

It doesn’t look good for TikTok.

Users 12-and-Up

On the Apple App Store, TikTok claims its content is suitable for children ages 12 and up. But Apple challenged that age rating in 2022, according to evidence in South Carolina’s case against TikTok.  

The Washington Post reports:

A team at Apple reviewing TikTok’s rating found that the app features ‘frequent or intense mature or suggestive content’ and pressed the platform to raise its recommended age to 17 and over.

But TikTok refused to give up its kid-friendly rating. Instead, it claimed it took “aggressive strategies” to filter and remove the kinds of content Apple flagged. South Carolina’s suit says it didn’t work. TikTok still shows children inappropriate and vulgar content, the case alleges, but the company doesn’t want to cop to it.

Big Business

New York Attorney General Letitia James, who helped lead the charge against TikTok, says TikTok has financial incentives to keep its age rating low. A press release explaining the case claims approximately 35% of TikTok’s American ad revenue comes from children and teens.

Leaked information from Kentucky’s case, reviewed and published by NPR, supports James’ assertion.

TikTok knows know many of its most dedicated users are minors. One internal study of its users found 95% of American smartphone users under 17-years-old use the app. Another study of TikTok’s engagement statistics notes, “As expected, across most engagement metrics, the younger the user, the better the performance.”

The company knows it must keep young users engaged. In one employee chat regarding a TikTok tool meant to decrease the time minors spend on the app, a project manager admitted, “Our goal is not to reduce the time spent [on TikTok].” Another employee added, “[The goal is] to contribute to daily active users and retention [of other users].”

Flimsy Features

The aforementioned tool allows parents to impose an hour-long TikTok time limit — a feature that would impact TikTok’s goal if it worked. But it doesn’t. TikTok estimates it only reduces usage time by an average of 1.5 minutes.

Another internal document suggests the screen-time limit wasn’t built to work well. TikTok only evaluated the feature’s success by how it “improved public trust in the TikTok platform via media coverage.”

TikTok apparently instructs their content moderators to perform a similarly shoddy job. A document referring to “younger users/U13” tells moderators to leave underage users’ accounts alone unless it explicitly identifies them as under 13 years old.

Addiction, Inc.

TikTok doesn’t just turn a blind eye to minors on its platform — it recruits them.

The platform discovered users must watch 260 videos to form a TikTok habit. Kentucky’s lawsuit elaborates:

While this may seem substantial, TikTok videos can be as short as 8 seconds and are played for viewers in rapid-fire succession, automatically. Thus, in under 35 minutes, an average user is likely to become addicted to the platform.

TikTok knows its most engaging features cause young people to compulsively open its app. It also knows what problems these compulsions cause. The company’s internal research concludes:

Compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy and increased anxiety. … [It] also interferes with essential responsibilities like sufficient sleep, work/school responsibilities and connecting with loved ones.
Trapped in PainHub

One of the platform’s most addictive features is its algorithm, which learns and feeds users the kinds of videos they like. But ingesting too much of the same content can quickly skew the way users view the world.

A good example of this comes from one of TikTok’s employees, who participated in an internal study of “filter bubbles” — the kind of homogenous content filtering that occurs when social media algorithms determine what posts users engage with.

This employee wrote:

After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 minutes to drop into a ‘negative’ filter bubble. The intensive density of negative content make me lower down mood (sic) and increase my sadness feelings (sic) though I am in a high spirit in my recent life.

The employee is referencing TikTok accounts featuring exclusively sad stories and comments from people in pain. Ostensibly designed to support those going through a hard time, an excess of this kind of content makes the world seem like a perpetually dark place.

To avoid filter bubbles, TikTok claims to offer a “Refresh” feature that resets the algorithm. James’ press release says this feature “does not work as TikTok claims.”

Online Strip Club

So TikTok knows minors generate big profits, creates addictive features to keep them on the app, turns a blind eye to underage users and allows them to binge on inappropriate content. Yikes.

Unfortunately for everyone involved, it gets worse.

One incident documented in Kentucky’s case involves TikTok Live, a feature that allows users to broadcast live videos of themselves. In 2022, TikTok discovered “a significant” number of adults had started paying minors to strip on live video.

You read that right. In one month alone, adults sent more than 1 million ‘gifts’ — real money converted into digital currency — to kids for ‘transactional’ behavior.

D.C. Attorney General Brian Schwalb calls this an “unlicensed payment system” that incentivizes minors to prostitute themselves. He told the Post:

TikTok has designed its money transmission business to lure children in, using childlike cartoons and emojis to make it look like the children are playing games, when in fact they are being exploited financially.

In contrast, a TikTok official commented to coworkers:

One of our key discoveries during this project that has turned into a major challenge with Live business is that the content that gets the highest engagement may not be the content we want on our platform.
Why It Matters

The more we learn about social media, the more obvious it seems that children shouldn’t get within ten feet of it. To learn more about what you can do to keep your child screen-free, take a look at the articles linked below.

At least commit to enforcing strong boundaries around technology. Some ideas include requiring your children to scroll social media in public area, regularly checking their social media feeds for inappropriate content and educating them about sextortion and other predatory online behaviors.

Remember that companies like TikTok and Meta will not reliably protect your child from exploitation and inappropriate content. It’s up to parents to keep kids safe online.

Additional Articles and Resources

Four Ways to Protect Your Kids from Bad Tech, From Social Psychologist Jonathan Haidt

Parent-Run Groups Help Stop Childhood Smartphone Use

Survey Finds Teens Use Social Media More Than Four Hours Per Day — Here’s What Parents Can Do

Teen Boys Falling Prey to Financial Sextortion — Here’s What Parents Can Do

Instagram’s Sextortion Safety Measures — Too Little, Too Late?

Kid’s Online Safety Act — What It Is and Why It’s a Big Deal

Instagram Content Restrictions Don’t Work, Tests Show

Zuckerberg Implicated in Meta’s Failures to Protect Children

Surgeon General Recommends Warning on Social Media Platforms

‘The Dirty Dozen List’ — Corporations Enable and Profit from Sexual Exploitation

Written by Emily Washburn · Categorized: Culture · Tagged: parenting, social media, TikTok

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Page 4

Privacy Policy and Terms of Use | Privacy Policy and Terms of Use | © 2025 Focus on the Family. All rights reserved.

  • Cookie Policy