The facts in the Texas case are horrendous.
In 2012, a male sexual predator used Facebook to lure a 15-year-old girl to a meeting where she was repeatedly raped, beaten, and then trafficked for sex. After she was rescued from her situation, she sued Facebook under the pseudonym “Jane Doe” for failing to take reasonable steps to keep sexual predators off its platform.
Some of her claims were dismissed by Texas courts, which ruled that Section 230 of the Communications Decency Act shields Facebook from liability for posts from third parties. She appealed that partial dismissal to the U.S. Supreme Court, but the justices refused to take her case.
However, Justice Clarence Thomas wrote separately to say that the high court may – and ought to – take up a similar case in the future to address the scope of social media immunity included in Section 230, unless Congress acts first.
Here’s the issue: Should Facebook and other social media companies be legally responsible for harm caused to their users by other users if the companies could reasonably take steps to prevent that harm?
Justice Thomas seems to think so.
“It is hard to see why the protection §230(c)(1) grants publishers against being held strictly liable for third parties’ content should protect Facebook from liability for its own ‘acts and omissions,’” Thomas wrote.
The justice’s statement no doubt sent shockwaves through Facebook corporate offices as well as those of other social media companies, who view Section 230 as a complete defense to any and all harms resulting from social media posts on their platforms.
“I … concur in the Court’s denial of certiorari,” Thomas continued. “We should, however, address the proper scope of immunity under §230 in an appropriate case.”
That’s an invitation that will not go unnoticed.
So why didn’t the justices take Jane Doe’s case against Facebook?
Unfortunately, the high court chose not to accept the case, according to Justice Thomas, because it was not quite ready for review. It’s too soon for the lawsuit, which is still underway in the Texas state court system, to be appealed. “The litigation is not ‘final’,” Thomas wrote.
Section 230, passed in 1996, is designed to promote the development of the internet and interactive computer services. One of the ways the law accomplishes that is to not hold social media companies liable for the content published by others.
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” Section 230 (c)(1) reads.
Justice Thomas, however, questions the blanket immunity granted by the lower courts to social media companies under Section 230. He thinks the text of the statute doesn’t justify that.
“As I have explained, the arguments in favor of broad immunity under §230 rest largely on ‘policy and purpose,’ not on the statute’s plain text. [citation omitted]. Here, the Texas Supreme Court recognized that ‘[t]he United States Supreme Court—or better yet, Congress—may soon resolve the burgeoning debate about whether the federal courts have thus far correctly interpreted section 230.’ [citation omitted].
“Assuming Congress does not step in to clarify §230’s scope, we should do so in an appropriate case.”
Perhaps Jane Doe’s lawsuit will return to the Supreme Court after it reaches a final resolution in the Texas courts. Thomas homed in on her allegations concerning Facebook’s actions or lack thereof.
“Here, the Texas Supreme Court afforded publisher immunity even though Facebook allegedly ‘knows its system facilitates human traffickers in identifying and cultivating victims,’ but has nonetheless ‘failed to take any reasonable steps to mitigate the use of Facebook by human traffickers’ because doing so would cost the company users—and the advertising revenue those users generate,” Thomas wrote.
The bottom line for Thomas, and perhaps for other justices on the court, is that social media companies ought not be able to hide behind Section 230 and ignore potential harms they know about and could reasonably prevent.
There is some good news to report in the battle against sex trafficking on social media. Congress stepped up in 2018 to pass Public Law 115-164, the Stop Enabling Sex Traffickers Act, which goes a long way toward stopping the use of social media to further sex trafficking. But that law does not cover the claims Jane Doe brought against Facebook that were dismissed by the Texas courts.
Facebook has shown itself to be remarkably adept at censoring material on its platform that doesn’t meet its woke standards for speech or subject matter. But if it refuses to address real problems like sex trafficking, then it is past time for Congress or the Supreme Court to step in and require it to do so.
What You Should Know About ‘The Facebook Files’ – An Investigative Report from ‘The Wall Street Journal’
Censorship, Suppression and the 2020 Election – As Senators Grill Big Tech CEOs in Hearing, Conservatives Flock to New Platforms
Proposed Bill in Congress Will Rein in Big Tech Censorship
Photo from Shutterstock.