In an important case that tests the limits of the federal shield law for social media platforms, known as Section 230 of the Communications Decency Act, the Texas Supreme Court last week ruled that civil lawsuits against Facebook from victims of sex trafficking could proceed in the state courts.

Three young women, who were ages 14 and 15 when they were first contacted on social media, brought lawsuits in the Texas courts against Facebook under a state statute prohibiting sex trafficking, which allowed victims to sue anyone “who intentionally or knowingly benefits from participating in a venture that traffics another person.” They also accused the social media company of negligence in allowing the sex trafficking to occur.

The young girls were all lured into prostitution by older men who used Facebook and Instagram to “friend” them and then communicate with them, grooming them to be sold for sex.

Facebook attempted to have the lawsuits dismissed in the lower courts but was unsuccessful. It then asked the Texas Supreme Court to throw out the lawsuits. The justices agreed with Facebook that the plaintiffs’ negligence claims (and also a “products liability” claim) were barred by Section 230, but that the state statutory claims could proceed.

“We do not understand section 230 to ‘create a lawless no-man’s-land on the Internet’ in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking,” Justice Jimmy Blackrock wrote for the court.

“Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that section 230 does not allow it. Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking. Congress recently amended section 230 to indicate that civil liability may be imposed on websites that violate state and federal human-trafficking laws,” the justice stated, referring to Congress’ 2018 amendment to the federal law.

The Texas decision does not mean that Facebook is liable – yet. But it does allow the lawsuits to proceed and permit the young women to present the facts of their cases – unless Facebook appeals this latest ruling to the U.S. Supreme Court before that can happen.

So how did Facebook allegedly “knowingly or intentionally” participate in the girls’ trafficking?

“According to Plaintiffs, Facebook violated this statute through such ‘acts and omissions’ as ‘knowingly facilitating the sex trafficking of [Plaintiffs]’ and ‘creat[ing] a breeding ground for sex traffickers to stalk and entrap survivors,’” the opinion stated.

But what, exactly, did Facebook do?

The plaintiffs do allege at least one thing that could be seen as “knowingly” facilitating human trafficking, i.e., that Facebook “uses the detailed information it collects and buys on its users to direct users to persons they likely want to meet” and, “[i]n doing so, . . . facilitates human trafficking by identifying potential targets, like [Plaintiffs], and connecting traffickers with those individuals.”

We’ll have to wait and see how successful the plaintiffs are at connecting those dots for a judge or jury.

Section 230 was designed to promote free speech on the internet. It works by shielding interactive computer services from liability for content placed on their sites by third parties. The provision has come under scrutiny in recent years by Congress over both a perceived bias against conservatives, as well as from some lawmakers who want more restrictions on speech by social media companies.

In the case of these three young women, the alleged failure by Facebook to protect them from sex traffickers is the type of problem Section 230 shields social media companies against, according to the Texas decision. But a “failure” to do something is not the same thing, legally speaking, as an affirmative bad act.

Why is that important?

Requiring social media companies to anticipate and protect against every conceivable bad act or language of a third party would keep many companies from even offering a platform for free speech. Thus, Section 230’s protections keep the internet platform providers in business.

But “intentional” bad actions, on the other hand, ought to be punished, since they are entirely avoidable.

Human trafficking on the internet is a huge problem, and some platforms that actively sponsor pornographic material, such as Pornhub, have come under increasing scrutiny and have been the subject of lawsuits by individuals harmed because of alleged trafficking and child pornography related to those sites.

Whether Facebook falls into the category of “knowingly or intentionally” benefiting from trafficking remains to be proven. But this Texas decision is significant in the battle against human trafficking because it clears the way for victims to fight back, using the legal tools provided by state legislatures.

Photo from REUTERS