As Twitter Sex Trafficking Case Proceeds, Platforms Face an Impossible Dilemma
A federal judge says an anti-porn group's suit against Twitter can move forward, in a case that could portend a dangerous expansion of how courts define "sex trafficking."
A federal lawsuit accusing Twitter of sex trafficking can move forward, says District Judge Joseph C. Spero, in a decision that could portend a dangerous expansion of how courts define "sex trafficking."
The case is one of the first to invoke the controversial 2018 Fight Online Sex Trafficking Act (FOSTA), which made it a federal crime to host digital content that facilitates prostitution. The legislation also tweaked Section 230 of the Communications Decency Act, which protects digital entities like Twitter from being held liable for everything that users post.
The case—filed in the U.S. District Court for the Northern District of California—was brought by two teenagers and the National Center on Sexual Exploitation (NCOSE), a conservative activist group, formerly known as Morality in Media, that also opposes Netflix, Amazon, and Cosmopolitan magazine. Spero's August 19 decision hints at what similar sex trafficking claims against social media companies may look like in a world where Section 230 isn't an obstacle.
It's a worrying vision.
"We're starting to see…what the actual impact of [FOSTA] is going to be, and are courts going to interpret it kind of in a more broad or a more narrow manner," says Caitlin Vogus, deputy director of the Free Expression Project at Center for Democracy & Technology. "Here is an example of a court interpreting it more broadly, and that raises a lot of concerns for the impact that it might have on platforms when they're making decisions about how to respond to all speech on their platforms going forward."
Exploitation Creep
Traditionally, the crime of sex trafficking must involve "commercial sex acts"—a.k.a. prostitution—and there must be minors involved or an element of force, threatened force, fraud, or coercion. In short, someone must pay someone else (or give them something of value) in a quid pro quo that involves an attempted or completed nonconsensual sex act.
In the case against Twitter, the plaintiffs suggest that soliciting a sex video from someone under age 18 amounts to sex trafficking. Unwittingly providing a platform for a third party to post or link to that video makes one part of a sex trafficking enterprise, they argue. Thus, Twitter is allegedly guilty of participating in a sex trafficking venture by temporarily and unknowingly hosting links to a pornographic video featuring two teenagers.
Several years ago, the teens—who were 13 or 14 years old at the time—recorded themselves engaging in sexual activity and used Snapchat to share these videos with a third party initially believed to be a peer. The recipient turned out to be an adult, who allegedly blackmailed one of the teens (John Doe #1) into providing additional sexual content. Doe #1 blocked the Snapchat predator, and "communications ceased."
The perpetrator could have been held individually responsible, since blackmail and soliciting obscenity from minors are both crimes. Instead, NCOSE is going after a bigger, richer, and much more high-profile target—albeit one much less culpable for criminal activity.
At some point, a compilation of the videos was posted elsewhere online. In 2019, Doe #1 discovered that two Twitter accounts had shared links to it. Doe—and then his mom, separately—reported these to Twitter and were told Twitter would look into it. Twitter also advised that they report it to the National Center for Missing and Exploited Children and to law enforcement, their complaint says. Later, Twitter responded to one of Doe's messages saying "no action will be taken at this time. … If the content is hosted on a third-party website, you'll need to contact that website's support team to report it." Meanwhile, Doe's mom had reached out to a Department of Homeland Security agent, who reached out to Twitter.
Nine days after Doe #1 first made contact, the tweets were deleted and the accounts that shared them suspended.
About a year later, Doe sued Twitter, accusing it of direct sex trafficking, benefiting from participation in a sex trafficking venture, receipt and distribution of child pornography, negligence, and violation of California's product liability law.
What happened was clearly wrong, but it's hard to see how it qualifies as sex trafficking. Yes, minors were involved, but no one paid them for sex, nor did they (or some third-party trafficker) get something of value for sending or posting the videos. But it's crucial to NCOSE's case that what happened be labeled as illegal commercial sexual activity and not some other criminal offense—otherwise, FOSTA wouldn't apply. And if FOSTA doesn't apply, then Section 230 does.
Section 230 says that for certain liability purposes—civil lawsuits, state (but not federal) criminal charges—computer service providers shouldn't be treated as the speaker or publisher of user-generated content. If I defame you on Facebook, it's me, not Facebook, who's legally culpable. If I meet a Match.com date who assaults me, the assailant is guilty, not Match. And so on. But FOSTA exempted many claims involving illegal commercial sex from this rubric. Now, if someone deemed guilty of sex trafficking is found to have used a platform to meet or market a victim, that platform isn't automatically shielded from various legal liabilities.
"Due largely to FOSTA, civil sex trafficking claims can now be brought against online platforms that had no direct involvement with the sex trafficking venture or the victims," says First Amendment attorney Lawrence Walters, head of the Walters Law Group.
Twitter's Defense
Twitter argues that FOSTA's exception to Section 230 protection was meant to apply only to "openly malicious actors" who "were deliberately and knowingly assisting and profiting from reprehensible crimes." Congress didn't mean "for online platforms like Twitter that proactively act against such activity to be sued for their inadvertent failure to remove content," it suggests in a motion to dismiss.
The Does' lawsuit "lacks any facts showing that Twitter affirmatively participated in any kind of venture with the Perpetrators, let alone a sex trafficking venture," the company argues. The suit doesn't put forth "any facts establishing that Twitter knew that Plaintiffs were victims of sex trafficking or that the Videos were evidence of this crime," nor does it "allege any connection between the Perpetrators and Twitter or that Twitter received any benefits because of the Videos."
What's more, "Twitter did remove the Videos and suspend the accounts that had posted them." It's just that "given the sheer volume of Tweets posted every day on Twitter's platform (hundreds of millions of Tweets posted by over 190 million daily users), it is simply not possible for Twitter—or the individuals who enforce its Rules and policies—to find and remove all offending content immediately or accurately in all cases."
Evidence of things like affirmative participation and malicious intent would make sense as requirements for sex trafficking liability. But contrary to Twitter's claims, that's not what FOSTA's creators and supporters were going for.
A pre-FOSTA law—the 2015 SAVE Act—had already made knowingly advertising a sex trafficking victim a crime. It went unused. What many people pushing FOSTA as a followup wanted was a law that could target platforms for incidentally hosting content that facilitated certain harms or sex work more broadly.
The number-one argument for FOSTA was that it would bring down Backpage.com—a platform which did not affirmatively and maliciously participate in sex trafficking and, in fact, took copious steps to prevent advertisements by minors. The Department of Justice commended the company's CEO for these efforts, and federal prosecutors stressed in internal memos that the website was doing more than any other adult ad platform to combat child sex trafficking while still permitting posts by adult sex workers. (This is also Twitter's policy.)
FOSTA was passed to punish sites like Backpage—and, thus, sites like Twitter. While lawmakers liked to talk a good game about it only targeting malicious actors, FOSTA was always constructed to allow its use against entities that tried to prevent illegal content but simply didn't do it perfectly (which basically means any platform that features substantial amounts of user-posted content). And the goal of NCOSE and many FOSTA supporters was always to make anything less than the immediate takedown of any sexualized content a risky proposition.
Twitter's defense seems on more solid ground when it points out that federal laws against sex trafficking simply don't seem to apply here. The law (Section 1591 of federal criminal code) requires a guilty party to take action toward a person (or to benefit from participation in a venture which did so) while knowing or—except in the case of advertising—acting "in reckless disregard of the fact" that force, threats of force, fraud, or coercion "will be used to cause the person to engage in a commercial sex act, or that the person has not attained the age of 18 years and will be caused to engage in a commercial sex act."
No one in this case provided, obtained, or maintained a person, merely videos. Arguing that a video counts as a person is a "novel" and unsupported interpretation, Twitter argues.
Furthermore, no one engaged in prostitution or anything that might be deemed commercial sex, nor would these videos cause them to do so in the future. Twitter couldn't have facilitated a commercial sex act, because there was none.
The only allegation of financial benefit is the suit's claim that Twitter attracted ad revenue and users by featuring minors having sex—which would require believing that mainstream advertisers really want their material alongside child porn, and that Twitter caters to this. But even if you accept that stretch of logic, there was no underlying sex trafficking from which Twitter could benefit.
How Much Web Platforms Must Know
Alas, the court was not entirely persuaded by such arguments. While agreeing with Twitter that the company did not directly engage in sex trafficking, Judge Spero thinks there may be sufficient evidence to establish that sex trafficking took place and Twitter benefited from it.
Refreshingly, the judge rejects the direct sex trafficking claim, finding persuasive Twitter's argument that you can't be guilty of sex trafficking a video. "The more difficult issue is…whether Twitter is immune from liability under Section 230 on [a] claim" of beneficiary liability, he writes. This requires determining what level of knowledge of wrongdoing—called mens rea—Twitter must have to be liable.
"There was a lot of debate over the mens rea requirement" in FOSTA, says Jennifer Huddleston, director of technology and innovation policy at the think tank American Action Forum. "One of the concerns that was expressed was that [FOSTA could] be used to reach companies that might've been trying to prevent this and had something fall through."
In the 2020 case Doe v. Kik Interactive, Inc., brought against the messaging app Kik, the U.S. District Court for the Southern District of Florida held that for digital platforms, "a finding of actual knowledge and overt participation in a venture of sexual trafficking is required to defeat [Section 230] immunity."
But Judge Spero disagrees, concluding instead that civil claims are "not subject to the more stringent requirements that apply to criminal violations" and, thus, no actual knowledge is required of Twitter and platforms like it.
"Ultimately, this issue may need to be settled by the U.S. Supreme Court," says Walters. "But for now the scope of potential liability depends on whether a court will follow the Twitter decision or the Kik decision."
As to whether "Twitter participated in a 'venture'" and whether "Twitter received a benefit from the sex trafficking venture and that the benefit motivated its conduct," the judge concludes that the Does and NCOSE have at least provided enough allegations to suggest the answer to these questions could be yes.
Spero makes some questionable determinations to get here. For instance, he suggests that a large number of retweets of the tweet sharing the video somehow accrues a benefit to Twitter, when by most evidence Twitter wants to be rid of illegal content.
Spero says it's hard to square Twitter's claim that it had no knowledge of sex trafficking with "allegations that they alerted Twitter that the Videos were created under threat." But teens being bullied into sexting doesn't automatically equal criminal sex trafficking.
Perhaps most amazingly, the judge lends credence to the suit's claim that the word "twinks"—used in one tweet sharing Doe's video—refers to children, and therefore should have alerted Twitter that the content featured minors. Twink is commonly used in the gay community to refer to young adult men of a certain build and look. (This is a common problem in studies, hearings, and cases about sex trafficking. With Backpage, for instance, lawmakers time and again insisted that words commonly used in ads by adult sex workers were actually code words for minors, then condemned Backpage based on their own faulty understanding of a community's lingo.)
Ultimately, the judge rejected most of the lawsuit's claims against Twitter but will let one claim move forward.
Moving Forward
The judge's decision doesn't necessarily mean that Spero agrees that Twitter is guilty of benefiting from sex trafficking, just that there is sufficient evidence not to totally dismiss the possibility. It's now up to NCOSE lawyers to prove that Twitter purposely "permits large amounts of human trafficking and commercial sexual exploitation material on its platform" and that it attracts users and ad revenue because of it.
But the fact that the case is going forward at all demonstrates how FOSTA has tipped the scales.
"Twitter must now bear the burdens of discovery, pretrial preparation, and trial since FOSTA eliminated the immunity that online service providers previously relied upon to quickly dismiss these claims," says Walters.
"If a type of speech does not get Section 230 protection, it does not automatically mean that the company will be found liable," says Huddleston. But "Section 230 is incredibly valuable because the farther along in litigation a case goes, even if you're ultimately vindicated in the courts, the more costly the litigation gets. And particularly for startups and midsize companies, that cost and the incentives involved can have an impact on decisions to carry user-generated content."
By Spero's logic, "pretty much any site that's being used by sex traffickers is going to be at risk," notes Vogus. "The way the court interpreted what it means to be in a common venture with sex traffickers" means people bringing a lawsuit "just have to show…that there is sex trafficking happening on the platform."
And sex trafficking has been stretched to encompass any manner of things involving sexual activity, including consensual sexual activity between adults and—as in this ruling—videos minors take of themselves.
Platforms looking at this decision might reasonably decide to take down any content with sexual overtones that gets reported.
"If somebody reports content to you that is actually something like a sex education video, or adult consensual sexual content, that you might say it's too risky for me to leave this up on the chance that it could be actually content that violates the statute," says Vogus. "They'll just err on the side of taking all the content down. And so a lot of content that is not illegal, that is perfectly legitimate, that might be even in the public interest to remain online, will get swept down instead because there'll be a fear of civil liability."
For sex workers, such results could be especially devastating. "Twitter is truly the last resource that most sex workers have. It's how we stay connected to our fan base and without it, it would push us further into the outskirts of the internet," sex worker Envy Us told Vice recently. "Which is where organizations like NCOSE want us."
Platforms could also go the other way, considering that what may damn Twitter is having not taken down the content once it was reported.
"Another direction that maybe a platform looking at this could go would be to say, OK, to reduce my risk of liability, what I'm actually going to do is make it really difficult for people to report content to me," says Vogus. "Because that way we can kind of try to purposely blind ourselves to these things and then no court would be able to say that we had knowledge of it."
Bad actions did occur here. But we already had laws against those acts. This new approach means that any platform where bad actors post—no matter how briefly and furtively—could be liable for heinous crimes they know nothing about.
"There have been a lot of concerns about whether or not [FOSTA is] actually targeting the bad actors," says Huddleston. "Even if you really want to go after the underlying crime and after the bad actors, we've seen a lot of cases brought against other parties."
Critics of the judge's decision and the NCOSE lawsuit are not arguing that Twitter should host links to teens having sex, or that such videos should be freely available online. The question is whether it makes sense to treat platforms that inadvertently and temporarily facilitate access to such content as sex traffickers. Is that proportionate to their culpability? Will it make platforms more or less likely to respond appropriately to such content in the future? Will it make finding and punishing the perpetrators of such content easier or more difficult? What unintended consequences might such a system perpetuate?
Show Comments (99)