throbber
Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 1 of 56
`
`
`
`
`
`
`
`
`
`
`
`UNITED STATES DISTRICT COURT
`
`NORTHERN DISTRICT OF CALIFORNIA
`
`JOHN DOE, et al.,
`
`Plaintiffs,
`
`v.
`
`TWITTER, INC.,
`
`Defendant.
`
`Case No. 21-cv-00485-JCS
`
`
`ORDER GRANTING IN PART AND
`DENYING IN PART MOTION TO
`DISMISS FIRST AMENDED
`COMPLAINT
`
`
`
`
`
`Re: Dkt. No. 48
`
`
`
`
`
`I.
`
`INTRODUCTION
`
`Plaintiffs John Doe #1 and John Doe #2 allege that when they were thirteen years old they
`
`were solicited and recruited for sex trafficking and manipulated into providing to a third-party sex
`
`trafficker pornographic videos (“the Videos”) of themselves through the social media platform
`
`Snapchat. A few years later, when Plaintiffs were still in high school, links to the Videos were
`
`posted on Twitter. Plaintiffs allege that when they learned of the posts, they informed law
`
`enforcement and urgently requested that Twitter remove them but Twitter initially refused to do
`
`so, allowing the posts to remain on Twitter, where they accrued more than 167,000 views and
`
`2,223 retweets. According to Plaintiffs, it wasn’t until the mother of one of the boys contacted an
`
`agent of the Department of Homeland Security, who initiated contact with Twitter and requested
`
`the removal of the material, that Twitter finally took down the posts, nine days later.
`
`Plaintiffs assert state and federal claims against Twitter based on its alleged involvement in
`
`and/or enabling of sex trafficking and the distribution of the child pornography containing their
`
`images. Twitter, however, contends that even after Congress’s enactment of the Fight Online Sex
`
`Trafficking Act and Stop Enabling Sex Traffickers Act in 2018, the conduct alleged by Plaintiffs
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 2 of 56
`
`
`
`is shielded from liability under Section 230 of the Communications Decency Act (“CDA”). Thus,
`
`Twitter brings a Motion to Dismiss First Amended Complaint (“Motion”) seeking dismissal of all
`
`of Plaintiffs’ claims on the basis that it is immune from liability under the CDA. In the Motion,
`
`Twitter also contends Plaintiffs fail to state viable claims as to many of their claims. A hearing on
`
`the Motion was held on August 6, 2021. For the reasons stated below, the Motion is GRANTED
`
`in part and DENIED in part.1
`
`II.
`
`BACKGROUND
`
`A. First Amended Complaint
`
`Plaintiffs’ First Amended Complaint (“FAC”), which is the operative complaint, contains
`
`detailed allegations describing: 1) Twitter’s platform, business model and content moderation
`
`policies and practices (FAC ¶¶ 23-51); 2) the ways Twitter allegedly permits and even aids in the
`
`distribution of child pornography on its platform and profits from doing so (FAC ¶¶ 52-84); 3)
`
`how pornographic content featuring John Doe #1 and John Doe #2 was created and eventually
`
`ended up on Twitter’s platform (FAC ¶¶ 85-100); and 4) Twitter’s response to requests that the
`
`pornographic photos and videos containing Plaintiffs’ images be removed from Twitter (FAC ¶¶
`
`101-132).
`
`Based on these allegations, Plaintiffs assert the following claims:
`
`1) violation of the Trafficking Victims Protection Reauthorization Act (“TVPRA”), 18
`
`U.S.C. §§ 1591(a)(1) and 1595(a) based on the allegation that “Twitter knew, or was in reckless
`
`disregard of the fact, that through monetization and providing, obtaining, and maintaining [child
`
`sexual abuse material (“CSAM”)] on its platform, Twitter and Twitter users received something of
`
`value for the video depicting sex acts of John Doe #1 and John Doe #2 as minors.” FAC ¶¶ 133-
`
`143 (Claim One);
`
`2) violation of the TVPRA, 18 U.S.C. §§ 1591(a)(2) and 1595(a), based on the allegation
`
`that Twitter “knowingly benefited, or should have known that it was benefiting, from assisting,
`
`supporting, or facilitating a violation of 1591(a)(1).” FAC ¶¶ 144-155 (Claim Two);
`
`
`1 The parties have consented to the jurisdiction of a United States magistrate judge pursuant to 28
`U.S.C. § 636(c).
`
`2
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 3 of 56
`
`
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`3) violation of the duty to report child sexual abuse material under 18 U.S.C. §§ 2258A
`
`and 2258B. FAC ¶¶ 156-163 (Claim Three);
`
`4) civil remedies for personal injuries related to sex trafficking and receipt and distribution
`
`of child pornography under 18 U.S.C. §§ 1591, 2252A, and 2255, based on the allegations that
`
`Twitter was “notified of the CSAM material depicting John Doe #1 and John Doe #2 as minors on
`
`its platform and still knowingly received, maintained, and distributed this child pornography after
`
`such notice[,]” causing Plaintiffs to suffer “serious harm and personal injury, including, without
`
`limitation, physical, psychological, financial, and reputational harm.” FAC ¶¶ 164-176 (Claim
`
`Four);
`
`5) California products liability based on the allegedly defective design of the Twitter
`
`platform, which is “designed so that search terms and hashtags utilized for trading CSAM return
`
`suggestions for other search terms and hashtags related to CSAM” and through use of
`
`“algorithm(s), API, and other proprietary technology” allows “child predators and sex traffickers
`
`to distribute CSAM on a massive scale” while also making it difficult for users to report CSAM
`
`and not allowing for immediate blocking of CSAM material once reported pending review. FAC
`
`¶¶ 177-190 (Claim Five);
`
`6) negligence based on allegations that Twitter had a duty to protect Plaintiffs, had actual
`
`knowledge that CSAM containing their images was being disseminated on its platform and failed
`
`to promptly remove it once notified. FAC ¶¶ 191-197 (Claim Six);
`
`7) gross negligence based on the same theory as Plaintiffs’ negligence claim. FAC ¶¶ 198-
`
`203 (Claim Seven);
`
`8) negligence per se based on the allegation that Twitter’s conduct violated numerous laws,
`
`including 18 U.S.C. §§ 1591 and 1595 (benefiting from a sex trafficking venture), 18 U.S.C. §
`
`2258A (failing to report known child sexual abuse material), 18 U.S.C. § 2552A (knowingly
`
`distributing child pornography), Cal. Civ. Code § 1708.85 (intentionally distributing non-
`
`consensually shared pornography), and Cal. Penal Code § 311.1 (possessing child pornography).
`
`FAC ¶¶ 204-26 (Claim Eight);
`
`9) negligent infliction of emotional distress. FAC ¶¶ 207-212 (Claim Nine);
`
`3
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 4 of 56
`
`
`
`10) distribution of private sexually explicit materials, in violation of Cal. Civ. Code §
`
`1708.85, based on the allegation that “[b]y refusing to remove or block the photographic images
`
`and video depicting him after Plaintiff John Doe #1 notified Twitter that both he and John Doe #2
`
`were minors, Twitter intentionally distributed on its online platform photographic images and
`
`video of the Plaintiffs.” FAC ¶¶ 213-218 (Claim Ten);
`
`11) intrusion into private affairs, based on the allegation that “Twitter intentionally
`
`intruded into Plaintiffs’ reasonable expectation of privacy by continuing to distribute the
`
`photographic images and video depicting them after John Doe #1 notified Twitter that Plaintiffs
`
`were minors and the material had been posted on its platform without their consent.” FAC ¶¶ 219-
`
`223 (Claim Eleven);
`
`12) invasion of privacy under the California Constitution, Article 1, Section 1. FAC ¶¶
`
`224-228 (Claim Twelve); and
`
`13) violation of California Business and Professions Code § 17200 (“UCL”) based on
`
`allegations that “Twitter utilized and exploited Plaintiffs for its own benefit and profit” and
`
`“Plaintiffs, to their detriment, reasonably relied upon Twitter’s willful and deceitful conduct and
`
`assurances that it effectively moderates and otherwise controls third-party user content on its
`
`platforms.” FAC ¶¶ 229-234 (Claim Thirteen).
`
`Plaintiffs seek compensatory and punitive damages, injunctive relief, restitution,
`
`disgorgement of profits and unjust enrichment and attorneys’ fees and costs.
`
`B.
`
`Statutory Background
`
`1. The CDA
`
`The CDA was enacted as part of the Telecommunications Act of 1996. It contains a
`
`“Good Samaritan” provision that immunizes interactive computer service (“ICS”) providers from
`
`liability for restricting access to certain types of materials or giving users the technical means to
`
`restrict access to such materials, providing as follows:
`
`(c) Protection for “Good Samaritan” blocking and screening of
`offensive material
`
`(1) Treatment of publisher or speaker
`
`
`4
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 5 of 56
`
`
`
`No provider or user of an interactive computer service shall be
`treated as the publisher or speaker of any information provided
`by another information content provider.
`
`(2) Civil liability
`
`
`No provider or user of an interactive computer service shall
`be held liable on account of—
`
`(A) any action voluntarily taken in good faith to restrict access to
`or availability of material that the provider or user considers
`to be obscene, lewd, lascivious, filthy, excessively violent,
`harassing, or otherwise objectionable, whether or not such
`material is constitutionally protected; or
`
`(B) any action taken to enable or make available to information
`content providers or others the technical means to restrict
`access to material described in paragraph (1).
`
`47 U.S.C. § 230(c).
`
`“This grant of immunity dates back to the early days of the internet when concerns first
`
`arose about children being able to access online pornography.” Enigma Software Grp. USA, LLC
`
`v. Malwarebytes, Inc., 946 F.3d 1040, 1046 (9th Cir. 2019), cert. denied, 141 S. Ct. 13 (2020). At
`
`that time, “[p]arents could not program their computers to block online pornography, and this was
`
`at least partially due to a combination of trial court decisions in New York that had deterred the
`
`creation of online-filtration efforts.” Id. Under the New York cases, “if a provider remained
`
`passive and uninvolved in filtering third-party material from its network, the provider could not be
`
`held liable for any offensive content it carried from third parties.” Id. (citing Cubby, Inc. v.
`
`CompuServe, Inc., 776 F. Supp. 135, 139–43 (S.D.N.Y. 1991)). On the other hand, “once a
`
`service provider undertook to filter offensive content from its network, it assumed responsibility
`
`for any offensive content it failed to filter, even if it lacked knowledge of the content.” Id. (citing
`
`Stratton Oakmont, Inc. v. Prodigy Services Co., 1995 WL 323710, *5 (N.Y. Sup. Ct. May 24,
`
`1995)). “The Stratton Oakmont decision, along with the increasing public concern about
`
`pornography on the internet, served as catalysts” for the enactment of the CDA. Id.
`
`The Ninth Circuit has interpreted CDA § 230 broadly: so long as an interactive computer
`
`service provider is not also an “information content provider,” that is, someone who is
`
`“responsible, in whole or in part, for the creation or development of” the offending content, it is
`
`immune from liability arising from content created by third parties. Fair Hous. Council of San
`
`5
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 6 of 56
`
`
`
`Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1162 (9th Cir. 2008) (citing 47 U.S.C.
`
`§§ 230(c), (f)). Thus, a defendant is entitled to immunity under the CDA if: 1) it is a “provider or
`
`user of an interactive computer service,” 2) the information for which the plaintiff seeks to hold
`
`the defendant liable was “information provided by another information content provider,” and 3)
`
`the complaint seeks to hold the defendant liable as the “publisher or speaker” of that information.
`
`Klayman v. Zuckerberg, 753 F.3d 1354, 1357 (D.C. Cir. 2014) (citing 47 U.S.C. § 230(c)(1)); see
`
`also Pennie v. Twitter, Inc., 281 F. Supp. 3d 874, 890 (N.D. Cal. 2017) (finding that under Section
`
`230 Twitter was immune from claims based on theory that third-party content Twitter allowed to
`
`be posted on its platform led to plaintiff’s injury because the claim sought to hold Twitter liable as
`
`a publisher).2
`
`As expressly stated in Section 230, the policies underlying the enactment of that section
`
`are:
`
`
`
`
`
`
`
`(1) to promote the continued development of the Internet and other
`interactive computer services and other interactive media;
`
`(2) to preserve the vibrant and competitive free market that presently
`exists for the Internet and other interactive computer services,
`unfettered by Federal or State regulation;
`
`(3) to encourage the development of technologies which maximize
`user control over what information is received by individuals,
`families, and schools who use the Internet and other interactive
`computer services;
`
`(4) to remove disincentives for the development and utilization of
`blocking and filtering technologies that empower parents to
`restrict their children's access to objectionable or inappropriate
`online material; and
`
`(5) to ensure vigorous enforcement of Federal criminal laws to deter
`and punish trafficking in obscenity, stalking, and harassment by
`means of computer.
`
`47 U.S.C. § 230(b).
`
`
`
`Section 230 expressly states that it has “[n]o effect on criminal law[,]” providing that
`
`“[n]othing in this section shall be construed to impair the enforcement of section 223 or 231 of this
`
`title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of Title
`
`
`2 Here, it is undisputed that Twitter is an interactive computer service provider and that the Videos
`were provided by another information content provider.
`6
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 7 of 56
`
`
`
`18, or any other Federal criminal statute.” 47 U.S.C. § 230(e)(1). It expressly preempts all state
`
`laws that are inconsistent with Section 230’s grant of immunity. 47 U.S.C. § 230(e)(3) (“No cause
`
`of action may be brought and no liability may be imposed under any State or local law that is
`
`inconsistent with this section.”).
`
`2. The TVPRA
`
`In 2000, Congress enacted the TVPRA, which criminalized sex trafficking. When it
`
`enacted the TVPRA, “Congress declared that the purposes of the [TVPRA] are to ‘combat
`
`trafficking in persons, a contemporary manifestation of slavery whose victims are predominantly
`
`women and children, to ensure just and effective punishment of traffickers, and to protect their
`
`victims.’ ” Ditullio v. Boehm, 662 F.3d 1091, 1094 (9th Cir. 2011) (quoting Pub.L. No. 106–386,
`
`§ 102, 114 Stat. 1488 (2000) (codified as amended at 18 U.S.C. § 1589 et seq.)). In 2003, the law
`
`was expanded to provide a private right of civil action for victims of sex trafficking, codified at 18
`
`U.S.C. § 1595. Id. “The version of § 1595 enacted in 2003 limited the civil remedy to victims of
`
`three specific trafficking acts (including sex trafficking of minors), and did not expressly permit
`
`recovery against individuals who benefit from participation in a trafficking venture.” Id. n. 1
`
`(citing Pub.L. 108–193, § 4(a)(4)(A), 117 Stat. 2878 (2003)). “In the [TVPRA’s] 2008
`
`reauthorization, Congress deleted those limitations.” Id. (citing Pub.L. 110–457, Title II, § 221(2),
`
`122 Stat. 5067 (2008)).
`
`In its current form, the TVPRA makes it a crime to engage in direct sex trafficking or to
`
`benefit financially from sex trafficking, providing as follows:
`
`(a) Whoever knowingly—
`
`(1) in or affecting interstate or foreign commerce, or within the
`special maritime and territorial jurisdiction of the United States,
`recruits, entices, harbors, transports, provides, obtains, advertises,
`maintains, patronizes, or solicits by any means a person; or
`
`(2) benefits, financially or by receiving anything of value, from
`participation in a venture which has engaged in an act described
`in violation of paragraph (1),
`
`knowing, or, except where the act constituting the violation of
`paragraph (1) is advertising, in reckless disregard of the fact, that
`means of force, threats of force, fraud, coercion described in
`subsection (e)(2), or any combination of such means will be used to
`
`7
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 8 of 56
`
`
`
`cause the person to engage in a commercial sex act, or that the person
`has not attained the age of 18 years and will be caused to engage in a
`commercial sex act, shall be punished as provided in subsection (b).
`
`18 U.S.C. § 1591(a). Section 1591(e) further provides that “[in] this section . . . [t]he term
`
`‘participation in a venture’ means knowingly assisting, supporting, or facilitating a violation of
`
`subsection (a)(1).” 18 U.S.C. § 1591(e)(4).
`
`The civil liability provision in its current form provides that “[a]n individual who is a
`
`victim of a violation of this chapter may bring a civil action against the perpetrator (or whoever
`
`knowingly benefits, financially or by receiving anything of value from participation in a venture
`
`which that person knew or should have known has engaged in an act in violation of this chapter)
`
`in an appropriate district court of the United States and may recover damages and reasonable
`
`attorneys fees.” 18 U.S.C. § 1595(a).
`
`3. FOSTA
`
`In 2018, the CDA was amended by the Allow States and Victims to Fight Online Sex
`
`Trafficking Act of 2017, Pub. L. No. 115-164, 132 Stat. 1253 (2018) (“FOSTA”), which inserted a
`
`new provision in CDA § 230 specifically addressing the application of that section in the context
`
`of sex trafficking law. See 47 U.S.C. § 230(f)(5). “ ‘Congress passed [FOSTA] to narrow Section
`
`230’s scope and provide prosecutors with new tools to combat the sex trafficking of both minors
`
`and adults.’ ” J. B. v. G6 Hosp., LLC, No. 19-CV-07848-HSG, 2020 WL 4901196, at *4 (N.D.
`
`Cal. Aug. 20, 2020) (quoting Woodhull Freedom Found. v. United States, 948 F.3d 363, 368 (D.C.
`
`Cir. 2020)); see also 164 Cong. Rec. S1849-08, 164 Cong. Rec. S1849-08, S1849 (reflecting that
`
`FOSTA was enacted in response to an increase in sex trafficking resulting from “the presence of
`
`[sex trafficking] organizations online that are using the ruthless efficiency of the internet to sell
`
`women and children”). The legislative history reflects that one of the websites of particular
`
`concern to Congress was a sex trafficking website called “Backpage,” which knowingly trafficked
`
`in young women and children. See 164 Cong. Rec. S1849-08, 164 Cong. Rec. S1849-08, S1854
`
`(“Why is this law so important? If I am looking at this through a prosecutor’s lens, now all of the
`
`prosecutors in the country can go after anyone who knowingly facilitates sex trafficking online. I
`
`am not saying when it is by accident, and I am not saying when it has slipped through and they
`
`8
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 9 of 56
`
`
`
`don’t know it; I am talking about to knowingly facilitate, which is what [B]ackpage was doing.”)
`
`(Senator McCaskill).
`
`FOSTA’s amendment of the CDA consisted of adding Section 230(e)(5):
`
`(5) No effect on sex trafficking law
`
`Nothing in this section (other than subsection (c)(2)(A)) shall be
`construed to impair or limit—
`
`(A) any claim in a civil action brought under section 1595 of Title 18,
`if the conduct underlying the claim constitutes a violation of
`section 1591 of that title;
`
`
`(B) any charge in a criminal prosecution brought under State law if
`the conduct underlying the charge would constitute a violation of
`section 1591 of Title 18; or
`
`(C) any charge in a criminal prosecution brought under State law if
`the conduct underlying the charge would constitute a violation of
`section 2421A of Title 18, and promotion or facilitation of
`prostitution is illegal in the jurisdiction where the defendant's
`promotion or facilitation of prostitution was targeted.
`
`
`
`
`
`47 U.S.C. § 230(e)(5). “ ‘FOSTA narrowed the scope of immunity for interactive computer
`
`service providers, by providing that Section 230 has “[n]o effect on sex trafficking law,” and shall
`
`not “be construed to impair or limit” civil claims brought under TVPRA Section 1595 or criminal
`
`charges brought under state law if the underlying conduct would constitute a violation of TVPRA
`
`Sections 1591 or 2421A.’ ” J. B. v. G6 Hosp., LLC, 2020 WL 4901196, at *4 (quoting Woodhull
`
`Freedom Found. v. United States, 948 F.3d at 368) (quoting 132 Stat. at 1254).
`
`C. Contentions of the Parties
`
`1. Motion
`
`Twitter argues in the Motion that it is immune under CDA § 230 as to all of Plaintiffs’
`
`claims. Motion at 2. According to Twitter, the amendment of Section 230 under FOSTA,
`
`permitting sex trafficking victims to pursue civil claims under 18 U.S.C. § 1595 against an
`
`interactive computer service provider where the provider violates 18 U.S.C. § 1591, created only a
`
`narrow exception to the immunity afforded under Section 230 that was “carefully targeted to
`
`remove civil immunity for the few criminal websites that, unlike Twitter here, were deliberately
`
`and knowingly assisting and profiting from reprehensible crimes.” Id. Twitter contends,
`
`9
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 10 of 56
`
`
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`“FOSTA’s language, its legislative history, and the pre-existing case law on Section 1591 all point
`
`to the same conclusion: civil claims can only proceed against sex traffickers and those who
`
`knowingly benefit from their affirmative participation in a sex trafficking venture.” Id.
`
`Twitter argues that here, Plaintiffs have failed to allege facts showing that the exception to
`
`immunity created under FOSTA applies because the FAC: 1) “lacks any facts showing that
`
`Twitter affirmatively participated in any kind of venture with the Perpetrators, let alone a sex
`
`trafficking venture”; 2) “does not allege, as required to establish a violation of Section 1591, any
`
`facts establishing that Twitter knew that Plaintiffs were victims of sex trafficking or that the
`
`Videos were evidence of this crime”; and 3) does not “allege any connection between the
`
`Perpetrators and Twitter or that Twitter received any benefits because of the Videos.” Id. Twitter
`
`further asserts that CDA § 230 protects it from liability because “Twitter did remove the Videos
`
`and suspend the accounts that had posted them” and it cannot be held liable under “any applicable
`
`law” simply because it did not take the videos down immediately. Id.
`
`Twitter represents that it “vigorously combats [child sexual exploitation material (“CSE”)]
`
`through a combination of methods, including review of user reports and the use of proprietary
`
`technology to proactively identify and remove such material” but that “given the sheer volume of
`
`Tweets posted every day on Twitter’s platform (hundreds of millions of Tweets posted by over
`
`190 million daily users), it is simply not possible for Twitter – or the individuals who enforce its
`
`Rules and policies – to find and remove all offending content immediately or accurately in all
`
`cases.” Id. at 1. Twitter points to its zero-tolerance policy for child sexual exploitation materials,
`
`which is set forth in its Rules – to which users must agree when they create a Twitter account. Id.
`
`at 6. According to Twitter, it also “utilizes multiple tools, including reports by the public . . . ,
`
`moderators who review reports of abuse and CSE content, innovative technology and algorithms
`
`that proactively identify abusive content, and online education and information sharing to combat
`
`online abuse.” Id. (citing FAC ¶¶ 42-43, 55-57; Wong Decl., Exs. 1 (news article entitled “Twitter
`
`says it’s getting better at detecting abusive tweets without your help”), 2 (a blog post by Twitter
`
`entitled “A healthier Twitter: Progress and more to do”)). According to Twitter, in enacting
`
`FOSTA, Congress did not intend “for online platforms like Twitter that proactively act against
`
`10
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 11 of 56
`
`
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`such activity to be sued for their inadvertent failure to remove content.” Id. at 2.
`
`The purpose of CDA § 230, according to Twitter, was “to ensure that interactive computer
`
`service (‘ICS’) providers would never have to choose ‘between taking responsibility for all
`
`messages and deleting no messages at all,’ which presents such providers a ‘grim’ and illusory
`
`choice.” Id. at 3 (quoting Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC,
`
`521 F.3d 1157, 1162-63 (9th Cir. 2008)). To achieve that purpose, it asserts, Ҥ 230 creates broad
`
`immunity for claims against ICS providers based on content created by users: ‘No provider . . . of
`
`an interactive computer service shall be treated as the publisher or speaker of any information
`
`provided by another information content provider.’ ” Id. (quoting 47 U.S.C. § 230(c)(1)). Twitter
`
`contends this provision “bars all causes of action that seek to hold ICS providers like Twitter
`
`liable for not removing content created by a third-party.” Id. (citing Igbonwa v. Facebook, Inc.,
`
`2018 WL 4907632, at *5-7 (N.D. Cal. Oct. 9, 2018), aff’d, 786 F. App’x 104 (9th Cir. 2019)).
`
`Twitter asserts that FOSTA created only a narrow exception to the immunity afforded
`
`under CDA § 230, permitting a victim of sex trafficking to bring a civil action under § 1595, but
`
`only “ ‘if the conduct underlying the claim constitutes a violation of [S]ection 1591,’ the criminal
`
`statute prohibiting sex trafficking.” Id. at 4 (citing 47 U.S.C. § 230(e)(5)(A); Doe v. Kik
`
`Interactive, 482 F.Supp.3d 1242, 1251 (S.D. Fla. 2020) (“FOSTA permits civil liability for
`
`websites only if the conduct underlying the claim constitutes a violation of section 1591.”)). This
`
`limitation is important, it contends, because “Section 1591 has more stringent mens rea and
`
`required elements to meet than Section 1595.” Id. Thus, Twitter argues, the FOSTA exception
`
`only applies to “openly malicious actors” and does not otherwise change the scope of immunity
`
`under CDA § 230. Id. at 5 (citing Kik, 482 F.Supp.3d at 1249-51); see also id. at 3. According
`
`to Twitter, this is apparent from the legislative history. Id. (citing 164 Cong. Rec., at S1860-62
`
`(statement of Senator Durbin (“[FOSTA] is a narrowly crafted bill that would ensure that Section
`
`230 . . . does not provide legal immunity to websites like Backpage that knowingly facilitate sex
`
`trafficking”)); id. (statement of Senator Schumer (“Key to my support is my understanding that
`
`this legislation would not allow nuisance lawsuits against technology companies.”)).
`
`Here, Twitter asserts, it meets all the requirements for establishing immunity under CDA §
`
`11
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 12 of 56
`
`
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`230, namely, that (1) it is an ICS provider; (2) Plaintiffs’ claims treat Twitter as the publisher or
`
`speaker of the content in question; and (3) someone other than Twitter provided or created the
`
`content at issue. Id. at 8-11. Twitter argues further that the FAC does not allege facts that would
`
`establish that any exemption to CDA § 230 applies, including the FOSTA exception that allows
`
`for the imposition of liability where the ICS itself violates Section 1591, either as a “primary
`
`violator” or a “secondary participant” that ‘knowingly . . . benefits, financially or by receiving
`
`anything of value, from participation in a venture’ with a primary violator.” Id. (quoting 18 U.S.C. §
`
`1591(a)). Id. at 11-12.
`
`With respect to Plaintiffs’ claim that Twitter was a primary participant in sex trafficking
`
`under Section 1591(a)(1), Twitter contends Plaintiffs’ allegations fall short because “[t]o plead a
`
`primary violation, a plaintiff must allege that the defendant ‘provide[d], obtain[ed], [and]
`
`maintain[ed] . . . a person” knowing that he or she “will be . . . cause[d]” to engage in a
`
`commercial sex act.” Id. at 12 (quoting 18 U.S.C. § 1591(a)(1)) (emphasis added by Twitter).
`
`Twitter contends “Plaintiffs allege only that ‘Twitter knowingly provided, obtained, and
`
`maintained the Videos,’ not Plaintiffs” and therefore they fail to allege a primary violation. Id.
`
`(quoting FAC ¶ 141) (emphasis added by Twitter). Twitter also argues that as to Plaintiffs’ claims
`
`under both Section 1591(a)(1) and Section 1591(a)(2), those claims fall short for the additional
`
`reason that Section 1591 “requires a defendant to know that the victim ‘will in the future [be]
`
`cause[d] . . . to engage in prostitution.’ ” Id. at 12 n. 10 (citing United States v. Todd, 627 F.3d
`
`329, 334 (9th Cir. 2010)). According to Twitter, “Plaintiffs cannot plead that Twitter had such
`
`knowledge as the FAC alleges that Plaintiffs had cut off contact with the Perpetrators before the
`
`Videos were posted on Twitter’s platform.” Id. (citing FAC ¶¶ 94-96).
`
`Twitter contends Plaintiffs also fail to allege that it was a secondary participant under
`
`Section 1591(a)(2). Id. at 12-19. According to Twitter, to establish that it is a secondary
`
`participant, Plaintiffs must “plead that Twitter ‘knowingly . . . benefit[ed] . . . from participation in
`
`a venture which has engaged in [sex trafficking] in violation of [Section 1591(a)(1)].’ ” Id.
`
`(quoting 18 U.S.C. § 1591(a)(2)). It further asserts that Section 1591 was amended by FOSTA to
`
`define “[p]articipation in a venture” as “knowingly assisting, supporting, or facilitating” a primary
`
`12
`
`Northern District of California
`
`United States District Court
`
`

`

`Case 3:21-cv-00485-JCS Document 69 Filed 08/19/21 Page 13 of 56
`
`
`
`1
`
`2
`
`3
`
`4
`
`5
`
`6
`
`7
`
`8
`
`9
`
`10
`
`11
`
`12
`
`13
`
`14
`
`15
`
`16
`
`17
`
`18
`
`19
`
`20
`
`21
`
`22
`
`23
`
`24
`
`25
`
`26
`
`27
`
`28
`
`violation. Id. at 12 (quoting 18 U.S.C. § 1591(e)(4)) (emphasis added by Twitter)). Twitter
`
`argues that neither of the grounds upon which Plaintiffs rely to establish that Twitter was a
`
`secondary participant – “(i) Twitter’s initial failure to find a violation of its policies after
`
`reviewing the Video, or (ii) Twitter’s nine-day delay in removing the Videos” – establishes it was
`
`a secondary participant for three reasons. Id.
`
`“First, Plaintiffs do not allege the existence of any type of venture between Twitter and any
`
`party that has a common purpose, much less facts suggesting ‘that [Twitter] actually participat

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket