throbber
1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 1 of 14
`
`
`
`
`
`MAYER BROWN LLP
`ANDREW J. PINCUS (pro hac vice)
`apincus@mayerbrown.com
`1999 K Street, NW
`Washington, DC 20006-1101
`Telephone: (202) 263-3000
`Facsimile: (202) 263-3300
`
`MAYER BROWN LLP
`DOUGLAS A. SMITH (SBN 290598)
`dougsmith@mayerbrown.com
`350 South Grand Avenue, 25th Floor
`Los Angeles, California 90071-1503
`Telephone: (213) 229-9500
`Facsimile: (213) 625-0248
`
`Attorneys for Amicus Curiae Internet Association
`
`
`UNITED STATES DISTRICT COURT
`NORTHERN DISTRICT OF CALIFORNIA
`SAN FRANCISCO DIVISION
`
`TWITTER, INC.,
`
`Plaintiff,
`
`
`
`
`
`vs.
`
`KEN PAXTON, in his official capacity as
`Attorney General of Texas
`
`Case No. 3:21-cv-01644-MMC
`
`INTERNET ASSOCIATION’S AMICUS
`CURIAE BRIEF IN SUPPORT OF
`PLAINTIFF TWITTER’S MOTION
`FOR A PRELIMINARY INJUNCTION
`
`Hon. Maxine M. Chesney
`
`
`
`
`
`
`
`
`
`
`
`
`Defendant.
`
`
`
`
`
`
`- i -
`
`INTERNET ASSOCIATION’S AMICUS CURIAE BRIEF IN SUPPORT OF
`PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 2 of 14
`
`
`
`TABLE OF CONTENTS
`
`
`TABLE OF AUTHORITIES .......................................................................................................... iii 
`INTEREST OF THE AMICUS CURIAE ......................................................................................... 1 
`SUMMARY OF ARGUMENT ....................................................................................................... 1 
`ARGUMENT ................................................................................................................................... 2 
`I.  Content Moderation Is Essential To The Functioning Of Social Media Platforms—And
`
`Those Moderation Decisions Are Protected By The First Amendment. ............................. 2 
`A.  Social Media Companies Utilize Content Moderation To Provide Attractive And
`
`Responsible User Experiences By Eliminating Inappropriate And Objectionable
` Material From Their Platforms. ..................................................................................... 2 
`B.  Content Moderation Standards And Decisions Are Protected By The First
` Amendment. ................................................................................................................... 8 
`CONCLUSION ................................................................................................................................ 9 
`
`
`
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- ii -
`
`INTERNET ASSOCIATION’S AMICUS CURIAE BRIEF IN SUPPORT OF
`PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 3 of 14
`
`TABLE OF AUTHORITIES
`
`Page(s)
`
`Cases
`
`Brown v. Ent. Merchs. Ass’n,
`564 U.S. 786 (2011) ....................................................................................................................8
`
`Bullfrog Films, Inc. v. Wick,
`847 F.2d 502 (9th Cir. 1988) .......................................................................................................8
`
`Goldblum v. Nat’l Broad. Corp.,
`584 F.2d 904 (9th Cir. 1978) .......................................................................................................8
`
`Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of Bos.,
`515 U.S. 557 (1995) ................................................................................................................2, 8
`
`Joseph Burstyn, Inc. v. Wilson,
`343 U.S. 495 (1952) ....................................................................................................................8
`
`Knight First Amendment Inst. at Columbia Univ. v. Trump,
`928 F.3d 226 (2d Cir. 2019) ........................................................................................................8
`
`Med. Lab’y Mgmt. Consultants v. Am. Broad. Companies, Inc.,
`306 F.3d 806 (9th Cir. 2002) .......................................................................................................8
`
`Miami Herald Pub. Co. v. Tornillo,
`418 U.S. 241 (1974) ................................................................................................................2, 8
`
`Nieves v. Bartlett,
`139 S. Ct. 1715 (2019) ................................................................................................................9
`
`Reno v. ACLU,
`521 U.S. 844 (1997) ....................................................................................................................8
`
`Turner Broadcasting System, Inc. v. FCC,
`512 U.S. 622 (1994) ....................................................................................................................8
`
`Other Authorities
`
`Account and Community Restrictions - Do Not Post Sexual of Suggestive Content
`Involving Minors, REDDIT, INC., https://www.reddithelp.com/hc/en-us/articles/
`360043075352 .............................................................................................................................4
`
`Alison Grace Johansen, Deepfakes: What they are and why they’re threatening,
`NORTONLIFELOCK, https://us.norton.com/internetsecurity-emerging-threats-
`what-are-deepfakes.html .............................................................................................................7
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- iii -
`
`INTERNET ASSOCIATION’S AMICUS CURIAE BRIEF IN SUPPORT OF
`PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 4 of 14
`
`
`
`Community Guidelines - Developing Policies, YOUTUBE, https://www.youtube.
`com/howyoutubeworks/policies/community-guidelines/#developing-policies ..........................4
`
`Community Guidelines, SNAP, INC., https://www.snap.com/en-
`US/communityguidelines ............................................................................................................3
`
`Community Standards – Account Integrity & Authentic Identity, FACEBOOK,
`https://www.facebook. com/communitystandards/misrepresentation ........................................4
`
`Community Standards – Bullying & Harassment, FACEBOOK, https://www.
`facebook.com/communitystandards/bullying .........................................................................3, 5
`
`Community Standards - Authenticity, AIRBNB, INC., https://www.airbnb.com/trust/
`standards ......................................................................................................................................5
`
`Community Standards, AIRBNB, https://www.airbnb.com/trust/standards .......................................4
`
`Daisy Soderberg-Rivkin, Five myths about online content moderation, from a
`former content moderator, R STREET INSTITUTE (Oct. 30, 2019),
`https://www.rstreet.org/2019/10/30/five-myths-about-online-content-
`moderation-from-a-former-content-moderator ...........................................................................3
`
`Daly Kos: Rules of the Road, KOS MEDIA, LLC, https://www.dailykos.com/rules-
`of-the-road ...................................................................................................................................5
`
`Facebook by the Numbers, OMNICORE,
`https://www.omnicoreagency.com/facebook-statistics ...............................................................6
`
`General Guidelines and Policies - Abusive Behavior, TWITTER, INC.,
`https://help.twitter.com/en/rules-and-policies/ abusive-behavior ...............................................3
`
`Guidelines for Traveler Reviews, TRIPADVISOR, LLC,
`https://www.tripadvisorsupport.com/hc/en-us/articles/200614797-Our-
`guidelines-for-traveler-reviews ...................................................................................................5
`
`Jason A. Gallo & Clare Y. Cho, CONG. RSCH. SERV. R46662, SOCIAL MEDIA:
`MISINFORMATION AND CONTENT MODERATION ISSUES FOR CONGRESS 1
`(2021), https://crsreports.congress.gov/product/pdf/R/R46662 ..................................................7
`
`Laura Hanu, James Thewlis & Sasha Haco, How AI Is Learning To Identify Toxic
`Online Content, SCIENTIFIC AMERICAN (2021) ...........................................................................6
`
`LinkedIn Professional Community Policies, LINKEDIN, https://www.linkedin.
`com/help/linkedin/answer/34593/linkedin-professional-community-policies ............................4
`
`Mind-Blowing LinkedIn Statistics and Facts, KINSTA, INC.: KINSTA BLOG (Mar 18,
`2021), https://kinsta.com/blog/linkedin-statistics .......................................................................6
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- iv -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 5 of 14
`
`
`
`Pew Research Center, Social Media Fact Sheet,
`https://www.pewresearch.org/internet/fact-sheet/social-media/#social-media-
`use-over-time (last visited Mar. 24, 2021) ..................................................................................1
`
`Platform Manipulation and Spam Policy, General Guidelines and Policies,
`TWITTER, INC., https://help.twitter.com/en/rules-and-policies/platform-
`manipulation ................................................................................................................................5
`
`Professional Community Policies, LINKEDIN,
`https://www.linkedin.com/legal/professional-community-policies ........................................3, 5
`
`PWC, The quest for truth: content moderation in action,
`https://www.pwc.com/us/en/-industries/tmt/library/content-moderation-quest-
`for-truth-and-trust.html (last visited Mar. 24, 2021) ...................................................................7
`
`Reddit Content Policy, REDDIT,
`https://www.redditinc.com/policies/contentpolicy-1 ..................................................................4
`
`Trust & Safety - Community Standards, AIRBNB, INC.,
`https://www.airbnb.com/trust/ standards ....................................................................................3
`
`YouTube For Press, YOUTUBE: OFFICIAL BLOG, https://blog.youtube.com/press/ ..........................6
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- v -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 6 of 14
`
`
`
`INTEREST OF THE AMICUS CURIAE1
`Internet Association (“IA”) represents the interests of the Nation’s leading internet
`
`companies and their customers. Its members include companies whose products and services
`enable people throughout the country and the world to express themselves, both privately and
`publicly.2
`
`IA seeks to protect internet freedom and free speech, promote innovation and economic
`growth, and empower customers and users. Amicus’s members serve as platforms for
`communications and services for billions of users, and the success of these online businesses—and
`the vitality of online media generally—depends on their ability to adopt and enforce robust
`community standards governing the content of their websites, which are tailored to the purposes of
`the website and the needs of its users. Amicus and its members have a substantial interest in whether
`the First Amendment permits government officials to use their authority to investigate, second
`guess, or penalize decisions by online services regarding the removal of specific content on their
`platforms or suspension of specific users from their websites, particularly if that government action
`is in retaliation for such decisions made by an online services provider.
`SUMMARY OF ARGUMENT
`Seventy-two percent of American adults use some type of social media.3 “For many users,
`
`social media is part of their daily routine,” with more that 40% of the users of leading platforms
`visiting the sites daily and more than seventy percent visiting weekly.4 With such ubiquitous use,
`many Americans turn to social media to receive their news; engage in commentary; maintain
`relationships with family, friends, coworkers, and members of their communities—whether
`
`
`1 No counsel for a party authored this brief in whole or in part. No party, no party’s counsel, and
`no person other than Amicus or its counsel made a monetary contribution intended to fund the
`preparation or submission of this brief. Plaintiff consents to the filing of this brief and Defendant
`does not oppose the filing of this brief.
`2 A complete list of Internet Association members is available at http://internetassociation.org/our-
`members/.
`3 Pew Research Center, Social Media Fact Sheet, https://www.pewresearch.org/internet/fact-
`sheet/social-media/#social-media-use-over-time (last visited Mar. 24, 2021).
`4 Id.
`
`
`- 1 -
`
`INTERNET ASSOCIATION’S AMICUS CURIAE BRIEF IN SUPPORT OF
`PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 7 of 14
`
`
`
`neighbors, fellow members of a sports league, or other members of their church; and inform their
`everyday decisions. From determining what products or services to purchase or which local events
`to attend on a weekend, social media is integrated into every facet of life.
`To ensure a quality user experience, internet companies exercise editorial discretion in the
`form of content moderation. This includes setting and enforcing rules against inappropriate,
`objectionable, and inaccurate material on their platforms, and—sometimes—temporarily or
`permanently suspending users who persist in violating the platforms’ community standards. The
`rules adopted by internet companies vary from company to company and represent a judgment
`about what types of content are appropriate for the nature of a specific service, its target audience,
`and concerns about the impact of online content on offline conduct.
`As with any decision that requires the exercise of judgment, there is room for healthy debate
`over those decisions. But where there is no room for debate is that such decisions are protected by
`the First Amendment—as the Supreme Court has held in a variety of contexts. See, e.g., Hurley v.
`Irish-Am. Gay, Lesbian & Bisexual Grp. of Bos., 515 U.S. 557, 570 (1995) (“[t]he selection of
`contingents to make a parade”); Miami Herald Pub. Co. v. Tornillo, 418 U.S. 241, 251 (1974)
`(newspaper). The Texas AG’s investigation violates this fundamental Constitutional safeguard.
`ARGUMENT
`I. Content Moderation Is Essential To The Functioning Of Social Media Platforms—
`And Those Moderation Decisions Are Protected By The First Amendment.
`
`
`A. Social Media Companies Utilize Content Moderation To Provide Attractive And
`Responsible User Experiences By Eliminating Inappropriate And Objectionable
`Material From Their Platforms.
`
`
`
`Amicus’s members provide online platforms through which users can share news and
`
`opinions, advertise goods, rate and review service businesses and vendors, search for housing, and
`
`interact with individuals around the globe. To offer these and myriad other services, these providers
`
`depend heavily on their right to regulate the content on their platforms, including by filtering,
`
`screening or otherwise preventing third-party users from posting material that violates the
`
`provider’s content rules, which are sometimes referred to as “community standards.”
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- 2 -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 8 of 14
`
`
`
`Content moderation “provides a healthy and safe environment where users can upload their
`
`own products, posts or comments and comfortably engage with others. It’s a tool to improve user
`
`experience, ensure that platforms adhere to local and global laws, and helps users trust that they
`
`can interact through a platform or use a service without fear of being deceived.”5
`
`These community standards can vary enormously depending on a website’s functions and
`
`goals, and may include, for example, prohibiting hate speech, requiring sellers to provide accurate
`
`information about their products, or penalizing users for artificially amplifying the significance of
`
`their posts (e.g., by using fake accounts to increase the number of times a post is “liked”). Without
`
`the ability to prevent unwanted or offensive content, the services that Amicus’s members provide
`
`could become unsafe, unreliable, and therefore unattractive to users.
`
`Thus, online providers have issued community standards prohibiting various categories of
`
`objectionable material from their websites, including:
`
` “Hate speech,” such as content attacking someone based on race, ethnicity, national origin,
`religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious
`disease or disability.6
`
` “Bullying,” “harassment,” or “threats,” such as conveying an intent to release “personally
`identifiable information” about another.7
`
` “Violent,” “graphic,” or “sexual” content.8
`
`
`5 Daisy Soderberg-Rivkin, Five myths about online content moderation, from a former content
`moderator, R STREET INSTITUTE (Oct. 30, 2019), https://www.rstreet.org/2019/10/30/five-myths-
`about-online-content-moderation-from-a-former-content-moderator/.
`6 Community Guidelines, SNAP, INC., https://www.snap.com/en-US/communityguidelines.
`7 Community Standards – Bullying & Harassment, FACEBOOK, https://www.
`facebook.com/communitystandards/bullying; General Guidelines and Policies - Abusive
`Behavior, TWITTER, INC., https://help.twitter.com/en/rules-and-policies/ abusive-behavior; Trust
`& Safety - Community Standards, AIRBNB, INC., https://www.airbnb.com/trust/standards.
`8 Professional Community Policies, LINKEDIN, https://www.linkedin.com/legal/professional-
`community-policies.
`
`
`- 3 -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`

`

`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 9 of 14
`
`
`
`
`
`Impersonating others, such as by creating a misleading profile assuming “to be or speak for
`another person or entity.”9
`
`In addition to these and other broad categories, providers have also adopted community
`
`standards prohibiting more targeted types of content, including material that sexually exploits or
`
`endangers children, 10 supports terrorism, 11 misrepresents commercial listings, 12 or artificially
`
`amplifies the apparent significance of content (e.g., by manipulating the number of “votes,” “likes,”
`
`or “follows” a post receives). 13 These community standards are easily accessible through
`
`providers’ websites and, in general, users must accept them as a pre-condition to posting content
`
`or otherwise accessing the platform.
`
`Typically, platform operators are also clear about why they have adopted their respective
`
`community standards, which are often tailored to achieve the specific goals of their platforms and
`
`“developed in partnership with a wide range of external industry and policy experts” as well as
`
`based on direct feedback from users.14 For example, LinkedIn does not “allow content that attacks,
`
`denigrates, intimidates, dehumanizes, incites or threatens hatred, violence, prejudicial or
`
`discriminatory action against individuals or groups because of their actual or perceived race,
`
`ethnicity, national origin, caste, gender, gender identity, sexual orientation, religious affiliation, or
`
`
`9 Community Standards – Account Integrity & Authentic Identity, Facebook,
`https://www.facebook.com/communitystandards/misrepresentation.
`10 Account and Community Restrictions - Do Not Post Sexual of Suggestive Content Involving
`Minors, REDDIT, INC., https://www.reddithelp.com/hc/en-us/articles/ 360043075352.
`11 LinkedIn Professional Community Policies, LINKEDIN, https://www.linkedin.
`com/help/linkedin/answer/34593/linkedin-professional-community-policies? lang=en.
`12 Community Standards, AIRBNB, https://www.airbnb.com/trust/standards.
`13 Reddit Content Policy, REDDIT, https://www.redditinc.com/policies/contentpolicy-1.
`14 Community Guidelines - Developing Policies, YOUTUBE, https://www.youtube.
`com/howyoutubeworks/policies/community-guidelines/#developing-policies.
`- 4 -
`
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`

`

`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 10 of 14
`
`
`disability status.”15 This and other rules in its “Professional community policies” are designed to
`
`make the LinkedIn “community a place where everyone is respectful, compassionate, and
`
`honest.”16
`
`The community standards adopted by online providers reflect the diversity of the internet
`
`itself, with each online provider adopting standards specifically tailored to the diverse needs of its
`
`particular community of users. Social network platforms may implement protections to prevent
`
`harassment of younger users given that such content can “have more of an emotional impact on
`
`minors.” 17 Retail and rental platforms may prohibit users from posting inaccurate product
`
`information given the importance of buyers knowing what they are purchasing.18 Platforms that
`
`compile user reviews may prohibit users from posting anonymous or irrelevant reviews, or may
`
`prevent users from reviewing their own, friends’ or relatives’ businesses. 19 And platforms
`
`frequented by influential figures may prohibit users from misleadingly impersonating such figures,
`
`which could deceive other users or disrupt financial markets.20 Indeed, a platform could have as
`
`its purpose only the dissemination of particular views on public policy or political questions.21
`
`
`15 Professional Community Policies, LINKEDIN, https://www.linkedin.com/legal/professional-
`community-policies.
`16 Id.
`17 Community Standards - Bullying and Harassment, FACEBOOK, https://www.
`facebook.com/communitystandards/bullying.
`18 Community Standards - Authenticity, AIRBNB, INC., https://www.airbnb.com/trust/ standards.
`19 Guidelines for Traveler Reviews, TRIPADVISOR, LLC,
`https://www.tripadvisorsupport.com/hc/en-us/articles/200614797-Our-guidelines-for-traveler-
`reviews.
`20 Platform Manipulation and Spam Policy, General Guidelines and Policies, TWITTER, INC.,
`https://help.twitter.com/en/rules-and-policies/platform-manipulation.
`21 See, e.g., Daly Kos: Rules of the Road, KOS MEDIA, LLC, https://www.dailykos.com/rules-of-
`the-road (“This is a site for Democrats. That’s the fundamental premise underlying all
`expectations about posting, commenting, and interacting with other site users”).
`- 5 -
`
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`

`

`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 11 of 14
`
`
`Further, Amicus’s members employ a multitude of general-purpose technologies to support
`
`their content moderation efforts, such as providing users with “report abuse” buttons and other
`
`mechanisms to flag problematic content or contact the companies with complaints. Members also
`
`devote significant staff and resources to monitoring, analyzing and enforcing compliance with their
`
`respective community guidelines. In addition, providers have invested significant resources in
`
`developing sophisticated software and algorithms to detect and remove harmful content. In many
`
`instances, they have shared these technologies to help others eradicate that harmful content as well.
`
`Social media platforms rely on “thousands of human reviewers” 22—to enhance their ability to
`
`provide quick responses to evolving problems.
`
`Flexibility has played a critical role in enabling platforms to experiment and thereby refine
`
`their approaches to content moderation over time. Moderating content is not easy given the almost
`
`unfathomable volumes of content online and the need to make sometimes-nuanced distinctions.
`
`For example:
`
` YouTube has over 2 billion users and over a billion hours of video viewed on its platform
`every day.
`
` Facebook has more than 2.6 billion users, who send 3 million messages every 20 seconds.
`
` LinkedIn has nearly 740 million members worldwide, with users who frequently engage
`with the platform, which has more than 1 billion interactions every month.23
`
`
`Given the sheer number of decisions that need to be made, there are necessarily divergent views
`
`among the public and outside interested parties about content moderation decisions.
`
`
`22 Laura Hanu, James Thewlis & Sasha Haco, How AI Is Learning To Identify Toxic Online
`Content, SCIENTIFIC AMERICAN (2021), https://www.scientificamerican.com/article/can-ai-
`identify-toxic-online-content/.
`23 YouTube For Press, YOUTUBE: OFFICIAL BLOG, https://blog.youtube.com/press/; Facebook by
`the Numbers, OMNICORE, https://www.omnicoreagency.com/facebook-statistics/; Mind-Blowing
`LinkedIn Statistics
`and Facts, KINSTA,
`INC.: KINSTA BLOG
`(Mar 18, 2021),
`https://kinsta.com/blog/linkedin-statistics/.
`
`
`- 6 -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 12 of 14
`
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`Providers learn from, adapt, and update their approaches over time and have remained
`
`flexible in deciding what content is objectionable and how to prevent it from being posted. For
`
`example, many technology companies have been incorporating algorithms into their content
`
`moderation, such as Google’s Jigsaw that aims to detect “toxic comments online.”24 And some
`
`social media companies have relatively recently banned “deepfake” videos, which use artificial
`
`intelligence to alter videos to mislead someone into thinking that the video’s subject said or did
`
`something that in fact they did not.25
`
`Content moderation decisions can be controversial. “[P]latform companies . . . have to
`
`make Solomon-like decisions about the veracity of this information. If they publish content that is
`
`clearly untrue, consumer backlash would likely be swift, threatening revenue and reputation.
`
`On the other hand, if platforms refuse to publish certain content, they may be accused of censorship,
`
`bias or having a political agenda.”26 A recent Congressional Research Service frames the dilemma
`
`posed by content moderation, stating that “[s]ome Members of Congress are concerned about social
`
`media dissemination of misinformation . . . and are exploring how social media platform operators
`
`can stop or slow that dissemination via content moderation,” but “[o]ther Members’ interest in
`
`content moderation relates to concerns that platform operators are moderating content that should
`
`not be restricted.”27 That highlights the dilemma that online services confront every day.
`
`
`
`
`24 See supra note 22.
`25 See Alison Grace Johansen, Deepfakes: What they are and why they’re threatening,
`NORTONLIFELOCK, https://us.norton.com/internetsecurity-emerging-threats-what-are-
`deepfakes.html.
`26 PWC, The quest for truth: content moderation in action, https://www.pwc.com/us/en/-
`industries/tmt/library/content-moderation-quest-for-truth-and-trust.html (last visited Mar. 24,
`2021).
`27 Jason A. Gallo & Clare Y. Cho, CONG. RSCH. SERV. R46662, SOCIAL MEDIA: MISINFORMATION
`AND CONTENT MODERATION ISSUES FOR CONGRESS 1 (2021),
`https://crsreports.congress.gov/product/pdf/R/R46662.
`- 7 -
`
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`
`
`
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Page 13 of 14
`
`
`B. Content Moderation Standards And Decisions Are Protected By The First
`Amendment.
`“[W]hatever the challenges of applying the Constitution to ever–advancing technology, ‘the
`basic principles of freedom of speech and the press, like the First Amendment’s command, do not
`vary’ when a new and different medium for communication appears.” Brown v. Ent. Merchs. Ass’n,
`564 U.S. 786, 790 (2011) (quoting Joseph Burstyn, Inc. v. Wilson, 343 U.S. 495, 503 (1952)). The
`Supreme Court has held that its decisions “provide no basis for qualifying the level of First
`Amendment scrutiny that should be applied to this medium [the Internet].” Reno v. ACLU, 521
`U.S. 844, 870 (1997). Thus, “social media is entitled to the same First Amendment protections as
`other forms of media.” Knight First Amendment Inst. at Columbia Univ. v. Trump, 928 F.3d 226,
`237 (2d Cir. 2019).
`For decades, the Supreme Court and the Ninth Circuit have recognized that the First
`Amendment protects a speaker’s decisions about which speech or speakers to include when creating
`a compilation of others’ speech. See Hurley, 515 U.S. at 570 (parade); Turner Broadcasting
`System, Inc. v. FCC, 512 U.S. 622 (1994) (cable provider’s selection of channels); Med. Lab’y
`Mgmt. Consultants v. Am. Broad. Companies, Inc., 306 F.3d 806, 825 (9th Cir. 2002) (television
`program); Goldblum v. Nat'l Broad. Corp., 584 F.2d 904, 907 (9th Cir. 1978) (television program).
`Indeed, “[t]he danger inherent in [such] government . . . oversight, even in the interest of ‘balance,’
`is well established.” Bullfrog Films, Inc. v. Wick, 847 F.2d 502, 510 (9th Cir. 1988).
`When internet companies select the content for their platforms and the providers of that
`content, they are exercising “editorial control and judgment” over the “material” on their platforms,
`including the “treatment of public issues and public officials.” Miami Herald Pub. Co. v. Tornillo,
`418 U.S. 241, 258 (1974). While those editorial judgments may be perceived as “fair or unfair”
`depending on the prevailing political winds and “risk that occasionally debate on vital matters will
`not be comprehensive and that all viewpoints may not be expressed,” “governmental regulation of
`this crucial process [cannot] be exercised consistent with First Amendment guarantees of a free
`press.” Id. at 258, 260 (White, J., concurring).
`An investigation into the editorial judgments of internet companies therefore triggers First
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`
`- 8 -
`
`INTERNET ASSOCIATION’S MOTION FOR LEAVE TO FILE AMICUS CURIAE BRIEF
`IN SUPPORT OF PLAINTIFF TWITTER’S MOTION FOR A PRELIMINARY INJUNCTION
`CASE NO. 3:21-CV-01644-MMC
`
`

`

`Case 3:21-cv-01644-MMC Document 25-1 Filed 03/24/21 Pa

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket