`
`
`
`Joseph R. Saveri (SBN 130064)
`Steven N. Williams (SBN 175489)
`Anupama K. Reddy (SBN 324873)
`JOSEPH SAVERI LAW FIRM, LLP
`601 California Street, Suite 1000
`San Francisco, CA 94108
`Telephone: (415) 500-6800
`Facsimile: (415) 395-9940
`Email: jsaveri@saverilawfirm.com
`swillliams@saverilawfirm.com
`areddy@saverilawfirm.com
`
`Attorneys for Plaintiff and the Proposed Class
`
`
`
`UNITED STATES DISTRICT COURT
`CENTRAL DISTRICT OF CALIFORNIA
`
`
`CANDIE FRAZIER, individually and
`on behalf of all others similarly situated,
`
`
`Plaintiffs,
`
`
`v.
`
`BYTEDANCE INC. and TIKTOK
`INC.
`
`
`Defendants.
`
`
`
`Civil Action No. 2:21-cv-9913
`
`COMPLAINT AND DEMAND FOR
`JURY TRIAL
`
` CLASS ACTION
`
`
`Civil Action No. 2:21-cv-9913
`
`
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 2 of 36 Page ID #:2
`
`
`
`Plaintiff Candie Frazier, on behalf of herself and all others similarly situated,
`brings this Class Action Complaint against Defendants ByteDance Inc. and TikTok Inc.
`(“Defendants”) for negligence, negligent exercise of retained control, and violations of
`California Unfair Competition Law (“UCL”), Cal. Bus. & Prof. Code § 17200 et seq.,
`UCL §17200, demanding a trial by jury on all claims for which a jury is authorized.
`Plaintiff Frazier makes the following allegations based on personal knowledge as to the
`facts pertaining to herself and upon information and belief, including the investigation of
`counsel, as to all other matters.
`
`INTRODUCTION
`Plaintiff Candie Frazier is a content moderator who seeks to protect herself
`1.
`and all others similarly situated from the dangers of psychological trauma resulting from
`exposure to graphic and objectionable content on ByteDance, Inc.’s (ByteDance) TikTok
`application (“app”) and ByteDance’s failure to provide a safe workplace for the
`thousands of contractors who are entrusted to provide the safest possible environment
`for TikTok users.
`Every day, TikTok users upload millions of videos to its platform. Millions
`2.
`of these uploads include graphic and objectionable content such as child sexual abuse,
`rape, torture, bestiality, beheadings, suicide, and murder. To maintain a sanitized
`platform, maximize its already vast profits, and cultivate its public image, TikTok relies
`on people like Plaintiff Frazier—known as “Content Moderators”—to view those videos
`and remove any that violate the corporation’s terms of use.
`Plaintiff works for the firm Telus International (“Telus”), which provides
`3.
`Content Moderators for TikTok, a popular app owned by ByteDance. ByteDance is an
`important client of Telus International. TikTok is a social media application that allows
`users to create and share short videos that can be edited with background music and
`other special effects
`4. While working at the direction of ByteDance and TikTok, Content
`Moderators—including Plaintiff Frazier—witness thousands of acts of extreme and
`1
`Civil Action No. 2:21-cv-9913
`
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 3 of 36 Page ID #:3
`
`
`
`graphic violence, including sexual assault, genocide, rape, and mutilation. Plaintiff
`Frazier views videos of the genocide in Myanmar, mass shootings, children being raped,
`and animals being mutilated. Content Moderators like Plaintiff Frazier spend twelve
`hours a day reviewing and moderating such videos to prevent disturbing content from
`reaching TikTok’ s users.
`Content Moderators also face repeated exposure to conspiracy theories
`5.
`(including suggestions that the COVID-19 pandemic is a fraud), distortions of historical
`facts (like denials that the Holocaust occurred), fringe beliefs, and political
`disinformation (like false information about participating in the census, lies about a
`political candidate’s citizenship status or eligibility for public office, and manipulated or
`doctored videos of elected officials). �is type of content has destabilized society and
`often features objectionable content.
` As a result of constant and unmitigated exposure to highly toxic and
`6.
`extremely disturbing images at the workplace, Ms. Frazier has developed and suffers
`from significant psychological trauma including anxiety, depression, and posttraumatic
`stress disorder (“PTSD”).
`ByteDance and TikTok are aware of the negative psychological effects that
`7.
`viewing graphic and objectionable content has on Content Moderators. Despite this
`knowledge, they have not implemented safety standards known throughout the industry
`to protect their Content Moderators from harm.
`8. �ese safety standards could have reduced the risk and mitigated the harm
`suffered by Content Moderators working on behalf of ByteDance and TikTok.
`ByteDance and TikTok failed to implement workplace safety standards.
`9.
`Instead, they requires their Content Moderators to work under conditions they know
`cause and exacerbate psychological trauma.
`10. By requiring their Content Moderators to review graphic and objectionable
`content, ByteDance and TikTok require Content Moderators to engage in abnormally
`dangerous activities. And by failing to implement the workplace safety standards they
`2
`Civil Action No. 2:21-cv-9913
`
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 4 of 36 Page ID #:4
`
`
`
`helped develop, ByteDance and TikTok violates California law. By imposing non-
`disclosure agreements, ByteDance and TikTok exacerbate the harm they cause to
`Content Moderators.
`11. Without this Court’s intervention, ByteDance and TikTok will continue to
`injure Content Moderators and breach the duties they owe to Content Moderators who
`review content on their platform.
`12. On behalf of herself and all others similarly situated, Plaintiff Frazier brings
`this action (1) to compensate Content Moderators that were exposed to graphic and
`objectionable content on ByteDance’s TikTok platform; (2) to ensure that ByteDance
`and TikTok provide Content Moderators with tools, systems, and mandatory ongoing
`mental health support to mitigate the harm reviewing graphic and objectionable content
`can cause; and (3) to provide mental health screening and treatment to the thousands of
`current and former Content Moderators affected by ByteDance’s and TikTok’s unlawful
`practices.
`
`JURISDICTION AND VENUE
`13. �is Court has subject matter jurisdiction over this action pursuant to 28
`U.S.C. § 1332(d) and 1367 because: (i) this is a class action in which the matter in
`controversy exceeds the sum of $5,000,000, exclusive of interest and costs; (ii) there are
`100 or more class members; and (iii) some members of the class, including Plaintiff
`Frazier, are citizens of states different from some Defendants, and also because two
`Defendants are citizens or subjects of a foreign state.
`14. �is Court has personal jurisdiction over Defendants because: (i) they
`transact business in the United States, including in this District; (ii) they have substantial
`aggregate contacts with the United States, including in this District; (iii) they engaged
`and are engaging in conduct that has and had a direct, substantial, reasonably
`foreseeable, and intended effect of causing injury to persons throughout the United
`States, including in this District, and purposely availed themselves of the laws of the
`
`3
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 5 of 36 Page ID #:5
`
`
`
`United States. TikTok is headquartered in the Los Angeles County and regularly
`conducts substantial business there including at its office in Culver City, California.
`15. Venue is proper in this judicial District pursuant to 28 U.S.C. § 1391(b), (c)
`and (d), because a substantial part of the events giving rise to Plaintiffs’ claims occurred
`in this District, a substantial portion of the affected interstate trade and commerce was
`carried out in this District, and one or more of the Defendants reside in this District or
`are licensed to do business in this District. TikTok and ByteDance transacted business,
`maintained substantial contacts, or committed tortious acts in this District, causing injury
`to persons residing in, located in, or doing business throughout the United States,
`including in this District. TikTok is headquartered in the Los Angeles County and
`conducts substantial activities business there. Plaintiff Frazier and the proposed class
`have been, and continue to be, injured as a result of TikTok’ s and ByteDance’s illegal
`conduct in the County of Los Angeles.
`PARTIES
`Plaintiff Frazier is a resident of Las Vegas, Nevada who works as a Content
`16.
`Moderator, reviewing content for ByteDance and TikTok. During this period, Plaintiff
`has been employed by Telus International.
`17. Defendant ByteDance Inc. is, and at all relevant times was, a Delaware
`corporation with its principal place of business in Mountain View, California.
`18. Defendant TikTok Inc. (“TikTok”) relevant times was, a California
`corporation with its principal place of business at 5800 Bristol Pkwy, Culver City, Los
`Angeles County, California. Defendant TikTok also maintains offices in Palo Alto,
`California and Mountain View, California. TikTok is owned by ByteDance. Defendants
`ByteDance Inc. and TikTok Inc. are referred to collectively as the “ByteDance
`Defendants.”
`In doing the things alleged herein, each of the ByteDance Defendants was
`19.
`aware of and was aiding and abetting the actions of the other.
`
`4
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 6 of 36 Page ID #:6
`
`
`
`20. Non-party Telus International (“Telus”) is a dual listed public company
`trading on the New York Stock Exchange and Toronto Stock Exchange. Non-party
`Telus International Holding (U.S.A.) Corp. is incorporated under the laws of Delaware
`with its headquarters located at 2251 South Decatur Boulevard Las Vegas, Nevada USA
`89102. Telus International has several locations in the United States including in Las
`Vegas Nevada, Folsom, California, and North Charleston, South Carolina. Telus’s
`Folsom branch is located at 255 Parkshore Drive, Folsom, California 95630. Defendants
`Telus International and Telus International Holding (U.S.A.) Corp. are referred to
`collectively as “Telus”
`
`FACTUAL ALLEGATIONS
`A. Content moderators watch and remove the most depraved images on the
`internet to protect ByteDance’s profits.
`21. Content moderation is the job of removing online material that is
`objectionable, offensive, or otherwise violates the terms of use for social networking
`sites like TikTok.
`In fiscal year 2020, ByteDance made approximately $34.3 billion in
`22.
`advertising revenue. In 2019, that number was $17 billion, and in 2018 that number was
`$7.4 billion. ByteDance accomplished this in part due to the popularity of its TikTok
`app. TikTok is a booming social media platform that allows posting of videos.
`23. TikTok is attractive to companies and individuals that want to buy
`advertisements because of its immense user base. TikTok has over 130 million active
`users. These users value TikTok for its plethora of content and ability to share
`information. Further, TikTok is immensely popular with younger demographics, a key
`group for advertising.
`24. According to a November 5, 2019 article in The Washington Post, “[t]he
`short-video app has become a global phenomenon and has taken young American
`audiences by storm, blending silly jokes, stunts and personal stories into a tech
`powerhouse downloaded more than 1.3 billion times worldwide.”
`5
`Civil Action No. 2:21-cv-9913
`
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 7 of 36 Page ID #:7
`
`
`
`25. To generate this content, ByteDance relies on users to upload videos to its
`platform. TikTok users spend almost an hour on average a day on the app, with younger
`individuals spending even more time on the app.
`26. The amount of content on TikTok is massive, with TikTok having more
`than a billion videos viewed on its platform each day and millions of active users.
`Instead of scrutinizing content before it is uploaded, ByteDance and TikTok
`27.
`rely on users to report inappropriate content. ByteDance and TikTok receive millions of
`user reports of potentially objectionable content on its platforms. Human moderators
`review the reported content – sometimes thousands of videos and images every shift –
`and remove those that violate ByteDance’s and TikTok’s terms of use.
`28. Upon receiving a report from a user about inappropriate content,
`ByteDance and TikTok send that video to their Content Moderators. These videos
`include animal cruelty, torture, suicides, child abuse, murder, beheadings, and other
`graphic content. The videos are each sent to two content moderators, who review the
`videos and determine if the video should remain on the platform, be removed from the
`platform, or have its audio muted.
`29. ByteDance and TikTok require Content Moderators to review hundreds of
`thousands if not millions of potentially rule-breaking posts per week via ByteDance’s
`and TikTok’s review software. Due to the sheer volume of content, content moderators
`are permitted no more than 25 seconds per video, and simultaneously view three to ten
`videos at the same time.
`30. Content moderators are constantly monitored by ByteDance and TikTok
`through their software. ByteDance’s TCS software allows ByteDance and TikTok to
`watch and monitor Content Moderators. This functionality is used to supervise
`employees to ensure they remain on the platform at all times during work hours and
`strictly adhere to time breaks.
`31. ByteDance and TikTok recognize the dangers of exposing their users to
`images and videos of graphic violence. In December 2020, TikTok updated its
`6
`Civil Action No. 2:21-cv-9913
`
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 8 of 36 Page ID #:8
`
`
`
`community guidelines, available at https://newsroom.tiktok.com/en-us/refreshing-our-
`policies-to-support-community-well-being, to foster well-being on its platform to
`address distressing content like suicide and self-harm.
`In September 2020, Theo Bertram, TikTok’ s Director of Government
`32.
`Relations and Public Policy, told British politicians that TikTok has over 10,000 content
`moderators worldwide.
`B. Repeated exposure to graphic imagery can cause devastating psychological
`trauma, including PTSD, anxiety, and depression.
`It is well known that exposure to images of graphic violence can cause
`33.
`debilitating injuries, including PTSD, anxiety, and depression.
`In a study conducted by the National Crime Squad in the United Kingdom,
`34.
`seventy-six percent of law enforcement officers surveyed reported feeling emotional
`distress in response to exposure to child abuse on the internet. The same study, which
`was co-sponsored by the United Kingdom’s Association of Chief Police Officers,
`recommended that law enforcement agencies implement employee support programs to
`help officers manage the traumatic effects of exposure to child pornography.
`In a study of 600 employees of the Department of Justice’s Internet Crimes
`35.
`Against Children task force, the U.S. Marshals Service found that a quarter of the
`cybercrime investigators surveyed displayed symptoms of psychological trauma,
`including secondary traumatic stress.
`36. Another study of cybercrime investigators from 2010 found that “greater
`exposure to disturbing media was related to higher levels of . . . secondary traumatic
`stress” and that “substantial percentages” of investigators exposed to disturbing media
`“reported poor psychological well-being.”
`37. The Eyewitness Media Hub has also studied the effects of viewing videos
`of graphic violence, including suicide bombing, and found that “40 percent of survey
`respondents said that viewing distressing eyewitness media has had a negative impact on
`their personal lives.”
`7
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 9 of 36 Page ID #:9
`
`
`
`38. Whereas viewing or hearing about another person’s traumatic event used to
`be considered “secondary traumatic stress,” the current Diagnostic and Statistical
`Manual of Mental Disorders (American Psychiatric Association, 5th ed. 2013) (“DSM-
`5”) recognizes that secondary or indirect exposure to trauma, such as repeated or
`extreme exposure to aversive details of trauma through work-related media, meets the
`first diagnostic criterion for PTSD.
`39. While there is no way to eliminate the risk created by exposure to graphic
`and objectionable content, especially demanding job requirements or a lack of social
`support reduce resilience in the face of trauma exposure and increase the risk of
`developing debilitating psychological symptoms.
`40. Depending on many factors, individuals who have experienced
`psychological trauma may develop a range of subtle to significant physical and
`psychological symptoms, including extreme fatigue, dissociation, difficulty sleeping,
`excessive weight gain, anxiety, nausea, and other digestive issues.
`41. Trauma exposure and PTSD are also associated with increased risk of
`chronic health problems including cardiovascular conditions, pain syndromes, diabetes,
`and dementia.
`42. There is growing evidence that early identification and treatment of PTSD
`is important from a physical health perspective, as a number of meta-analyses have
`shown increased risk of cardiovascular, metabolic, and musculoskeletal disorders
`among patients with long-term PTSD.
`43. Psychological trauma and PTSD are also often associated with the onset or
`worsening of substance use disorders. Epidemiologic studies indicate that one-third to
`one-half of individuals with PTSD also have a substance use disorder. Compared to
`individuals without PTSD, those with PTSD have been shown to be more than twice as
`likely to meet the diagnostic criteria for alcohol abuse or dependence; individuals with
`PTSD are also three to four times more likely to meet the diagnostic criteria for drug
`abuse or dependence.
`8
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 10 of 36 Page ID #:10
`
`
`
`44. PTSD symptoms may manifest soon after the traumatic experiences, or they
`may manifest later, sometimes months or years after trauma exposure. The Americans
`with Disabilities Act recognizes that certain diseases can manifest into disabilities and
`describes PTSD as a “hidden disability” on its website:
`https://www.ada.gov/servicemembers_adainfo.html.
`45. An individual’s risk of developing PTSD or associated symptoms may be
`reduced through prevention measures, categorized as primary, secondary, and tertiary
`interventions. Primary interventions are designed to increase resilience and lower the
`risk of future PTSD among the general population. Secondary interventions are designed
`to lower the risk of PTSD among individuals who have been exposed to trauma, even if
`they are not yet showing symptoms of traumatic stress. Finally, tertiary interventions are
`designed to prevent the worsening of symptoms and improve functioning in individuals
`who are already displaying symptoms of traumatic stress or who have been diagnosed
`with PTSD.
`Individuals who develop PTSD or other mental health conditions following
`46.
`traumatic exposure require not only preventative measures but also treatment. Unlike
`prevention, treatment measures are aimed at symptom resolution and recovery from the
`disorder.
`47. Preliminary screening is necessary to determine the types of prevention or
`treatment measures most appropriate for an individual.
`C. ByteDance and TikTok control the means and manner in which content
`moderation occurred.
`48. ByteDance employs Telus to supply it with content moderators who work
`directly for Telus but at the direction and control of ByteDance. Plaintiff is one of the
`content moderators hired by Telus to perform content moderation work for the benefit of
`ByteDance.
`
`9
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 11 of 36 Page ID #:11
`
`
`
`49. ByteDance and TikTok provide Telus and Content Moderators with
`ByteDance’s proprietary review software (TCS) to moderate and tag videos and the tags
`Content Moderators are required to use on each video they view and moderate.
` ByteDance and TikTok use the TCS software not only as a platform for
`50.
`video review, but also to monitor the performance of Telus employees daily. ByteDance
`and TikTok use this software to monitor whether quotas are being met and to track the
`time Content Moderators spend away from the TCS application.
`
`10
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 12 of 36 Page ID #:12
`
`
`
`
`
`51. ByteDance and TikTok cause Telus to withhold payment to Content
`Moderators if they are not on the TCS application beyond their allotted breaks (two
`fifteen-minute breaks and one hour-long lunch break for a twelve-hour workday),
`directly determining employee compensation.
`
`52. ByteDance and TikTok further send “adherence letters” or “adhesion
`contracts” to Content Moderators daily. Adherence contracts are standard form contracts
`or boilerplate contracts that are typically offered on a take it or leave it basis. Content
`Moderators were subject to weekly ByteDance and TikTok tests which provided videos
`for calibration for their software. Tags for videos were directly provided by ByteDance
`and TikTok to Content Moderators for use.
`
`
`
`11
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 13 of 36 Page ID #:13
`
`
`D. ByteDance and TikTok do not meet industry standards for mitigating harm
`to Content Moderators.
`53. ByteDance and TikTok are aware of the damage that disturbing imagery
`could have on Content Moderators. Through their app TikTok, ByteDance is a member
`of the Technology Coalition
`created “to develop technology solutions to disrupt the ability to use the Internet to
`exploit children or distribute child pornography.”
`54. Other members of the Technology Coalition include Facebook, YouTube,
`Snap Inc., and Google, all firms with similar content moderation challenges.
`In January 2015, the Technology Coalition published an “Employee
`55.
`Resilience Guidebook for Handling Child Sex Abuse Images” (the “Guidebook”).
`56. According to the Guidebook, the technology industry “must support those
`employees who are the front line of this battle.”
`57. The Guidebook recommends that internet companies implement a robust,
`formal “resilience” program to support Content Moderators’ well-being and mitigate the
`effects of exposure to trauma-inducing imagery.
`58. With respect to hiring Content Moderators, the Guidebook recommends:
`a. In an informational interview, “[u]se industry terms like ‘child sexual
`abuse imagery’ and ‘online child sexual exploitation’ to describe subject
`matter”;
`b. In an informational interview, “[e]ncourage candidate to go to websites
`[like the National Center for Missing and Exploited Children] to learn
`about the problem”;
`c. In follow-up interviews, “[d]iscuss candidate’s previous
`experience/knowledge with this type of content”;
`d. In follow-up interviews, “[d]iscuss candidate’s current level of comfort
`after learning more about the subject”;
`
`12
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 14 of 36 Page ID #:14
`
`
`
`e. In follow-up interviews, “[a]llow candidate to talk with employees who
`handle content about their experience, coping methods, etc.”; and
`f. In follow-up interviews, “[b]e sure to discuss any voluntary and/or
`mandatory counseling programs that will be provided if candidate is
`hired.”
`59. With respect to safety on the job, the Guidebook recommends:
`a. Limiting the amount of time an employee is exposed to child sexual
`abuse imagery;
`b. Teaching moderators how to assess their own reaction to the images;
`c. Performing a controlled content exposure during the first week of
`employment with a seasoned team member and providing follow up
`counseling sessions to the new employee;
`d. Providing mandatory group and individual counseling sessions
`administered by a professional with specialized training in trauma
`intervention; and
`e. Permitting moderators to “opt-out” from viewing child sexual abuse
`imagery
`60. The Technology Coalition also recommends the following practices for
`minimizing exposure to graphic content:
`a. Limiting time spent viewing disturbing media to “no more than four
`consecutive hours”;
`b. “Encouraging switching to other projects, which will allow professionals
`to get relief from viewing images and come back recharged and
`refreshed”;
`c. Using “industry-shared hashes to more easily detect and report [content]
`and in turn, limit employee exposure to these images. Hash technology
`allows for identification of exactly the same image previously seen and
`identified as objectionable”;
`13
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 15 of 36 Page ID #:15
`
`
`
`d. Prohibiting Content Moderators from viewing child pornography one
`hour before the individuals leave work; and
`e. Permitting Content Moderators to take time off as a response to trauma.
`61. According to the Technology Coalition, if a company contracts with a third-
`party vendor to perform duties that may bring vendor employees in contact with graphic
`content, the company should clearly outline procedures to limit unnecessary exposure
`and should perform an initial audit of the independent contractor’s wellness procedures
`for its employees.
`62. The National Center for Missing and Exploited Children (“NCMEC”) also
`promulgates guidelines for protecting Content Moderators from psychological trauma.
`For instance, NCMEC recommends changing the color or resolution of the image,
`superimposing a grid over the image, changing the direction of the image, blurring
`portions of the image, reducing the size of the image, and muting audio.
`63. Based on these industry standards, some internet companies take steps to
`minimize harm to Content Moderators. For instance, at Microsoft, “[t]he photos are
`blurred, rendered in black and white, and shown only in thumbnail sizes. Audio is
`removed from video.” Filtering technology is used to distort images, and Content
`Moderators are provided with mandatory psychological counseling.
`64. At the UK’s Internet Watch Foundation, each applicant for a content
`moderator position is assessed for suitability by a psychologist, who asks about their
`support network, childhood experiences, and triggers. Applicants are then interviewed
`about their work skills before proceeding to a final interview where they are exposed to
`child sexual abuse imagery. Candidates sit with two current Content Moderators and
`review a sequence of images getting progressively worse, working towards the worst
`kinds of sexual violence against children. This stage is designed to see how candidates
`cope and let them decide whether they wish to continue with the application process.
`Once they accept the job, Content Moderators have an enhanced background check
`
`14
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 16 of 36 Page ID #:16
`
`
`
`before they start their six months’ training, which involves understanding criminal law,
`learning about the dark web, and, crucially, building relevant trauma resilience.
`E. ByteDance and TikTok fail to implement meaningful protections for their
`Content Moderators.
`65. ByteDance and TikTok fail to implement workplace safety measures that
`meet industry standards that other companies and non-profits have implemented.
`ByteDance and TikTok have further failed to implement the standards suggested by the
`Technology Coalition, despite being a member.
`66. ByteDance and TikTok do not ensure that Content Moderators are asked
`about their previous experience with graphic content, nor do they make sure that they are
`told that this content can have a significant negative mental health impact on content
`moderators. Content moderators are not permitted to preview the graphic content or
`advised to seek out other outside information during the hiring process.
`67. Before Content Moderators are exposed to any graphic content or receive
`any training, they are required to sign an employment contract and Non-Disclosure
`Agreement (“NDA”). Only after these documents are signed does the training begin.
`68. ByteDance and TikTok do not require that Content Moderators be trained
`about how to address their own reaction to the images.
`69. ByteDance and TikTok fail to provide safeguards known to mitigate the
`negative effects of reviewing graphic content.
`70. Content Moderators are required to review hundreds of graphic and
`disturbing videos each week. To determine whether a video should be removed,
`ByteDance and TikTok create and continually revise tags for content that Content
`Moderators must use to determine whether flagged content violates ByteDance’s and
`TikTok’s policies. Lately, TikTok has increased the number of “tags” Content
`Moderators must use while moderating videos from 20 tags to 100 tags. Content
`Moderators are now expected not to just to review the content of the video, but review
`
`15
`
`Civil Action No. 2:21-cv-9913
`COMPLAINT AND DEMAND FOR JURY TRIAL
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`27
`28
`
`
`
`
`
`
`Case 2:21-cv-09913 Document 1 Filed 12/23/21 Page 17 of 36 Page ID #:17
`
`
`
`video backgrounds and other aspects of the video to make sure they conform to TikTok’
`s rules.
`71. Despite these complex Community Guidelines, ByteDance and TikTok
`impose strict quantity and accuracy quotas on Content Moderators. Content Moderators
`are required to review videos for no longer than 25 seconds. Within the 25 permitted
`seconds of reviewing each video, Content Moderators are expected to have an accuracy
`rate of 80%. Content Moderators review between 3 to 10 videos at the same time.
`Despite this, they are continuously surveilled and pushed by the TCS software to review
`videos faster and faster.
` To determine whether Content Moderators meet these metrics, ByteDance
`72.
`and TikTok audit Content Moderator’s