`Filing # 104496395 E-Filed 03/06/2020 03:50:49 PM
`
`IN THE CIRCUIT COURT OF THE THIRTEENTH JUDICIAL CIRCUIT
`IN AND FOR HILLSBOROUGH COUNTY, FLORIDA
`CIRCUIT CIVIL DIVISION
`
`DEBRYNNA GARRETT, ALEXANDER C. ROBERTS,
`TIMOTHY DIXON, JR., KONICA RITCHIE,
`JESSICA YOUNG, LAMOND RICHARDSON,
`ANGELA CANSINO, JOHNNY OLDEN,
`KATRINA EVANS, DANIEL WALKER,
`TODD ALEXANDER, ELTON GOULD,
`LAMEKA DOTSON, NICHOLAS COLLINS,
`REMEAL EUBANKS, TANIA PAUL,
`GABRIELLE MURRELL, COURTNEY NELSON,
`individually and on behalf of all others similarly situated,
`
`
`
`Plaintiffs,
`
`
`
`v.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` Case No.: 20-CA-001146
`
`FACEBOOK, INC., and COGNIZANT
`TECHNOLOGY SOLUTIONS U.S. CORPORATION,
`
`Defendants.
`
`
`_______________________________________________/
`
`AMENDED CLASS ACTION COMPLAINT AND DEMAND FOR JURY TRIAL
`
`
`
`Plaintiffs DEBRYNNA GARRETT, ALEXANDER C. ROBERTS, TIMOTHY DIXON,
`
`JR., KONICA RITCHIE, JESSICA YOUNG, LAMOND RICHARDSON, ANGELA CANSINO,
`
`JOHNNY OLDEN, KATRINA EVANS, DANIEL WALKER, TODD ALEXANDER, ELTON
`
`GOULD, LAMEKA DOTSON, NICHOLAS COLLINS, REMEAL EUBANKS, TANIA PAUL,
`
`GABRIELLE MURRELL and COURTNEY NELSON (“Plaintiffs”) hereby sue the Defendants,
`
`FACEBOOK, INC. (“Facebook”), and COGNIZANT TECHNOLOGY SOLUTIONS U.S.
`
`CORPORATION (“Cognizant”) (collectively, “Defendants”) to protect themselves and all others
`
`similarly situated from the dangers of psychological trauma resulting from Defendants’ failure to
`
`provide a safe workplace for the thousands of “content moderators” who are entrusted to provide
`
`the safest environment possible for Facebook users.
`
`
`
`· LECHNER LAW ·
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 2 of 33 PageID 11
`
`BACKGROUND
`
`
`
`1.
`
`Every day, Facebook users post millions of videos, images, and livestreamed
`
`broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, racist violence and
`
`murder. To maintain a sanitized platform, maximize its already vast profits, and cultivate its public
`
`image, Facebook relies on people like Plaintiffs – known as “content moderators” – to view those
`
`posts and remove any that violate the corporation’s terms of use.
`
`
`
`2.
`
`From their cubicles during the overnight shift in Cognizant’s Tampa and Phoenix
`
`offices, Plaintiffs witnessed thousands of acts of extreme and graphic violence. As another
`
`Facebook content moderator recently told the Guardian, “You’d go into work …, turn on your
`
`computer and watch someone have their head cut off. Every day, every minute, that’s what you
`
`see. Heads being cut off.”
`
`
`
`3.
`
`As a result of constant and unmitigated exposure to highly toxic and extremely
`
`disturbing images through Facebook’s content review systems, Plaintiffs and other class members
`
`developed and suffer from significant psychological trauma and/or post-traumatic stress disorder
`
`(“PTSD”).
`
`
`
`4.
`
`In an effort to cultivate its image, Facebook helped draft workplace safety standards
`
`to protect content moderators like Plaintiffs and the proposed class from workplace trauma and
`
`associated adverse consequences, which include pre-hiring psychological screening; providing
`
`moderators with robust and mandatory counseling and mental health support; altering the
`
`resolution, audio, size, and color of trauma-inducing images; and training moderators to recognize
`
`the physical and psychological symptoms of PTSD.
`
`
`
`5.
`
`Other recommended safety standards include: having content moderators work in
`
`pairs or teams rather than alone; improving working conditions by not focusing solely on
`
`
`
`2
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 3 of 33 PageID 12
`
`efficiency and productivity; and providing additional breaks or “wellness time” during periods of
`
`extraordinary stress. In addition, Cognizant employees requested to change their queues. For
`
`example, several content moderators asked the company to change which queues they were
`
`assigned, whereby Workforce Management could periodically place a moderator in less graphic
`
`queues, such as regulated goods. Defendants failed to implement any of these safety standards.
`
`
`
`6.
`
`But Defendants ignore the very workplace safety standards they helped create.
`
`Instead, the multibillion-dollar corporations affirmatively require their content moderators to work
`
`under conditions known to cause and exacerbate psychological trauma.
`
`
`
`7.
`
`Facebook contracts with companies like Cognizant to serve as its agent responsible
`
`for finding, hiring and employing the moderators, and then laying them off when the contract
`
`expires, thereby attempting to absolve Defendants of accountability for the mental health of
`
`(offering no psychological support to) their workers after they are laid off. In fact, Cognizant has
`
`shut down its Tampa and Phoenix offices in February 2020, laying off its workforce and leaving
`
`content moderators, including Plaintiffs, with no means of obtaining requisite ongoing medical
`
`monitoring, screening, diagnosis, or adequate treatment after suffering psychological trauma
`
`during their employment.
`
`
`
`8.
`
`By requiring their content moderators to work in dangerous conditions that cause
`
`debilitating physical and psychological harm and then laying them off when the contract expires
`
`in order to absolve themselves of accountability for their mental health issues, Defendants violate
`
`Florida and Arizona law.
`
`
`
`9. Without this Court’s intervention, Defendants will continue to breach the duties
`
`they owe to the content moderators who review content on Facebook’s platforms.
`
`10. Content moderators are essentially the first responders of the internet, performing
`
`3
`
`
`
`
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 4 of 33 PageID 13
`
`a critical function on a platform with billions of users. Many times, moderators are the first to see
`
`emergency situations and report them to Facebook to report to local authorities. Plaintiffs were
`
`specifically referred to as “first responders,” and Facebook compiles statistics about how
`
`moderators assist law enforcement. Plaintiffs and the other content moderators, at a minimum,
`
`deserve
`
`the same protections as other
`
`first
`
`responders, which
`
`includes workers’
`
`compensation/health coverage for the PTSD caused by the working conditions.
`
`
`
`11. On behalf of themselves and all others similarly situated, Plaintiffs bring this action
`
`(1) to ensure that Defendants cease to engage in these unlawful and unsafe workplace practices
`
`and instead provide content moderators with safe tools, systems, and mandatory ongoing mental
`
`health support, (2) to establish a medical monitoring fund for testing and providing mental health
`
`treatment to the thousands of current and former content moderators affected by Defendants’
`
`unlawful practices, and (3) to provide monetary compensation to the thousands of current and
`
`former content moderators for the lost wages and medical and mental health expenses incurred as
`
`a result of the Defendants’ unlawful practices.
`
`JURISDICTION AND VENUE
`
`
`
`12. This is an action for damages in excess of $30,000.00, exclusive of interest, costs,
`
`and equitable relief.
`
`
`
`13. Venue is proper in this Court because the unlawful conduct giving rise to the claims
`
`herein occurred within this judicial district, and at least one Defendant is located in this judicial
`
`district.
`
`
`
`14. This Court has personal jurisdiction over Cognizant because the corporation
`
`operates, conducts, engages in, and carries on a business venture in this state and has an office in
`
`this judicial circuit, at 7725 Woodland Center Blvd., Tampa, FL 33614, and regularly conducts
`
`
`
`4
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 5 of 33 PageID 14
`
`substantial business there, committed a tortious act within Florida, and engages in substantial and
`
`not isolated activity within Florida.
`
`
`
`15.
`
`This Court has personal jurisdiction over Facebook because the corporation
`
`operates, conducts, engages in, and carries on a business venture in this state and has an office in
`
`this state, located at 701 Brickell Ave., Miami, Florida 33131, and regularly conducts substantial
`
`business there and throughout the state, committed a tortious act within Florida, and engages in
`
`substantial and not isolated activity within Florida.
`
`PARTIES
`
`
`
`16.
`
`Plaintiffs Garrett, Roberts, Dixon, Ritchie, Young, Richardson, Cansino, Olden,
`
`Evans, Walker, Alexander, Dotson, Collins, Eubanks, Paul, Murrell and Nelson are residents of
`
`Hillsborough County, Florida. Plaintiff Gould is a resident of Pasco County, Florida. Plaintiff
`
`Roberts is a citizen of Arizona.
`
`
`
`17. Defendant Facebook provides “products that enable people to connect and share
`
`with friends and family through mobile devices, personal computers, and other surface” or “to
`
`share their opinions, ideas, photos and videos, and other activities with audiences ranging from
`
`their closest friends to the public at large.” Facebook is a publicly traded corporation incorporated
`
`under the laws of Delaware, with its headquarters located at 1601 Willow Road, Menlo Park,
`
`California, 94025.
`
`
`
`18. Defendant Cognizant
`
`is a professional services vendor
`
`that employed
`
`approximately 800 workers at its Facebook content moderation site in Tampa. Cognizant is a
`
`publicly traded corporation incorporated under the laws of Delaware, with its headquarters located
`
`at 211 Quality Circle, College Station, TX 77845.
`
`
`
`5
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 6 of 33 PageID 15
`
`FACTUAL ALLEGATIONS
`
`A. Content moderators watch and remove some of the most depraved images on the
`
`internet to protect users of Facebook’s products from trauma-inducing content.
`
`
`
`19. Content moderation is the practice of removing online material that violates the
`
`terms of use for social networking sites or applications like Facebook.com and Instagram.
`
`
`
`20.
`
`Instead of scrutinizing content before it is published to its users, Facebook primarily
`
`relies on users to report inappropriate content. Facebook receives more than one million user
`
`reports of potentially objectionable content on its social media sites and applications every day.
`
`Human moderators review the reported content – sometimes thousands of videos and images every
`
`shift – and remove those that violate Facebook’s terms of use.
`
`
`
`21. After content is flagged, Facebook’s algorithms direct it to a content moderator,
`
`who then reviews it using a platform developed by Facebook.
`
`
`
`22.
`
`Facebook asks content moderators to review more than 10 million potentially rule-
`
`breaking posts per week via its review platforms. Facebook seeks to ensure all user-reported
`
`content is reviewed within 24 hours of a report and with an overall error rate of less than one
`
`percent.
`
`
`
`23. Facebook has developed and continually revises hundreds of rules that content
`
`moderators use to determine whether flagged content – i.e., posts, comments, messages, images,
`
`videos, advertisements, etc. – violates Facebook’s policies.
`
`
`
`24. Facebook has also developed expectations for the amount of time a content
`
`moderator should need to review different types of flagged content.
`
`
`
`25. According to Monika Bickert, head of global policy management at Facebook,
`
`Facebook conducts weekly audits of every content moderator’s work to ensure that its content
`
`rules are being followed consistently.
`
`
`
`6
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 7 of 33 PageID 16
`
`
`
`26.
`
`In August 2015, Facebook rolled out Facebook Live, a feature that allows users to
`
`broadcast live video streams on their Facebook pages. Mark Zuckerberg, Facebook’s chief
`
`executive officer, considers Facebook Live to be instrumental to the corporation’s growth. Mr.
`
`Zuckerberg has been a prolific user of the feature, periodically “going live” on his own Facebook
`
`page to answer questions from users.
`
`
`
`27. But Facebook Live also provides a platform for users to livestream murder,
`
`beheadings, torture, and even their own suicides, including the following:
`
`● In late April 2017, a father killed his 11-month-old daughter and livestreamed it before
`
`hanging himself. Six days later, Naika Venant, a 14-year-old who lived in a foster home,
`
`tied a scarf to a shower’s glass doorframe and hung herself. She streamed the whole suicide
`
`in real time on Facebook Live. Then in early May, a Georgia teenager took pills and placed
`
`a bag over her head in a suicide attempt. She livestreamed the attempt on Facebook and
`
`survived only because viewers watching the event unfold called police, allowing them to
`
`arrive before she died.
`
`● On March 15, 2019, the horrific mass shooting in Christchurch, New Zealand, which killed
`
`50 people at two mosques, was livestreamed on Facebook as the shooter pulled up to a
`
`mosque in Christchurch, New Zealand, grabbed guns out of his vehicle and stormed inside,
`
`opening fire on worshipers. By the time Facebook removed the 17-minute video, it had
`
`been viewed roughly 4,000 times, the company said.
`
`
`
`28. Facebook understands the dangers associated with a person watching this kind of
`
`imagery.
`
`
`
`29.
`
`In the context of protecting users from this kind of content, Mr. Zuckerberg
`
`announced on May 3, 2017:
`
`
`
`7
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 8 of 33 PageID 17
`
`“Over the last few weeks, we’ve seen people hurting themselves and others on
`Facebook—either live or in video posted later. Over the next year, we’ll be adding
`3,000 people to our community operations team around the world – on top of the
`4,500 we have today – to review the millions of reports we get every week, and
`improve the process for doing it quickly.
`
`These reviewers will also help us get better at removing things we don’t allow on
`Facebook like hate speech and child exploitation. And we’ll keep working with
`local community groups and law enforcement who are in the best position to help
`someone if they need it – either because they’re about to harm themselves, or
`because they’re in danger from someone else.”
`
`30. According to Sheryl Sandberg, Facebook’s chief operating officer, “Keeping
`
`
`
`people safe is our top priority. We won’t stop until we get it right.”
`
`
`
`31. Today, approximately 15,000 content moderators around the world review content
`
`via Facebook’s review platforms.
`
`
`
`32. Most of these 15,000 content moderators, like Plaintiffs and the proposed class
`
`here, are employed by third-party vendors of Facebook and are not Facebook employees.
`
`
`
`33. For many reasons, including short-term contracts and the trauma associated with
`
`the work, most content moderators – like Plaintiffs – remain in the position for short periods of
`
`time. When the contractors’ contracts expire, the content moderators are laid off and abandoned
`
`by Defendants, with no access to adequate or ongoing mental health services or psychological
`
`support.
`
`B. Repeated exposure to graphic imagery can cause devastating psychological trauma,
`
`including PTSD.
`
`
`
`34.
`
`It is well known that exposure to images of graphic violence can cause debilitating
`
`injuries, including PTSD.
`
`
`
`35.
`
`In a study conducted by the National Crime Squad in the United Kingdom, 76
`
`percent of law enforcement officers surveyed reported feeling emotional distress in response to
`
`exposure to child abuse on the internet. The same study, which was co-sponsored by the United
`
`
`
`8
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 9 of 33 PageID 18
`
`Kingdom’s Association of Chief Police Officers, recommended that law enforcement agencies
`
`implement employee support programs to help officers manage the traumatic effects of exposure
`
`to child pornography.
`
`
`
`36.
`
`In a study of 600 employees of the Department of Justice’s Internet Crimes Against
`
`Children task force, the U.S. Marshals Service found that a quarter of the cybercrime investigators
`
`surveyed displayed symptoms related to psychological trauma, including from secondary
`
`traumatic stress.
`
`
`
`37. Another study of cybercrime investigators from 2010 found that “greater exposure
`
`to disturbing media was related to higher levels of . . . secondary traumatic stress” and that
`
`“substantial percentages” of investigators exposed to disturbing media “reported poor
`
`psychological well-being.”
`
`
`
`38. The Eyewitness Media Hub has also studied the effects of viewing videos of
`
`graphic violence, including suicide bombing, and found that“40 percent of survey respondents said
`
`that viewing distressing eyewitness media has had a negative impact on their personal lives.”
`
`
`
`39. Whereas viewing or hearing about another person’s traumatic event used to be
`
`considered “secondary traumatic stress,” the current Diagnostic and Statistical Manual of Mental
`
`Disorders (American Psychiatric Association, 5th ed. 2013) (“DSM-5”) recognizes that secondary
`
`or indirect exposure to trauma, such as repeated or extreme exposure to aversive details of trauma
`
`through work-related media, meets the first diagnostic criterion for PTSD.
`
`
`
`40.
`
`It is well established that stressful work conditions, such as especially demanding
`
`job requirements or a lack of social support, reduce resilience in the face of trauma exposure and
`
`increase the risk of developing debilitating psychological symptoms.
`
`41. Depending on many factors, individuals who have experienced psychological
`
`9
`
`
`
`
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 10 of 33 PageID 19
`
`trauma may develop a range of subtle to significant physical and psychological symptoms,
`
`including extreme fatigue, disassociation, difficulty sleeping, excessive weight gain, anxiety,
`
`nausea, and other digestive issues.
`
`
`
`42. Trauma exposure and PTSD are also associated with increased risk of chronic
`
`health problems including cardiovascular problems, strokes, pain syndromes, diabetes, epilepsy,
`
`and dementia.
`
`
`
`43. There is growing evidence that early identification and treatment of PTSD is
`
`important from a physical health perspective, as a number of meta-analyses have shown increased
`
`risk of cardiovascular, metabolic, and musculoskeletal disorders among patients with long-term
`
`PTSD.
`
`
`
`44. Psychological trauma and/or PTSD are also often associated with the onset or
`
`worsening of substance use disorders. Epidemiologic studies indicate that one-third to one-half of
`
`individuals with PTSD also have a substance use disorder. Compared to individuals without PTSD,
`
`those with PTSD have been shown to be more than twice as likely to meet the diagnostic criteria
`
`for alcohol abuse or dependence; individuals with PTSD are also three to four times more likely
`
`to meet the diagnostic criteria for drug abuse or dependence.
`
`
`
`45. PTSD symptoms may manifest soon after the traumatic experiences, or they may
`
`manifest later in life, sometimes months or years after trauma exposure.
`
`
`
`46. An individual’s risk of developing PTSD or associated symptoms may be reduced
`
`through prevention measures, which include primary, secondary, or tertiary interventions. Primary
`
`interventions are designed to increase resilience and lower the risk of future PTSD among the
`
`general population. Secondary interventions are designed to lower the risk of PTSD among
`
`individuals who have been exposed to trauma, even if they are not yet showing symptoms of
`
`
`
`10
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 11 of 33 PageID 20
`
`traumatic stress. Finally, tertiary interventions are designed to prevent the worsening of symptoms
`
`and improve functioning in individuals who are already displaying symptoms of traumatic stress,
`
`or have been diagnosed with PTSD.
`
`
`
`47.
`
`Individuals who develop PTSD or other mental health conditions following
`
`traumatic exposure require not only preventative measures but also treatment. Unlike prevention,
`
`treatment measures are aimed at symptom resolution and recovery from the condition.
`
`
`
`48. Preliminary screening is necessary to determine which types of prevention or
`
`treatment measures are most appropriate for an individual.
`
`C.
`
`
`
`
`Facebook helped craft industry standards for minimizing harm to content
`moderators but failed to implement the very standards it helped create.
`
`49.
`
`In 2006, Facebook helped create the Technology Coalition, a collaboration of
`
`internet companies aiming “to develop technology solutions to disrupt the ability to use the Internet
`
`to exploit children or distribute child pornography.”
`
`
`
`50. Facebook was a member of the Technology Coalition at all times relevant to the
`
`allegations herein.
`
`
`
`51.
`
`In January 2015, the Technology Coalition published an “Employee Resilience
`
`Guidebook for Handling Child Sex Abuse Images” (the “Guidebook”).
`
`
`
`52. According to the Guidebook, the technology industry “must support those
`
`employees who are the front line of this battle.”
`
`
`
`53. The Guidebook recommends that internet companies implement a robust, formal
`
`“resilience” program to support content moderators’ well-being and mitigate the effects of
`
`exposure to trauma-inducing imagery.
`
`54. With respect to hiring content moderators, the Guidebook recommends:
`
`
`
`a.
`
`In an informational interview, “[u]se industry terms like ‘child sexual abuse
`
`11
`
`
`
`
`
`
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 12 of 33 PageID 21
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`imagery’ and ‘online child sexual exploitation’ to describe subject matter.”
`
`b.
`
`In an informational interview, “[e]ncourage candidate to go to websites
`
`
`
`
`
`[like the National Center for Missing and Exploited Children] to learn about
`
`the problem.”
`
`c.
`
`In follow-up interviews, “[d]iscuss candidate’s previous experience/
`
`
`
`knowledge with this type of content.”
`
`d.
`
`In follow-up interviews, “[d]iscuss candidate’s current level of comfort
`
`
`
`after learning more about the subject.”
`
`e.
`
`In follow-up interviews, “[a]llow candidate to talk with employees who
`
`
`
`handle content about their experience, coping methods, etc.”
`
`f.
`
`In follow-up interviews, “[b]e sure to discuss any voluntary and/or
`
`
`
`mandatory counseling programs that will be provided if candidate is hired.”
`
`55. With respect to safety on the job, the Guidebook recommends:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`a.
`
`Limiting the amount of time an employee is exposed to child sexual abuse
`
`
`
`imagery;
`
`b.
`
`Teaching moderators how to assess their own reaction to the images;
`
`c.
`
`Performing a controlled content exposure during the first week of
`
`
`
`
`
`employment with a seasoned team member and providing follow up
`
`counseling sessions to the new employee;
`
`d.
`
`Providing mandatory group and individual counseling sessions
`
`
`
`
`
`
`
`administered by a professional with specialized training in trauma
`
`intervention; and
`
`e.
`
`Permitting moderators to “opt-out” from viewing child sexual abuse
`
`12
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 13 of 33 PageID 22
`
`
`
`
`
`
`
`
`
`imagery.
`
`56. The Technology Coalition also recommends the following practices for minimizing
`
`exposure to graphic content:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`a.
`
`Limiting time spent viewing disturbing media to “no more than four
`
`
`
`consecutive hours;”
`
`b.
`
`“Encouraging switching to other projects, which will allow professionals to
`
`
`
`get relief from viewing images and comeback recharged and refreshed;”
`
`c.
`
`Using “industry-shared hashes to more easily detect and report [content]
`
`
`
`
`
`
`
`and in turn, limit employee exposure to these images. Hash technology
`
`allows for identification of exactly the same image previously seen and
`
`identified as objectionable;”
`
`d.
`
`Prohibiting moderators from viewing child pornography one hour before
`
`
`
`the individuals leave work; and
`
`e.
`
`Permitting moderators to take time off as a response to trauma.
`
`57. According to the Technology Coalition, if a company contracts with a third-party
`
`vendor to perform duties that may bring vendor employees in contact with graphic content, the
`
`company should clearly outline procedures to limit unnecessary exposure and should perform an
`
`initial audit of the independent contractor’s wellness procedures for its employees.
`
`
`
`58. The National Center for Missing and Exploited Children (“NCMEC”) also
`
`promulgates guidelines for protecting content moderators from psychological trauma. For
`
`instance, NCMEC recommends changing the color or resolution of the image, superimposing a
`
`grid over the image, changing the direction of the image, blurring portions of the image, reducing
`
`the size of the image, and muting audio.
`
`
`
`13
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 14 of 33 PageID 23
`
`
`
`59. Based on these industry standards, some internet companies take steps to minimize
`
`harm to content moderators. However, Defendants do not take any of the mitigating set forth
`
`above. Cognizant did not even conduct any psychological evaluations on new employees,
`
`including Plaintiffs, to determine if they were a good fit for the job. Although there are counselors
`
`on staff, they do not provide any real counseling services. In fact, management was told to monitor
`
`how much time employees spent with counselors in order to discourage use of counseling services.
`
`
`
`60. Content moderators review thousands of traumatic images each day through
`
`Facebook’s review platforms without the benefit of these known safeguards and with little training
`
`on how to handle the resulting distress.
`
`
`
`61.
`
`In addition, Facebook sets overarching standards relating to the timeframe for and
`
`accuracy of review.
`
`
`
`62.
`
`Plaintiffs and other content moderators at the Tampa and Phoenix Cognizant
`
`facilities faced relentless pressure from their bosses to better enforce Facebook’s community
`
`standards, which receive near-daily updates that leave its contractor workforce in a perpetual state
`
`of uncertainty. The Tampa site, which has been accurately referred to in the press as a
`
`“sweatshop,” has routinely failed to meet the 98 percent “accuracy” target set by Facebook. With
`
`a score hovering around 92, it has been Facebook’s worst-performing site in North America.
`
`
`
`63.
`
`In mid-2019, a “Wellness Summit” was held, where Facebook displayed the
`
`productivity statistics for its content moderator contractors. Cognizant was on the bottom of the
`
`list. After this Summit, Cognizant issued directives to begin writing up employees for lack of
`
`production and taking too much “wellness” time.
`
`
`
`64. Following the Wellness Summit, employees, including Plaintiffs, were being
`
`pushed hard. Employees were expected to review 300 pieces of content per day. Employees were
`
`
`
`14
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 15 of 33 PageID 24
`
`being written up for lack of production and taking too much “wellness” time (e.g. time to
`
`“decompress” during a shift). These write ups were based on a directive from Cognizant, which
`
`originally came from Facebook
`
`
`
`65.
`
`Facebook understands that its standards impose intense pressure and stress on
`
`content moderators, and that such stress contributes to and exacerbates content moderators’ risk of
`
`developing psychological trauma.
`
`
`
`66. As one moderator described the job:
`
`“[The moderator] in the queue (production line) receives the tickets (reports)
`randomly. Texts, pictures, videos keep on flowing. There is no possibility to know
`beforehand what will pop up on the screen. The content is very diverse. No time is
`left for a mental transition. It is entirely impossible to prepare oneself
`psychologically. One never knows what s/he will run into. It takes sometimes a few
`seconds to understand what a post is about. The agent is in a continual situation of
`stress. The speed reduces the complex analytical process to a succession of
`automatisms. The moderator reacts. An endless repetition. It becomes difficult to
`disconnect at the end of the eight-hour shift.”
`
`67. Facebook also demands that its content moderation vendors, including Cognizant,
`
`
`
`require their employees to sign sweeping Non-Disclosure Agreements (“NDAs”). Facebook
`
`further requires its vendors to provide Facebook-developed training to all content moderators to
`
`instruct the moderators not to speak about the content or workplace conditions to anyone outside
`
`of their review team. By prohibiting content moderators from discussing their work or seeking
`
`outside social support, Facebook impedes the development of resiliency and increases the risk that
`
`moderators will develop psychological trauma.
`
`
`
`68.
`
`The results of an in-depth investigation into the dangerous working conditions at
`
`Cognizant’s Tampa facility and some of the psychological harm suffered by the content
`
`moderators employed there was published in The Verge in June 2019. For further background
`
`information, please visit: https://www.theverge.com/2019/6/19/18681845/facebook-moderator-
`
`
`
`15
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 16 of 33 PageID 25
`
`interviews-video-trauma-ptsd-cognizant-tampa and https://www.cnn.com/videos/business/2019/
`
`06/23/former-facebook-moderator-blows-the-whistle.cnn.
`
`E.
`
`Plaintiffs Garrett’s individual allegations.
`
`
`
`69.
`
`From approximately July 2018 until the present, Plaintiff Garrett worked as a
`
`Process Executive (a/k/a Facebook Content Moderator) at Cognizant’s offices at 7725 Woodland
`
`Center Blvd., Tampa, FL 33614.
`
`
`
`70. During this period, Plaintiff Garrett was employed by Cognizant.
`
`
`
`71. At all times relevant to this complaint, Cognizant was an independent contractor of
`
`Facebook.
`
`
`
`
`
`72. Cognizant directly oversaw all human resources matters concerning Ms. Garrett.
`
`73. Ms. Garrett has never been employed by Facebook in any capacity and never
`
`received any wages or employee benefits package (e.g., wellness benefits, paid time off, parental
`
`financial assistance) from Facebook.
`
`
`
`74. During her employment as a content moderator, Ms. Garrett was exposed to
`
`thousands of images, videos, and livestreamed broadcasts of graphic violence, including
`
`beheadings, mutilations, terrorist killings and torture, rapes and murders.
`
`
`
`75. On the night of March 9, 2018, Mr. Roberts’s coworker, Keith Utley, had a heart
`
`attack and died at his desk during Mr. Roberts’s shift. Management came in and instructed
`
`everyone that they could not speak about it with anyone. Cognizant never disclosed what Mr.
`
`Utley was watching when he died, nor did it taking any apparent mitigating steps to ensure this
`
`does not happen again to other content moderators.
`
`
`
`76. Ms. Garrett and other content moderators in Tampa and Phoenix receive two 15-
`
`minute breaks and a 30-minute lunch each day, along with nine minutes per day of “wellness” time
`
`
`
`16
`
`
`
`Case 8:20-cv-00585-MSS-CPT Document 1-1 Filed 03/12/20 Page 17 of 33 PageID 26
`
`that they purportedly can use when they feel overwhelmed by the emotional toll of the job.
`
`
`
`77. On March 15, 2019, two consecutive terrorist shooting attacks occurred at mosques
`
`in Christchurch, New Zealand, killing 51 people and injuring 49. The gunman, a white
`
`supremacist, live-streamed the first 17 minutes of this attack on Facebook Live. Cognizant
`
`employees, including Ms. Garrett, were forced to watch these horrific images over and over again
`
`due to its viral nature. Defendants Facebook and Cognizant did not provide support after this event
`
`and failed to have trauma counselors on site.
`
`
`
`78. As a result of the extreme working conditions and unrelenting pressure, in Fall
`
`2018, Ms. Garrett was diagnosed with PTSD. As a result of these impairments, Ms. Garrett went
`
`out on a leave of absence in or about September 2019.
`
`E.
`
`Plaintiffs Roberts’ individual allegations.
`
`
`
`79. From approximately November 2017 until January 19, 2020, Plaintiff Robe