`
`
`
`THE ROSEN LAW FIRM, P.A.
`Phillip Kim, Esq. (PK 9384)
`Laurence M. Rosen, Esq. (LR 5733)
`275 Madison Ave., 40th Floor
`New York, New York 10016
`Telephone: (212) 686-1060
`Fax: (212) 202-3827
`Email: pkim@rosenlegal.com
`lrosen@rosenlegal.com
`
`
`Counsel for Plaintiff
`
`
`UNITED STATES DISTRICT COURT
`EASTERN DISTRICT OF NEW YORK
`
`WEE ANN NGIAN, Individually and on behalf
`of all others similarly situated,
`
`
`Plaintiff,
`
`v.
`
`FACEBOOK, INC., MARK ZUCKERBERG,
`AND DAVID M. WEHNER,
`
`
`Defendants.
`
`
`
`
`Case No.
`
`CLASS ACTION COMPLAINT FOR
`VIOLATION OF THE FEDERAL
`SECURITIES LAWS
`
`JURY TRIAL DEMANDED
`
`CLASS ACTION
`
`Plaintiff Wee Ann Ngian (“Plaintiff”), individually and on behalf of all other persons
`
`similarly situated, by Plaintiff’s undersigned attorneys, for Plaintiff’s complaint against
`
`Defendants (defined below), alleges the following based upon personal knowledge as to Plaintiff
`
`and Plaintiff’s own acts, and information and belief as to all other matters, based upon, inter alia,
`
`the investigation conducted by and through Plaintiff’s attorneys, which included, among other
`
`things, a review of the defendants’ public documents, and announcements made by defendants,
`
`United States Securities and Exchange Commission (“SEC”) filings, wire and press releases
`
`published by and regarding Facebook, Inc. (“Facebook” or the “Company”), analysts’ reports and
`
`advisories about the Company, and information readily obtainable on the Internet. Plaintiff
`
`1
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 2 of 28 PageID #: 2
`
`
`
`believes that substantial evidentiary support will exist for the allegations set forth herein after a
`
`reasonable opportunity for discovery.
`
`NATURE OF THE ACTION
`
`1.
`
`This is a federal securities class action on behalf of all persons and entities who
`
`purchased the publicly traded securities of Facebook between November 3, 2016 and October 4,
`
`2021, both dates inclusive (the “Class Period”). Plaintiff seeks to recover compensable damages
`
`caused by Defendants’ violations of the federal securities laws under the Securities Exchange Act
`
`of 1934 (the “Exchange Act”).
`
`JURISDICTION AND VENUE
`
`2.
`
`The claims asserted herein arise under and pursuant to §§10(b) and 20(a) of the
`
`Exchange Act (15 U.S.C. §78j(b) and §78t(a)) and Rule 10b-5 promulgated thereunder by the SEC
`
`(17 C.F.R. §240.10b-5).
`
`3.
`
`This Court has jurisdiction over the subject matter of this action under 28 U.S.C.
`
`§1331 and §27 of the Exchange Act.
`
`4.
`
`Venue is proper in this judicial district pursuant to §27 of the Exchange Act (15
`
`U.S.C. §78aa) and 28 U.S.C. §1391(b) as the alleged misstatements entered and subsequent
`
`damages took place within this judicial district.
`
`5.
`
`In connection with the acts, conduct and other wrongs alleged in this Complaint,
`
`Defendants, directly or indirectly, used the means and instrumentalities of interstate commerce,
`
`including but not limited to, the United States mail, interstate telephone communications and the
`
`facilities of the national securities exchange.
`
`
`
`
`
`2
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 3 of 28 PageID #: 3
`
`
`
`PARTIES
`
`6.
`
`Plaintiff, as set forth in the accompanying certification, incorporated by reference
`
`herein, purchased Facebook’s securities during the Class Period and was economically damaged
`
`thereby.
`
`7.
`
`Defendant Facebook is the world’s largest online social network, with 2.5 billion
`
`monthly active users. The Company is incorporated in Delaware and its principal executive offices
`
`are located at 1601 Willow Road, Menlo Park, CA 94025. Facebook securities are traded on
`
`NASDAQ under the ticker symbol “FB.”
`
`8.
`
`Defendant Mark Zuckerberg (“Zuckerberg”) has been the Chief Executive Officer
`
`(“CEO”) of Facebook throughout the Class Period.
`
`9.
`
`Defendant David M. Wehner (“Wehner”) has been the Chief Financial Officer
`
`(“CFO”) of Facebook throughout the Class Period.
`
`10.
`
`Defendants Zuckerberg and Wehner are sometimes referred to herein as the
`
`“Individual Defendants.”
`
`11.
`
`Each of the Individual Defendants:
`
`(a)
`
`directly participated in the management of the Company;
`
`(b)
`
`was directly involved in the day-to-day operations of the Company at the highest
`
`levels;
`
`(c)
`
`was privy to confidential proprietary information concerning the Company and its
`
`business and operations;
`
`(d)
`
`was directly or indirectly involved in drafting, producing, reviewing and/or
`
`disseminating the false and misleading statements and information alleged herein;
`
`3
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 4 of 28 PageID #: 4
`
`
`
`(e)
`
`was directly or indirectly involved in the oversight or implementation of the
`
`Company’s internal controls;
`
`(f)
`
`was aware of or recklessly disregarded the fact that the false and misleading
`
`statements were being issued concerning the Company; and/or
`
`(g)
`
`approved or ratified these statements in violation of the federal securities laws.
`
`12.
`
`The Company is liable for the acts of the Individual Defendants and its employees
`
`under the doctrine of respondeat superior and common law principles of agency because all of the
`
`wrongful acts complained of herein were carried out within the scope of their employment.
`
`13.
`
`The scienter of the Individual Defendants and other employees and agents of the
`
`Company is similarly imputed to the Company under respondeat superior and agency principles.
`
`14.
`
`The Company and the Individual Defendants are referred to herein, collectively, as
`
`the “Defendants.”
`
`SUBSTANTIVE ALLEGATIONS
`
`Materially False and Misleading Statements
`
`15.
`
`On November 3, 2016, Facebook filed with the SEC a Form 10-Q quarterly report
`
`for the quarter ended September 30, 2016 (“3Q 2016 10-Q”). The 3Q 2016 10-Q was signed by
`
`Defendant Wehner. Attached to the 3Q 2016 10-Q were certifications pursuant to the Sarbanes-
`
`Oxley Act of 2002 (“SOX”) signed by Defendants Zuckerberg and Wehner attesting to the
`
`accuracy of financial reporting, the disclosure of any material changes to the Company’s internal
`
`control over financial reporting and the disclosure of all fraud.
`
`16.
`
`The 3Q 2016 10-Q stated that, between September 30, 2013 and September 30,
`
`2016, Facebook’s monthly active users (“MAUs”) grew from 199 million to 229 million in the
`
`United States and Canada. The 3Q 2016 10-Q also stated that, during the same time frame,
`
`Facebook’s MAUs in Europe grew from 276 million to 342 million.
`
`4
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 5 of 28 PageID #: 5
`
`
`
`17.
`
`The 3Q 2016 10-Q stated, in relevant part, the following about individuals with
`
`multiple accounts:
`
`“We believe the percentage of accounts that are duplicate or false is
`meaningfully lower in developed markets such as the United States or United
`Kingdom and higher in developing markets such as India and Turkey.”
`
`(Emphasis added.)
`
`18.
`
`On February 2, 2017, Facebook filed with the SEC a Form 10-K annual report for
`
`the fiscal year ended December 31, 2016 (“2016 10-K”). The 2016 10-K was signed by Defendants
`
`Zuckerberg and Wehner. Attached to the 2016 10-K were SOX certifications signed by Defendants
`
`Zuckerberg and Wehner attesting to the accuracy of financial reporting, the disclosure of any
`
`material changes to the Company’s internal control over financial reporting and the disclosure of
`
`all fraud.
`
`19.
`
`The 2016 10-K stated that, between December 31, 2013 and December 31, 2016,
`
`MAUs in the United States and Canada grew from 201 million to 231 million. The 2016 10-K also
`
`stated that, in Europe during the same time frame, Facebook’s MAUs grew from 282 million to
`
`349 million.
`
`20.
`
`The 2016 10-K represented, in pertinent part, the following about individuals with
`
`multiple accounts:
`
`“We believe the percentage of accounts that are duplicate or false is
`meaningfully lower in developed markets such as the United States or United
`Kingdom and higher in developing markets such as India and Turkey.”
`
`(Emphasis added.)
`
`21.
`
`On February 1, 2018, Facebook filed with the SEC a Form 10-K annual report for
`
`the fiscal year ended December 31, 2017 (“2017 10-K”). The 2017 10-K was signed by Defendants
`
`Zuckerberg and Wehner. Attached to the 2017 10-K were SOX certifications signed by Defendants
`
`5
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 6 of 28 PageID #: 6
`
`
`
`Zuckerberg and Wehner attesting to the accuracy of financial reporting, the disclosure of any
`
`material changes to the Company’s internal control over financial reporting and the disclosure of
`
`all fraud.
`
`22.
`
`The 2017 10-K represented that, between December 31, 2016 and December 31,
`
`2017, MAUs in the United States and Canada grew from 231 million to 239 million. The 2017 10-
`
`K also represented that, during the same time frame, Facebook’s MAUs in Europe grew from 349
`
`million to 370 million.
`
`23.
`
`The 2017 10-K stated, in pertinent part, the following about individuals with
`
`multiple accounts:
`
`“We believe the percentage of duplicate accounts is meaningfully higher in developing
`markets such as India, Indonesia, and the Philippines, as compared to more developed
`markets.”
`
`(Emphasis added.)
`
`24.
`
`On July 16, 2018, Facebook published on its website a statement titled, “Working
`
`to Keep Facebook Safe.” In pertinent part, the statement said:
`
`“It has been suggested that turning a blind eye to bad content is in our
`commercial interests. This is not true. Creating a safe environment where people
`from all over the world can share and connect is core to Facebook’s long-term
`success.
`
`*
`
`*
`
`
`*
`
`How We Create and Enforce Our Policies
`
`More than 1.4 billion people use Facebook every day from all around the world.
`They post in dozens of different languages: everything from photos and status
`updates to live videos. Deciding what stays up and what comes down involves
`hard judgment calls on complex issues — from bullying and hate speech to
`terrorism and war crimes. It’s why we developed our Community Standards with
`input from outside experts — including academics, NGOs and lawyers from
`around the world. We hosted three Facebook Forums in Europe in May, where we
`were able to hear from human rights and free speech advocates, as well as counter-
`terrorism and child safety experts.
`
`6
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 7 of 28 PageID #: 7
`
`
`
`*
`
`*
`
`
`These Community Standards have been publicly available for many years, and
`this year, for the first time, we published the more detailed internal guidelines
`used by our review teams to enforce them.
`
`*
`
`Reviewing reports quickly and accurately is essential to keeping people safe on
`Facebook. This is why we’re doubling the number of people working on our safety
`and security teams this year to 20,000. This includes over 7,500 content reviewers.
`We’re also investing heavily in new technology to help deal with problematic
`content on Facebook more effectively. For example, we now use technology to
`assist in sending reports to reviewers with the right expertise, to cut out duplicate
`reports, and to help detect and remove terrorist propaganda and child sexual
`abuse images before they’ve even been reported.”
`
`(Emphasis added.)
`
`25.
`
`On July 17, 2018, Facebook updated its “Working to Keep Facebook Safe”
`
`statement. The updated statement said, in relevant part:
`
`“Cross Check
`We want to make clear that we remove content from Facebook, no matter who
`posts it, when it violates our standards. There are no special protections for any
`group — whether on the right or the left. ‘Cross Check’ — the system described
`in Dispatches — simply means that some content from certain Pages or Profiles
`is given a second layer of review to make sure we’ve applied our policies correctly.
`
`This typically applies to high profile, regularly visited Pages or pieces of content
`on Facebook so that they are not mistakenly removed or left up. Many media
`organizations’ Pages — from Channel 4 to The BBC and The Verge — are cross
`checked. We may also Cross Check reports on content posted by celebrities,
`governments, or Pages where we have made mistakes in the past. For example, we
`have Cross Checked an American civil rights activist’s account to avoid mistakenly
`deleting instances of him raising awareness of hate speech he was encountering.
`
`To be clear, Cross Checking something on Facebook does not protect the profile,
`Page or content from being removed. It is simply done to make sure our decision
`is correct.
`
`
`*
`
`*
`
`*
`
`
`Minors
`We do not allow people under 13 to have a Facebook account. If someone is is
`[sic] reported to us as being under 13, the reviewer will look at the content on their
`
`7
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 8 of 28 PageID #: 8
`
`
`
`profile (text and photos) to try to ascertain their age. If they believe the person is
`under 13, the account will be put on a hold and the person will not be able to use
`Facebook until they provide proof of their age. Since the program, we have been
`working to update the guidance for reviewers to put a hold on any account they
`encounter if they have a strong indication it is underage, even if the report was for
`something else.
`
`(Emphasis added.)
`
`26.
`
`On January 31, 2019, Facebook filed with the SEC a Form 10-K annual report for
`
`the fiscal year ended December 31, 2018 (“2018 10-K”). The 2018 10-K was signed by Defendants
`
`Zuckerberg and Wehner. Attached to the 2018 10-K were SOX certifications signed by Defendants
`
`Zuckerberg and Wehner attesting to the accuracy of financial reporting, the disclosure of any
`
`material changes to the Company’s internal control over financial reporting and the disclosure of
`
`all fraud.
`
`27.
`
`The 2018 10-K stated that, between December 31, 2017 and December 31, 2018,
`
`MAUs grew from 239 million to 242 million in the United States and Canada. The 2018 10-K also
`
`stated that, during the same time frame, Facebook’s MAUs grew from 370 million to 381 million
`
`in Europe.
`
`28.
`
`The 2018 10-K stated, in pertinent part, the following about multiple accounts:
`
`“We believe the percentage of duplicate accounts is meaningfully higher in
`developing markets such as the Philippines and Vietnam, as compared to more
`developed markets.”
`
`(Emphasis added.)
`
`29.
`
`On January 29, 2020, Facebook filed with the SEC a Form 10-K annual report for
`
`the fiscal year ended December 31, 2019 (“2019 10-K”). The 2019 10-K was signed by Defendants
`
`Zuckerberg and Wehner. Attached to the 2019 10-K were SOX certifications signed by Defendants
`
`Zuckerberg and Wehner attesting to the accuracy of financial reporting, the disclosure of any
`
`8
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 9 of 28 PageID #: 9
`
`
`
`material changes to the Company’s internal control over financial reporting and the disclosure of
`
`all fraud.
`
`30.
`
`The 2019 10-K represented that, between December 31, 2018 and December 31,
`
`2019, MAUs grew from 242 million to 248 million in the United States and Canada. The 2019 10-
`
`K also represented that, during the same time frame, Facebook’s MAUs grew from 381 million to
`
`394 million in Europe.
`
`31.
`
`The 2019 10-K stated, in relevant part, the following about multiple accounts:
`
`“We believe the percentage of duplicate accounts is meaningfully higher in
`developing markets such as the Philippines and Vietnam, as compared to more
`developed markets.”
`
`(Emphasis added.)
`
`32.
`
`On January 27, 2021, Facebook filed with the SEC a Form 10-K annual report for
`
`the fiscal year ended December 31, 2020 (“2020 10-K”). The 2020 10-K was signed by Defendants
`
`Zuckerberg and Wehner. Attached to the 2020 10-K were SOX certifications signed by Defendants
`
`Zuckerberg and Wehner attesting to the accuracy of financial reporting, the disclosure of any
`
`material changes to the Company’s internal control over financial reporting and the disclosure of
`
`all fraud.
`
`33.
`
`The 2020 10-K stated that, between December 31, 2019 and December 31, 2020,
`
`MAUs in the United States and Canada grew from 248 million to 258 million. The 2020 10-K also
`
`represented that, during the same time frame, Facebook’s MAUs in Europe grew from 394 million
`
`to 419 million.
`
`34.
`
`The 2019 10-K stated, in pertinent part, the following about multiple accounts:
`
`“We believe the percentage of duplicate accounts is meaningfully higher in
`developing markets such as the Philippines and Vietnam, as compared to more
`developed markets.”
`
`
`9
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 10 of 28 PageID #: 10
`
`
`
`(Emphasis added.)
`
`35.
`
`The statements referenced in ¶¶15-34 above were materially false and/or
`
`misleading because they misrepresented and failed to disclose the following adverse facts
`
`pertaining to the Company’s business, operational and financial results, which were known to
`
`Defendants or recklessly disregarded by them. Specifically, Defendants made false and/or
`
`misleading statements and/or failed to disclose that: (1) Facebook misrepresented its user growth;
`
`(2) Facebook knew, or should have known, that duplicate accounts represented a greater portion
`
`of its growth than stated, and it should have provided more detailed disclosures as to the
`
`implication of duplicate accounts to Facebook’s user base and growth; (3) Facebook did not
`
`provide a fair platform for speech, and regularly protected high profile users via its Cross
`
`Check/XCheck system; (4) despite being aware of their use of Facebook’s platforms, the Company
`
`failed to respond meaningfully to drug cartels, human traffickers, and violent organizations; (5)
`
`Facebook has been working to attract preteens to its platform and services; and (6) as a result,
`
`Defendants’ public statements were materially false and misleading at all relevant times.
`
`The Truth Emerges
`
`36.
`
`On September 13, 2021, during trading hours, The Wall Street Journal (“WSJ”)
`
`published an article titled “Facebook Says Its Rules Apply to All. Company Documents Reveal a
`
`Secret Elite That’s Exempt.” It would be the first of nine articles published by the WSJ based on
`
`documents provided by a then-unknown whistleblower (the “Whistleblower”). The article stated,
`
`in relevant part:
`
`Mark Zuckerberg has publicly said Facebook Inc. allows its more than three
`billion users to speak on equal footing with the elites of politics, culture and
`journalism, and that its standards of behavior apply to everyone, no matter their
`status or fame.
`
`
`10
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 11 of 28 PageID #: 11
`
`In private, the company has built a system that has exempted high-profile users
`from some or all of its rules, according to company documents reviewed by The
`Wall Street Journal.
`
`The program, known as “cross check” or “XCheck,” was initially intended as a
`quality-control measure for actions taken against high-profile accounts, including
`celebrities, politicians and journalists. Today, it shields millions of VIP users from
`the company’s normal enforcement process, the documents show. Some users are
`“whitelisted”—rendered immune from enforcement actions—while others are
`allowed to post rule-violating material pending Facebook employee reviews that
`often never come.
`
`
`*
`
`*
`
`*
`
` A
`
` 2019 internal review of Facebook’s whitelisting practices, marked attorney-
`client privileged, found favoritism to those users to be both widespread and “not
`publicly defensible.”
`
`“We are not actually doing what we say we do publicly,” said the confidential
`review. It called the company’s actions “a breach of trust” and added: “Unlike the
`rest of our community, these people can violate our standards without any
`consequences.”
`
`*
`
`*
`
`
`*
`
`For ordinary users, Facebook dispenses a kind of rough justice in assessing
`whether posts meet the company’s rules against bullying, sexual content, hate
`speech and incitement to violence. Sometimes the company’s automated systems
`summarily delete or bury content suspected of rule violations without a human
`review. At other times, material flagged by those systems or by users is assessed
`by content moderators employed by outside companies.
`
`
`*
`
`*
`
`*
`
`
`Users designated for XCheck review, however, are treated more deferentially.
`Facebook designed the system to minimize what its employees have described in
`the documents as “PR fires”—negative media attention that comes from botched
`enforcement actions taken against VIPs.
`
`If Facebook’s systems conclude that one of those accounts might have broken its
`rules, they don’t remove the content—at least not right away, the documents
`indicate. They route the complaint into a separate system, staffed by better-trained,
`full-time employees, for additional layers of review.
`
`(Emphasis added).
`
`11
`
`
`
`
`
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 12 of 28 PageID #: 12
`
`
`
`37.
`
`On this news, Facebook shares dropped by $5.17 to close at $376.51 on September
`
`13, 2021.
`
`38.
`
`On September 28, 2021, during market hours, the WSJ published an article titled,
`
`“Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents Show.” The
`
`article said, in pertinent part:
`
`Internal Facebook documents reviewed by The Wall Street Journal show the
`company formed a team to study preteens, set a three-year goal to create more
`products for them and commissioned strategy papers about the long-term
`business opportunities presented by these potential users. In one presentation, it
`contemplated whether there might be a way to engage children during play dates.
`
`“Why do we care about tweens?” said one document from 2020. “They are a
`valuable but untapped audience.”
`
`*
`
`*
`
`
`*
`
`On Monday, Adam Mosseri, head of Instagram, said the company would pause
`the development of a version of the app for children, often referred to as
`Instagram Kids. He said the company wanted time to talk to parents, experts and
`lawmakers before proceeding. He also contended that underage users would simply
`lie about their age to access Instagram if a version for children under the age of 13
`wasn’t available.
`
`*
`
`*
`
`
`*
`
`Over the past five years, Facebook has made what it called “big bets” on
`designing products that would appeal to preteens across its services, according to
`a document from earlier this year.
`
`In more than a dozen studies over that period, the documents show, Facebook
`has tried to understand which products might resonate with children and
`“tweens” (ages 10 through 12), how these young people view competitors’ apps
`and what concerns their parents.
`
`“With the ubiquity of tablets and phones, kids are getting on the internet as young
`as six years old. We can’t ignore this and we have a responsibility to figure it out,”
`said a 2018 document labeled confidential. “Imagine a Facebook experience
`designed for youth.”
`
`Earlier this year, a senior researcher at Facebook presented to colleagues a new
`approach to how the company should think about designing products for
`
`12
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 13 of 28 PageID #: 13
`
`
`
`*
`
`*
`
`children. It provided a blueprint for how to introduce the company’s products to
`younger children. Rather than offer just two types of products—those for users 13
`and older, and a messenger app for kids—Facebook should tailor its features to six
`age brackets, said a slide titled “where we’ve been, and where we’re going.”
`
`*
`
`In a study about household dynamics, a Facebook user-experience researcher found
`that although teens often inspired their younger relatives to join Instagram, those
`same teens also often counseled the tweens not to share too frequently, and not to
`post things they would later regret.
`
`“I don’t know how to get a perfect picture like my sister says you need to post,” a
`tween told the researcher.
`
`“We need to understand if this influence over preteen sharing holds at scale,”
`the researcher wrote in a document posted to Facebook’s internal message board
`early this year. “If it is common that teens are discouraging preteens from
`sharing, there are obvious implications for creation and the ecosystem both in
`the near and longer-term as preteens are the next generation coming onto the
`platform.” The presentation cited concern among teenagers about oversharing as a
`“myth” about Instagram.
`
`(Emphasis added.)
`
`39.
`
`On this news, Facebook share prices dropped $7.32 to close at $340.65 on
`
`September 28, 2021.
`
`40.
`
`On October 3, 2021, CBS News aired a television segment on 60 Minutes
`
`interviewing the Whistleblower, revealed to be Frances Haugen, on her findings during her time
`
`at Facebook. On that same day, CBS published an article containing highlights from the interview,
`
`stating in relevant part:
`
`“The thing I saw at Facebook over and over again was there were conflicts of
`interest between what was good for the public and what was good for Facebook,”
`Haugen said. “And Facebook, over and over again, chose to optimize for its own
`interests, like making more money.”
`
`
`*
`
`*
`
`*
`
`
`
`13
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 14 of 28 PageID #: 14
`
`
`
`Haugen told 60 Minutes that weeks after the 2020 election, Facebook dissolved a
`department called “Civic Integrity” which worked on risks to elections including
`misinformation.
`
`“Like, they basically said, ‘Oh good, we made it through the election. There wasn’t
`riots. We can get rid of Civic Integrity now,’” Haugen said. “Fast forward a couple
`months, we got the insurrection. And when they got rid of Civic Integrity, it was
`the moment where I was like, ‘I don't trust that they're willing to actually invest
`what needs to be invested to keep Facebook from being dangerous.’”
`
`
`*
`
`*
`
`*
`
`
`that generates
`Haugen said Facebook's algorithm optimizes for content
`engagement. That's led to publishers, “realizing that if they produce more content
`that is angry and divisive and polarizing, they’ll get more views,” in her words.
`
`“Facebook has realized that if they change the algorithm to be safer, people will
`spend less time on the site, they’ll click on less ads, they’ll make less money,”
`Haugen added.
`
`(Emphasis added.)
`
`41.
`
`On October 4, 2021, CBS News published an article titled, “Whistleblower’s SEC
`
`Complaint: Facebook Knew Platform Was Used to ‘Promote Human Trafficking and Domestic
`
`Servitude’”, containing the whistleblower complaints against Facebook filed with the SEC. There
`
`were eight complaints shared in the CBS article. The whistleblower complaints against Facebook,
`
`which the CBS News article discussed, contained the following allegations:
`
`a. Facebook knew its platforms perpetuated misinformation, but did little to stop it.
`
`In relevant part, this complaint alleged:
`
`
`its role perpetuating
`the public about
`investors and
`Facebook misled
`misinformation and violent extremism relating to the 2020 election and January 6th
`insurrection.
`
`
`*
`
`*
`
`*
`
`
`Facebook made misstatements and omissions regarding its facilitation of political
`misinformation, including in testimony before Congress.
`
`*
`
`*
`
`*
`
`14
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 15 of 28 PageID #: 15
`
`
`
`
`Facebook only actions less than 1% of Violence and Inciting to Violence (V&I)
`content on Facebook – Facebook’s strategy of focusing on Content over other
`solutions lets this content effectively run free[.]
`
`
`*
`
`*
`
`*
`
`
`Facebook has demonstrated via experiments using brand new test accounts how
`rapidly Facebook’s algorithms can veer people interested in Conservative topics
`into radical or polarizing ideas and groups/pages, some demonstrating traits of
`Coordinated Inauthentic Behavior (CIB) akin to what was seen by the Macedonians
`in 2016[.]
`
`
`*
`
`*
`
`*
`
`
`Pages that repeat offend for misinformation are permitted to continue to spread
`misinformation[.]
`
`
`*
`
`*
`
`*
`
`
`Facebook has “whitelisted” political users who violate its terms, leading to the
`spread of misinformation and violence on and off the platform.
`
`(Emphasis added.)
`
`
`b. Facebook did little to combat human traffickers using its platform. In pertinent part,
`
`this complaint said:
`
`Facebook misled investors and the public about its promotion of human trafficking
`/ slaver / servitude.
`
`
`*
`
`*
`
`*
`
`
`Internal company documents show that Facebook and Instagram were, and are,
`being used to promote human trafficking and domestic servitude. An internal
`Facebook record created no later than April 2019 states: “We have observed
`increasing number of reported content that indicates that the platform is being
`used to coordinate and promote domestic servitude . . . real world harm caused by
`domestic servitude as well as risk to the business due to potential PR fires . . .”
`
`*
`
`*
`
`*
`
`
`Notably, there was widespread media coverage of an “undercover investigation by
`BBC News Arabic” in or around October 2019, which found that “domestic
`workers are being illegally bought and sold online in a booming black market . .
`
`15
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 16 of 28 PageID #: 16
`
`
`
`
`
`. on Facebook-owned Instagram, where posts have been promoted via algorithm-
`boosted hashtags, and sales negotiated via private messages.”
`
`*
`
`*
`
`*
`
`
`However, even after this news coverage, Facebook’s regular SEC filings
`continually omitted specific references to trafficking, domestic servitude, human
`slavery, and the Apple App Store escalation.
`
`In fact, Facebook’s failure to solve human trafficking and servitude on its platforms
`threatened its distribution on the Apple App Store. Moreover, as the enclosed
`Facebook records show, Facebook’s statements about human trafficking were
`false. For example, Facebook has confirmed: [. . .] [W]e received communication
`from Apple where the company threatened to pull FB & IG from its App Store
`due to them identifying content promoting ‘domestic servitude’. . . [. . .] However,
`due to the underreporting of this behaviour and absence of proactive detection,
`newly created and existing content not captured in the IG sweep meant that
`domestic servitude content remained on the platform. [. . .] Was this issue known
`to Facebook before BBC enquiry and Apple escalation? Yes. [. . .] [O]ur platform
`enables all three stages of the human exploitation lifecycle (recruitment,
`facilitation, exploitation)[.]”
`
`(Emphasis added.)
`
`c. Confirming the earlier WSJ article, Facebook’s XCheck program gave preferential
`
`treatment to certain users. In relevant part, this complaint said:
`
`
`Facebook misled investors and the public about equal enforcement of its terms
`given that high-profile users are “whitelisted” under its “XCheck” program.
`
`*
`
`*
`
`*
`
`
`[O]ver the years, many XChecked people & entities have been exempted from
`enforcement. That means, for a select few members of our community, we are
`not enforcing our policies and standards. Unlike the rest of our community, these
`people can violate our standards without any consequences[.]
`
`*
`
`*
`
`*
`
`
`We are exempting certain people and businesses from our policies and standards [.
`. .] This undermines our fairness and legitimacy efforts; creates legal and
`compliance risks for the company . . . Based on an initial company-wide audit,
`this problem is pervasive across the country[.]
`
`(Emphasis added.)
`
`16
`
`
`
`Case 1:21-cv-05976-MKB-RER Document 1 Filed 10/27/21 Page 17 of 28 PageID #: 17
`
`
`
`
`
`d. Facebook misled investors and the public the extent to which Facebook was used
`
`to foment ethnic violence and global division. In relevant part, this complaint
`
`revealed:
`
`
`Facebook misled investors and the public about bringing “the world closer
`together” where it relegates international users and promotes global division and
`ethnic violence.
`
`
`*
`
`*
`
`*
`
`
`[. . .] Facebook’s shareholders proposed having a human/civil rights expert on
`the board, stating:
`
`“In September 2020, a Facebook employee reported Facebook ignored global
`political manipulation from foreign governments seeking to ‘abuse our platform
`on vast scales to mislead their own citizenry.’ [. . .]
`
`Children’s rights organization Plan International found online attacks against girls
`globally are most prevalent on Facebook.
`
`*