`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`__________________________________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`__________________________________________
`
`
`
`Wombat Security Technologies, Inc.,
`Petitioner,
`
`v.
`
`PhishMe, Inc.,
`Patent Owner.
`
`____________________________
`
`U.S. PATENT NO. 9,674,221
`PGR2017-00050
`____________________________
`
`
`
`PETITION FOR POST-GRANT REVIEW OF U.S. PATENT 9,674,221
`
`
`
`
`
`Mail Stop Patent Board
`Patent Trial and Appeal Board
`U.S. Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`301021460 v2
`
`
`
`
`
`I.
`
`II.
`
`TABLE OF CONTENTS
`
`INTRODUCTION ........................................................................................... 1
`
`REQUIREMENTS AND MANDATORY NOTICES .................................... 3
`A. Standing ................................................................................................... 3
`B. Real Party-in-Interest ............................................................................... 3
`C. Related Matters ........................................................................................ 3
`1. Lawsuits ........................................................................................... 3
`2. Post-grant Petitions .......................................................................... 4
`3. Related Applications ........................................................................ 5
`D. Lead and Backup Counsel and Service Information ............................... 6
`E. Payment of Fees ....................................................................................... 6
`
`III. OVERVIEW OF THE ‘221 PATENT ............................................................ 6
`A. Simulated Phishing Methods Described in the Specification .................. 6
`B. Person Having Ordinary Skill in the Art ................................................. 9
`C. Eligibility for PGR ................................................................................... 9
`1. First Post-AIA Limitation .............................................................. 10
`2. Second Post-AIA Limitation .......................................................... 13
`3. Related Applications ...................................................................... 16
`4. Conclusion and AIA Applicability ................................................ 17
`D. Priority Date ........................................................................................... 17
`
`IV. OVERVIEW OF THE CHALLENGE AND THE RELIEF
`REQUESTED ................................................................................................ 18
`
`V.
`
`CLAIM CONSTRUCTION .......................................................................... 20
`
`VI. DETAILED EXPLANATION FOR THE GROUNDS FOR
`UNPATENTABILITY .................................................................................. 23
`
`
`
`i
`
`
`
`A. Obviousness Grounds 1, 2 and 3 ........................................................... 23
`1. Summary of Relied Upon Prior Art ............................................... 24
`2.
`Independent Claims ........................................................................ 35
`3. Dependent Claims .......................................................................... 68
`B. Grounds 4, 5 and 6: Failure to Satisfy the Written Description
`Requirement of § 112(a) ........................................................................ 74
`1. Ground 4: Claims 11-16 and 18 Should Be Canceled Because
`Specification Does Not Disclose a Remote Computing Device
`That Provides Graphically Displayed Confirmatory Feedback ..... 75
`2. Ground 5: Claims 11-16 and 18 Should be Canceled Because
`the Specification Does Not Disclose a Remote Computing
`Device For Sending the Identified Email for Analysis or
`Detection After it is Determined Not to be a Known Simulated
`Phishing Attack .............................................................................. 77
`3. Ground 6: Claims 1-6 and 8 Should be Canceled Because the
`Specification Does Not Disclose Each of the Multiple Methods
`Recited in the Claims ..................................................................... 79
`C. Ground 7: Claims 11-16, 18 are Indefinite ............................................ 81
`D. Ground 8: The Challenged Claims are Ineligible under § 101 .............. 85
`
`VII. CONCLUSION .............................................................................................. 91
`Claims Appendix of Challenged Claims …………………………………………93
`
`
`
`
`
`
`
`
`ii
`
`
`
`TABLE OF AUTHORITIES
`Cases
`
`Alice Corp. v. CLS Bank Int’l, 134 S. Ct. 2347 (2014) .................................... 85, 86
`Allergan, Inc. v. Sandoz Inc., 796 F.3d 1293 (Fed. Cir. 2015) ................................ 74
`Ariad Pharm., Inc. v. Eli Lilly and Co., 598 F.3d 1336 (Fed.
`Cir. 2010) ................................................................................................... 9, 11, 74
`Aristocrat Techs. Austl. Pty Ltd. v. Int'l Game Tech., 521 F.3d 1328 (Fed.
`Cir. 2008) .............................................................................................................. 82
`Atmel Corp. v. Info. Storage Devices, Inc., 198 F.3d 1374 (Fed. Cir. 1999) .......... 85
`Creston Elec., Inc. v. Intuitive Building Controls, Inc., IPR2015-01460,
`Paper 14 (PTAB January 14, 2016) ...................................................................... 31
`Cuozzo Speed Techs. LLC v. Lee, 136 S. Ct. 2131 (2016) ...................................... 20
`Digitech Image Techs., LLC v. Elecs. For Imaging, Inc., 758 F.3d 1344
`(Fed. Cir. 2014) ..................................................................................................... 87
`EON Corp. v. AT&T Mobility LLC, 785 F.3d 616 (Fed. Cir. 2015) ................. 82, 83
`Ex Parte Lakkala, Appeal 2011-001526, 2013 WL 1341108 (PTAB July 7,
`2015) ..................................................................................................................... 23
`FairWarning IP, LLC v. Iatric Sys., Inc., 839 F.3d 1089 (Fed. Cir. 2016) ............. 89
`Helsinn Healthcare S.A. v. Teva Pharm. USA, Inc., 855 F.3d 1356 (Fed.
`Cir. 2017) .............................................................................................................. 25
`I/P Engine, Inc. v. AOL Inc., 576 Fed. Appx. 982 (Fed. Cir. 2014) ........................ 86
`In re Distefano, 808 F.3d 845 (Fed. Cir. 2015) ....................................................... 50
`In re Gosteli, 872 F.2d 1008 (Fed. Cir. 1989) ........................................................... 9
`In re Gulack, 703 F.2d 1381 (Fed. Cir. 1983) ......................................................... 50
`In re Hall, 781 F.2d 897 (Fed. Cir. 1986) ................................................................ 34
`In re TLI Commc’ns LLC Patent Litig., 823 F.3d 607 (Fed. Cir. 2016) ........... 87, 91
`In re Translogic Tech. Inc., 504 F.3d 1249 (Fed. Cir. 2007) .................................. 20
`
`
`
`iii
`
`
`
`Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307 (Fed. Cir.
`2016) ........................................................................................................ 86, 87, 91
`Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343 (Fed. Cir.
`2015) ..................................................................................................................... 91
`IpLearn, LLC v. K12 Inc., 76 F.Supp.3d 525 (D. Del. 2014) .................................. 88
`KSR Int’l Co. v. Teleflex Inc., 127 S. Ct. 1727 (2007) ..................................... 52, 64
`Lockwood v. American Airlines, 107 F.3d 1565 (Fed. Cir. 1997) ........................... 74
`Minton v. National Ass’n of Securities Dealers, Inc., 336 F.3d 1373 (Fed.
`Cir. 2003) .............................................................................................................. 25
`Multimedia Plus, Inc. v. Playerlync, LLC, 198 F.Supp.3d 264 (S.D.N.Y.
`2016), aff’d. 2017 WL 3498637 (Fed. Cir. Aug. 16, 2017) ................................. 88
`Mylan Pharm Inc. v. Yeda Res. & Dev. Co., PGR2016-00010, Paper 9 at 10
`(PTAB Aug. 15, 2016) ............................................................................................ 9
`Nike, Inc. v. Adidas AG, 812 F.3d 1326 (Fed. Cir. 2016) ........................................ 64
`Noah Sys., Inc. v. Intuit Inc., 675 F.3d 1302 (Fed. Cir. 2012) ................................. 82
`Ormco Corp. v. Align Tech., Inc., 498 F.3d 1307 (Fed. Cir. 2007) .................. 35, 61
`PhishMe, Inc. v. Wombat Security Technologies, Inc., No. 1:17-cv-00769-
`LPS-CJB.................................................................................................................. 4
`PowerOasis, Inc. v. T-Mobile USA, Inc., 522 F.3d 1299 (Fed. Cir. 2008) ............. 10
`Robert Bosch, LLC v. Snap-On Inc., 769 F.3d 1094 (Fed. Cir. 2014) ............. 22, 81
`Sogue Holdings (Bermuda) Ltd. v. Keyscan, Inc., 2010 WL 2292316 (N.D.
`Cal. June 7, 2010) ................................................................................................. 22
`Synopsis, Inc. v. Mentor Graphics Corp., 839 F.3d 1138 (Fed. Cir. 2016) ............ 88
`Turbocare Div. of Demag Delaval Turbomachinery Corp. v. General Electric
`Co., 264 F.3d 1111 (Fed. Cir. 2001) ..................................................................... 74
`Williamson v. Citrix Online, LLC, 792 F.3d 1339 (Fed. Cir. 2015) ................ passim
`Regulations
`
`37 C.F.R. § 42.200(b) .............................................................................................. 20
`
`
`
`iv
`
`
`
`
`
`Statutes
`
`35 U.S.C. § 101 .................................................................................................passim
`35 U.S.C. § 102 ................................................................................................passim
`35 U.S.C. § 103 .................................................................................................passim
`35 U.S.C. § 112 ................................................................................................passim
`
`
`
`
`
`v
`
`
`
`Exhibit No.
`
`TABLE OF EXHIBITS
`Description
`
`1001
`
`1002
`
`1003
`
`1004
`
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`1012
`
`1013
`
`1014
`
`U.S. Patent 9,674,221
`
`Complaint for Patent Infringement, PhishMe Inc. v. Wombat
`Security Technologies, Inc., June 16, 2017
`
`Complaint for Patent Infringement, PhishMe Inc. v. Wombat
`Security Technologies, Inc., June 1, 2016
`
`First Amended Complaint for Patent Infringement, PhishMe Inc.
`v. Wombat Security Technologies, Inc., July 19, 2016
`
`Second Amended Complaint for Patent Infringement, PhishMe
`Inc. v. Wombat Security Technologies, Inc., September 6, 2016
`
`Consolidation Order, PhishMe Inc. v. Wombat Security
`Technologies, Inc., Case No. 16-403-LPS-CJB and 17-769-LPS-
`CJB, June 28, 2017
`
`Decision Denying Institution of Post-Grant Review, PGR2017-
`00009, Patent No. 9,398,038, Paper 7, June 8, 2017
`
`Petitioner’s Request for Rehearing, PGR2017-00009, Patent No.
`9,398,038, Paper 8, June 20, 2017
`
`Decision Denying Request for Rehearing, PGR2017-00009,
`Patent No. 9,398,038, Paper 9, July 20, 2017
`
`Declaration of Aviel Rubin, Ph.D.
`
`Application Serial No. 13/765,538, filed February 8, 2013
`
`Application Serial No. 13/785,252, filed March 5, 2013
`
`Redline comparison between Application Serial No. 13/785,252
`and Application Serial No. 13/765,538
`
`Cisco IronPort Email Security Plug-in 7.1 Administrator Guide,
`Cisco Systems, Inc., December 6, 2010
`
`
`
`vi
`
`
`
`Exhibit No.
`
`Description
`
`1015
`
`1016
`
`1017
`
`1018
`
`1019
`
`1020
`
`1021
`
`1022
`
`1023
`
`1024
`
`1025
`
`1026
`
`1027
`
`Keno Albrecht, “Mastering Spam: A Multifaceted Approach with
`the Spamato Spam Filter System,” Swiss Federal Institute of
`Technology Zurich, 2006
`
`Fahmida Y. Rashid, “PhishGuru,” PC Mag,
`www.pcmag.com/article2/0,2817,2404750,00.asp, May 25, 2012
`
`Declaration of Kurt Wescoe
`
`Declaration of Ralph Massaro
`
`“Leading Computer Science University Takes Multi-Pronged
`Approach to Combat Phishing; Deploys Wombat Security’s
`Highly Effective Suite of Training and Filtering Products,” March
`10, 2011
`
`“A Multi-Pronged Approach To Combat Phishing,” Wombat
`Security Technology, March 2011
`
`File History of U.S. Patent 9,674,221 (Serial No. 15/418,709)
`from PAIR (without foreign references and non-patent literature)
`
`P. Kumaraguru et al., “Lessons From a Real World Evaluation of
`Anti-Phishing Training,” eCrime Researchers Summit, 15-16
`October 2008
`
`P. Kumaraguru, “PhishGuru: A System for Educating Users about
`Semantic Attacks,” Ph.D. Thesis, Carnegie Mellon University,
`April 14, 2009
`
`Declaration of Alan Himler
`
`Declaration of Elizabeth Whittington
`
`Ex parte Schulhauser, Appeal 2013-007847 (PTAB April 28,
`2016)
`
`Redline comparison of claim 11 of U.S. Patent 9,674,221 to claim
`1 of U.S. Patent 9,674,221
`
`
`
`vii
`
`
`
`Exhibit No.
`
`Description
`
`1028
`
`1029
`
`1030
`
`1031
`
`Declaration of Steve Hicks
`
`U.S. Pub. No. 2012/0124671 A1 to Fritzson et al.
`
`Application Serial No. 13/763,486, filed February 8, 2013
`
`Application Serial No. 13/763,515, filed February 8, 2013
`
`
`
`
`
`
`viii
`
`
`
`I.
`
`INTRODUCTION
`
`Wombat Security Technologies, Inc. (“Wombat”) requests post-grant review
`
`(PGR) of claims 1-6, 8, 11-16, and 18 (“Challenged Claims”) of U.S. Patent
`
`9,674,221 (“the ’221 Patent,” Ex. 1001). The ’221 Patent is assigned to PhishMe,
`
`Inc. (“PhishMe”). Wombat petitions for cancellation of the Challenged Claims
`
`under 35 U.S.C. §§ 101, 103 and § 112.1
`
`The ’221 Patent relates to simulated phishing campaigns to educate email
`
`recipients about the dangers of phishing attacks. A phishing attack is “a message,
`
`commonly in the form of an e-mail,” from an attacker “directing the [recipient] to
`
`perform an action, such as opening an e-mail attachment or following … an
`
`embedded link” to a fraudulent, phishing webpage. Ex. 1001, col. 1:33-38. If the
`
`recipient opens the attachment or follows the link of an actual phishing email,
`
`harmful results can occur, such as installation of malicious software on the
`
`recipient’s computer. Id., col. 1:41-53.
`
`To “make individuals more knowledgeable about phishing attacks,” the ‘221
`
`Patent proposes an “education process” by which “individuals are subjected to
`
`simulated phishing attacks, which are designed to resemble actual phishing
`
`
`1 The AIA versions of §§ 102, 103 and 112 apply to the ‘221 Patent. All
`
`references herein to these statutes are to their AIA versions.
`
`
`
`1
`
`
`
`attacks.” Id., col. 1:62-66. For those email recipients who fall victim to the
`
`simulated attack, training is provided. Id., col. 2:4-5. For those who identify a
`
`known simulated phishing email as a possible phishing attack, feedback is
`
`provided. Id., col 2:7-12.
`
`Simulated phishing campaigns existed many years prior to the priority date
`
`for the ‘221 Patent. Researchers at Carnegie Mellon University (CMU) published
`
`about them by 2008. Exs. 1022-1023. They called their system “PhishGuru” and
`
`Wombat commercialized it. Ex. 1023 at 66. Wombat’s Anti-Phishing System, of
`
`which PhishGuru was a part, qualifies as prior art and discloses most of the
`
`limitations of the Challenged Claims. It does not disclose an email client plug-in
`
`through which the email recipient can report a received email as a phishing attack,
`
`but such plug-ins were commonplace in the prior art. The Challenged Claims of
`
`the ‘221 Patent, therefore, are obvious. The method claims are especially obvious
`
`because, due to their numerous conditional limitations, they recite multiple
`
`methods (three in fact, as shown below) and only one of them needs to be obvious.
`
`See Ex Parte Schulhauser, Appeal 2013-007847 (PTAB April 28, 2016) (Exhibit
`
`1026) at 6-10.
`
`Additionally, the Challenged Claims are invalid under § 112(a) for failing to
`
`satisfy the written description requirement and the challenged system claims are
`
`indefinite under § 112(b) because they include mean-plus-function elements
`
`
`
`2
`
`
`
`without reciting sufficient corresponding structure in the specification. Finally, the
`
`Challenged Claims are directed to an abstract education process and are ineligible
`
`under § 101.
`
`This petition is supported by an expert declaration from Prof. Aviel Rubin of
`
`Johns Hopkins University. Ex. 1010.
`
`II. REQUIREMENTS AND MANDATORY NOTICES
`A.
`Standing
`Wombat certifies that (a) before the date on which this petition is being
`
`filed, neither Wombat nor any real party-in-interest filed a civil action challenging
`
`the validity of a claim of the ’221 Patent; and (b) neither Wombat nor any real
`
`party-in-interest or privy of Wombat is estopped from challenging the claims on
`
`the grounds described herein.
`
`B. Real Party-in-Interest
`The real party-in-interest is Wombat Security Technologies, Inc. Wombat
`
`Security Technologies UK Ltd. and Wombat Security Technologies Asia Pte. Ltd.,
`
`both subsidiaries of Wombat, can also be considered a real parties-in-interest.
`
`C. Related Matters
`1.
`Lawsuits
`
`PhishMe sued Wombat in the United States District Court of Delaware on
`
`June 16, 2017, styled PhishMe, Inc. v. Wombat Security Technologies, Inc., No.
`
`
`
`3
`
`
`
`1:17-cv-00769-LPS-CJB (“the Second Lawsuit”) for infringement of the ‘221
`
`Patent and a related patent, Patent 9,591,017 (“the ‘017 Patent”). Ex. 1002.
`
`On May 31, 2016, PhishMe sued Wombat in the same court, Case No. 1:16-
`
`cv-00403-LPS (“the First Lawsuit”), asserting, Patent 9,356,948 (“the ‘948
`
`Patent”), a predecessor of the ‘017 and ‘221 Patents. Ex. 1003. On July 19, 2016,
`
`PhishMe amended its complaint in the First Lawsuit to add another predecessor
`
`patent, Patent 9,398,038 (“the ‘038 Patent”). Ex. 1004. On September 6, 2016,
`
`PhishMe dropped the ‘948 Patent from the First Lawsuit. Ex. 1005.
`
`On June 28, 2017, the district court consolidated the First and Second
`
`Lawsuits. Ex. 1006. As of the date of this petition, the court has not ruled on
`
`claim construction or validity regarding any of the patents.
`
`2.
`
`Post-grant Petitions
`
`Wombat requested post-grant review and inter partes review of the ‘038
`
`Patent. The Board denied Wombat’s request for PGR, Ex. 1007, and its request for
`
`rehearing. Ex. 1008-1009. Wombat filed an IPR petition for the ‘038 Patent on
`
`July 18, 2017. The case numbers for the PGR and IPR petitions for the ‘038 Patent
`
`are PGR2017-00009 and IPR2017-01813 respectively. Wombat also petitioned on
`
`August 23, 2017 for PGR of the ‘017 Patent (PGR 2017-00047).
`
`
`
`4
`
`
`
`3.
`
`Related Applications
`
`The ‘221 Patent claims priority to several predecessor patents and a pending
`
`application claims priority to it as shown in the chart below.
`
`
`
`5
`
`
`
`
`
`D. Lead and Backup Counsel and Service Information
`
`
`Lead Counsel
`
`
`
`
`Backup Counsel
`
`Mark G. Knedeisen
`Reg. No. 42,747
`mark.knedeisen@klgates.com
`T: 412-355-6342
`Patrick J. McElhinny
`Reg. No. 46,320
`patrick.mcelhinny@klgates.com
`T: 412-355-6334
`Ragae Ghabrial
`Reg. No. 59,104
`ragae.ghabrial@klgates.com
`T: 412-355-8690
`
`
`
`All listed counsel are with K&L Gates, LLP, 210 Sixth Avenue, Pittsburgh,
`
`PA 15222. A power of attorney designating the above-identified counsel is being
`
`filed with this petition. Wombat consents to electronic service by email.
`
`Payment of Fees
`
`E.
`Wombat authorizes the Office to charge the required fees for PGR of
`
`fourteen (14) claims, and any additionally required fees, to Deposit Account No.
`
`02-1818.
`
`III. OVERVIEW OF THE ‘221 PATENT
`A.
`Simulated Phishing Methods Described in the Specification
`The ’221 Patent describes a manner to educate individuals about phishing
`
`attack risks by sending simulated, non-malicious phishing emails to the individuals
`
`and tracking the individuals’ responses. In Figure 1 of the ’221 Patent (below), a
`
`
`
`6
`
`
`
`“network device 14” sends simulated phishing emails that “resemble real phishing
`
`attacks” to computing devices 16-20 of the intended recipients (referred to in the
`
`’221 Patent as “users” and “individuals”). Ex. 1001, col. 3:55-60. The simulated
`
`phishing emails include information that can be used later to identify them as
`
`simulated phishing emails, such as a “sender identifier,” a “recipient identifier,” a
`
`subject or time of transmission of the message, or “message headers.” Id., col.
`
`7:52-55. A database 24 stores data (e.g., a log) about the sent simulated phishing
`
`emails. Id., col. 3:67-68; Figs. 1-2.
`
`The individuals’ computers 16, 18, 20 can have an email client “plug-in” so
`
`that when an individual receives a suspected phishing email, the individual can
`7
`
`
`
`
`
`
`
`activate a “graphical user interface element” of the plug-in to report the received
`
`email as a potential phishing attack. Id., col. 7:24-29.
`
`When the individual reports a received email, either the network device 14
`
`or the email client plug-in determines whether the identified email is a known
`
`simulated phishing email. Id., col. 4:46-49; col. 4:66-col. 5:7; col. 7:49-67. If the
`
`identified email is determined to be a known simulated phishing email, the system
`
`provides feedback to the individual confirming that the email was a simulated
`
`phishing email. Id., col. 27-12; col. 4:26-33; col.8:1-3. The ’221 Patent does not
`
`identify the device or component that provides the feedback, stating only in passive
`
`voice that the confirmatory feedback “may be provided …in the form of an email
`
`message, or an out-of-band message, such as an SMS message or other message.”
`
`Id., col. 4:29-33; col. 8:1-3.
`
`The network device 14 records in a database 26 the individuals’ responses to
`
`the simulated phishing emails, i.e., whether they reported the email, ignored it, or
`
`fell for it. Id., col. 3:67-col. 4:3; col. 5:49-63; col. 8:1-5; Figs. 1, 3. The network
`
`device 14 uses the response data to calculate trustworthiness scores for the
`
`individuals that are indicative of the individuals’ abilities to identify potential
`
`phishing emails. Id., col. 5:64-col.7:12. If the identified email is determined not to
`
`be a known simulated phish, “a computer security expert” or “computer software
`
`
`
`8
`
`
`
`configured to detect phishing attacks” can analyze the email to determine if it is a
`
`real phishing attack. Id., col. 2:28-44.
`
`The ‘221 Patent also incorporates two “Related Applications”―Serial Nos.
`
`13/763,486 and 13/763,515 (Exs. 1030-1031). Ex. 1001, col. 1:17-20.
`
`Person Having Ordinary Skill in the Art
`
`B.
`The skill level of a “person having ordinary skill in the art” (“PHOSITA”) to
`
`which the ‘221 Patent pertains would have a bachelor’s degree or the equivalent in
`
`computer science (or a related academic field) and at least two to three years of
`
`additional experience in the computer security field, or equivalent work
`
`experience, and would have familiarity with phishing email attacks and spam and
`
`phishing email filters. Ex. 1010, ¶ 44.
`
`“Any person skilled in the art” for the written description and enablement
`
`standards of § 112(a) has the same skill level as a PHOSITA for the obviousness
`
`assessment. In re Gosteli, 872 F.2d 1008, 1012 (Fed. Cir. 1989); Ariad Pharm.,
`
`Inc. v. Eli Lilly and Co., 598 F.3d 1336, 1351 (Fed. Cir. 2010) (en banc).
`
`C. Eligibility for PGR
`The standard for PGR eligibility is more likely than not that the patent has at
`
`least one claim having an effective filing date on or after March 16, 2013. Mylan
`
`Pharm Inc. v. Yeda Res. & Dev. Co., PGR2016-00010, Paper 9 at 10 (PTAB Aug.
`
`15, 2016). The chart above shows that the lineage of the ‘221 patent includes two
`
`
`
`9
`
`
`
`applications filed prior to March 16, 2013―Serial No. 13/763,538 (“the ‘538
`
`Application,” Ex. 1011) and Serial No. 13/785,252 (“the ‘252 Application,” Ex.
`
`1012). Their specifications are virtually identical. Ex. 1013 (redline comparison).
`
`The ‘221 Patent is eligible for PGR because it has at least one claim that is
`
`not entitled to the filing dates of either the ‘538 or ‘252 Applications. Claim 11 of
`
`the ‘221 Patent in particular includes subject matter that is not disclosed in either
`
`of these pre-March-16-2013 applications. PowerOasis, Inc. v. T-Mobile USA, Inc.,
`
`522 F.3d 1299, 1306 (Fed. Cir. 2008) (subject matter disclosed for first time in a
`
`continuation-in-part does not receive benefit of the parent’s filing
`
`date). Presumably for that reason, the ‘221 Patent was examined under the first-
`
`to-file AIA provisions, Ex. 1021 at 72, and PhishMe did not object.
`
`For the sake of expediency, this petition focuses on whether claim 11 is
`
`entitled to a pre-March 16, 2013 filing date. There are at least two limitations in
`
`claim 11 that are not supported by either the ‘252 or ‘538 Applications.
`
`1.
`
`First Post-AIA Limitation
`
`The ‘252 and ‘538 Applications do not support the following limitation
`
`recited in claim 11: “the remote computing device configured for:… when the
`
`identified email is determined to be a known simulated phishing attack …,
`
`providing a graphically displayed feedback to the individual confirming that the
`
`identified email was a simulated phishing attack ….” The ‘252 and ‘538
`
`
`
`10
`
`
`
`Applications fail to disclose the subject matter of this limitation because: (1) the
`
`‘252 and ‘538 Applications do not disclose any sort of “graphically displayed”
`
`confirmatory feedback to the individual when the individual correctly identifies a
`
`simulated phishing email; and (2) they do not disclose a component that provides
`
`confirmatory feedback, much less the “remote computing device.”
`
`Neither the ‘252 nor ‘538 Applications disclose the graphically displayed
`
`confirmatory feedback limitation in haec verba. Ex. 1007 at 11 (“the exact words
`
`used in the feedback limitation may not appear in the ‘252 Application …”). The
`
`‘252 and ‘538 Applications make general references to “education” and “training”
`
`in two paragraphs,2 but that minimal disclosure insufficiently shows that the
`
`inventors possessed the graphically displayed confirmatory feedback limitation.
`
`See Ariad, 598 F.3d at 1351 (“the hallmark of written description is disclosure”).
`
`• Paragraph [0005] states: “In an education process, individuals are
`
`subjected to simulated phishing attacks, which are designed to
`
`resemble actual phishing attacks. In response to a simulated attack, an
`
`individual typically either falls victim to it, ignores the attack,
`
`consciously chooses to not react or additionally reports the attack
`
`too…. For those that fall victim to an attack, training is provided to
`
`2 The content and numbering of the two paragraphs in the ‘252 and ‘538
`
`Applications are the same. Ex. 1013.
`
`
`
`11
`
`
`
`decrease the likelihood that they will be deceived by a future
`
`simulated and/or real phishing attack.” (emphasis added); and
`
`• Paragraph [0018] states: “simulated phishing attacks are designed to
`
`resemble real phishing attacks in order to train the users of computing
`
`devices 16, 18 and 20 to better recognize and thwart a real phishing
`
`attack.” (emphasis added).
`
`Neither of these paragraphs demonstrates to a PHOSITA that the inventors
`
`possessed the claimed limitation of a remote computing device providing the
`
`“graphically displayed” confirmatory feedback. Both paragraphs are insufficient
`
`because the training could be provided in response to the individual falling for (or
`
`being deceived by) a simulated attack, instead of correctly spotting one. Ex. 1010,
`
`¶¶ 55-59. Indeed, ¶ [0005] refers exclusively to training provided to those who fall
`
`victim to the simulated attack. Ex. 1007 at 12 (¶ 5 does not disclose the claimed
`
`confirmatory feedback). Moreover, neither paragraph mentions that the feedback
`
`is provided by a remote computing device, as required by claim 11. Because of
`
`this omission, a PHOSITA would conclude that the inventors did not possess that
`
`the remote computing device would provide the confirmatory feedback at the time
`
`that the ‘252 and ‘538 Applications were filed. Ex. 1010, ¶ 59.
`
` The ‘252 and ‘538 Applications refer to an email client plug-in at the
`
`remote computing device (Exs. 1012 and 1013 at ¶¶ [0022], [0029]), but never
`
`
`
`12
`
`
`
`mention that the feedback could be provided by the plug-in. And it is not inherent
`
`or implicit that the training described in ¶¶ [0005] and [0018] is provided by the
`
`remote computing device since the feedback could be provided on another
`
`computer device. Ex. 1010, ¶ 59.
`
`The insufficient disclosure of the ‘252 and ‘538 Applications with regard to
`
`this limitation extends to the ‘221 Patent as described in Ground 4 below, which is
`
`incorporated by reference.
`
`2.
`
`Second Post-AIA Limitation
`
`The ‘252 and ‘538 Application also do not support the limitation recited in
`
`claim 11 as: “…the remote computing device configured for:…when the identified
`
`email is determined not to be a known simulated phishing attack …, sending the
`
`identified email for analysis or detection of whether or not the identified email is a
`
`phishing attack ….”
`
`This limitation requires the remote computing device to send the identified
`
`email for analysis or detection after it is determined not to be a known simulated
`
`phishing attack, but the ‘252 and ‘538 Applications do not disclose this limitation.
`
`Instead, the two applications disclose that, after a plug-in at the remote computing
`
`device determines that the identified email is not a known simulated phishing
`
`email, the remote computing device “could query network device 14 to determine
`
`the trustworthiness level of the individual who flagged the message….” Exs.
`
`
`
`13
`
`
`
`1011-1012 at ¶ [0022]. This teaching does not disclose sending the identified
`
`email for analysis or detection, and a PHOSITA would not understand from it that
`
`the inventors possessed that the remote computing device would send the email for
`
`analysis or detection in response to determining that the identified email is not a
`
`known simulated phishing attack. Ex. 1010, ¶ 62.
`
`In response to receiving the trustworthiness level of the individual, the
`
`remote computing device still does not send the email for analysis or detection
`
`according to the ‘252 and ‘538 Applications. Instead, “computing device 18 could
`
`alert network device 14, a network security appliance…, and/or a security event
`
`responder … that a potential malicious message was able to thwart security
`
`measures and that additional security measures should be taken to ensure that such
`
`messages (e.g., messages from same sender as flagged message) are blocked in the
`
`future.” Exs. 1011-1012 at ¶ [0022]. Thus, the applications disclose blocking the
`
`sender of the non-known-simulated phishing email, which does not require sending
`
`the email anywhere. Instead, just the email’s header, including the sender
`
`information, could be sent to the network security appliance so that the appliance
`
`could update its filters to block the sender. Ex. 1010, ¶ 63. Consequently, a
`
`PHOSITA would not immediately discern, visualize or recognize that the inventors
`
`possessed a system where the remote computing device sends the user-reported,
`
`non-known simulated phishing email for analysis or detection. Ex. 1010, ¶ 64.
`
`
`
`14
`
`
`
`The ‘252 and ‘538 Applications disclose a plug-in at the remote computing
`
`device sending the email to the network device for analysis or detection, but only
`
`in response to the user reporting a received email via the plug-in, Exs. 1011- 1012
`
`at ¶ [0029], not in response to the plug-in determining that the user-reported email
`
`is a non-known simulated phishing email. After the plug-in sends the email to the
`
`network device, the network device determines whether the email is a known
`
`simulated phishing email specification. Id. at ¶ [0030]. Thus, this disclosure
`
`insufficiently demonstrates that the inventors possessed systems and methods
`
`where a remote computing device sends the user-reported email for analysis or
`
`detection after the plug-in determines the email is a non-known simulated phishing
`
`email. Ex. 1010, ¶ 61.
`
`The fact that the ‘252 and ‘538 Applications disclose that the determination
`
`of whether a user-reported email is a known simulated phishing email can be done
`
`at either the network device or the remote computing device (Exs. 1011-1012 at ¶
`
`[