throbber
Trials@uspto.gov
`Tel: 571-272-7822
`
`
`
`
`
`
`Paper 12
`Entered: May 10, 2018
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`_______________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`_______________
`
`INAUTH, INC.,
`Petitioner,
`v.
`MSIGNIA, INC.,
`Patent Owner.
`____________
`
`Case IPR2018-00150
`Patent 9,559,852 B2
`____________
`
`
`Before TREVOR M. JEFFERSON, JAMES B. ARPIN, and
`GREGG I. ANDERSON, Administrative Patent Judges.
`
`ARPIN, Administrative Patent Judge.
`
`
`
`
`DECISION
`Denying Institution of Inter Partes Review
`35 U.S.C. § 314(a)
`
`
`
`
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`I. INTRODUCTION
`A. Background
`
`InAuth, Inc. (“Petitioner”) filed a Petition requesting an inter partes
`review of claims 1–25 of U.S. Patent No. 9,559,852 B2 (Ex. 1001, “the ’852
`patent”). Paper 2 (“Pet.”). mSIGNIA, Inc. (“Patent Owner”) filed a
`Preliminary Response. Paper 9 (“Prelim. Resp.”). Having considered the
`Petition, the Preliminary Response, and the evidence of record, and applying
`the standard set forth in 35 U.S.C. § 314(a), which requires that Petitioner
`demonstrate a reasonable likelihood that it would prevail with respect to at
`least one challenged claim; we deny institution of inter partes review of
`claims 1–25 of the ’852 patent.
`
`B. Related Matters
`
`The parties indicate that the ’852 patent is the subject of a civil action
`identified as mSIGNIA, Inc. v. InAuth, Inc., 8:17-cv-01289 (C.D. Cal.), filed
`July 26, 2017. Pet. 71 (citing Ex. 1027); Paper 5, 1. Petitioner states that
`InAuth, Inc. is a wholly owned subsidiary of American Express Travel
`Related Services Company, Inc., the corporate parent of which is American
`Express Company. Pet. 70. Thus, Petitioner states that InAuth, Inc., the
`American Express Company, and American Express Travel Related Services
`Company, Inc. are real parties-in-interest. Id. at 70–71. Patent Owner states
`that mSIGNIA is the real party-in-interest. Paper 5, 1.
`
`2
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`C. The ’852 Patent
`
`The ’852 patent is entitled “Cryptographic Security Functions Based
`on Anticipated Changes in Dynamic Minutiae” and is directed to “methods
`and systems for dynamic key cryptography us[ing] a wide range of minutiae
`as key material including computer hardware, firmware, software, user
`secrets, and user biometrics rather than stor[ing] a random number as a
`cryptographic key on the computer.” Ex. 1001, 3:7–11. The ’852 patent
`claims priority to U.S. Provisional Patent Application No. 61/462,474, filed
`February 3, 2011. Id. at [60]. Although Petitioner does not concede that the
`’852 patent is entitled to that priority date, the applied references predate that
`date, so, for purposes of this Decision, we accept the provisional
`application’s filing date as the earliest effective filing date of the ’852 patent.
`See Pet. 6 n.2.
`The ’852 patent recognizes that, in known authentication methods
`using “computer fingerprints,” “[a] typical computer identifier is computed
`and remains static; to ensure reliability the computer fingerprint typically
`uses computer minutiae (e.g., serial numbers) that normally do not change.
`Thus, current computer fingerprints typically use a relatively small set of
`static minutia which may be prone to spoofing.” Ex. 1001, 2:51–56
`(emphases added); see Prelim. Resp. 1. Known methods, however,
`“allegedly did not provide for the use of minutia that is subject to change
`because routine changes to the minutia, e.g., an upgrade to a component,
`would alter the fingerprint and cause false identification of a device as
`‘different’ (a ‘false negative’).” Pet. 1–2 (citing Ex. 1001, 2:56–3:2). The
`Specification of the ’852 patent system explains that the disclosed systems
`and methods permit use of minutia that is subject to change, such as location
`
`3
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`or hardware, firmware, or software versions, in the authentication process.
`Pet. 2; Prelim. Resp. 2. In particular, these systems and methods use
`information regarding “anticipated changes” to the minutia to “deliver[] a
`tolerant, yet secure authentication with fewer false negatives.” Pet. 2
`(quoting Ex. 1001, 5:40–44).
`Figures 2A and 2B of the ’852 patent are reproduced below.
`
`
`
`4
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`
`Figures 2A and 2B depict a system diagram illustrating a challenge,
`response and validation process performed by the system of Figure 1.
`Ex. 1001, 4:35–38.
`More specifically, Figures 2A and 2B depict an example for providing
`and using dynamic key cryptography to ensure valid service user 20 is using
`authenticated computer 18 in system 200. Id. at 10:24–27. System 200
`collects and catalogs minutiae values of computer 18 and service user 20
`that may identify computer 18 and service user 20, such that computer
`minutia 64 and secrets and biometric minutia 26 may be used by dynamic
`key crypto provider 10 to form dynamic keys unique to each and every
`distinct computer 18 and service user 20. Id. at 10:27–34. Consequently,
`each distinct computer 18 may use unique computer minutia 64 and secrets
`and biometric minutia 26 in system 200 that correspond to that distinct
`computer 18 and service user 20, respectively, and “each uniquely identified
`computer 18 corresponds to one and only one distinct computer 18 and each
`uniquely identified service user 20 may correspond to one and only one
`
`5
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`distinct service user 20.” Id. at 10:34–41. “The unique identification of a
`computer 18 may be processed by system [200], for example, by a service
`provider 14 or by the dynamic key crypto provider 10, and there [may] be no
`meaningful single identifier or identity key itself stored on the computer 18.”
`Id. at 10:42–46.
`System 200 depicts a system for identifying and authenticating a
`specific computer 18 and service user 20 via challenge, response, and
`validation sequences performed by dynamic key crypto provider 10. Id. at
`10:46–50. Each distinct computer 18 and service user 20 is recognized by
`specific computer minutia 64, specific secrets and biometric minutia 26, or
`combinations thereof found on computer 18 or collected by computer 18
`from service user 20 as cataloged by dynamic key crypto provider 10. See
`id. at 10:50–58.
`As shown in Figure 2A, at step 2001, “computer minutia 64 can
`represent a set of 390 distinct minutiae values that may be chosen for
`collecting and cataloging from the computer 18.” Id. at 11:22–24. For
`example, hardware minutia (Hx) may represent 40 categories or types of the
`minutia, firmware minutia (Fx) may represent 70 categories or types of the
`minutia, software minutia (Sx) may represent 280 categories or types of the
`minutia, service user 20 secrets (?x) may represent 2 specific secrets, and
`service user 20 biometric minutia (Bx) may represent 5 categories or types
`of the minutia, from which it may be possible to accurately and uniquely
`identify specific computer 18 and associated service user 20 for computer
`18. Id. at 11:21–28, 12:37–49. “Hardware minutia values typically cannot
`change without changing a physical component of the computer 18.
`Firmware minutia can be updated but usually their update is controlled by
`
`6
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`someone other than the service user 20. Software minutia changes
`dynamically via various individual instantiations of service user 20 and
`includes elements that may require predictable, constant change in normal
`situations (i.e., frequently called contact phone numbers).” Id. at 11:38–46.
`Software minutiae values, however, may reflect customizations performed
`by service user 20. Thus, “software minutiae values can accurately identify
`computer 18 devices that are otherwise extremely similar in hardware and
`firmware.” Id. at 11:49–51.
`Referring to Figure 2B, at step 2020, formulate challenge 116 process
`computes a cryptographic key based on a combination of minutia (e.g., Hx-
`Fy-Sz for the illustrated example). Id. at 15:43–46. At step 2030, dynamic
`key crypto provider 10 computes all responses that are acceptable from
`computer 10. Id. at 16:8–10.
`At step 2040, the particular computer 18 being challenged
`may receive the challenge and unpack the challenge to determine
`which minutia it should collect and use the values of to form its
`response to the challenge. Having unpacked the challenge using
`information and algorithms stored in the dynamic key crypto
`library 56, the response process 112 can use the computer 18 to
`fetch the values of the selected computer minutia 64 or collect
`the values of selected service and biometrics minutia 26 and
`build a key that may be identical to the key computed by the
`dynamic key crypto provider 10 at step 2020.
`. . .
`
`As illustrated at step 2050, the validate response from
`computer 120 process can therefore be determined by simply
`comparing the actual response received from the computer 18 to
`the allowable responses that are pre-processed by the dynamic
`key crypto provider 10 to determine if there is a match.
`Decrypting or decoding of a response is not necessary so the
`validation can occur very quickly.
`
`7
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`Id. at 16:42–52, 17:1–7.
`
`D. Illustrative Claim
`
`Claims 1, 24, and 25 are independent, and each is directed to an
`identity recognition system. Id. at 34:18–47 (claim 1), 36:6–38 (claim 24),1
`36:39–37:7 (claim 25). Each of claims 2–23 depends directly or indirectly
`from claim 1. Id. at 34:48–36:5. Claim 1 is illustrative and is reproduced
`below, with emphasis added.
`
`A identity recognition system comprising:
`
`1.
`
` a
`
` non-transitory memory storing information associated with one
`or more identities, wherein the information stored for an identity
`includes (a) data values associated with that identity; and
`(b) information regarding anticipated changes to one or more of
`the stored data values associated with that identity;
`
`one or more hardware processors in communication with the
`memory and configured to execute instructions to cause the
`identity recognition system to recognize that the presentation of
`identity information by a computer is authentic, by performing
`operations comprising:
`
`
`generating a challenge to the computer, wherein the
`challenge prompts the computer to provide a response based on
`one or more data values from the computer that correspond to
`one or more of the stored data values associated with the identity;
`
`
`receiving, from the computer, the response to the
`challenge;
`
`
`determining whether the response is allowable, wherein
`such determining comprises using the stored information
`regarding anticipated changes to the stored data values
`
`
`1 Claim 24 is corrected by a Certificate of Correction.
`
`8
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`associated with the identity to determine whether a data value
`used to form the response is based on an acceptable change to a
`corresponding stored data value; and
`
`
`recognizing that the presentation of identity information
`by the computer is authentic, according to whether the computer
`has provided an allowable response to the challenge.
`Id. at 34:18–47 (disputed limitations emphasized).
`E. Applied References and Declaration
`Petitioner relies on the following references and declaration in support
`of its asserted grounds for unpatentability.
`
`Exhibit No.
`1003
`1004
`
`1005
`
`1006
`
`Pet. iv, 9, 11.
`
`Declaration or Reference
`Declaration of Patrick Traynor, Ph.D.
`U.S. Patent No. 8,316,421 B2 to Etchegoyen et al., filed
`October 13, 2010; published April 21, 2011; and issued
`November 20, 2012 (“Etchegoyen”)2
`U.S. Patent Publication No. 2006/0282660 A1 to Varghese et
`al., filed April 28, 2006, and published December 14, 2006
`(“Varghese”)
`U.S. Patent No. 8,312,157 B2 to Jakobsson et al., filed July
`16, 2009; published January 20, 2011; issued November 13,
`2012 (“Jakobsson”)
`
`F. Asserted Grounds of Unpatentability
`
`Petitioner argues that claims 1–25 of the ’852 patent are unpatentable
`based on the following grounds:
`Reference(s)
`Basis
`Etchegoyen
`35 U.S.C. § 102
`
`2 Based on its claim of priority to U.S. Provisional Patent Application No.
`61/252,960, filed October 19, 2009, Dr. Traynor argues that Etchegoyen is
`entitled to a priority date of October 19, 2009. Ex. 1003 ¶¶ 61–64.
`
`Claim(s)
`1–5, 7, 14–21, 24, and 25
`
`9
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`35 U.S.C. § 103
`Etchegoyen
`Etchegoyen and Jakobsson 35 U.S.C. § 103
`Etchegoyen and Varghese
`35 U.S.C. § 102
`Varghese
`35 U.S.C. § 102
`Varghese
`35 U.S.C. § 103
`Pet. 5.
`
`II. ANALYSIS
`A. Claim Construction
`
`1–5, 7, 14–21, 24, and 25
`6 and 8–12
`13, 22, and 23
`1–23 and 25
`24
`
`In an inter partes review, “[a] claim in an unexpired patent that will
`not expire before a final written decision is issued shall be given its broadest
`reasonable construction in light of the specification of the patent in which it
`appears.” 37 C.F.R. § 42.100(b). In determining the broadest reasonable
`construction, we presume that claim terms carry their ordinary and
`customary meaning. See In re Translogic Tech., Inc., 504 F.3d 1249, 1257
`(Fed. Cir. 2007). This presumption may be rebutted when a patentee, acting
`as a lexicographer, sets forth an alternate definition of a term in the
`specification with reasonable clarity, deliberateness, and precision. In re
`Paulsen, 30 F.3d 1475, 1480 (Fed. Cir. 1994).
`
`1. “generating a challenge” (Claims 1, 24, and 25)
`
`Petitioner and Patent Owner agree that only the term: “generating a
`challenge” recited in each independent claim requires express construction.
`Pet. 13; Prelim. Resp. 4–5; see Paper 11. Moreover, Petitioner and Patent
`Owner agree that “generating a challenge” should be construed as
`“generating a request for information.” Pet. 13; Prelim. Resp. 4.
`Nevertheless, Patent Owner further clarifies that the “challenge” is a
`challenge to the computer that is to be authenticated. Prelim. Resp. 4. We
`
`10
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`believe that this clarification is apparent from the context in which the
`construed term is used in the independent claims. See Pet. 13 (“As used in
`the specification, the challenge is a request to the computer to provide
`information based on selected minutia.”). On this record and for purposes of
`this Decision, we adopt the parties’ agreed upon construction, in view of
`Patent Owner’s clarification that the challenge is to the computer seeking
`authentication, as the broadest reasonable interpretation of this term.
`
`2. Other Claim Terms
`
`Only terms which are in controversy need to be construed, and then
`only to the extent necessary to resolve the controversy. See, e.g., Nidec
`Motor Corp. v. Zhongshan Broad Ocean Motor Co., 868 F.3d 1013, 1017
`(Fed. Cir. 2017) (“[W]e need only construe terms ‘that are in controversy,
`and only to the extent necessary to resolve the controversy.’” (quoting Vivid
`Techs., Inc. v. Am. Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999))).
`For purposes of this Decision, no other claim terms require express
`construction.
`
`B. Asserted Grounds
`
`1. Overview
`
`Petitioner argues that claims 1–5, 7, 14–21, 24, and 25 of the ’852
`patent are anticipated by or, in the alternative, rendered obvious over the
`teachings of Etchegoyen and relies upon the Declaration of Dr. Traynor
`(Ex. 1003) to support its arguments. Pet. 5, 14–38; see supra Section I.F.
`Further, Petitioner argues that each of claims 6, 8–13, 22, and 23 are
`rendered obvious over the teachings of Etchegoyen in combination with
`those of Jakobsson or Varghese, and again relies upon the Declaration of
`
`11
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`Dr. Traynor (Ex. 1003) to support its arguments. Pet. 5, 38–47; see supra
`Section I.F. In addition, Petitioner argues that claims 1–25 of the ’852
`patent are anticipated by or, in the alternative, rendered obvious over the
`teachings of Varghese and relies upon the Declaration of Dr. Traynor
`(Ex. 1003) to support its arguments. Pet. 5, 47–70; see supra Section I.F.
`As noted above, each of claims 2–23 depends directly from
`independent claim 1, and Patent Owner focuses its opposition primarily to
`the asserted grounds for unpatentability on the application of the teachings
`of Etchegoyen or Varghese to claims 1, 24, and 25. Prelim. Resp. 11–20,
`25–28.
`After reviewing Petitioner’s and Patent Owner’s arguments and
`evidence, for the reasons set forth below, we are not persuaded that
`Petitioner has demonstrated a reasonable likelihood of prevailing in showing
`that any of claims 1–25 of the ’852 patent is unpatentable with respect to the
`grounds based, in whole or in part, on Etchegoyen or Varghese.
`Consequently, we deny institution of inter partes review of claims 1–25 of
`the ’852 patent as to the asserted grounds.
`
`2. Legal Principles
`
`“A claim is anticipated only if each and every element as set forth in
`the claim is found, either expressly or inherently described, in a single prior
`art reference.” Verdegaal Bros. v. Union Oil Co., 814 F.2d 628, 631 (Fed.
`Cir. 1987). The elements must be arranged as required by the claim, but this
`is not an ipsissimis verbis test. See In re Bond, 910 F.2d 831, 832 (Fed. Cir.
`1990). Moreover, “it is proper to take into account not only specific
`teachings of the reference but also the inferences which one skilled in the art
`
`12
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`would reasonably be expected to draw therefrom.” Application of Preda,
`401 F.2d 825, 826 (CCPA 1968). Nevertheless,
`unless a reference discloses within the four corners of the
`document not only all of the limitations claimed but also all of
`the limitations arranged or combined in the same way as recited
`in the claim, it cannot be said to prove prior invention of the thing
`claimed, and thus, cannot anticipate under 35 U.S.C. § 102.
` Net MoneyIN, Inc. v. VeriSign, Inc., 545 F.3d 1359, 1371 (Fed. Cir. 2008)
`(emphasis added); accord In re Arkley, 455 F.2d 586 (CCPA 1972).
`A claim is unpatentable under 35 U.S.C. § 103(a) if the differences
`between the claimed subject matter and the prior art are such that the subject
`matter, as a whole, would have been obvious at the time the invention was
`made to a person having ordinary skill in the art to which said subject matter
`pertains. KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007).
`Obviousness is a question of law, which is resolved on the basis of
`underlying factual determinations including: (1) the scope and content of
`the prior art; (2) any differences between the claimed subject matter and the
`prior art; (3) the level of ordinary skill in the art;3 and (4) when in evidence,
`
`
`3 Petitioner proposes an assessment of the level of ordinary skill in the art.
`Pet. 12; see Ex. 1003 ¶¶ 22, 23. Petitioner’s declarant, Dr. Traynor, exceeds
`this assessed level. Ex. 1003 ¶¶ 9–20. At this time, Patent Owner does not
`propose an alternative assessment. For purposes of this Decision, and to the
`extent necessary, we adopt Petitioner’s assessment.
`
`13
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`objective evidence of nonobviousness.4 Graham v. John Deere Co., 383
`U.S. 1, 17–18 (1966).
`
`3. Asserted Anticipation by and/or Obviousness Over Etchegoyen
`
`Petitioner argues that claims 1–5, 7, 14–21, 24, and 25 are
`unpatentable under 35 U.S.C. §§ 102(e) or 102(a) as anticipated by
`Etchegoyen or under 35 U.S.C. § 103(a) as rendered obvious over
`Etchegoyen. Pet. 5, 14–38. Relying on the testimony of its declarant,
`Dr. Traynor, Petitioner explains how Etchegoyen allegedly discloses each
`and every element of the challenged claims. Id. (citing Ex. 1003). We begin
`our analysis with an overview of Etchegoyen.
`
`a. Etchegoyen (Ex. 1004)
`
`Etchegoyen is entitled “System and Method for Device Authentication
`with Built-in Tolerance” and describes systems performing the following
`computer-implementable steps:
`(a) receiving and storing a first digital fingerprint of the
`device during a first boot of an authenticating software on the
`device, the first digital fingerprint based on a first set of device
`components, (b) receiving a second digital fingerprint from
`the device at a subsequent time, (c) comparing the second
`digital fingerprint with a plurality of stored digital
`fingerprints of known devices, (d) in response to the
`comparison indicating a mismatch between the second digital
`fingerprint and the plurality of stored digital fingerprints,
`generating a request code comprising instructions for the
`device to generate a third digital fingerprint using the first set
`of device components, (e) sending the request code to the
`remote device, (f) receiving the third digital fingerprint from
`
`4 Patent Owner does not present arguments or evidence of such secondary
`considerations in the Preliminary Response. See Pet. 70.
`
`14
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`the remote device in response to the request code, and
`(g) authenticating the device based on a comparison of the
`first and third digital fingerprints.
`Ex. 1004, [57] (emphases added). Specifically, Etchegoyen describes that:
`The first boot fingerprint may be generated using the
`overall
`environmental
`information
`collected
`by
`the
`authentication module. Alternatively, the first boot fingerprint
`may be generated using specific components of the device as
`predetermined by the authentication client. The specific
`components may include components from a typical-upgrade
`components list or a non-typical-upgrade components list. The
`typical-upgrade components list may include components such
`as: graphic card, random access memory, sound card, network
`adaptor, hard drive, CD/DVD drive, Ethernet controller, or other
`routinely upgraded components. The non-typical-upgrade
`components list may include components such as: motherboard,
`USB host controller, central microprocessor, PCI Bus, System
`CMOS Clock, etc.
`Id. at 4:63–5:9 (emphasis added).
`Etchegoyen’s Figure 4 is reproduced below.
`
`15
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`
`Figure 4 is a process flow chart illustrating an embodiment of computer-
`implementable steps of a method for device authentication with built-in
`tolerance. Id. at 4:16–18; see Ex. 1003 ¶¶ 74–77.
`Figure 4 depicts an exemplary process flow on the authenticating
`server side of method 400 for authenticating a device. Ex. 1004, 12:16–57.
`At step 410, a digital fingerprint is received by the authentication server.
`The digital fingerprint may have a plurality of digital fingerprint portions,
`and each portion represents a digital fingerprint of a component of the
`device. Id. at 12:21–23. For example, the digital fingerprint may include a
`plurality of mini-fingerprints, each associated with a different component of
`
`16
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`the device, which may be combined to form the digital fingerprint. Id. at
`12:23–27.
`At step 420, the received digital fingerprint is authenticated by
`comparing it against stored digital fingerprints. Id. at 12:28–29. In
`particular, each portion of the received digital fingerprint may be compared
`with portions of a stored digital fingerprint. Id. at 12:29–31. If a portion of
`the received digital fingerprint matches any portion(s) of a stored digital
`fingerprint, a match of that portion is recognized. Id. at 12:31–34. If,
`however, the authenticating server fails to find a match for a portion of the
`received digital fingerprint, as indicated in step 430, that portion may be
`flagged as failed. Id. at 12:34–36. Alternatively, received digital fingerprint
`portions with matched portions in a stored digital fingerprint may be flagged
`as passed. Id. at 12:36–38.
`In one embodiment, the device may be authenticated
`solely based on the ratio of passed and failed digital fingerprint
`portions. For example, if 80% of the portions are flagged as
`passed, then the device may be validly authenticated. However,
`additional safeguards may be added to the authentication
`process by breaking down the components responsible for the
`errors.
`Id. at 39–45 (emphasis added); see id. at 13:22–38 (describing step 450). In
`step 440, each portion of the digital fingerprint may be categorized by its
`associated component. Ex. 1004, 12:45–46. By decoding the fingerprint
`portion, information about the component, such as the type of component
`used to generate the digital fingerprint portion, may be determined; and
`when the component information is analyzed, the component may be
`categorized as a typical-upgrade component or a non-typical-upgrade
`component. Id. at 12:49–55. “Thus, each fingerprint portion may be
`
`17
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`compared against stored fingerprints, flagged as failed or passed, and
`categorized.” Id. at 12:55–57 (emphasis added).
`
`b. Analysis With Respect to Independent Claims 1, 24, and 25
`
`Petitioner argues that Etchegoyen discloses each and every element
`recited in independent claim 1. Pet. 14–23. Petitioner applies substantially
`the same mapping of Etchegoyen to independent claims 24 (id. at 23–26)
`and 25 (id. at 26–28). In the alternative, Petitioner argues that:
`With respect to Claims 1, 24, and 25, to the extent Patent
`Owner contends that Etchegoyen does not anticipate because it
`does not disclose a single embodiment that uses both the
`request/response protocol (i.e., challenge/response protocol) to
`obtain a fingerprint and the typical-upgrade ratio to analyze the
`fingerprint, it would have been obvious to a [person of ordinary
`skill in the art] to modify the embodiments disclosed to include
`both aspects.
`Id. at 37 (citing Ex. 1003 ¶ 151). Patent Owner contends, however, that
`Etchegoyen fails to disclose the limitations of claim 1 relating to
`“anticipated changes” including “storing information associated with one or
`more identities, wherein the information stored for an identity includes
`(a) data values associated with that identity; and (b) information regarding
`anticipated changes to one or more of the stored data values associated with
`that identity” and “determining whether the response is allowable, wherein
`such determining comprises using the stored information regarding
`anticipated changes to the stored data values associated with the identity to
`determine whether a data value used to form the response is based on an
`acceptable change to a corresponding stored data value.” Prelim. Resp. 7–
`11. For the reasons discussed below, we are not persuaded that Petitioner
`has shown a reasonable likelihood of prevailing in its assertion that
`
`18
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`Etchegoyen discloses each and every element or teaches or suggests all of
`the limitations recited in independent claims 1, 24, and 25.
`
`i.
`
`Petitioner’s Mapping of Etchegoyen onto Claim 1
`
`Petitioner provides a detailed mapping of the disclosure of
`Etchegoyen onto the elements of independent claim 1. Pet. 14–23; see
`Ex. 1003 ¶¶ 78–95. Specifically, Petitioner argues that Etchegoyen
`discloses a system for authenticating the identity of a device. Ex. 1004,
`1:30–43. Similar to the system recited in claim 1, Etchegoyen’s system
`recognizes that it is “desirable to provide an authentication method with built
`in flexibility or tolerance to allow for some upgrades or changes to the
`device.” Id. (emphasis added). Petitioner argues that Etchegoyen discloses
`“a non-transitory memory storing information associated with one or more
`identities” of the device to be authenticated. Pet. 18–19; see Ex. 1001,
`34:19–21. Specifically, challenged claim 1 recites two types of stored
`information, namely, “(a) data values associated with that identity; and (b)
`information regarding anticipated changes to one or more of the stored data
`values associated with that identity.” Ex. 1001, 34:21–24. Petitioner argues
`that Etchegoyen’s “first boot fingerprint” corresponds to the “data values”
`recited in claim 1. Pet. 15 (citing Ex. 1004, 4:56–62), 19. Further,
`Petitioner argues that Etchegoyen’s “predetermined non-typical-upgrade
`component/typical-upgrade component ratio” and Etchegoyen’s lists of
`typical-upgrade components and non-typical-upgrade components disclose
`the “information regarding anticipated changes to one or more of the stored
`data values.” Id. at 19–20 (citing Ex. 1004, 3:12–17, 4:65–5:2, 12:45–57,
`13:22–38).
`
`19
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`
`Claim 1 further recites “one or more hardware processors in
`communication with the memory and configured to execute instructions to
`cause the identity recognition system to recognize that the presentation of
`identity information by a computer is authentic, by performing operations.”
`Ex. 1001, 34:25–29. Petitioner argues that, referring to its Figure 1,
`Etchegoyen’s Processing Module 160 in communication with Storage
`Module 155 discloses this additional hardware. Pet. 16, 20 (citing Ex. 1004,
`3:2–37, 9:49–52).
`Claim 1 recites four method steps performed by the “one or more
`hardware processors.” Ex. 1001, 34:30–47. Petitioner argues that
`Etchegoyen discloses each of these steps. Specifically, claim 1 recites the
`step of “generating a challenge to the computer, wherein the challenge
`prompts the computer to provide a response based on one or more data
`values from the computer that correspond to one or more of the stored data
`values associated with the identity.” Id. at 34:30–34. In view of the agreed
`construction of the term “generating a challenge” (see supra Section II.A.1.),
`Petitioner argues that Etchegoyen discloses “generat[ing] a request code,”
`which corresponds to generating the recited challenge. Pet. 20 (citing
`Ex. 1004, 5:35–39). Petitioner argues that Etchegoyen’s request code
`prompts the challenged computer to “send a response that includes a
`fingerprint or mini-fingerprint, such as a fingerprint based on the same
`components used for the first boot fingerprint.” Id. at 20 (citing Ex. 1004,
`11:13–22); see Ex. 1004, 5:47–51.
`Claim 1 further recites the step of “receiving, from the computer, the
`response to the challenge.” Ex. 1001, 34:35–36. Petitioner argues that
`Etchegoyen discloses that the recited “response to the challenge” in the form
`
`20
`
`

`

`IPR2018-00150
`Patent 9,559,852 B2
`
`of a “response code” that is generated in response to the request code is
`transmitted to the authenticating server. Pet. 21 (citing Ex. 1004, 5:44–63).
`
`Claim 1 further recites “determining whether the response is
`allowable, wherein such determining comprises using the stored information
`regarding anticipated changes to the stored data values associated with the
`identity to determine whether a data value used to form the response is based
`on an acceptable change to a corresponding stored data value.” Ex. 1001,
`34:37–43. Referring to steps 420–450 of Etchegoyen’s Figure 4, discussed
`above, Petitioner argues that:
`At step 420, the system “compar[es] each portion of the
`received digital fingerprint with portions of a stored digital
`fingerprint.” [Ex. 1004], 12:28-29. At step 430, the system
`“flag[s] each digital fingerprint portion creating an error during
`authentication.” Id., 3:10-12. At step 440, the system
`“categorize[s] associated component[s] of each fingerprint
`portion that produced the error.” Id., 3:12-14. As discussed
`above, this step entails categorizing the flagged portions as
`corresponding to “typical-upgrade components” or “non-typical-
`upgrade components.” And lastly at step 450, the system
`“authenticate[s] device based on categorization of flagged
`components.” Id., 15:42-43.
`The determining step of Etchegoyen “us[es] the stored
`information regarding anticipated changes to the stored data
`values associated with the identity” both because it uses the
`predetermined
`typical-upgrade ratio and because
`it uses
`information
`regarding whether a
`failed mini-fingerprint
`corresponds to a “typical” or “non-typical” upgrade component.
`The determining step of Etchegoyen also includes
`“determin[ing] whether a data value used to form the response is
`based on an acceptable change to a corresponding stored data
`value.” Etchegoyen discloses that one way of determining
`whether the response is based on an acceptable

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket