throbber

`
`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`
`
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`
`
`
`NETAPP, INC., LENOVO (UNITED STATES) INC., and EMC CORP.,
`Petitioners
`
`v.
`
`INTELLECTUAL VENTURES II LLC
`Patent Owner
`____________________
`
`IPR2017-00467
`Patent 6,968,459
`____________________
`
`
`PATENT OWNER INTELLECTUAL VENTURES II LLC’S
`PRELIMINARY RESPONSE TO PETITION
`
`
`
`
`
`
`
`
`
`Mail Stop PATENT BOARD
`Patent Trial and Appeal Board
`U.S. Patent & Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`
`
`
`
`
`
`
`
`

`

`
`
`I.
`
`II.
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`TABLE OF CONTENTS
`
`Introduction. ..................................................................................................... 1
`
`The ’459 patent solved the important problem of preventing authorized users
`from copying sensitive data to unsecured removable storage devices. ........... 2
`
`III. Claim construction. .......................................................................................... 5
`
`IV. The Board should exercise its discretion under 35 U.S.C. § 325(d) to deny
`the Petition because it presents a substantially similar argument that the
`Board denied in a previous IPR petition against the ’388 patent. ................... 6
`
`V. Ground 1: The Board should deny institution because Petitioners have not
`established a reasonable likelihood that the combination of Blakley and
`Bramhill renders claims 15, 18, 24, and 25 obvious. ...................................... 8
`
`A. Overview of Blakley. ............................................................................ 8
`
`B.
`
`C.
`
`Overview of Bramhill. .........................................................................12
`
`Petitioners have not established a reasonable likelihood that the
`combination of Blakley and Bramhill renders independent claim 15
`obvious. ...............................................................................................13
`
`1.
`
`2.
`
`Petitioners have not established a reasonable likelihood of
`showing that the combination discloses the “sensing” element
`of claim 15. ...............................................................................13
`
`Petitioners have not established a reasonable likelihood of
`showing that the combination discloses “providing restricted-
`access to the storage device” as recited in claim 15. ................23
`
`D.
`
`Petitioners have not established a reasonable likelihood that the
`combination of Blakley and Bramhill renders claims 18, 24, and 25
`obvious. ...............................................................................................27
`
`VI. Ground 2: The Board should deny institution because Petitioners have not
`established a reasonable likelihood that the combination of Uchida and
`Bramhill renders claims 15, 18, 24, and 25 obvious. ....................................28
`
`
`
`- i -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`A.
`
`B.
`
`Petitioners have not established a reasonable likelihood that the
`combination of Uchida and Bramhill renders independent claim 15
`obvious. ...............................................................................................28
`
`Petitioners have not established a reasonable likelihood that the
`combination of Uchida and Bramhill renders claims 18, 24, and 25
`obvious. ...............................................................................................32
`
`VII. The Board should deny at least one of the grounds due to redundancy. .......35
`
`VIII. Conclusion. ....................................................................................................36
`
`
`
`
`
`
`
`
`
`
`- ii -
`
`

`

`
`
`
`
`Table of Authorities
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`Cases
`
`Corning Inc. v. DSM IP Assets B.V.,
`IPR2013-00048, Paper 94 (PTAB May 9, 2014) .................................................... 22
`
`Conopco dba Unilever v. Procter & Gamble Co.,
`IPR2014-00628, Paper 21 (PTAB October 20, 2014) ............................................... 6
`
`Conopco, Inc. v. Procter & Gamble Co.,
`IPR2013-00505, Paper 9 (PTAB Feb. 12, 2014) ..................................................... 35
`
`D-Link Systems, Inc. v. Chrimar Systems, Inc.,
`IPR2016-01426, Paper 13 (PTAB Jan. 12, 2017) ................................................... 21
`
`In re Gordon,
`733 F.2d 900 (Fed. Cir. 1984) .................................................................................. 16
`
`In re Nuvasive,
`842 F.3d 1376 (Fed. Cir. 2016)................................................................................ 21
`
`KSR Int’l Co. v. Teleflex Inc.,
`550 U.S. 398 (2007) ........................................................................................... 19, 20
`
`Los Angeles Biomedical Research Inst. at Harbor-UCLA Med. Ctr. v.
`Eli Lilly & Co.,
`__ F.3d __, 2017 WL 765812 (Fed. Cir. February 28, 2017) ................................. 20
`
`Phillips v. AWH Corp.,
`415 F.3d 1303 (Fed. Cir. 2005).................................................................................. 5
`
`Unified Patents v. Intellectual Ventures II LLC,
`IPR2016-01404, Paper 9 (PTAB January 11, 2017) ................................................. 6
`
`Vivid Techs. v. Amer. Science,
`200 F.3d 795 (Fed. Cir. 2000) .................................................................................... 5
`
`
`
`
`- iii -
`
`

`

`
`
`
`
`
`Statute
`
`IPR2017-00467
`
`IPR2017-00467
`Patent No. 6,968,459
`Patent No. 6,968,459
`
`
`Statute
`
`35 U.S.C. § 325(d) ................................................................................................ 6, 8
`35 U.S.C. § 325(d) .............................................................................................. ..6, 8
`
`
`
`
`
`
`
`
`
`
`
`- iv -
`
`_iV_
`
`

`

`
`
`
`
`I.
`
`Introduction.
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`The Board should not institute trial on either of the grounds asserted by
`
`Petitioners because both grounds contain fundamental deficiencies that Petitioners
`
`cannot fix with a trial. The ’459 patent addresses an important technological issue:
`
`preventing an authorized user from copying secure information to an unsecured
`
`removable device. The claims of the ’459 patent reflect this innovation, requiring
`
`that the computer prevents writing to devices when security information is not
`
`sensed. Petitioners’ cited references do not address this issue and significantly
`
`differ from the challenged claims.
`
`This Preliminary Response lays out the facts that show Petitioners have not
`
`met their burden. In Section II, Patent Owner provides a roadmap of the ’459
`
`patent, showing how the patent solved the problem of preventing authorized users
`
`from copying sensitive data to unsecured removable storage devices. Section III
`
`sets forth Patent Owner’s position why no explicit claim construction is necessary
`
`to resolve this dispute, and Sections IV–V each address Petitioners’ failure to meet
`
`their burden for the two asserted grounds. For example, Ground 1 fails because
`
`Petitioners do not show that the combination of Blakley and Bramhill discloses the
`
`“sensing” elements recited in independent claims 15 and 18. Ground 2 fails
`
`because Petitioners do not show that the combination of Uchida and Bramhill
`
`
`
`- 1 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`discloses the “providing restricted-access” and “sensing” elements of independent
`
`claims 15 and 18.
`
`This Preliminary Response will show that Petitioners have not met their
`
`burden to establish a reasonable likelihood of prevailing against any challenged
`
`claim. The Board should therefore deny all of Petitioners’ proposed grounds.
`
`II. The ’459 patent solved the important problem of preventing authorized
`users from copying sensitive data to unsecured removable storage
`devices.
`
`In 1999, engineers at Imation Corporation recognized that “[o]ne of the
`
`greatest challenges” in creating a secure computing environment is “preventing the
`
`authorized user from using sensitive data in an unauthorized manner.” (Ex. 1001,
`
`’459 patent, 1:13–23.) For example, prior to the ’459 patent, after a user
`
`successfully entered a password, the user was able to access and handle
`
`information without technology-imposed limitations. The lack of access control
`
`meant that an authorized user could “simply copy[] the sensitive data to a
`
`removable storage device such as floppy diskette.” (’459 patent, 1:23–26.)
`
`To address this critical security flaw, the inventors of the ’459 patent
`
`developed a computing environment using secure storage devices where, for
`
`example, “a computer automatically operates in a secure ‘full-access’ data storage
`
`mode when the computer detects the presence of a secure removable storage
`
`device.” (’459 patent, 1:36–40.) Conversely, “[i]f the computer senses a non-
`
`
`
`- 2 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`secure removable storage device then the computer automatically operates in a
`
`‘restricted-access’ mode.” (’459 patent, 1:40–43.) Figure 1, reproduced below,
`
`illustrates an embodiment of such a computing environment.
`
`
`
`In secure full-access mode, the system “uses a cryptographic key to encrypt
`
`and decrypt the data stream between the computer and the removable storage
`
`device.” (’459 patent, 1:44–47.) The key can be generated by a combination of
`
`various types of information such as “(1) device-specific information derived of
`
`the removable storage device, (2) manufacturing information that has been etched
`
`
`
`- 3 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`onto the storage device, (3) drive-specific information…, and (4) user-specific
`
`information such as a password.” (’459 patent, 1:47–55.) Figure 2, reproduced
`
`below, illustrates an exemplary method requiring the four types of information
`
`identified in the figure (204, 206, 208, 210) are required for a storage device to be
`
`deemed secure and have “full access.”
`
`
`
`The patent contemplates two situations: where the storage device is secure
`
`and where the storage device is unsecured. When the storage device is secure, the
`
`computing environment uses the “secure storage device as a secure ‘access card’”
`
`to gain access to sensitive data of the organization. (’459 patent, 1:56–58.) For
`
`
`
`- 4 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`example, the computer allows the user to access sensitive information from other
`
`sources after plugging in a secure storage device. (See ’459 patent, 1:58–62.)
`
`Sensitive information written to the secure storage device can be encrypted. (See
`
`’459 patent, 1:44–47.) Conversely, if the storage device is unsecured, the computer
`
`operates such that the user cannot write to the device, preventing the removal of
`
`sensitive information from the computer when using an unsecured device. (See
`
`’459 patent, 1:63–66.)
`
`III. Claim construction.
`Petitioners propose constructions for four terms— “device-specific security
`
`information,” “[device/user]-specific information,” “security information,” and
`
`“detecting a storage device within a storage drive.” (See Petition, pp. 9–16.)
`
`The Board should not adopt constructions for any of these terms for two
`
`reasons. First, construction is not “necessary to resolve the controversy” in this
`
`proceeding. Vivid Techs. v. Amer. Science, 200 F.3d 795, 803 (Fed. Cir. 2000).
`
`Second, the meaning of each claim term is clear and therefore no further
`
`construction is required. Phillips v. AWH Corp., 415 F.3d 1303, 1312 (Fed. Cir.
`
`2005).
`
`
`
`- 5 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`IV. The Board should exercise its discretion under 35 U.S.C. § 325(d) to
`deny the Petition because it presents a substantially similar argument
`that the Board denied in a previous IPR petition against the ’388 patent.
`
`Congress gave the Board discretion to “reject the petition or request because,
`
`the same or substantially the same prior art or arguments previously were
`
`presented to the Office.” 35 U.S.C. § 325(d). “[S]ubstantial similarity of argument,
`
`standing alone, is sufficient for a denial under § 325(d).” Conopco dba Unilever v.
`
`Proctor and Gamble Co., IPR2014-00628, Decision Denying Institution at p. 10
`
`(Paper 21, Oct. 20, 2014).
`
`The Board should exercise this discretion to deny institution of the present
`
`proceeding. The Board has already considered and rejected a challenge to the
`
`claims based on an argument that comparison of two values constitutes “sensing
`
`whether the storage device has security information.” Here, Petitioners repackage
`
`that argument already rejected by the Board, relying on substantially similar
`
`disclosures of its asserted art for the same claim elements.
`
`In January 2017, the Board denied grounds against claims 15 and 18 of the
`
`’388 patent. Unified Patents Inc. v. Intellectual Ventures II, LLC, IPR2016-01404,
`
`Institution Decision at pp. 17–20 (Paper 9, Jan. 11, 2017). In its decision, the
`
`Board rejected the petitioner’s assertion that comparison of a security code with
`
`another security code constitutes “sensing whether the storage device has security
`
`information generated from a combination of device-specific information
`
`
`
`- 6 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`associated with the storage device and user-specific information associated with a
`
`user” as recited in claim 18. Id. at 18. Specifically, the Board such a comparison
`
`“does not address sensing whether [the storage device] contains [the alleged
`
`security information].” Id. at 19. The Board therefore found that the petitioner did
`
`not meet its burden of establishing that obviousness of claim 18 of the ’388 patent.
`
`Here, Petitioners make the same argument based on Blakley’s bit strings and
`
`Uchida’s passwords in lieu of the security code of the first petition. In both
`
`Grounds, Petitioners argue that the alleged security information is stored on the
`
`storage device, and that comparing the stored information to an entered value
`
`satisfies the “sensing” elements of independent claims 15 and 18. For Ground 1, in
`
`the only sentence addressing the art’s disclosure of the claim 15 “sensing” element,
`
`Petitioners argue that “Blakely teaches that a… bit string is generated and stored
`
`on the storage device during installation, and later checked against a
`
`pseudorandom bit string generated when a user logs on.” (Petition, p. 28
`
`(emphasis added).) Petitioners rely on the same disclosure for claim 18. (Petition,
`
`p. 37.) Like the argument the Board rejected in the first petition, Petitioners only
`
`cite to a comparison for alleged disclosure of the “sensing” elements, and the
`
`Board should exercise its discretion to deny Ground 1 for that reason.
`
`For Ground 2, Petitioners cite to Uchida’s alleged disclosure of “permitting
`
`full access to a storage medium if a password for the medium is consistent with the
`
`
`
`- 7 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`stored password(s)” for the “sensing” elements. (Petition, p. 52 (citing Uchida,
`
`3:20–30).) Petitioners further argue that the “passwords are stored… and must
`
`coincide with a separate user entered password” (Petition, p. 54) and that “access is
`
`restricted depending on whether the user-inputted password coincides with one of
`
`the predetermined passwords.” (Petition, p. 62.) Thus, Petitioners argument is
`
`substantially similar to the argument the Board denied in the first petition in that it
`
`equates a comparison of a stored password to a user-entered password with the
`
`“sensing” elements recited in claims 15 and 18.
`
`The Board has already considered and rejected substantially similar
`
`arguments as those Petitioners present in the instant Petition. Accordingly, the
`
`Board should exercise its discretion to deny institution under § 325(d).
`
`V. Ground 1: The Board should deny institution because Petitioners have
`not established a reasonable likelihood that the combination of Blakley
`and Bramhill renders claims 15, 18, 24, and 25 obvious.
`A. Overview of Blakley.
`Blakley discloses a method “to protect the confidentiality of information
`
`stored on a storage device of a computer, even if the computer is stolen or
`
`otherwise accessed without the owner’s consent or knowledge.” (Blakley, 1:44–
`
`47.) Blakley accomplishes this goal using software that operates in two stages: an
`
`installation stage (see Blakley, 5:41 to 6:12) and an access stage (see Blakley,
`
`6:13–47.) The figure below illustrates the installation process described in Blakley,
`
`
`
`- 8 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`with annotations to corresponding disclosures of Blakley. (See Blakley, 5:41 to
`
`6:12.)
`
`
`
`Figure A – Block Diagram of Blakley
`
`During the first step of the installation process, a user enters a password.
`
`Blakley uses this user-entered password for two separate functions: 1) user
`
`verification during subsequent logon attempts, and 2) generation of the
`
`pseudorandom bit string for data encryption. (See Blakley, 5:42–44.) In both
`
`functions, the password is masked to create a secret key: “Information dependent
`
`on the user password is then combined with (non-secret) information (e.g., a
`
`mask…) to determine a secret key, 𝑎, for the user.” (Blakley, 5:44–48.) For the
`user verification function, the secret key 𝑎 is processed through a one-way function
`
`
`
`- 9 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`to produce a result that can be stored “on the mass storage device to allow the key
`
`processing unit to distinguish correct and incorrect passwords.” (Blakley, 5:66 to
`
`6:2.) Because a one-way function is applied, the secret key cannot be derived from
`
`the stored value. Blakley stores the result of the one-way function in storage. (See
`
`Blakley, 5:66 to 6:2.) Importantly, security is maintained because neither the
`
`password nor the secret key is stored on the disk. (See Blakley, 2:25–32 and 6:2–
`
`3.)
`
`Blakley’s installation process also performs encryption of data stored on the
`
`drive. During the installation process, data is read from each sector of interest,
`
`encrypted, and written back to the same sector. (See Blakley, 6:4–11.) The cipher
`
`used by Blakley for encryption (also referred to as a pseudorandom bit string) is a
`
`function of a secret key and an index specifying the data’s position on the storage
`
`device. Because of the computational overhead associated with any cryptographic
`
`operation, Blakley performs pre-processing on the secret key. Specifically, for the
`
`Blakley, 5:61–63; see also 2:33–35 and 8:8–10.) Pre-processing the secret key in
`
`this manner, before it is used in an encryption operation, allows the pseudorandom
`
`data encryption function, the secret key is pre-processed to form a cipher 𝑓𝑎. (See
`bit string cipher to be efficiently computed in real-time for each sector index 𝑖
`
`every time a sector is accessed. (Blakley, 2:35–42, 6:29–32.) The computed cipher
`
`is used to encrypt the information stored on the disk in plaintext that “the user
`
`
`
`- 10 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`wishes to have [ ] kept private.” (Blakley, 6:4–6.) Specifically, “[w]hen the string 𝑥
`at position 𝑖 of the disk is read, the value 𝑓𝑎(𝑖) is computed by the computing
`system.” (Blakley, 6:6–8.) The data is encrypted when “[a] value 𝑦 is computed by
`XORing or otherwise combining 𝑥 and 𝑓𝑎(𝑖),” and the encrypted data “𝑦 replaces
`the previous value 𝑥 for the contents of the sector.” (Blakley, 6:8–12.)
`
`After installation, during the access process, a user logs onto the machine by
`
`inputting the password. (See Blakley, 6:13–16.) Similar to installation, the
`
`password is verified by creating a secret key, passing the secret key through a one-
`
`way function, and checking the result “against information stored in the computing
`
`process. (See Blakley, 6:16–21.) At this point, whenever the user wants to read
`
`device” (i.e., the one-way function of 𝑎 stored during installation). (Blakley, 6:21–
`25.) The system also creates the cipher 𝑓𝑎 in a similar manner as the installation
`encrypted information from the 𝑖-th sector of the disk, “the software computes
`𝑓𝑎(𝑖)” and “retrieves the contents of the 𝑖-th sector of the disk.” (Blakley, 6:29–
`33.) The retrieved encrypted value “is XORed with 𝑓𝑎(𝑖) to determine the
`‘plaintext’ 𝑥.” (Blakley, 6:34–36.) When the user wants to write information to the
`𝑖-th sector, Blakley again calculates 𝑓𝑎(𝑖) and XORs the unencrypted information
`with 𝑓𝑎(𝑖) to form the encrypted version 𝑦. The resulting encrypted data is then
`stored on the 𝑖-th sector of the disk. (See Blakley, 6:39–47.)
`
`
`
`- 11 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`B. Overview of Bramhill.
`Bramhill appears to be a technical article titled “Challenges for Copyright in
`
`a Digital Age.” As the title suggests, the article is concerned with copyright
`
`protection, “propos[ing] an initial model of a software-based system that provides
`
`copyright protection of multimedia information when delivered by Internet-based
`
`services.” (Bramhill, p. 63, Abstract.)
`
`Bramhill argues for a “strong one-to-one binding between the software of a
`
`user, and the user.” (Bramhill, p. 67, § 3.1 ¶ 2.) For the purposes of this
`
`proceeding, Petitioners focus primarily on the binding described in Section 3.2,
`
`which is titled “Binding software to a machine.” (Bramhill, p. 67, § 3.2.) This
`
`section discusses “a number of characteristics of a computer [that] can be
`
`measured to achieve a similar level of probability of its identity,” termed
`
`“cybermetrics” by Bramhill. (Bramhill, p. 67, § 3.2 ¶ 1.) Bramhill lists nine
`
`examples of cybermetrics including size of memory, the presence of a CD drive,
`
`“operational characteristics,” manufacturer of physical components, logical
`
`directory and file structures, or files specifically created to identify the machine.
`
`(See Bramhill, p. 67, § 3.2.)
`
`Bramhill discloses that a fraudster “would have to recreate both the logical
`
`and physical characteristics of” a machine belonging to another user in order “to
`
`make use of a software decryption processing belonging to” that user. (Bramhill,
`
`
`
`- 12 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`p. 67.) However, Bramhill provides no details as to how its “cybermetrics”—a
`
`term Bramhill only uses once—could be used to thwart copyright infringement.
`
`C.
`
`Petitioners have not established a reasonable likelihood that the
`combination of Blakley and Bramhill renders independent claim
`15 obvious.
`1.
`
`Petitioners have not established a reasonable likelihood of
`showing that the combination discloses the “sensing”
`element of claim 15.
`Claim 15 recites “sensing whether a storage device has device-specific
`
`security information stored thereon.” Petitioners address this claim element by
`
`parsing it into two parts, “device-specific security information” and “sensing.”
`
`Petitioners’ piecemeal approach is fatal to its arguments. As Patent Owner
`
`demonstrates below, Petitioners fail to establish that the combination of Blakley
`
`and Bramhill teach or suggest this claim element.
`
`Petitioners present three different arguments regarding whether the
`
`combination teaches or suggests “device-specific security information” in isolation.
`
`(See Petition, pp. 26–29.) First, Petitioners argue Blakley’s “pseudorandom bit
`
`string” that is used to encrypt data stored on a storage device is “device-specific
`
`security information.” (See Petition, pp. 26–27.) Second, Petitioners argue that
`
`Blakley’s value identification could be considered “device-specific security
`
`information” should the Board adopt a broad claim construction for the term. (See
`
`Petition, pp. 27–28.) Third, Petitioners turn to Bramhill, arguing that Bramhill’s
`
`
`
`- 13 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`“cybermetrics” when combined with Blakley teach or suggest “device-specific
`
`security information.” (See Petition, pp. 28–29.)
`
`But when Petitioners address the “sensing” part of the claim element,
`
`Petitioners only rely on Blakley’s “pseudorandom bit string.” (See Petition, p. 28.)
`
`Accordingly, Petitioners’ second and third arguments have not established that the
`
`combination of Blakley and Bramhill teach or suggest this claim element.
`
`However, for completeness, Patent Owner demonstrates that none of Petitioners’
`
`three arguments has a reasonable likelihood of success. The Board should therefore
`
`deny institution of Ground 1 for claim 15.
`
`a) Petitioners fail to show that Blakley discloses “sensing
`whether a storage device has” the pseudorandom bit string
`“stored thereon.”
`Petitioners’ first argument fails because Petitioners do not and cannot show
`
`that Blakley teaches or suggests “sensing whether a storage device has” the
`
`pseudorandom bit string “stored thereon.” Petitioners’ argument ignores a key
`
`portion of the claim element—that the sensing action determines whether the
`
`device-specific security information is stored on the storage device. Petitioners’
`
`omission is not surprising because Blakley never stores the pseudorandom bit
`
`string on the storage device. Petitioners attempt to obscure this fatal flaw in
`
`Blakley by mischaracterizing the teachings of Blakley. But when Petitioners’
`
`subterfuge is stripped away, the explicit disclosures of Blakley, as well as the
`
`
`
`- 14 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`nature of the pseudorandom bit string itself, make clear that Petitioners’ argument
`
`fails.
`
`In Blakley, “the function that produces the pseudorandom bit string for each
`
`sector index, and any other information (e.g., the secret key) useful in encrypting
`
`and decrypting disk accesses, is preferably stored in volatile memory when the
`
`machine is in use” and is “erased” when “the authorized user logs off, powers off,
`
`locks the computer, or when a predetermined timeout occurs.” (Blakley, 6:48–57
`
`(emphasis added).) The pseudorandom bit strings are undoubtedly “other
`
`information… useful in encrypting and decrypting disk accesses” (Blakley, 6:48–
`
`53) because they are “used to encrypt and decrypt data accesses” (Blakley,
`
`Abstract). The pseudorandom bit strings are erased and therefore not stored to the
`
`storage device.
`
`Blakley also discloses that the pseudorandom bit strings are generated each
`
`time the system accesses a protected portion of the disk. Describing the installation
`
`process, Blakley states that “[w]hen the string 𝑥 at position 𝑖 of the disk is read, the
`value 𝑓𝑎(𝑖) is computed by the computing system.” (Blakley, 6:6–8.) Afterward,
`when “the operating system will attempt to read the 𝑖-th sector from the disk… the
`software computes 𝑓𝑎(𝑖)” again, in order to decrypt the information on the disk.
`
`(Blakley, 6:26–32; see 6:32–36.) If it were stored on the disk, the system could
`
`simply access that stored value instead of re-computing it. Blakley’s disclosure can
`
`
`
`- 15 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`lead to only one conclusion—that the pseudorandom bit strings are erased (not
`
`stored) and must be regenerated.
`
`Furthermore, a person of ordinary skill in the art (“POSITA”) would
`
`recognize that Blakley’s pseudorandom bit string is not stored on the storage
`
`device for several reasons. First, storing the key used to encrypt/decrypt data with
`
`the data would undermine the entire purpose of Blakley, rendering it inoperable for
`
`its intended purpose. In re Gordon, 733 F.2d 900, 902 (Fed. Cir. 1984). If Blakley
`
`stored the cipher (key) used to encrypt and decrypt the stored data with the
`
`encrypted data, an unauthorized user could access the cipher and decrypt the data,
`
`defeating Blakley’s purpose of “protect[ing] the confidentiality of information
`
`stored on a storage device of a computer.” (Blakley, 1:44–47.) Second, storing the
`
`pseudorandom bit strings would take an enormous amount of storage space
`
`because each bit string is the same length as the data that it encrypts. (See Blakley,
`
`3:49–56 (“[A] pseudorandom bit string [] will have a length equal to the area of the
`
`storage device in which the data will be stored. If the storage device is a hard disk
`
`drive, the area is a ‘sector.’”).) Thus, under Petitioners’ theory, half of the storage
`
`device would have to be devoted to storing pseudorandom bit strings
`
`corresponding to each sector of encrypted information. In addition to being
`
`counterintuitive, the argument contradicts Blakley, which discusses “encrypting
`
`
`
`- 16 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`the entire disk,” and makes no mention of being limited to only encrypting half of
`
`it. (Blakley, 7:6–10.)
`
`Petitioners fail to discuss any of these issues, addressing the storage
`
`requirement in a carefully-worded, three-sentence paragraph that mischaracterizes
`
`Blakley. Petitioners state that “Blakley teaches that a device-specific…
`
`pseudorandom bit string is generated and stored on the storage device during
`
`installation, and later checked against a pseudorandom bit string generated when a
`
`user logs on or otherwise initiates a request to access secure information.”
`
`(Petition, p. 28.) But this is not what Blakley discloses. The value that Petitioners
`
`describe as being “stored on the storage device during installation” (Petition, p. 28)
`
`is not the “pseudorandom bit string” in Blakley. Rather, Blakley refers to this value
`
`as a “one-way function of the secret key.” (Blakley, 5:66 to 6:2.) And the
`
`distinction between the two values is critical. The figure below highlights the
`
`differences between these two values.
`
`
`
`- 17 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`
`
`Figure B – Annotated Block Diagram of Blakley
`
`Blakley’s pseudorandom bit string used as its cryptographic cipher is not
`
`“stored on the storage device during installation” and is not “later checked” against
`
`a value generated when a user logs on. (Petition, p. 28.) The portion of Blakley
`
`cited by Petitioners explicitly states that the result of the “one-way function of the
`
`secret key” is stored and later checked during password verification. (See Blakley,
`
`5:66–6:2 and 6:21–25 (“A one-way function of the secret key 𝑎 is installed on the
`
`mass storage device to allow the key processing unit to distinguish correct and
`
`incorrect passwords.”).) As its name implies, the result of the one-way function of
`
`the secret key is generated from the secret key. Petitioners do not argue that the
`
`result of the one-way function is the “device-specific security information.” Nor
`
`
`
`- 18 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`can they. Petitioners contend that the dependence on the sector index 𝑖 is what
`
`made the pseudorandom bit string used as the cryptographic cipher “device-
`
`specific.” (See Petition, pp. 26–27.) The result of the one-way function of the secret
`
`key does not utilize or depend in any way on the sector index.
`
`The Board should reject Petitioners’ sleight of hand. Petitioners’ identified
`
`Blakley’s pseudorandom bit string used as its cryptographic cipher as the recited
`
`“device-specific security information” of claim 15. Therefore, to meet their burden
`
`of showing that Blakley teaches or suggest the entire claim element, Petitioners
`
`must point to some disclosure of sensing whether the pseudorandom bit string used
`
`as a cryptographic cipher is stored on the storage device. The Petitioners did not
`
`and cannot because Blakley does not store its cryptographic cipher to maintain
`
`security of its data. The fact that Blakley stores an entirely different value derived
`
`from the secret key in storage is not sufficient or even relevant, regardless of
`
`Petitioners’ attempt to improperly place the same label on the values. KSR Int’l Co.
`
`v. Teleflex Inc., 550 U.S. 398, 418 (2007).
`
`b) Petitioners fail to establish that a POSITA would have
`combined Blakley and Bramhill to meet the “sensing”
`element of claim 15.
`
`Petitioners argue that “Bramhill also teaches using ‘device-specific security
`
`information’” and that “a POSA would have understood that the disclosures of
`
`Blakley in view of Bramhill disclose, teach, or suggest this limitation.” Petitioners’
`
`
`
`- 19 -
`
`

`

`
`
`
`
`IPR2017-00467
`Patent No. 6,968,459
`
`
`argument fails for two reasons. First, Petitioners provide no articulated reasoning
`
`nor rational underpinnings supporting the combination, and never even explain
`
`what the combination is. See KSR, 550 U.S. at 418 (“there must be some
`
`articulated reasoning with some rational underpinning to support the legal
`
`conclusion of obviousness”); Los

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket