`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`MICROSOFT CORPORATION and
`MICROSOFT MOBILE INC.,
`Petitioners,
`
`v.
`
`KONINKLIJKE PHILIPS N.V.,
`Patent Owner.
`____________
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`____________
`
`Record of Oral Hearing
`December 20, 2018
`____________
`
`
`
`
`Before KEVIN F. TURNER, DAVID C. MCKONE, and
`MICHELLE N. WORMMEESTER, Administrative Patent Judges.
`
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`
`
`APPEARANCES:
`
`ON BEHALF OF THE PETITIONER:
`
`
`CHRISTINA J. MCCULLOUGH, ESQUIRE
`Perkins Coie
`1201 Third Avenue
`Suite 4900
`Seattle, Washington 98101-3099
`
`
`
`ON BEHALF OF THE PATENT OWNER:
`
`
`JUSTIN J. OLIVER, ESQUIRE
`STEPHEN K. YAM, ESQUIRE
`Venable LLP
`600 Massachusetts Avenue, N.W.
`Washington, DC 20001
`
`
`
`
`The above-entitled matter came on for hearing on Thursday,
`
`December 20, 2018, commencing at 12:59 p.m., at the U.S. Patent and
`Trademark Office, 600 Dulany Street, Alexandria, Virginia.
`
`
`
`
`
`2
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`
`P R O C E E D I N G S
`- - - - -
`
` MR. DILL: All rise.
`
`JUDGE WORMMEESTER: Please be seated. Good afternoon,
`everyone. We have our final hearing today in Case IPR2018-00023,
`Microsoft v. Philips, which concerns U.S. Patent No. 6,690,387. I'm Judge
`Wormmeester, and Judges Turner and McKone are appearing remotely.
`
`Let's get the parties' appearances, please. Who do we have for
`Petitioner?
`
`MS. MCCULLOUGH: Good afternoon, Your Honors; Christina
`McCullough of Perkins Coie for Petitioners, Microsoft Corporation and
`Microsoft Mobile Inc.
`
`JUDGE WORMMEESTER: And who's here for Patent Owner?
`
`MR. OLIVER: Good afternoon, Your Honor, Justin Oliver of
`Venable on behalf of Philips, the Patent Owner. With me at counsel table is
`Stephen Yam, also of Venable.
`
`JUDGE WORMMEESTER: Thank you; welcome. We set forth the
`procedure for today's hearing in our trial order; but just to remind everyone
`the way this will work; each party will have 60 minutes to present
`arguments. Petitioner has the burden and will go first and may reserve time
`for rebuttal. Patent Owner will then have the opportunity to present its
`response. Please remember that Judges Turner and McKone will be unable
`to hear you unless you speak into the microphone; and when referring to any
`demonstrative, please state the slide number so that they can follow along.
`
`Also, this is a reminder that the demonstratives that you submitted are
`not part of the record. The record of the hearing will be the transcript. We
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`3
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`will give you a warning when you're into your rebuttal or reaching the end of
`your argument time. Are there any questions before we proceed?
`
`MR. OLIVER: One question, Your Honor. With respect to the
`motion to exclude will the Patent Owner have a chance to reserve rebuttal
`time to the extent that --
`
`JUDGE MCKONE: I'm not going to be able to hear you unless you
`speak at the microphone at the podium. Thank you.
`
`MR. OLIVER: Apologies, Your Honor. With respect to the motion
`to exclude to the extent that is addressed on the Petitioner's rebuttal time,
`will the Patent Owner be able to reserve time for rebuttal of that issue should
`it be raised?
`
`JUDGE WORMMEESTER: Yes; that's fine with us.
`
`MR. OLIVER: Thank you.
`
`JUDGE WORMMEESTER: Okay; Counsel, will you be reserving
`any time?
`
`MS. MCCULLOUGH: Yes, Your Honor; I'd like to reserve 15
`minutes of my time for rebuttal.
`
`JUDGE WORMMEESTER: 15 minutes; okay. And you may begin
`when you are ready.
`
`MS. MCCULLOUGH: Your Honor, if I may approach; I have some
`courtesy copies of our demonstratives for the Board.
`
`JUDGE WORMMEESTER: Sure.
`
`MS. MCCULLOUGH: Thank you, Your Honors; Christina
`McCullough for Petitioners Microsoft Corporation and Microsoft Mobile
`Inc. I'll start at slide 2 of our demonstratives. This petition involves Patent
`No. 6,690,387; and this patent describes a touchscreen system and method
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`4
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`that scrolls display data at the speed and in the direction of a user's touch.
`The core aspects of this method are straightforward, and they're illustrated in
`figure 1 of the '387 Patent, which is shown on slide 2.
`
`The method starts by sensing the direction and speed of a touch; it
`also senses the duration of a touch. The display data is scrolled along with
`the finger's touch, and if the touch lifts from the screen, the scrolling can
`slow down at some rate, as shown in step 106 of figure 1. The scrolling can
`also stop in response to certain conditions like sensing a finger's touch, as
`shown in step 108.
`
`Moving to slide 3 -- this petition involves challenges to both the
`method and the system claims; and I'll start by addressing the method claims
`today.
`
`Claim 9 is the only independent method claim in this patent; and this
`claim tracks the steps of the figure 1 method we just saw. Claim 9 recites a
`method of controlling scroll-like display of data on a screen that involves
`sensing the duration of a touch; sensing the speed and direction of the touch;
`initiating scrolling in that direction and at the sensed speed; slowing the
`speed at a predetermined rate; and terminating scrolling upon sensing a few
`conditions, including a substantially stationary touch or an end-of-scroll
`signal.
`
`The Board has construed this final limitation -- the stopping scrolling
`limitation for the method claims as requiring sensing only one of these two
`conditions; and that's consistent with how the district court has also
`interpreted this claim in the pending litigation between the parties.
`
`Moving to slide 4 -- slide 4 lists the grounds that are at issue in this
`petition; and these grounds are based, primarily, on the Anwar patent --
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`5
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`figure 1 of which is shown on slide 4. Anwar discloses a system for
`displaying and manipulating documents; and as shown in figure 1, Anwar's
`system stores a digital representation of a document that's displayed on a
`video display -- that's element 26 in figure 1.
`
`Moving to slide 5 -- in Anwar's background section, Anwar explains
`that there's a strong effort to build mobile and handheld computing devices
`that easily allow users to view documents -- email, video presentations, other
`forms of content. Anwar says that there were existing mobile computing
`systems like Palm Pilots and mobile phones that could display content for a
`user, even complex content, but these systems were limited. They were
`limited in their capacity to allow a user to manipulate the display of that
`content. These systems didn’t allow paging through different pages of a
`document; these systems did not allow selecting portions of a document.
`Anwar discloses that in column 1.
`
`And so what Anwar recognized was that there was a need to provide
`user interface tools that allowed a user to more easily manipulate and view
`content on a mobile device; and that's what Anwar's system does. It
`discloses a variety of different user interface tools that allow a user to
`manipulate content. It discloses a magnifying glass tool, for example, that
`allows zooming in on content; it discloses a ruler that allows a user to see
`scale between two documents; it discloses a handwriting tool that allows a
`user to handwrite characters that the system can sense.
`
`And moving to slide 6 -- it discloses the tool that's at issue in this
`petition -- a scrolling tool -- that allows a user to scroll through different
`pages of a document. This tool is described most fully in the first half of
`column 14 and in figures 13A and 13B. Now, Anwar says that this tool
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`6
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`scrolls pages of a document by applying a velocity characteristic to the
`document that's displayed. Specifically, when a user clicks on a document
`and then drags, a velocity detector process determines the velocity of the
`user's touch and moves the document along in line with that velocity. In
`figures 13A and 13B, for example, Anwar discloses that a user can click and
`drag on a document to the left, and the document will move along with the
`user's drag.
`
`Now, for documents with multiple pages, Anwar discloses that the
`system can scroll the pages across the screen at a rate that's determined by
`the velocity of the user's drag motion. And Anwar discloses that by using
`this velocity determination process, the user interface is able to present a
`more natural way of moving documents through a viewing space.
`
`Moving to slide 7 -- the petition describes the way in which Anwar's
`scrolling tool maps to claim 9; and we've color-coded this on slide 7 for
`illustration. Anwar's scrolling tool allows a user to click on a document,
`drag it at a certain speed and in a particular direction. The user can then
`release that document, and the document can continue to move at the same
`rate as the user's drag and in the established direction. We've identified that
`underlined in blue and green, on the left-hand side of this slide; and that
`corresponds to limitations 9B and 9C, which are sensing the speed and
`duration of a touch and initiating scrolling in the sensed direction and at the
`sensed speed.
`
`Anwar says that the page scroll velocity can also decrease by a
`constant page inertia, and that it can stop moving when the user clicks on the
`document. And this corresponds to limitations 9D and 9E -- slowing the
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`
`7
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`speed of the scrolling motion at a predetermined rate, and terminating the
`scrolling when a stationary finger touch is sensed.
`
`So, this leaves only limitation 9A -- the sensing the duration of the
`touch limitation as potentially missing from Anwar.
`
`And moving to slide 8 -- the Narutaka reference discloses sensing the
`duration of a finger touch. Narutaka discloses a method for scrolling data on
`a touchscreen that results in a scrolling process that is simpler, faster, and
`more intuitive for a user. That's disclosed in the abstract of Narutaka; it's
`also disclosed in paragraphs 8 and 25 of that reference. In Narutaka's
`simpler, and faster, more intuitive method, this is based on using a
`predetermined timing, a particular threshold of a touch that Narutaka's
`system uses to determine whether a user has actually entered a scroll
`command.
`
`Moving to slide 9 -- Narutaka discloses that its system checks to see if
`a user has been continuously touching the touch panel for a predetermined
`fixed amount of time. If not -- if the touch has not lasted that predetermined
`amount of time -- Narutaka's system determines if the operator has input
`some instruction other than a scroll; but if it has lasted that predetermined
`amount of time or longer, Narutaka's system determines that the operator has
`given a scroll instruction.
`
`So, Narutaka, specifically, discloses limitation 9A -- sensing the
`direction of the touch -- and it teaches, specifically, in a scrolling context
`that the duration of a touch can be used to differentiate between touches
`intended to scroll and touches intended for another command.
`
`Moving to slide 10 -- although Anwar discloses various functionality,
`these various user interface tools, like the scrolling tool and magnifier tools,
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`8
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`Anwar doesn't describe how its system differentiates between the touch
`commands for those different tools. Specifically, for the scrolling tool,
`Anwar doesn't explain how its system detects that a click and drag on a
`document should be interpreted as a scroll, rather than as a select or some
`other type of command. And a person of skill in the art looking at Anwar's
`system would have seen the benefit of being able to do that. In fact, they
`would have seen the need of being able to do that if Anwar's going to be
`providing all of the different functionality that it describes in a single
`system; but Anwar doesn't disclose the mechanics of how its system makes
`that determination.
`
`And staying, actually, on slide 10 -- one thing that both parties agree
`on -- and this is reflected in testimony on slide 10 -- is that when the '387
`Patent was filed, it was already well known to use touch duration to
`differentiate between different types of touch inputs. Dr. Porter -- Philips'
`expert -- agreed that using touch duration to differentiate between touch
`commands, this was already known at the time the '387 Patent was filed --
`that's in Exhibit 1027, at page 170.
`
`JUDGE WORMMEESTER: Counsel, Patent Owner, it seems to
`argue that Anwar distinguishes between different commands using the
`number of clicks or some other features. What's your response to that?
`
`MS. MCCULLOUGH: So, my response is that, that's easily
`countered by actually looking at the portions of Anwar that Philips cites to;
`and if we move a couple of slides ahead to slide 13, we've addressed these
`portions. So, as Your Honor notes, Philips' primary argument -- they don't
`disagree that together Anwar and Narutaka disclose all the different
`limitations of claim 9. They disagree with the motivation to combine those
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`9
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`references and they also disagree with how that combined system would
`have worked; and their main argument is, as Your Honor notes, there would
`have been no reason to go to Narutaka for the teaching of using touch
`duration to differentiate between the scroll and other inputs because Anwar
`discloses all of these different types of functionality -- all of these different
`tools that can be operated with clicks, or with clicks and drags.
`
`But if you look at the various different portions of Anwar that Philips
`cites to, none of these give a way that the system can actually differentiate
`between a scrolling input and something else. So, let's walk through those
`various points of Anwar that Philips points to.
`
`The first point -- to column 9, lines 46 to 49 of Anwar -- and, actually,
`in this entire section of the expert report that they point to -- this is the only
`portion of Anwar that's cited. So, let's look at that section -- and that's
`shown on slide 14 of our demonstratives.
`
`This section says nothing at all about scrolling. This section is talking
`about a graphical tool that can be moved over the screen by dragging it with
`a cursor. This section talks about a known command, but this known
`command is the command to move that magnifying glass tool. This section
`doesn't describe how the system differentiates between this tool and a
`scrolling tool. It doesn't describe how the system differentiates between
`known commands, generally. So, there's no disclosure -- either here or
`really anywhere in Anwar -- of how the system can take a click and drag on
`a document and understand that should be interpreted as a scroll, as opposed
`to a select or some other type of input.
`
`If you go to slide 15 -- Philips points to disclosure in Anwar about
`using clicks, as Your Honor referenced a few minutes ago. But, again, none
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`10
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`of these portions of Anwar talk about differentiating between scrolling
`commands and other commands. Two of these are actually referring to the
`disclosure of the scrolling tool itself; and, of course, that doesn't describe
`how the system is able to differentiate between a scrolling touch and other
`touches.
`
`The other two cites have nothing at all to do with scrolling. Philips
`points to column 11, lines 64 to 66. This talks about a magnifying tool.
`There's no disclosure in this passage of scrolling or how the system
`differentiates between a touch that selects a magnifying tool and moves it,
`and a touch that selects and moves at the speed of a touch, a document.
`
`Philips also points to column 14, lines 55 to 58. This, again, is
`describing a page zoom detector, a separate tool from the scrolling tool; and
`there's no disclosure here of how the system can differentiate a click and
`move -- a click and drag -- that activates and moves this page detector tool
`from one that clicks and drags on a document to scroll.
`
`Moving to slide 16 -- Philips also points to figure 12, which illustrates
`various click and drag command strokes that correspond to document
`manipulation commands. Now, as an example, Anwar discloses that figure
`12F -- shown in the lower-right of slide 16 -- that this illustrates a click and
`drag to the left that causes the system to switch to the next document.
`Figure 12G, similarly, is a click and drag to the right that causes the system
`to switch to a previous document.
`
`Now, Philips argues that because Anwar discloses all of these
`different tools -- all these different gestures -- Anwar's system must,
`necessarily, possess the capability to distinguish between all these gestures;
`and this really illustrates the point, exactly. Anwar does disclose a bunch of
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`11
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`different tools. Figure 12 does disclose multiple different gestures; and
`person of skill in the art would see the need to be able to differentiate
`between all these different gestures. But the click and drag of Anwar's
`scrolling tool -- which, as shown here, at the lower-left of slide 16 -- Anwar
`refers to as a command stroke, just like all the command strokes of figure 12
`-- that click and drag is indistinguishable from some of these gestures that
`are shown in figure 12.
`
`And moving to slide 17 -- if you compare the gesture in 13A and 13B,
`that click and drag to the left that scrolls a document, that is the same exact
`gesture that's shown in figure 12F; but that click and drag to the left goes to
`an entirely new document. Now, Dr. Porter conceded this during his
`deposition. He conceded that what's shown in figures 13A and 13B, that's
`the exact same gesture that's described in figure 12F; and Dr. Porter said that
`if Anwar's system is going to have some way to differentiate between those
`two gestures, there's going to need to be some additional information that the
`system uses.
`
`So, a person of skill in the art would have seen the benefit in being
`able to do this; to differentiate between a touch that scrolls a document and a
`touch that does something else, like flips to a separate document entirely; but
`Anwar doesn't give us that. Anwar doesn't say, exactly, how its system
`makes that determination; it focuses on the tools -- how these different tools
`operate; and Narutaka does disclose this.
`
`It discloses that, specifically, in the scrolling context, touch duration --
`a feature that all of the experts and the named inventors agree was well
`known in this case -- that was something that could be used to differentiate
`between touches intended to scroll and touches intended for something else.
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`12
`
`
`
`JUDGE TURNER: Counsel, just a quick question before you move
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`
`on.
`MS. MCCULLOUGH: Yes, Your Honor.
`
`JUDGE TURNER: I want to make sure you can hear me -- had some
`
`microphone problems earlier. I'm assuming that the Petitioner isn’t taking a
`position that Anwar is not enabled, correct?
`
`MS. MCCULLOUGH: That's correct, Your Honor.
`
`JUDGE TURNER: And so, I can make him use the system; but
`you're just saying that it doesn't teach this particular aspect?
`
`MS. MCCULLOUGH: So, Anwar, specifically, says that a person of
`skill in the art can use whatever methods are appropriate for the context. If
`you actually look at the claims of Anwar, those claims are generally directed
`to a velocity detector process; and it says -- I think it recites an interface
`process, generally.
`
`So, this would, certainly, be within the level of skill in the art, and
`both experts agree -- the named inventors agree in Exhibits 1027 and 1028 --
`that using touch duration was well known. This was a common way in
`which user interface designers and touch systems differentiated between
`touch gestures. So, certainly, that would have been something that a person
`of skill in the art would have known, and understood, to be an obvious
`addition to Anwar.
`
`JUDGE TURNER: But if I include this process from Narutaka --
`even if it's, indeed, wouldn't Anwar be more complex? I mean maybe there's
`a tradeoff there and the benefits of making more complex are, you know,
`evident to one of ordinary skill in the art; but it seems like we are making
`Anwar more complex by including this process, yes?
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`13
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`MS. MCCULLOUGH: So, that is one of the arguments that Philips
`
`makes, that incorporating the ability to sense touch duration before starting a
`scroll would not only have made Anwar more complex, but would have
`outweighed the benefit that we've identified -- the benefit of allowing the
`system a way to differentiate between different types of touch input, in a
`scrolling context.
`
`And I would say -- I think two different things. The first is that
`Anwar has to have some way of making this distinction, Anwar doesn't say
`how it does it; how specifically the scrolling tool -- when the system detects
`a touch and drag on a document -- the scrolling tool interprets that as a scroll
`as opposed to a select, or some other process entirely.
`
`So, some -- as Dr. Porter said -- some additional information is going
`to need to be used. There's going to be some additional complexity in order
`for Anwar to provide all the different features that it discloses, that it wants
`to provide for the user. So that's, I think, the first point is that, necessarily,
`something is going to have to be added to Anwar's disclosure in order to
`make a differentiation that would allow Anwar to provide these different
`features. But, I think, more fundamentally -- and this is shown on Slide 18
`of our presentation -- this argument that adding the single well-known
`selection criteria of sensing touch duration before scrolling into Anwar that,
`that would have rendered the system so complex that it would be unusable,
`and would outweigh the benefit we've identified. This was really contrary to
`the state of the art at the time.
`
`So, as Dr. Terveen testified, this is 2001 -- late 2001 that we're talking
`about. At this time it was conventional in user interfaces to not just have
`multiple commands -- multiple functionalities -- but to have huge numbers
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`14
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`of commands and functionalities. Anwar, specifically, talks about Palm
`Pilots. At this time in 2001, MACs and PCs -- these computing systems
`were ubiquitous. And all of these different systems successfully integrated
`lots of different techniques in the user interfaces in a way that had become
`conventional -- in a way that people could understand.
`
`So, adding additional functionalities or capabilities into a system, that
`was common. These systems, including the Palm Pilots that Anwar,
`specifically, talks about as being known and analogous to the device that it's
`putting forward, those all had multiple, large numbers of commands and
`functionalities that they offered to a user. And I'll note that even Anwar --
`Anwar has one such interface. Anwar, specifically, invites the addition of
`even more tools. There are probably a dozen different user interface tools
`already disclosed in Anwar. Anwar says -- for example in column 8, lines
`37 to 41 -- that it would be obvious to those of ordinary skill that other types
`of tools could be provided using the systems and methods described, and all
`such tools would be understood to fall within the scope of the invention.
`
`So, the single criteria -- adding this ability to sense time duration,
`which was a well-known technique that was present in many touch systems
`at the time this patent was filed -- this is not the type of criteria that would
`have rendered Anwar's interface so complex as to be unusable.
`
`Now, moving to slide 19 -- we get to the next reason Philips says that
`it would not have been obvious to incorporate Narutaka's teaching of sensing
`time duration into Anwar's scrolling tool. And what Philips says here is that
`it conjectures that the user would not be able to know how to operate this
`system. If you pull the ability to sense time duration into Anwar, a user
`would find that very complex, hard to understand -- they wouldn't know how
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`15
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`to operate the device. And this is simply inconsistent with the record.
`
`Dr. Terveen testified that if users had ever used a touchscreen before -
`- this is on pages 191 and 192 of his transcript, shown on slide 19 -- if users
`had ever used a touch system before, they would likely have already been
`familiar with the ability of a system to differentiate gestures based on the
`duration of a touch. That was used broadly in touch systems. And then both
`experts agreed that even if a user did not have prior touch system
`experience, they would have been able to figure out how to use the system
`just like they would figure out any other new system -- through
`experimentation and product documentation. Dr. Porter agreed that a user
`could look at a manual -- product documentation -- to figure out how exactly
`it worked.
`
`Dr. Terveen said this is the type of facile addition that could be picked
`up using a tiny bit of experimentation. As Dr. Terveen said, the amount of
`experimentation the user would have to do is so little that it's almost
`unconscious. This would be a micro-experimentation. A user could do it
`quickly to understand how the gesture recognition worked in the scrolling
`tool; and after they figured it out, the ongoing cognitive burden, or difficulty
`in operating the system going forward, that would be minimal.
`
`So, here we have testimony from both experts and the named
`inventors, that using touch duration in touch systems was known; users
`would likely have been familiar with this already; and it it's a system that
`they could have learned easily -- either by referring to documentation or
`experimenting with the system.
`
`JUDGE WORMMEESTER: About the testimony -- it seems to be
`about touch sensing in isolation. What about when you add it to a system
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`16
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`that already incorporates, you know, the number of clicking -- and, I think, it
`was, you know, the shape of your swipe or the presence of drag, I think, is
`something else Philips had mentioned. Does that change at all? Does that
`change the complexity or how a user would learn how to use the touch
`duration feature to distinguish commands?
`
`MS. MCCULLOUGH: And this testimony from Dr. Terveen, in
`particular, is talking about the combined Narutaka/Anwar system. It is
`talking about the combination of sensing the duration of a touch into the
`scrolling tool. So, it is the context of this testimony. But I would say, more
`broadly, Your Honor, we are talking about the scrolling tool, in particular.
`That is the basis for the read on claim 9. Anwar says -- and both experts
`agreed -- that a person of skill in the art would understand that you could
`incorporate more or less functionality into a device, as appropriate. But
`here, I think, the same arguments would apply.
`
`We see this all the time when we get a new version of Windows or a
`new iPhone, right. We play around with it a little bit and the different apps -
`- the different context -- we have to experiment with each of those; but the
`fact that I have 20 different applications on my smartphone doesn't mean
`that I can't figure out how to use each of those individually when I
`experiment on it, or when I read the product manual relating to that device.
`
`That same was true in 2001. This was a time when we had Palm
`Pilots -- a very early ancestor, but still an ancestor to smartphones; it was the
`same when we had PCs and we had MACs. Users would have been able to
`pick those up.
`
`JUDGE WORMMEESTER: Okay; thank you.
`
`JUDGE TURNER: Before you go --
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`17
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`MS. MCCULLOUGH: Yes
`
`JUDGE TURNER: -- sort of hypothetically here -- if I keep adding
`
`features, are we going to reach a level of complexity? I mean is Patent
`Owner just completely off base here. So, if I add like 10 new features, that's
`not going to, perhaps, provide undue complexity? I mean is there -- I
`understand you're saying that this wouldn't have been a bridge too far; but is
`there a bridge too far; I guess is my question.
`
`MS. MCCULLOUGH: Certainly, you could absolutely think of a
`bridge too far. I mean there's a continuum at which point it makes sense and
`which point it doesn't make sense. If you got to slide 20, Dr. Terveen,
`specifically, talked about this. He, specifically, talked about what design
`considerations a person of skill in the art would need to consider when
`deciding whether it made sense to incorporate additional functionality into
`Anwar's scrolling tool, and in deciding how -- how to do that. This is
`something that Philips spends a lot of time on in its briefing. And as Dr.
`Terveen testified -- for example, at pages 108 and 109 of his transcript -- a
`person of skill in the art would understand that it depends on how you
`integrate it. You're going to need to fit additional functionality into the
`system in a way that makes sense with what's already there. That's
`something that a person with skill in the art would know, and would be
`bringing to the table in making any combination.
`
`And in this case, Dr. Terveen, specifically, identified how, exactly, he
`was going to be combining the ability to sense touch duration -- as disclosed
`in Narutaka -- into Anwar's tool. He described the Anwar device -- it
`already includes a processor; and that processor is what provides the timer
`and counter functionality necessary to measure touch duration. So, Anwar
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`10
`11
`12
`13
`14
`15
`16
`17
`18
`19
`20
`21
`22
`23
`24
`25
`26
`
`18
`
`
`
`Case IPR2018-00023
`Patent 6,690,387 B2
`
`already had hardware that was capable of measuring touch duration. Given
`that, the add would be a single line of code; it would be an if-statement --
`and this is described earlier in our demonstratives. I want to say it's in the
`140's of Dr. Terveen's declaration -- but he said it would be a single if-
`statement, interposed just before the scrolling starts, that says has this touch
`lasted a threshold amount of time; if so, I'm going to order an appropriate
`command -- for example, I'm going to order the system to start scrolling.
`
`So, Dr. Terveen has identified, specifically, why it would make sense
`to bring this functionality into Anwar because it would allow Anwar to
`provide the various different tools that it describes; and, specifically, would
`allow Anwar to differentiate between touches intended to scroll, and the
`scrolling tool; and touches to i