throbber
Exhibit 12
`
`U.S. Patent No. 8,526,767 (“’767 Patent”)
`
`Invalidity Chart Based On Primary Reference U.S. Patent Application Publication No. 2007/0176906 (“WARREN”)
`
`WARREN qualifies as prior art to U.S. Patent No. 8,526,767 (“’767 Patent”) at least under 35 U.S.C. § 102(a) and anticipates and,
`alone or with other references, renders obvious one or more of claims 1-3, 6, and 11-14. To the extent WARREN does not disclose
`one or more limitations of the claims, it would have been obvious to combine the teachings of WARREN with the knowledge of one
`of ordinary skill in the art and with one or more of the references below to render the claims at-issue in the ’767 Patent invalid.
`
`• U.S. Patent Application Publication No. 2009/0284478 (“BALTIERRA”)
`• U.S. Patent Application Publication No. 2007/0247435 (“BENKO”)
`• U.S. Patent No. 8,519,965 (“CADY”)
`• U.S. Patent Application Publication No. 2009/0325643 (“HAMADENE”)
`•
`Japanese Laid-Open Patent Application Gazette H09-231004 (“KATOU”)
`• U.S. Patent Application Publication No. 2009/0213084 (“KRAMER”)
`• U.S. Patent Application Publication No. 2010/0020025 (“LEMORT”)
`• U.S. Patent Application Publication No. 2008/0046425 (“PERSKI”)
`•
`International Patent Publication No. WO 00/63874 (“STRINGER”)
`• U.S. Patent Application Publication No. 2008/0036743 (“WESTERMAN”)
`• U.S. Patent Application Publication No. 2009/0225039 (“WILLIAMSON”)
`• U.S. Patent Application Publication No. 2007/0046643 (“HILLIS”) (prior art under at least 35 U.S.C. §102(b))
`• U.S. Patent Application Publication No. 2006/0066582 (“LYON”) (prior art under at least 35 U.S.C. §102(b))
`• U.S. Patent Application Publication No. 2007/0152984 (“ORDING”) (prior art under at least 35 U.S.C. §102(a))
`• U.S. Patent Application Publication No. 2007/0291009 (“WRIGHT”) (prior art under at least 35 U.S.C. §102(a))
`• Admitted Prior Art
`
`The excerpts cited herein are exemplary. For any claim limitation, Samsung may rely on excerpts cited for any other limitation and/or
`additional excerpts not set forth fully herein to the extent necessary to provide a more comprehensive explanation for a reference’s
`disclosure of a limitation. Where an excerpt refers to or discusses a figure or figure items, that figure and any additional descriptions
`of that figure should be understood to be incorporated by reference as if set forth fully herein. Similarly, where an excerpt cites to
`particular text referring to a figure, the citation should be understood to include the figure and related figures as well.
`
`1
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 1
`
`

`

`These invalidity contentions are not an admission by Samsung that the accused products or components, including any current or past
`version of these products or components, are covered by, or infringe the asserted claims, particularly when these claims are properly
`construed and applied. These invalidity assertions are also not an admission that Samsung concedes or acquiesces to any claim
`construction(s) implied or suggested by Plaintiff in its Complaint or the associated infringement claim charts. Nor is Samsung
`asserting any claim construction positions through these charts, including whether the preamble is a limitation. Samsung also does not
`concede or acquiesce that any asserted claim satisfies the requirements of 35 U.S.C. §§ 112 or 101 and submits these invalidity
`contentions only to the extent Plaintiff’s assertions may be understood.
`
`
`
`
`
`2
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 2
`
`

`

`
`
`Asserted Claims
`
`Claim 1
`
`Exemplary Disclosures
`
`
`
`[1.pre] A touch sensor device
`comprising:
`
`WARREN, alone or in combination with the knowledge of a person of ordinary skill in the art,
`discloses and/or renders obvious the touch sensor device recited in claim 1.
`
`WARREN at Abstract:
`“A proximity sensor device and method is provided that facilitates improved system usability.
`Specifically, the proximity sensor device and method provide a user with the ability to easily
`cause different results in an electronic system using a proximity sensor device as a user interface.
`For example, it can be used to facilitate user interface navigation, such as dragging and scrolling.
`As another example, it can be used to facilitate value adjustments, such as changing a device
`parameter. In general, the proximity sensor device is adapted to distinguish between different
`object combination motions, determine relative temporal relationships between those motions,
`and generate user interface results responsive to the motions. This allows a user to selectively
`generate different results using the motion of two different object combinations.”
`
`WARREN at [0002]:
`“This invention generally relates to electronic devices, and more specifically relates to proximity
`sensor devices and using a touch sensor device for producing user interface inputs.”
`
`WARREN at [0003]:
`“Proximity sensor devices (also commonly called touch pads or touch sensor devices) are widely
`used in a variety of electronic systems. A proximity sensor device typically includes a sensing
`region, often demarked by a surface, which uses capacitive, resistive, inductive, optical, acoustic
`and/or other technology to determine the presence, location and/or motion of one or more
`fingers, styli, and/or other objects. The proximity sensor device, together with finger(s) and/or
`other object(s), can be used to provide an input to the electronic system. For example, proximity
`sensor devices are used as input devices for larger computing systems, such as those found
`integral within notebook computers or peripheral to desktop computers. Proximity sensor devices
`are also used in smaller systems, including: handheld systems such as personal digital assistants
`
`3
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 3
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`(PDAs), remote controls, communication systems such as wireless telephones and text
`messaging systems. Increasingly, proximity sensor devices are used in media systems, such as
`CD, DVD, MP3, video or other media recorders or players.”
`
`WARREN at [0011]:
`“The present invention provides a proximity sensor device and method that facilitates improved
`system usability. Specifically, the proximity sensor device and method provide a user with the
`ability to easily cause different results in an electronic system using a proximity sensor device as
`a user interface. For example, it can be used to facilitate user interface navigation, such as
`dragging and scrolling. As another example, it can be used to facilitate value adjustments, such
`as changing a device parameter. In general, the proximity sensor device is adapted to distinguish
`between different object combination motions, determine relative temporal relationships between
`those motions, and generate user interface results responsive to the motions. Specifically, the
`proximity sensor device is adapted to indicate a first result responsive to detected motion of the
`first object combination, indicate a second result responsive to detected motion of the second
`object combination, the second result different from the first result, and indicate a third result
`responsive to detected motion of the first object combination following the detected motion of
`the second object combination, the third result different from first result and the second result.
`This allows a user to selectively generate different results using the motion of two different
`object combinations.”
`
`WARREN at [0012]:
`“In one specific embodiment, the proximity sensor device is implemented to facilitate continued
`cursor movement with selection, commonly referred to as “dragging” using motion of different
`object combinations. For example, the proximity sensor device is implemented to indicate
`selection with cursor movement responsive to detected motion of two adjacent objects across the
`sensing region, indicate selection without cursor movement responsive to detected motion of one
`object across the sensing region when the detected motion of one object across the sensing
`region followed the detected motion of two adjacent objects across the sensing region without an
`intervening termination event, and indicate further selection with cursor movement responsive to
`detected motion of two adjacent objects across the sensing region when the detected motion of
`
`4
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 4
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`two adjacent objects across the sensing region followed the detected motion of one object across
`the sensing region that followed the detected motion of the adjacent objects across the sensing
`region. This facilitates use of the proximity sensor device by a user to indicate results such as
`extended dragging, and is particularly useful for indicating continuing adjustments, for example,
`to facilitate dragging an object over a large distance or scrolling through a large document. This
`allows a user to continue to drag an object without requiring the user to perform more complex
`gestures on the proximity sensor device or activate extra control buttons.”
`
`WARREN at [0020]:
`“The present invention provides a proximity sensor device and method that facilitates improved
`system usability. Specifically, the proximity sensor device and method provide a user with the
`ability to easily cause different results in an electronic system using a proximity sensor device as
`a user interface. For example, it can be used to facilitate user interface navigation, such as
`dragging and scrolling.”
`
`WARREN at [0021]:
`“To cause selective results the proximity sensor device is adapted to distinguish between
`different object combination motions, determine relative temporal relationships between those
`motions, and generate user interface results responsive to the motions. Specifically, the
`proximity sensor device is adapted to indicate a first result responsive to detected motion of the
`first object combination, indicate a second result responsive to detected motion of the second
`object combination, the second result different from the first result, and indicate a third result
`responsive to detected motion of the first object combination following the detected motion of
`the second object combination, the third result different from first result and the second result.
`This allows a user to selectively generate different results using the motion of two different
`object combinations.”
`
`WARREN at [0024]:
`“In operation, proximity sensor device 116 suitably detects a position of stylus 114, finger or
`other input object within sensing region 118, and using processor 119, provides electrical or
`electronic indicia of the position to the electronic system 100. The system 100 appropriately
`
`5
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 5
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`processes the indicia to accept inputs from the user, to move a cursor or other object on a display,
`or for any other purpose.”
`
`WARREN at [0027]:
`“The processor 119, sometimes referred to as a proximity sensor processor or touch sensor
`controller, is coupled to the sensor and the electronic system 100. In general, the processor 119
`receives electrical signals from the sensor, processes the electrical signals, and communicates
`with the electronic system. The processor 119 can perform a variety of processes on the signals
`received from the sensor to implement the proximity sensor device 116. For example, the
`processor 119 can select or connect individual sensor electrodes, detect presence/proximity,
`calculate position or motion information, and report a position or motion when a threshold is
`reached, and/or interpret and wait for a valid tap/stroke/character/button/gesture sequence before
`reporting it to the electronic system 100, or indicating it to the user. The processor 119 can also
`determine when certain types or combinations of object motions occur proximate the sensor. For
`example, the processor 119 can distinguish between motion of a first object combination (e.g.,
`one finger, a relatively small object, etc.) and motion of a second object combination (e.g., two
`adjacent fingers, a relatively large object, etc.) proximate the sensing region, and can generate
`the appropriate indication in response to that motion. Additionally, the processor can distinguish
`the temporal relationship between motions of object combinations. For example, it can determine
`when motion of the first object combination has followed motion of the second object
`combination, and provide a different result responsive to the motions and their temporal
`relationship.”
`
`WARREN at [0032]:
`“It should be noted that although the various embodiments described herein are referred to as
`“proximity sensor devices”, “touch sensor devices”, “proximity sensors”, or “touch pads”, these
`terms as used herein are intended to encompass not only conventional proximity sensor devices,
`but also a broad range of equivalent devices that are capable of detecting the position of a one or
`more fingers, pointers, styli and/or other objects. Such devices may include, without limitation,
`touch screens, touch pads, touch tablets, biometric authentication devices, handwriting or
`character recognition devices, and the like. Similarly, the terms “position” or “object position” as
`
`6
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 6
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`used herein are intended to broadly encompass absolute and relative positional information, and
`also other types of spatial-domain information such as velocity, acceleration, and the like,
`including measurement of motion in one or more directions. Various forms of positional
`information may also include time history components, as in the case of gesture recognition and
`the like. Accordingly, proximity sensor devices can appropriately detect more than the mere
`presence or absence of an object and may encompass a broad range of equivalents.”
`
`WARREN at [0033]-[0034]:
`“In the embodiments of the present invention, the proximity sensor device 116 is adapted to
`provide the ability for a user to easily cause different results in an electronic system using a
`proximity sensor device 116 as part of a user interface. For example, it can be used to facilitate
`user interface navigation, such as cursor control, dragging and scrolling. As another example, it
`can be used to facilitate value adjustments, such as changing a device parameter. To cause
`selective results the proximity sensor device 116 is adapted to distinguish between different
`object combination motions, determine relative temporal relationships between those motions,
`and generate user interface results responsive to the motions. This allows a user to selectively
`generate different results using the motion of two different object combinations.
`In one specific embodiment, the proximity sensor device 116 is implemented to facilitate
`continued cursor movement with selection, a type of “dragging,” using motion of different object
`combinations. For example, the proximity sensor device 116 can be implemented to indicate
`selection with cursor movement (e.g., dragging) responsive to detected motion of two adjacent
`objects across the sensing region, indicate selection without cursor movement responsive to
`detected motion of one object across the sensing region when the detected motion of one object
`across the sensing region followed the detected motion of two adjacent objects across the sensing
`region without an intervening termination event, and indicate further selection with cursor
`movement responsive to detected motion of two adjacent objects across the sensing region when
`the detected motion of two adjacent objects across the sensing region followed the detected
`motion of one object across the sensing region that followed the detected motion of the adjacent
`objects across the sensing region. This facilitates use of the proximity sensor device 116 by a
`user to indicate results such as extended dragging over long distances. Thus, the proximity sensor
`
`7
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 7
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`device 116 allows a user to continue to drag an object without requiring the user to perform more
`complex gestures on the proximity sensor device or activate extra control buttons.”
`
`WARREN at [0036]-[0037]:
`“As described above, in the embodiments of the invention the proximity sensor device is adapted
`to distinguish between different object combination motions, determine relative temporal
`relationships between those motions, and generate user interface results responsive to the
`motions. The different object combinations can be distinguished based on a variety of different
`parameters, such as object type, object size, object proximity, pressure on the sensing region, and
`the number of objects proximate the sensing region, to list several non-limiting examples.
`As one specific example, the proximity sensor device is adapted to distinguish the number of
`objects proximate sensing region. Turning now to FIGS. 2-4, side views of exemplary object
`combinations are illustrated. Specifically, FIGS. 2-4 illustrate an embodiment where the
`proximity sensor device is adapted to distinguish between the number of fingers or other objects
`proximate the sensing region. In FIG. 2, the first object combination 202 comprises one finger
`204 proximate the sensing region 200. In FIG. 3, a second object combination 212 comprises two
`fingers 214 and 216 proximate the sensing region 200. Finally, in FIG. 4, the third object
`combination 222 comprises three fingers 224, 226 and 228. In this embodiment, the object
`position detector detects the position of the fingers proximate the sensing region, and determines
`the number of fingers present. Thus, the object position detector can determine if one, two or
`more fingers are proximate the touch sensor and generate a result responsive to the number of
`fingers and the motions of the fingers. Of course, the system could also be adapted to distinguish
`between any other quantities of objects, such as between three and four fingers, etc. It should be
`noted that such a system allows the user to easily change the object combination presented to the
`proximity sensor by selectively placing and lifting any combination of different fingers on the
`sensing area.”
`
`WARREN at [0041]:
`“As a further variation on these embodiments, the proximity sensor device can be adapted to
`determine the proximity of objects in distinguishing between object combinations. For example,
`the proximity sensor device can determine if two objects are within a specified proximity (e.g.,
`
`8
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 8
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`substantially adjacent) and distinguish the object combinations based on the number and/or
`proximity.”
`
`WARREN at [0042]-[0049]:
`“Turning now to FIG. 5, a state diagram 500 is illustrated schematically. The state diagram 500
`illustrates an implementation of a proximity sensor device adapted to distinguish between
`different object combination motions, determine relative temporal relationships between those
`motions, and generate user interface results responsive to the motions. As such, the state diagram
`500 can be implemented as part of a proximity sensor program, or as part of complete proximity
`sensor device. The state diagram 500 includes four states, an IDLE state 501, a FIRST RESULT
`state 502, a SECOND RESULT state 503, and a THIRD RESULT state 504. Each of these
`various states corresponds to a user interface action being performed in response to the various
`motions. Specifically, when in the FIRST RESULT state 502 a first user interface result is
`indicated responsive to object motion, when in the SECOND RESULT state 503 a second user
`interface result is indicated responsive to object motion, and when in the THIRD RESULT state
`504, a third user interface result is indicated responsive to object motion. The IDLE state 501
`provides an idle result.
`These indicated results can be any type of user interface action or adjustment. For example, the
`first result can comprise cursor movement without selection, the second result can comprise
`selection with cursor movement, and third result can comprise selection without cursor
`movement (e.g., with cursor movement inhibited). In such an implementation, the first result is
`providing “pointing”, the second result is providing “dragging”, the third result is providing
`continued selection without cursor motion, and the idle result occurring after a termination event.
`…
`Transitions between states are determined by detected object combination motion. For example,
`when in the IDLE state 501, detected first object combination motion (1) causes a transition to
`the FIRST RESULT state 502. Conversely, when in the IDLE state 501, detected second object
`combination motion (2) causes a transition to the SECOND RESULT state 503. When in the
`IDLE state 501, the detection of no object presence (0) causes the process to stay in the IDLE
`state 501.
`
`9
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 9
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`As a second example, when in the FIRST RESULT state 502, detected second object
`combination motion (2) causes a transition to the SECOND RESULT state 503. Conversely,
`when in the FIRST RESULT state 502, the detection of no object presence (0) causes a transition
`to the IDLE state 501. Finally, when in the FIRST RESULT state 502, the detection of first
`object combination motion (1) causes the process to stay in the FIRST RESULT state 502.
`It should be emphasized that the process illustrated in state diagram 500 can result in two
`different results responsive to the first object combination motion (1). Specifically, when in the
`FIRST RESULT state 502, first object combination motion (1) causes the first result.
`Conversely, when in the THIRD RESULT state 504, first object combination motion (1) causes
`the third result. This difference in result is determined in part by the relative temporal
`relationships between those motions. Specifically, if the first object combination motion (1) is
`following a termination event (i.e., removing the objects from the sensing region) the process
`transitions to the FIRST RESULT state 502 and the first result is indicated. If instead, the first
`object combination motion (1) is following a second result (i.e., SECOND RESULT state 503
`without an intervening termination event) the process transitions to the THIRD RESULT state
`504 and the third result is indicated.
`However, if in the SECOND RESULT state 503, and no object presence (0) for a period of time
`is detected, the no object presence (0) causes a transition to the IDLE state 501. Thus, no object
`presence of a period of time serves as a termination event, causing the next detection of a first
`object combination motion (1) to instigate a transition to FIRST RESULT state 502 instead of
`the THIRD RESULT state 504 that would have occurred without the termination event.”
`
`WARREN at [0052]-[0056]:
`“The system illustrated in FIGS. 5-13 operates as follows. After a period of inactivity, the
`proximity sensor device 600 would be in the IDLE state 501. When a user causes motion of a
`one object combination 604 (e.g., the first object combination), the process transitions to the
`FIRST RESULT state 502, and a cursor motion (a first result) is generated in response to the
`motion of the single object. This is illustrated in FIGS. 6 and 7, where the motion of the one
`object combination 604 across the sensing region 602 causes the cursor 702 to move across the
`program interface 700 and towards the icon 704.
`
`10
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 10
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`Upon moving the cursor 702 to the icon 704, the user can select the icon 704 and commence
`dragging the icon 704 by placing a second object (e.g., a second finger) on the sensing region
`602. The resulting detected motion of the two object combination 606 causes the process to
`transition to the SECOND RESULT state 503, and cursor motion with selection, referred to as
`“dragging” (a second result) is generated in response to the motion of the two object combination
`606. This is illustrated in FIGS. 8 and 9, where the motion of the two object combination 606
`across the sensing region 602 causes the cursor 702 and the selected icon 704 to be dragged
`across the program interface 700. This process continues until the object motion across the
`sensing region 602 is stopped (e.g., because the object motion has reached the edge of the
`sensing region 602 or simply by user choice).
`When continued dragging is desired, the user can reposition the objects on the sensing region
`602 without losing selection of the icon 704. Specifically, the user can remove one object from
`the sensing region 602, and move the remaining one object combination 604 back across the
`sensing region 602. This allows the user to reposition the objects without losing selection.
`Specifically, the detected motion of the one object combination 604 causes the process to
`transition to the THIRD RESULT state 504, and continued selection without cursor motion (a
`third result) is generated in response to the motion of the one object combination 604. This is
`illustrated in FIGS. 10 and 11, where the motion of the one object combination 604 across the
`sensing region 602 allows object repositioning while the icon 704 remains selected without
`causing cursor motion. This process continues until the user reaches the desired location on the
`sensing region.
`When the user reaches the desired object location and intends to continue dragging with cursor
`motion, the user can return the second object to the sensing region 602 and again moves the two
`object combination 606 across the sensing region. The detected motion of the two object
`combination 606 causes the process to again transition to the SECOND RESULT state 503, and
`dragging is continued in response to the motion of the two object combination 606. This is
`illustrated in FIGS. 12 and 13, where the motion of the two object combination 606 across the
`sensing region 602 again causes the cursor 702 and the selected icon 704 to be dragged across
`the program interface 700 until the icon 704 reaches the desired location.
`This process can be repeated as often as needed by the user. When dragging over a very large
`distance the process can be repeated many times, all without selection of the icon 704 being lost.
`
`11
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 11
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`[1.a] a sensor having a sensitive
`area extending in at least one-
`dimension and arranged to
`output sense signals responsive
`to proximity of an object to the
`sensitive area;
`
`However, if the user wishes returns to cursor motion without selection at any time, a termination
`event can be triggered, causing the process to return to the IDLE state 501. Again, in one
`embodiment the termination event can be triggered by lifting all objects from the sensing region
`for a period of time. The detection of no object presence for the period of time causes the process
`to return to the IDLE state 501, and the next detected motion of one object 604 will again trigger
`cursor motion, the first result. It should also be emphasized that an actual separate IDLE state
`501 is not required, and that process can be implemented by transitioning directly from the
`SECOND RESULT state 503 to the FIRST RESULT state 502 when a termination event occurs
`while in the SECOND RESULT state 503, and likewise transitioning directly from the THIRD
`RESULT state 504 to the FIRST RESULT state 502 when a termination event occurs while in
`the THIRD RESULT state 504.”
`
`WARREN at Claims 1-13.
`
`WARREN at FIGS. 2-13.
`
`WARREN, alone or in combination with the knowledge of a person of ordinary skill in the art,
`discloses and/or renders obvious “a sensor having a sensitive area extending in at least one-
`dimension and arranged to output sense signals responsive to proximity of an object to the
`sensitive area.”
`
`WARREN at Abstract:
`“A proximity sensor device and method is provided that facilitates improved system usability.
`Specifically, the proximity sensor device and method provide a user with the ability to easily
`cause different results in an electronic system using a proximity sensor device as a user interface.
`For example, it can be used to facilitate user interface navigation, such as dragging and scrolling.
`As another example, it can be used to facilitate value adjustments, such as changing a device
`parameter. In general, the proximity sensor device is adapted to distinguish between different
`object combination motions, determine relative temporal relationships between those motions,
`and generate user interface results responsive to the motions. This allows a user to selectively
`generate different results using the motion of two different object combinations.”
`
`
`12
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 12
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`WARREN at [0003]:
`“Proximity sensor devices (also commonly called touch pads or touch sensor devices) are widely
`used in a variety of electronic systems. A proximity sensor device typically includes a sensing
`region, often demarked by a surface, which uses capacitive, resistive, inductive, optical, acoustic
`and/or other technology to determine the presence, location and/or motion of one or more
`fingers, styli, and/or other objects. The proximity sensor device, together with finger(s) and/or
`other object(s), can be used to provide an input to the electronic system. For example, proximity
`sensor devices are used as input devices for larger computing systems, such as those found
`integral within notebook computers or peripheral to desktop computers. Proximity sensor devices
`are also used in smaller systems, including: handheld systems such as personal digital assistants
`(PDAs), remote controls, communication systems such as wireless telephones and text
`messaging systems. Increasingly, proximity sensor devices are used in media systems, such as
`CD, DVD, MP3, video or other media recorders or players.”
`
`WARREN at [0011]:
`“The present invention provides a proximity sensor device and method that facilitates improved
`system usability. Specifically, the proximity sensor device and method provide a user with the
`ability to easily cause different results in an electronic system using a proximity sensor device as
`a user interface. For example, it can be used to facilitate user interface navigation, such as
`dragging and scrolling. As another example, it can be used to facilitate value adjustments, such
`as changing a device parameter. In general, the proximity sensor device is adapted to distinguish
`between different object combination motions, determine relative temporal relationships between
`those motions, and generate user interface results responsive to the motions. Specifically, the
`proximity sensor device is adapted to indicate a first result responsive to detected motion of the
`first object combination, indicate a second result responsive to detected motion of the second
`object combination, the second result different from the first result, and indicate a third result
`responsive to detected motion of the first object combination following the detected motion of
`the second object combination, the third result different from first result and the second result.
`This allows a user to selectively generate different results using the motion of two different
`object combinations.”
`
`
`13
`
`SAMSUNG V. SOLAS
`IPR2021-01254
`Exhibit 2013
`Page 13
`
`

`

`Asserted Claims
`
`Exemplary Disclosures
`
`WARREN at [0012]:
`“In one specific embodiment, the proximity sensor device is implemented to facilitate continued
`cursor movement with selection, commonly referred to as “dragging” using motion of different
`object combinations. For example, the proximity sensor device is implemented to indicate
`selection with cursor movement responsive to detected motion of two adjacent objects across the
`sensing region, indicate selection without cursor movement responsive to detected motion of one
`object across the sensing region when the detected motion of one object across the sensing
`region followed the detected motion of two adjacent objects acro

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket