throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2007/0176906 A1
`
` WARREN (43) Pub. Date: Aug. 2, 2007
`
`
`US 20070176906A1
`
`(54) PROXIMITY SENSOR AND METHOD FOR
`INDICATING EXTENDED INTERFACE
`RESULTS
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`G06F 3/041
`
`(2006.01)
`
`(75)
`
`Inventor:
`
`Andrew I. WARREN, San Jose,
`CA (Us)
`
`(52) US. Cl.
`(57)
`
`....................................................... 345/173
`ABSTRACT
`
`Correspondence Address:
`INGRASSIA, FISHER & LORENZ, P-C-
`7150 E. CAMELBACK ROAD, SUITE 325
`SCOTTSDALEAZ 85251
`.
`(73) Ass1gnee:
`
`SYNAPTICS INCORPORATED:
`Santa Clara, CA (US)
`
`(21) Appl. No.:
`
`11/613,063
`
`(22)
`
`Filed:
`
`Dec. 19, 2006
`
`Related US. Application Data
`
`(60) Provisional application No. 60/764,406, filed on Feb.
`1, 2006.
`
`A proximity sensor device and method is provided that
`facilitates improved system usability. Specifically, the prox-
`imity sensor device and method provide a user with the
`ability to easily cause different results in an electronic
`system using a proximity sensor device as a user interface.
`For example,
`it can be used to facilitate user interface
`navigation, such as dragging and scrolling. As another
`example, it can be used to facilitate value adjustments, such
`as changing a device parameter. In general, the proximity
`sensor device is adapted to distinguish between different
`object combination motions, determine relative temporal
`relationships between those motions, and generate user
`interface results responsive to the motions. This allows a
`user to selectively generate different results using the motion
`of two different object combinations.
`
` 0 = NO OBJECT PRESENCE
`
`1: FIRST OBJECT COMBINATION MOTION
`2= SECOND OBJECT COMBINATION MOTION
`
`502
`
`504
`
`501
`
`0
`
`Petitioner Samsung Ex-1007, 0001
`
`Petitioner Samsung Ex-1007, 0001
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 1 of 7
`
`US 2007/0176906 A1
`
`
`
`EL
`
`IJ
`'—
`co
`>-
`U)
`
`Q2 O[
`
`I
`I—
`
`Petitioner Samsung EX-1007, 0002
`
`OL
`
`IJ
`_l
`LIJ
`
`Petitioner Samsung Ex-1007, 0002
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 2 of 7
`
`US 2007/0176906 A1
`
`200
`
`222
`
`226228
`
`FIG.4
`
`224
`
`212
`
`(20:04216I3Ii200Ei200
`214FIG.2FIG.3
`
`Petitioner Samsung EX-1007, 0003
`
`Petitioner Samsung Ex-1007, 0003
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 3 of 7
`
`US 2007/0176906 A1
`
`FIG.5
`
`Petitioner Samsung Ex-1007, 0004
`
`<I'
`0
`
`0L
`
`501
`
`
`OBJECTCOMBINATIONMOTION1:FIRSTOBJECTCOMBINATIONMOTION2=SECOND
`
`
`
`
`
`0=NOOBJECTPRESENCE
`
`502
`
`Petitioner Samsung Ex-1007, 0004
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 4 of 7
`
`US 2007/0176906 A1
`
`File
`
`2a)
`
`I 3o
`
`'0
`E
`
`fl.
`0
`
`602
`
`Petitioner Samsung EX-1007, 0005
`
`2oo
`
`l—
`
`1:
`a)
`U)
`
`E E
`
`ditView
`
`Petitioner Samsung Ex-1007, 0005
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 5 of 7
`
`US 2007/0176906 A1
`
`2-
`a)
`
`I Eo
`
`'D
`E
`
`1’
`
`Oo
`
`File
`FIG.8
`
`I—
`
`Petitioner Samsung EX-1007, 0006
`
`1:
`G)
`U)
`
`E E
`
`ditView
`
`Petitioner Samsung Ex-1007, 0006
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 6 of 7
`
`US 2007/0176906 A1
`
`File
`
`FIG.11
`
`604
`
`FIG.10
`
`Petitioner Samsung EX-1007, 0007
`
`2 8
`
`'—
`
`1:
`a:
`U)
`
`E E
`
`ditView
`
`2-a:
`
`I 3o
`
`'0
`E
`
`Petitioner Samsung Ex-1007, 0007
`
`

`

`Patent Application Publication
`
`Aug. 2, 2007 Sheet 7 of 7
`
`US 2007/0176906 A1
`
`File
`
`FIG.13
`
`602
`
`eoapo
`
`606
`
`600/4
`
`FIG.12
`
`Petitioner Samsung EX-1007, 0008
`
`2oo
`
`l—
`
`ta
`
`)
`(I)
`
`E E
`
`ditView
`
`9-a:
`
`I Bo
`
`'0
`.E
`
`Petitioner Samsung Ex-1007, 0008
`
`

`

`US 2007/0176906 A1
`
`Aug. 2, 2007
`
`PROXIMITY SENSOR AND METHOD FOR
`INDICATING EXTENDED INTERFACE
`RESULTS
`
`PRIORITY DATA
`
`[0001] This application claims priority of US. Provisional
`Patent Application Ser. No. 60/764,406, filed on Feb. 1,
`2006, which is hereby incorporated herein by reference.
`
`FIELD OF THE INVENTION
`
`to electronic
`invention generally relates
`[0002] This
`devices, and more specifically relates to proximity sensor
`devices and using a touch sensor device for producing user
`interface inputs.
`
`BACKGROUND OF THE INVENTION
`
`Proximity sensor devices (also commonly called
`[0003]
`touch pads or touch sensor devices) are widely used in a
`variety of electronic systems. A proximity sensor device
`typically includes a sensing region, often demarked by a
`surface, which uses capacitive, resistive, inductive, optical,
`acoustic and/or other technology to determine the presence,
`location and/or motion of one or more fingers, styli, and/or
`other objects. The proximity sensor device, together with
`finger(s) and/or other object(s), can be used to provide an
`input
`to the electronic system. For example, proximity
`sensor devices are used as input devices for larger comput-
`ing systems, such as those found integral within notebook
`computers or peripheral to desktop computers. Proximity
`sensor devices are also used in smaller systems, including:
`handheld systems such as personal digital assistants (PDAs),
`remote controls, communication systems such as wireless
`telephones and text messaging systems. Increasingly, prox-
`imity sensor devices are used in media systems, such as CD,
`DVD, MP3, video or other media recorders or players.
`[0004] Many electronic devices include a user interface, or
`UI, and an input device for interacting with the UI (e.g.,
`interface navigation). A typical UI includes a screen for
`displaying graphical and/or textual elements. The increasing
`use of this type of UI has led to a rising demand for
`proximity sensor devices as pointing devices. In these appli-
`cations the proximity sensor device can function as a value
`adjustment device, cursor control device, selection device,
`scrolling
`device,
`graphics/character/handwriting
`input
`device, menu navigation device, gaming input device, but-
`ton input device, keyboard and/or other input device.
`[0005] One issue with past touch sensor devices has been
`enabling dragging, scrolling, and similar functions with
`gestures. Specifically, many users cite difficulty in using
`touch sensor devices for “dragging”. In general, “dragging”
`comprises continued selection, optionally with motion. For
`example, dragging occurs when an icon is selected and
`moved using a mouse. Another example is when a portion of
`text is selected and highlighted. A third example is when a
`scrollbar thumb on a scrollbar is selected and moved to
`
`scroll through text. In all three examples, dragging is accom-
`plished with continued selection (e.g., pressing a button)
`combined with motion (e.g., cursor motion). Continued
`selection with zero motion, often referred to as a “press
`gesture,” may be viewed either as a special case of the drag
`gesture or as a distinct gesture
`[0006] With a mouse, dragging is simple: One moves the
`cursor to a start point, presses and holds a mouse button,
`
`then moves the cursor to an end pointwptionally “rowing”
`(lifting the mouse when it reaches the edge of the mouse pad
`and setting it back down away from the edge) to drag for
`long distancesithen releases the mouse button to stop
`dragging. With a traditional touch sensor device, dragging is
`much more awkward, particularly for dragging long dis-
`tances. Dragging for long distances is typically more awk-
`ward on touch sensor devices because it can require “row-
`ing”, e.g., lifting the finger when the edge of the touch sensor
`is reached to reposition the finger on the touch sensor.
`Specifically, while some previous techniques have facili-
`tated the use of two fingers to initiate dragging, dragging
`ends when both the fingers are removed and they have failed
`to provide any mechanism for maintaining dragging selec-
`tion without cursor movement. Thus,
`in these and other
`systems maintaining dragging with a touch sensor device
`requires simultaneously pressing another input device (e.g.,
`button) while moving the cursor with the touch sensor
`device. Pressing a button while moving the cursor using the
`touch sensor device can be difficult for some users.
`
`[0007] The motion in a dragging action often consists of a
`straight line from a start point to an end point, but some uses
`of dragging involve other kinds of motions. An effective
`dragging gesture for touch sensor devices must accommo-
`date all these usage patterns. Some prior art solutions, such
`as edge motion, help with simple linear drags but are less
`helpful with the kinds of drag motions used, for example,
`when operating a scroll bar.
`[0008]
`Some current techniques facilitate touch sensor
`device dragging without requiring input to other buttons. For
`example, the current market standard is a gesture called
`“tap-and-a-half” dragging. To utilize tap-and-a-half drag-
`ging, once the user has ensured that the cursor or other
`indicator is at a desired start point, the user lifts any finger
`that is on the sensitive surface of the touch sensor device
`
`taps once, and quickly places the finger back down on the
`sensitive surface. This gesture activates dragging. The user
`then moves the cursor by moving the finger to an end point
`and then lifts the finger to stop dragging. Typically the same
`finger is used for the entire dragging motion, but different
`fingers or objects may be used for different portions of the
`gesture.
`[0009] While the use of the basic tap-and-a-half gesture to
`initiate dragging is an improvement, its efficiency in facili-
`tating dragging over long distances is limited. Again, when
`dragging for long distances the user can be required to
`“row”, e.g., lift the finger when the edge of the touch sensor
`is reached to reposition the finger on the touch sensor. When
`the user lifts the finger to “row” the cursor, the selection will
`be lost and the drag will end, and typically must be restarted
`with another tap-and-a-half gesture, greatly complicating the
`gestures required to perform a long distance drag. Many
`solutions have been used and proposed to enhance the
`tap-and-a-half gesture for long distance drags. For example,
`US. Pat. No. 5,880,411 discloses locking drags, extended
`drags, and edge motion. However, all of these solutions, and
`indeed the tap-and-a-half gesture itself, have the disadvan-
`tage that performing them involves distinctly different and
`complicated hand and finger actions than are used with a
`mouse, hence making dragging difficult for users familiar
`with mice.
`
`[0010] Thus, while many different techniques have been
`used to facilitate dragging, there remains a continuing need
`for improvements in device usability. Particularly, there is a
`
`Petitioner Samsung Ex-1007, 0009
`
`Petitioner Samsung Ex-1007, 0009
`
`

`

`US 2007/0176906 A1
`
`Aug. 2, 2007
`
`continuing need for improved techniques for facilitating
`dragging with proximity sensor devices.
`
`BRIEF SUMMARY OF THE INVENTION
`
`[0011] The present invention provides a proximity sensor
`device and method that facilitates improved system usabil-
`ity. Specifically, the proximity sensor device and method
`provide a user with the ability to easily cause different
`results in an electronic system using a proximity sensor
`device as a user interface. For example, it can be used to
`facilitate user interface navigation, such as dragging and
`scrolling. As another example,
`it can be used to facilitate
`value adjustments, such as changing a device parameter. In
`general, the proximity sensor device is adapted to distin-
`guish between different object combination motions, deter-
`mine relative temporal relationships between those motions,
`and generate user interface results responsive to the motions.
`Specifically, the proximity sensor device is adapted to indi-
`cate a first result responsive to detected motion of the first
`object combination, indicate a second result responsive to
`detected motion of the second object combination,
`the
`second result different from the first result, and indicate a
`third result responsive to detected motion of the first object
`combination following the detected motion of the second
`object combination, the third result different from first result
`and the second result. This allows a user to selectively
`generate different results using the motion of two different
`object combinations.
`[0012]
`In one specific embodiment, the proximity sensor
`device is implemented to facilitate continued cursor move-
`ment with selection, commonly referred to as “dragging”
`using motion of different object combinations. For example,
`the proximity sensor device is implemented to indicate
`selection with cursor movement responsive to detected
`motion of two adjacent objects across the sensing region,
`indicate selection without cursor movement responsive to
`detected motion of one object across the sensing region
`when the detected motion of one object across the sensing
`region followed the detected motion of two adjacent objects
`across the sensing region without an intervening termination
`event, and indicate further selection with cursor movement
`responsive to detected motion of two adjacent objects across
`the sensing region when the detected motion of two adjacent
`objects across the sensing region followed the detected
`motion of one object across the sensing region that followed
`the detected motion of the adjacent objects across the
`sensing region. This facilitates use of the proximity sensor
`device by a user to indicate results such as extended drag-
`ging, and is particularly useful for indicating continuing
`adjustments, for example, to facilitate dragging an object
`over a large distance or scrolling through a large document.
`This allows a user to continue to drag an object without
`requiring the user to perform more complex gestures on the
`proximity sensor device or activate extra control buttons.
`
`BRIEF DESCRIPTION OF DRAWINGS
`
`[0013] The preferred exemplary embodiment of the
`present invention will hereinafter be described in conjunc-
`tion with the appended drawings, where like designations
`denote like elements, and:
`[0014]
`FIG. 1 is a block diagram of an exemplary system
`that includes a proximity sensor device in accordance with
`an embodiment of the invention;
`
`FIGS. 2-4 are side views of exemplary object
`[0015]
`combinations in the sensing area of a proximity sensor
`device in accordance with embodiments of the invention;
`[0016]
`FIG. 5 is a state diagram of a proximity sensor
`device process in accordance with embodiments of the
`invention;
`[0017]
`FIGS. 6, 8, 10 and 12 are schematic views of a
`proximity sensor device with object combination motions in
`accordance with embodiments of the invention; and
`[0018]
`FIGS. 7, 9, 11 and 13 are schematic views results
`on a program interface in accordance with embodiments of
`the invention.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`[0019] The following detailed description is merely exem-
`plary in nature and is not intended to limit the invention or
`the application and uses of the invention. Furthermore, there
`is no intention to be bound by any expressed or implied
`theory presented in the preceding technical field, back-
`ground, brief summary or the following detailed description.
`[0020] The present invention provides a proximity sensor
`device and method that facilitates improved system usabil-
`ity. Specifically, the proximity sensor device and method
`provide a user with the ability to easily cause different
`results in an electronic system using a proximity sensor
`device as a user interface. For example, it can be used to
`facilitate user interface navigation, such as dragging and
`scrolling.
`To cause selective results the proximity sensor
`[0021]
`device is adapted to distinguish between different object
`combination motions, determine relative temporal relation-
`ships between those motions, and generate user interface
`results responsive to the motions. Specifically, the proximity
`sensor device is adapted to indicate a first result responsive
`to detected motion of the first object combination, indicate
`a second result responsive to detected motion of the second
`object combination, the second result different from the first
`result, and indicate a third result responsive to detected
`motion of the first object combination following the detected
`motion of the second object combination, the third result
`different from first result and the second result. This allows
`
`a user to selectively generate different results using the
`motion of two different object combinations.
`[0022] Turning now to the drawing figures, FIG. 1 is a
`block diagram of an exemplary electronic system 100 that is
`coupled to a proximity sensor device 116. Electronic system
`100 is meant to represent any type of personal computer,
`portable computer, workstation, personal digital assistant,
`video game player, communication device (including wire-
`less phones and messaging devices), media device, includ-
`ing recorders and players (including televisions, cable
`boxes, music players, and video players) or other device
`capable of accepting input from a user and of processing
`information. Accordingly, the various embodiments of sys-
`tem 100 may include any type of processor, memory or
`display. Additionally,
`the elements of system 100 may
`communicate via a bus, network or other wired or wireless
`interconnection. The proximity sensor device 116 can be
`connected to the system 100 through any type of interface or
`connection, including 12C, SPI, PS/2, Universal Serial Bus
`(USB), Bluetooth, RF, IRDA, or any other type of wired or
`wireless connection to list several non-limiting examples.
`
`Petitioner Samsung Ex-1007, 0010
`
`Petitioner Samsung Ex-1007, 0010
`
`

`

`US 2007/0176906 A1
`
`Aug. 2, 2007
`
`Proximity sensor device 116 includes a processor
`[0023]
`119 and a sensing region 1 18. Proximity sensor device 116
`is sensitive to the position of a stylus 114, finger and/or other
`input object within the sensing region 118. “Sensing region”
`118 as used herein is intended to broadly encompass any
`space above, around, in and/or near the proximity sensor
`device 116 wherein the sensor of the touchpad is able to
`detect a position of the object. In a conventional embodi-
`ment, sensing region 118 extends from the surface of the
`sensor in one or more directions for a distance into space
`until signal-to-noise ratios prevent object detection. This
`distance may be on the order of less than a millimeter,
`millimeters, centimeters, or more, and may vary signifi-
`cantly with the type of position sensing technology used and
`the accuracy desired. Accordingly, the planarity, size, shape
`and exact locations of the particular sensing regions 116 will
`vary widely from embodiment to embodiment.
`[0024]
`In operation, proximity sensor device 116 suitably
`detects a position of stylus 114, finger or other input object
`within sensing region 118, and using processor 119, provides
`electrical or electronic indicia of the position to the elec-
`tronic system 100. The system 100 appropriately processes
`the indicia to accept inputs from the user, to move a cursor
`or other object on a display, or for any other purpose.
`[0025] The proximity sensor device 116 can use a variety
`of techniques for detecting the presence of an object. As
`several non-limiting examples, the proximity sensor device
`116 can use capacitive, resistive, inductive, surface acoustic
`wave, or optical techniques. In a common capacitive imple-
`mentation of a touch sensor device a voltage is typically
`applied to create an electric field across a sensing surface. A
`capacitive proximity sensor device 116 would then detect
`the position of an object by detecting changes in capacitance
`caused by the changes in the electric field due to the object.
`Likewise, in a common resistive implementation a flexible
`top layer and a bottom layer are separated by insulating
`elements, and a voltage gradient is created across the layers.
`Pressing the flexible top layer creates electrical contact
`between the top layer and bottom layer. The resistive prox-
`imity sensor device 116 would then detect the position of the
`object by detecting the voltage output due to changes in
`resistance caused by the contact of the object. In an induc-
`tive implementation, the sensor might pick up loop currents
`induced by a resonating coil or pair of coils, and use some
`combination of the magnitude, phase and/or frequency to
`determine distance, orientation or position. In all of these
`cases the proximity sensor device 116 detects the presence
`ofthe object and delivers position information to the system
`100. Examples of the type of technologies that can be used
`to implement the various embodiments of the invention can
`be found at U.S. Pat. No. 5,543,591, U.S. Pat. No. 6,259,234
`and U.S. Pat. No. 5,815,091, each assigned to Synaptics Inc.
`[0026]
`Proximity sensor device 116 includes a sensor (not
`shown) that utilizes any combination of sensing technology
`to implement one or more sensing regions. For example, the
`sensor of proximity sensor device 116 can use arrays of
`capacitive sensor electrodes to support any number of sens-
`ing regions. As another example, the sensor can use capaci-
`tive sensing technology in combination with resistive sens-
`ing technology to support
`the same sensing region or
`different sensing regions. Depending on sensing technique
`used for detecting object motion, the size and shape of the
`sensing region, the desired performance, the expected oper-
`ating conditions, and the like, proximity sensor device 116
`
`can be implemented with a variety of different ways. The
`sensing technology can also vary in the type of information
`provided, such as to provide “one-dimensional” position
`information (e.g. along a sensing region) as a scalar, “two-
`dimensional” position information (e.g. horizontal/vertical
`axes, angular/radial, or any other axes that span the two
`dimensions) as a combination of values, and the like.
`[0027] The processor 119, sometimes referred to as a
`proximity sensor processor or touch sensor controller,
`is
`coupled to the sensor and the electronic system 100. In
`general, the processor 119 receives electrical signals from
`the sensor, processes the electrical signals, and communi-
`cates with the electronic system. The processor 119 can
`perform a variety of processes on the signals received from
`the sensor to implement the proximity sensor device 116.
`For example, the processor 119 can select or connect indi-
`vidual sensor electrodes, detect presence/proximity, calcu-
`late position or motion information, and report a position or
`motion when a threshold is reached, and/or interpret and
`wait for a valid tap/stroke/character/button/gesture sequence
`before reporting it to the electronic system 100, or indicating
`it to the user. The processor 119 can also determine when
`certain types or combinations of object motions occur proxi-
`mate the sensor. For example, the processor 119 can distin-
`guish between motion of a first object combination (e. g., one
`finger, a relatively small object, etc.) and motion of a second
`object combination (e.g., two adjacent fingers, a relatively
`large object, etc.) proximate the sensing region, and can
`generate the appropriate indication in response to that
`motion. Additionally, the processor can distinguish the tem-
`poral relationship between motions of object combinations.
`For example,
`it can determine when motion of the first
`object combination has followed motion of the second
`object combination, and provide a different result responsive
`to the motions and their temporal relationship.
`[0028]
`In this
`specification,
`the term “processor” is
`defined to include one or more processing elements that are
`adapted to perform the recited operations. Thus, the proces-
`sor 119 can comprise all or part of one or more integrated
`circuits, firmware code, and/or software code that receive
`electrical signals from the sensor and communicate with the
`electronic system 100. In some embodiments, the elements
`that comprise the processor 119 would be located with or
`near the sensor. In other embodiments, some elements of the
`processor 119 would be with the sensor and other elements
`of the processor 119 would reside on or near the electronic
`system 100. In this embodiment minimal processing could
`be performed near the sensor, with the majority of the
`processing performed on the electronic system 100.
`[0029]
`Furthermore, the processor 119 can be physically
`separate from the part of the electronic system that
`it
`communicates with, or the processor 119 can be imple-
`mented integrally with that part of the electronic system. For
`example, the processor 119 can reside at least partially on a
`processor performing other functions for the electronic
`system aside from implementing the proximity sensor
`device 116.
`
`[0030] Again, as the term is used in this application, the
`term “electronic system” broadly refers to any type of device
`that communicates with proximity sensor device 116. The
`electronic system 100 could thus comprise any type of
`device or devices in which a touch sensor device can be
`
`implemented in or coupled to. The proximity sensor device
`could be implemented as part of the electronic system 100,
`
`Petitioner Samsung Ex-1007, 001 1
`
`Petitioner Samsung Ex-1007, 0011
`
`

`

`US 2007/0176906 A1
`
`Aug. 2, 2007
`
`or coupled to the electronic system using any suitable
`technique. As non-limiting examples the electronic system
`100 could thus comprise any type of computing device,
`media player, communication device, or another input
`device (such as another touch sensor device or keypad). In
`some cases the electronic system 100 is itself a peripheral to
`a larger system. For example, the electronic system 100
`could be a data input or output device, such as a remote
`control or display device, that communicates with a com-
`puter or media system (e.g., remote control for television)
`using a suitable wired or wireless technique. It should also
`be noted that the various elements (processor, memory, etc.)
`of the electronic system 100 could be implemented as part
`of an overall system, as part of the touch sensor device, or
`as a combination thereof. Additionally, the electronic system
`100 could be a host or a slave to the proximity sensor device
`1 16.
`
`In the illustrated embodiment the proximity sensor
`[0031]
`device 116 is implemented with buttons 120. The buttons
`120 can be implemented to provide additional input func-
`tionality to the proximity sensor device 116. For example,
`the buttons 120 can be used to facilitate selection of items
`
`using the proximity sensor device 116. Of course, this is just
`one example of how additional input functionality can be
`added to the proximity sensor device 116, and in other
`implementations the proximity sensor device 116 could
`include alternate or additional input devices, such as physi-
`cal or virtual switches, or additional proximity sensing
`regions. Conversely, the proximity sensor device 116 can be
`implemented with no additional input devices.
`[0032]
`It should be noted that although the various
`embodiments described herein are referred to as “proximity
`sensor devices”, “touch sensor devices”, “proximity sen-
`sors”, or “touch pads”,
`these terms as used herein are
`intended to encompass not only conventional proximity
`sensor devices, but also a broad range of equivalent devices
`that are capable of detecting the position of a one or more
`fingers, pointers, styli and/or other objects. Such devices
`may include, without limitation, touch screens, touch pads,
`touch tablets, biometric authentication devices, handwriting
`or character recognition devices, and the like. Similarly, the
`terms “position” or “object position” as used herein are
`intended to broadly encompass absolute and relative posi-
`tional information, and also other types of spatial-domain
`information such as velocity, acceleration, and the like,
`including measurement of motion in one or more directions.
`Various forms of positional information may also include
`time history components, as in the case of gesture recogni-
`tion and the like. Accordingly, proximity sensor devices can
`appropriately detect more than the mere presence or absence
`of an object and may encompass a broad range of equiva-
`lents.
`
`In the embodiments of the present invention, the
`[0033]
`proximity sensor device 116 is adapted to provide the ability
`for a user to easily cause different results in an electronic
`system using a proximity sensor device 116 as part of a user
`interface. For example,
`it can be used to facilitate user
`interface navigation, such as cursor control, dragging and
`scrolling. As another example,
`it can be used to facilitate
`value adjustments, such as changing a device parameter. To
`cause selective results the proximity sensor device 116 is
`adapted to distinguish between different object combination
`motions, determine relative temporal relationships between
`those motions, and generate user interface results responsive
`
`to the motions. This allows a user to selectively generate
`different results using the motion of two different object
`combinations.
`
`In one specific embodiment, the proximity sensor
`[0034]
`device 116 is implemented to facilitate continued cursor
`movement with selection, a type of “dragging,” using
`motion of different object combinations. For example, the
`proximity sensor device 116 can be implemented to indicate
`selection with cursor movement (e.g., dragging) responsive
`to detected motion of two adjacent objects across the sensing
`region, indicate selection without cursor movement respon-
`sive to detected motion of one object across the sensing
`region when the detected motion of one object across the
`sensing region followed the detected motion of two adjacent
`objects across the sensing region without an intervening
`termination event, and indicate further selection with cursor
`movement responsive to detected motion of two adjacent
`objects across the sensing region when the detected motion
`of two adjacent objects across the sensing region followed
`the detected motion of one object across the sensing region
`that followed the detected motion of the adjacent objects
`across the sensing region. This facilitates use of the prox-
`imity sensor device 116 by a user to indicate results such as
`extended dragging over long distances. Thus, the proximity
`sensor device 116 allows a user to continue to drag an object
`without requiring the user to perform more complex gestures
`on the proximity sensor device or activate extra control
`buttons.
`
`It should also be understood that while the embodi-
`[0035]
`ments of the invention are described herein the context of a
`
`fully functioning proximity sensor device, the mechanisms
`of the present invention are capable of being distributed as
`a program product in a variety of forms. For example, the
`mechanisms of the present invention can be implemented
`and distributed as a proximity sensor program on a com-
`puter-readable signal bearing media. Additionally,
`the
`embodiments of the present invention apply equally regard-
`less of the particular type of signal bearing media used to
`carry out the distribution. Examples of signal bearing media
`include: recordable media such as memory cards, optical and
`magnetic disks, hard drives.
`in the embodiments of the
`[0036] As described above,
`invention the proximity sensor device is adapted to distin-
`guish between different object combination motions, deter-
`mine relative temporal relationships between those motions,
`and generate user interface results responsive to the motions.
`The different object combinations can be distinguished
`based on a variety of different parameters, such as object
`type, object size, object proximity, pressure on the sensing
`region, and the number of objects proximate the sensing
`region, to list several non-limiting examples.
`[0037] As one specific example,
`the proximity sensor
`device is adapted to distinguish the number of objects
`proximate sensing region. Turning now to FIGS. 2-4, side
`views of exemplary object combinations are illustrated.
`Specifically, FIGS. 2-4 illustrate an embodiment where the
`proximity sensor device is adapted to distinguish between
`the number of fingers or other objects proximate the sensing
`region. In FIG. 2, the first object combination 202 comprises
`one finger 204 proximate the sensing region 200. In FIG. 3,
`a second object combination 212 comprises two fingers 214
`and 216 proximate the sensing region 200. Finally, in FIG.
`4, the third object combination 222 comprises three fingers
`224, 226 and 228. In this embodiment, the object position
`
`Petitioner Samsung Ex-1007, 0012
`
`Petitioner Samsung Ex-1007, 0012
`
`

`

`US 2007/0176906 A1
`
`Aug. 2, 2007
`
`detector detects the position of the fingers proximate the
`sensing region, and determines the number of fingers
`present. Thus, the object position detector can determine if
`one, two or more fingers are proximate the touch sensor and
`generate a result responsive to the number of fingers and the
`motions of the fingers. Of course, the system could also be
`adapted to distinguish between any other quantities of
`objects, such as between three and four fingers, etc. It should
`be noted that such a system allo

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket