`BOWens
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,526,767 B2
`Sep. 3, 2013
`
`US008526767B2
`
`(54) GESTURE RECOGNITION
`(75) Inventor: Alan Bowens, Southampton (GB)
`(73) Assignee: Atmel Corporation, San Jose, CA (US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 1010 days.
`
`(21) Appl. No.: 12/254,043
`(22) Filed:
`Oct. 20, 2008
`
`(65)
`
`Prior Publication Data
`US 2009/0273571 A1
`Nov. 5, 2009
`
`Related U.S. Application Data
`(60) Provisional application No. 61/049,453, filed on May
`1, 2008
`(51) Int. Cl.
`G06K 9/22
`(2006.01)
`(52) U.S. Cl.
`USPC ........................................... assis. 34.5/17,
`(58) Field of Classification Search
`USPC ...... 382312.34 sis. 348/1403. 715/700,
`715/719; 34.5/157, 173 177, 181
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`Dym et al.
`7, 1971
`3,593,115
`A
`7, 1992
`*
`Hawkins et al. .............. TO8,141
`5,133,076
`A
`*
`3, 1995
`Mese et al. .................... T13,321
`5,396,443
`A
`* 10, 1996
`5,570,113
`Zetts ............................. 345,173
`A
`Redmayne
`7, 1997
`5,650,597
`A
`Philipp
`3, 1998
`5,730,165
`A
`10, 1998
`Bisset et al.
`5,825,352
`A
`Gillespie et al.
`2, 2000
`6,028,271
`A
`
`press
`
`We
`multipress press, multitouch
`
`entry timert tap)
`entryl save?t)
`dol delta(t)
`dol check to
`timeout press
`not neart tap
`release
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`.
`
`.
`
`.
`
`.
`
`. 34.5/175
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`10/2001 Kent et al.
`6,297.811 B1
`T/2002 Gillespie et al.
`6,414,671 B1
`E. R 858 ERE
`s
`ck
`.
`.
`.
`Sigg R: 258 3.N. a. .
`6,888,536 B2
`5/2005 Westerman et al.
`7,663,607 B2
`2/2010 Hotelling
`2. R:
`38: Sin
`- 4 W
`otelling
`8,031,094 B2 10/2011 Hotelling
`8,031,174 B2 10/2011 Hamblin
`8,040.326 B2 10/2011 Hotelling
`8,049,732 B2 11/2011 Hotelling
`8,179,381 B2
`5/2012 Frey
`8,368,653 B2 * 2/2013 Han et al. ...................... 345,173
`(Continued)
`FOREIGN PATENT DOCUMENTS
`
`9, 2012
`WO WO 2012/1292.47
`OTHER PUBLICATIONS
`U.S. Appl. No. 61/454,936, Mar. 21, 2011, Myers.
`U.S. Appl. No. 61/454,950, Mar. 21, 2011, Lynch.
`
`(Continued)
`in a
`Primary Examiner Jason M Repko
`Assistant Examiner — Shervin Nakhjavan
`74). A
`A
`Fi
`Baker Botts LLP
`(74) Attorney, Agent, or Firm
`Baker Botts
`(57)
`ABSTRACT
`A state machine gesture recognition algorithm for interpret
`ing streams of coordinates received from a touch sensor. The
`gesture recognition code can be written in a high level lan
`guage Such as C and then compiled and embedded in a micro
`controller chip, or CPU chip as desired. The gesture recogni
`tion code can be loaded into the same chip that interprets the
`touch signals from the touch sensor and generates the time
`series data, e.g. a microcontroller, or other programmable
`logic device such as a field programmable gate array.
`
`14 Claims, 11 Drawing Sheets
`
`multitouchlick, multitouch
`
`releaselflick
`Flick Pending
`entrytimert fick)
`entryl save(Inittl)
`dol check to
`
`
`
`
`
`
`
`entrytimert drag)
`entryl save(t)
`dol delta(t))
`dol check to
`
`timeout! tap
`
`released
`double tap
`
`entryl save(t)
`dol check to
`press neart
`moved tap
`DoubleTap Pending
`entrytimerit lap)
`timeoutrap, press T
`entryl save(t)
`-------
`multitouchi multitouch
`multitouchi tap,
`dol delta(t)
`press, multitouch
`multitouchi release
`dol check to
`
`releasel release
`
`entrylsave(t1, t2)
`entry save (anglet, t2)
`dol delta(t1, t2)
`
`Ipinch
`
`Protate
`
`Elstretch
`
`
`
`Petitioner Samsung Ex-1001, 0001
`
`
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`2002fOO15024 A1
`2, 2002 Westerman et al.
`
`US 8,526,767 B2
`Page 2
`
`.
`
`. 345,157
`
`2007/0257894 A1 1 1/2007 Philipp
`2007/0265081 A1* 11/2007 Shimura et al. ................. 463/37
`2007/0279395 A1 12/2007 Philipp
`2007/0291.007 A1* 12/2007 Forlines et al. ............... 345,173
`2007/029 1009 A1* 12/2007 Wright et al. ............ 345,173
`
`
`
`2008. O165141 A1* 7, 2008 Christie ..........
`... 345,173
`2008/0309626 A1* 12/2008 Westerman et al. .......... 345,173
`20090315854 A1 12/2009 Matsuo
`2012fO242588 A1
`9/2012 Myers
`2012fO242592 A1
`9/2012 Rothkopf
`2012fO243151 A1
`9/2012 Lynch
`2012fO243719 A1
`9/2012 Franklin
`
`OTHER PUBLICATIONS
`U.S. Appl. No. 61/454,894, Mar. 21, 2011, Rothkopf.
`
`* cited by examiner
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`2003/0214481 A1* 11, 2003 Xiong .
`2005/0041018 A1
`2/2005 Philipp
`2006/0066582 A1
`3/2006 Lyon et al.
`2006/0097991 A1
`5/2006 Hotelling
`2006/0132460 A1* 6/2006 Kolmykov-Zotov et al., 345/173
`2006/0250377 A1 1 1/2006 Zadesky et al.
`2007/0046643 A1
`3f2007 Hillis et al. ................... 345,173
`2007/0097096 A1* 5/2007 Rosenberg .................... 345,173
`2007/0152979 A1
`7/2007 JobS et al.
`2007/0152984 A1
`7/2007 Ording et al.
`2007/0176906 A1
`8, 2007 Warren
`2007/0177804 A1
`8, 2007 Elias et al.
`2007/0247443 A1 10/2007 Philipp
`2007/0257890 A1 1 1/2007 Hotelling et al.
`
`Petitioner Samsung Ex-1001, 0002
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 1 of 11
`
`US 8,526,767 B2
`
`
`
`6eup jenou
`
`
`
`| Jeau SSæld
`
`
`
`
`
`fiulpued del pu000S
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Sseud
`
`Petitioner Samsung Ex-1001, 0003
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 2 of 11
`
`US 8,526,767 B2
`
`>895“g93398ES
`
`
`.52.2%,:fie28222:582235.mmma
`
`x2:E822
`
`9.65;x2:
`
`goaéosasg
`
`$8322:
`
`mmma
`
`augmeass
`
`€28EEw
`
`93%\8
`
`3329
`
`0829
`
`303
`
`BSeroEEon
`
`€98\Eco
`
`A3338
`
`
`
`ofixowfi\ou
`
`some:93ESE
`
`>330:st
`
`532235.932.88:
`
`
`
`9.6%;amp28mm
`
`386.19E.8:8:
`
`:.8:$3
`
`3%>385:
`leomcoBuwmma
`
`
`
`
`
`
`08222322355:922:.30321x85\oca90338
`
`
`
`
`
`3829II386.3822El305aisooezEEJVEEEES58536EgoiseQB£03222:cmwmumwkmwm
`
`
`
`as35:.968m3.2260
`
`
`
`$86.$822
`
`£223
`
`N.OE
`
`Egg—22
`
`a:.__%§Es
`
`2%:5ng2,8is
`
`a:£295\8a:.252.
`\8
`
`290:653
`
`Petitioner Samsung EX-1001, 0004
`
`Petitioner Samsung Ex-1001, 0004
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 3 of 11
`
`US 8,526,767 B2
`
`timeOut short
`
`Press Complete
`
`timeOut long
`
`
`
`Short Pending
`entry/timer(t short)
`
`timeOut Short
`Long Pending
`entry/timer(t long)
`timeout/ long
`Repeat Pending
`entry/timer(t repeat delay)
`
`timeOut repeat
`
`entry/timer(t repeat interval)
`
`timeout/ repeat
`entryl store touch location
`
`FIG. 3
`
`Petitioner Samsung Ex-1001, 0005
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 4 of 11
`
`US 8,526,767 B2
`
`Touch 1
`
`Touch 2
`
`Touch N
`
`FIG. 4
`
`
`
`events generated when
`
`N-Touchidle
`touch 1 pressed
`touch 1 released
`AND touch 2 pressed
`OR touch 2 released
`OR touchN released
`AND touchN pressed
`N-Touch
`Touched
`
`
`
`events generated when
`in touched State
`FIG. 5
`
`Petitioner Samsung Ex-1001, 0006
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 5 of 11
`
`US 8,526,767 B2
`
`1-touch
`FSM
`
`1-touch
`
`2-Touchdle
`
`
`
`2-touch FSM
`
`
`
`
`
`
`
`
`
`
`
`OC
`CICSSO
`OUC
`OCSS
`OR touch 2 released
`AND touch 2 pressed
`2-Touch Touched
`entry/save(t1, t2)
`entryl save (angle(t1, t2)
`dol delta(t1, t2)
`dol angle(t1, t2)
`
`?pinch
`
`Irotate
`
`Istretch
`
`FIG. 6
`
`Petitioner Samsung Ex-1001, 0007
`
`
`
`U.S. Patent
`
`Sep. 3, 2013
`
`Sheet 6 of 11
`
`US 8,526,767 B2
`
`Touch 2
`
`Touch 3
`
`Touch 1
`1-touch
`FSM
`
`3-touch FSM
`
`2-touch
`and double
`tap
`
`3-Touchldle
`
`2-touch
`and tap
`
`
`
`touch 1 pressed
`AND touch 2 pressed
`AND touch 3 pressed
`
`
`
`touch 1 released
`OR touch 2 released
`OR touch 3 preleased
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`3 touch drag
`
`FIG. 7
`
`Petitioner Samsung Ex-1001, 0008
`
`
`
`U.S. Patent
`U.S. Patent
`
`Sep. 3, 2013
`Sep. 3, 2013
`
`Sheet 7 of 11
`Sheet 7 of 11
`
`US 8,526,767 B2
`US 8,526,767 B2
`
`mm
`
`Zņ7ZZ || ZZ
`
`
`
`g.ml
`;H",
`9VNN_‘NN.fl.lllllllllo‘/N'l“!
`"IIl“8gfill-“Illhm“..-
`
`
`.IIIn.avll'ul‘lm
`
`.Tl’lll'lll‘fl
`"lIflm—om
`In.a
`{'ILHHH;
`
`
`
`
`
`\llj\LIJ\Ilrlxx|>|JNx
`
`w.0_n_
`
`mmx
`mmW23”TIW>HHHHEHmm
`
`Petitioner Samsung EX-1001, 0009
`
`9oror39
`
`ZG
`mm
`
`Petitioner Samsung Ex-1001, 0009
`
`
`
`
`
`Petitioner Samsung Ex-1001, 0010
`
`
`
`U.S. Patent
`US. Patent
`
`Sep. 3, 2013
`Sep. 3, 2013
`
`Sheet 9 of 11
`Sheet 9 of 11
`
`US 8,526,767 B2
`US 8,526,767 B2
`
`
`
`
`
`
`
`
`
`FIG.10
`
`Petitioner Samsung Ex-1001, 001 1
`
`Petitioner Samsung Ex-1001, 0011
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`Sep. 3, 2013
`w
`
`Sheet 10 of 11
`
`
`
`m,N?
`
`
`US 8,526,767 B2
`
`m
`
`w
`
`m,3.0?—
`
`H,mm:mNWNW3NMNMIi...........................
`
`? | & | & | Â | Â | Â | 06h
`
`norm.“wwwwww
`
`Petitioner Samsung Ex-1001, 0012
`
`Petitioner Samsung Ex-1001, 0012
`
`
`
`
`U.S. Patent
`US. Patent
`
`Sep. 3, 2013
`Sep. 3, 2013
`
`Sheet 11 of 11
`Sheet 11 of 11
`
`US 8,526,767 B2
`US 8,526,767 B2
`
`
`
`
`
`FIG. 12
`
`907
`907
`
`905
`905
`
`906
`906
`
`FIG. 13
`FIG. 13
`
`Petitioner Samsung EX-1001 , 0013
`
`Petitioner Samsung Ex-1001, 0013
`
`
`
`1.
`GESTURE RECOGNITION
`
`US 8,526,767 B2
`
`2
`number, with significant numbers of publications from Syn
`aptics, Inc. and also more recently Apple Computer, Inc., for
`example.
`US 2007/152984 A1 assigned to Apple Computer, Inc.
`discloses a portable communication device with multi-touch
`input which detects one or more multi-touch contacts and
`motions and performs one or more operations on an object
`based on the one or more multi-touch contacts and/or
`motions.
`US 2002/015024 A1 assigned to University of Delaware
`discloses simultaneously tracking multiple finger and palm
`contacts as hands approach, touch, and slide across a proX
`imity-sensor. Segmentation processing extracts shape, posi
`tion and Surface proximity features for each contact and a
`persistent path tracker is used to detect individual contact
`touchdown and liftoff. Combinatorial optimization modules
`associate each contacts path with a particular fingertip.
`thumb, or palm of either hand on the basis of biomechanical
`constraints and contact features. Classification of intuitive
`hand configurations and motions enables unprecedented inte
`gration of typing, resting, pointing, Scrolling, 3D manipula
`tion, and handwriting into a versatile, ergonomic computer
`input device.
`U.S. Pat. No. 5,825,352 discloses a touch panel which is
`capable of detecting multiple touches simultaneously. In an
`Xy electrode array, maxima and minima are identified in each
`of the X and y signals, wherein maxima are designated as
`finger touches. Peak and Valley data in the X and y directions
`are then interpolated to identify the location of one or more
`fingers on the sensor array.
`U.S. Pat. No. 6,028,271, U.S. Pat. No. 6,414,671 and U.S.
`Pat. No. 6,750,852 are related patents assigned to Synaptics,
`Inc. which disclose gesture recognition of an object on a
`touch-sensor pad and for cursor motion. Tapping, drags,
`pushes, extended drags and variable drags gestures are rec
`ognized by analyzing the position, pressure, and movement of
`the conductive object on the sensor pad during the time of a
`Suspected gesture, and signals are sent to a host indicating the
`occurrence of these gestures.
`US2007/176906 A1 assigned to Synaptics, Inc. discloses a
`touch sensor having a signal processor adapted to distinguish
`between three gestures based on different finger motions on
`the sensing device by providing a workflow with an idle state
`and three gesture-specific states referred to as first, second
`and third result states, as illustrated in FIG. 5 of US2007/
`176906 A1.
`Generally, the raw output from the 2D touch sensor will be
`a time series of x, y coordinates, which are then processed by
`software, or firmware generated from higher level software,
`to distinguish the nature of the gesture that has been input.
`Generally, the raw data is split into contiguous touch seg
`ments and then processed to determine what if any gestures
`can be deduced. The processing of the raw data to identify the
`gestures may be carried out in the same chip as generates the
`raw data, or the raw data may be exported to an external chip,
`for example by transmission over a communication bus to the
`device's central processing unit (CPU). The former approach
`is preferred by Synaptics, the latter by Apple as exemplified
`by US 2006/0066582 A1.
`Most of the patent literature is unspecific about how the
`raw time series data are converted into gestures. The straight
`forward approach is to write appropriate high level code, for
`example in C or another suitable programming language, in
`which the interpretation of the time series data is analyzed
`using conditional statements, such as if... then . . . else.
`However, it is difficult to reliably and efficiently add code
`to identify a new gesture into an existing block of code for
`
`CROSS-REFERENCES TO RELATED
`APPLICATIONS
`
`This application claims the benefit under 35 U.S.C. S 119
`(e) of U.S. Provisional Application Ser. No. 61/049,453 filed
`May 1, 2008.
`
`BACKGROUND OF THE INVENTION
`
`10
`
`15
`
`25
`
`30
`
`35
`
`The invention relates to gesture recognition in particular
`gesture recognition by processing of time series of positional
`inputs received by a two-dimensional (2D) touch sensor, Such
`as a capacitive or resistive touch sensor. The invention may
`also be applied to one-dimensional (1D) touch sensors, and
`the principles could also be applied to three-dimensional
`sensors. It may also be applied to proximity sensors, where no
`physical contact, i.e. touch, with a sensing Surface is involved.
`The invention can be applied to sensing Surfaces operable by
`a human finger, or a stylus.
`1D and 2D capacitive and resistive touch sensors have been
`in widespread use for many years. Examples include the
`screens of personal digital assistants (PDAs), MP3 audio
`player controls, mobile phone keypads and/or displays, and
`multimedia devices. The touchpad in notebook computers
`provided in place of a mouse is anotherform of 2D capacitive
`touch sensor. 2D sensors are also provided in many domestic
`appliances, so-called “white goods’. Such as ovens and blend
`CS.
`Detailed descriptions of 2D capacitive sensors have been
`given many times, for example in patents and patent applica
`tions with the inventor Harald Philipp such as US 2005/
`0041018 A1, US 2007/0247443 A1, US 2007/0257894 A1,
`and US 2007/0279395 A1, the contents of which are incor
`porated herein in their entirety.
`Other prior art examples of touch screens are as follows.
`U.S. Pat. No. 3,593,115 shows a touch element having
`triangulated shapes for determining object position. However
`this scheme requires numerous secondary electrode connec
`tions as well as two or more layers of construction, increasing
`construction costs and reducing transparency.
`U.S. Pat. No. 5,650,597 shows a 2D sensing method which
`in its active area requires only one layer but requires large
`numbers of electrode connections. Resistive strips resolve
`one axis of position, and the accuracy is dependent on the
`tolerance of large numbers of resistive strips. This method
`however does Suppress hand shadow effects.
`U.S. Pat. No. 6,297.811 describes a touch screen using
`triangulated wire outline electrode shapes to create field gra
`dients. However this patent suffers from the problem that it is
`difficult to scale up the screen size, as the number of electrode
`connections to a sensing circuit is one per triangle. It is
`desirable to dramatically reduce the number of connections in
`order to reduce cost and simplify construction. Also it is
`desirable to use solid shapes rather than wire outlines which
`are more expensive to construct. This method however does
`Suppress hand shadow effects.
`Gesture recognition has also been used for many years in
`Such devices. An early example is character recognition in
`PDAs, Such as the original machines from Palm Inc. Tracking
`finger motion, and single and double taps on a notebook
`touchpad is another long used example. More recently, ges
`ture recognition has been incorporated into handheld devices
`such as the Apple iPhone(R). Prior art patent publications on
`touchscreens that involve gesture recognition are also large in
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Petitioner Samsung Ex-1001, 0014
`
`
`
`US 8,526,767 B2
`
`3
`distinguishing between a significant number of gestures, for
`example at least 3 or 4, perhaps 10 to 20. Testing of the code
`is a particular difficulty. This is because in general at any
`intermediate point in a time series of X.y.t data the input may
`relate to a plurality of possible gestures, thereby making the
`coding for recognizing one gesture generally dependent on or
`linked to the coding for recognizing another gesture.
`
`SUMMARY OF THE INVENTION
`
`4
`of simultaneous touches. A key advantage of this approach is
`that the same code base is used for handling single touches,
`and each of 2-, 3- or higher numbers of simultaneous touches
`are processed using separate additional code embodied in
`separate state machines.
`A touch is usually only output as a valid touch, if certain
`criteria are satisfied, typically that there are a succession of
`touch at a stable x,y location or X.y region over multiple time
`sample increments. If a touch of a duration longer than a
`threshold duration is sensed in the touch state, then control
`flow passes to a press state module, wherein the press state is
`for handling longer touches. The press state is preferably a
`Superstate comprising multiple sub-states to distinguish
`between different durations of press and/or to allow a very
`long press to be interpreted as being repeat presses, which
`may be useful for alphanumeric key entry applications for
`example.
`The state machine preferably also has a plurality of state
`modules for interpreting higher level gestures, such as one or
`more states for interpreting double taps, flicks, drags and any
`other gestures. The gestures include those specifically
`described in this document as well as other gestures known in
`the art, specifically all those disclosed in the above-refer
`enced prior art documents.
`The invention provides in one aspect a touch sensor device
`comprising: a sensor having a sensitive area extending in at
`least one-dimension and arranged to output sense signals
`responsive to proximity of an object to the sensitive area; a
`position processing unit operable to calculate positions of
`interactions with the sensitive area from an analysis of the
`sense signals, and output a time series of data indicative of the
`interaction positions on the sensor, and thus touches; and a
`gesture processing unit operable to analyze the time series
`data to distinguish one or more gesture inputs therefrom,
`wherein the gesture processing unit is coded with gesture
`recognition code comprising a plurality of linked State mod
`ules.
`Further aspects of the invention relate to the gesture pro
`cessing unit on its own and the gesture processing unit in
`combination with the position processing unit, but without
`the sensor.
`The plurality of state modules preferably includes an idle
`state module and a plurality of gesture interpretation state
`modules, wherein the idle state module is entered at the start
`of operation, and is returnable to from at least some of the
`gesture interpretation state modules. The plurality of gesture
`interpretation state modules may include a touch state module
`for single touches, and wherein, responsive to a touch, the idle
`state passes control to the touch state.
`In some embodiments, the plurality of gesture interpreta
`tion state modules includes at least one multitouch state mod
`ule operable to process multiple simultaneous touches, and
`wherein the gesture processing unit is operable to pass control
`to the appropriate touch state module based on the number of
`simultaneous touches defined by the time series data at the
`time. A multitouch state module for each of two simultaneous
`touches and three simultaneous touches may be provided, and
`optionally also higher numbers of touches.
`The plurality of gesture interpretation state modules may
`advantageously include a press state module to which control
`can pass from a touch state module if a touch of a duration
`longer than a threshold duration is sensed in the touch state
`module. The press state is preferably a Superstate comprising
`multiple sub-states to distinguish between different durations
`of press.
`In some embodiments, the plurality of gesture interpreta
`tion state modules includes a plurality of state modules oper
`
`The invention solves this problem by adopting a state
`machine approach to designing and writing the gesture rec
`ognition algorithm. In particular, the invention relates to a
`touch sensor device comprising an at least one-dimensional
`sensor arranged to output a sense signal responsive to proX
`imity of an object, a position processing unit for calculating a
`position of an interaction with the sensitive area from an
`analysis of the sense signals and output a time series of data
`indicative of interaction positions on the sensor, and a gesture
`processing unit operable to analyze the time series data to
`distinguish one or more gesture inputs therefrom, wherein the
`gesture processing unit is coded with gesture recognition
`code comprising a plurality of linked State modules. The
`invention also relates to a corresponding signal processing
`method.
`The gesture recognition code can be written in a high level
`language Such as C and then compiled and embedded in a
`microcontroller chip, or CPU chip as desired. Preferably, the
`gesture recognition code is loaded into the same chip that
`interprets the touch signals from the screen and generates the
`time series data, e.g. a microcontroller, or other program
`mable logic device such as a field programmable gate array
`(FPGA). This approach has been used to create reliable test
`able code both for single-touch data input screens and also
`multi-touch data input screens. A single-touch screen is one
`which assumes only one simultaneous touch of the screen,
`and is designed to output only one Xy coordinate at any one
`time. A multi-touch screen is one that can sense multiple
`simultaneous touches, for example up to 2 or 3 simultaneous
`touches.
`The state machine includes an idle state module which is
`the start state, and also the state which is returned to after a
`gesture interpretation state module has been exited.
`Responsive to a touch, the idle state passes control to a
`touch state.
`In a multi-touch environment, the state machine is imple
`mented in the second embodiment described below such that
`there are multiple touch states, one for a single touch, one for
`a double touch, one for a triple touch etc with control passing
`to the appropriate touch state based on the number of simul
`taneous touches defined by the time series data at the time.
`Although the above approach for handling multitouch ges
`tures by having two-touch and three-touch states linked to one
`touch states operates well, redesigning the state machine to,
`for example, add a new multitouch gesture is difficult in view
`of the increasingly complex web of states and transitions.
`This problem is addressed by a fourth embodiment of the
`invention described below according to which there is pro
`vided a plurality of state machines limited to single-touch
`gesture recognition. If the gesture recognition code is config
`ured to recognize gestures having up to, say 3 simultaneous
`touches, then 3 Such single-touch state machines are pro
`vided. Further state machines are provided for multi-touch
`gesture recognition, each catering for a certain number of
`simultaneous touches, so there is a two-touch state machine
`and optionally a three-touch state machine, and further
`optionally additional state machines for still higher numbers
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Petitioner Samsung Ex-1001, 0015
`
`
`
`5
`able to recognize motion related gestures derived from one or
`more moving touches. In other embodiments, only static ges
`tures, such as press, tap and double tap are catered for.
`The best mode of implementing multitouch gesture inter
`pretation according to the invention provides gesture recog
`nition code configured to recognize gestures having up to N
`simultaneous touches, wherein N is at least 2, and comprises
`N single-touch state machines operable to recognize only
`single touch gestures, and N-1 multi-touch state machines
`each operable to recognize only n-touch gestures, wherein
`n=2 to N.
`The position processing unit and the gesture processing
`unit may be accommodated in, and run on, a single integrated
`circuit, for example a microcontroller. Alternatively, the posi
`tion processing unit may be accommodated in, and run on, a
`first integrated circuit, such as a microcontroller, and the
`gesture processing unit accommodated in, and run on, one or
`more separate integrated circuits, such as a personal computer
`or other complex system having its own central processing
`unit, graphics processing unit and/or digital signal processor
`with associated memory and bus communications.
`The invention provides in another aspect a method of rec
`ognizing gestures from a time series of touch data comprising
`coordinates of interaction positions on a touch sensor, the
`method comprising: receiving touch coordinates labeled
`with, or ordered by, time; analyzing the touch coordinates in
`a state machine comprising a plurality of linked State modules
`to recognize any one of a plurality of defined gestures there
`from; and outputting the recognized gestures.
`The invention provides in a still further aspect a single
`integrated circuit having a memory on which is loaded the
`above-referenced gesture state machine and which is oper
`able to carry out the method of gesture recognition defined
`thereby.
`The invention provides in yet another aspect a computer
`having a memory on which is loaded the above-referenced
`gesture state machine and which is operable to carry out the
`method of gesture recognition defined thereby.
`It will be appreciated that the gesture state machine
`approach for gesture recognition can be applied to any hard
`40
`ware platform. Capacitive touch sensors, in particular one
`dimensional and two-dimensional capacitive touch sensors
`are one important sensor type which can provide a hardware
`platform for a gesture recognition state machine according to
`the invention. In particular, the invention is equally applicable
`to so-called passive or active capacitive sensing techniques.
`Passive capacitive sensing devices rely on measuring the
`capacitance of a sensing electrode to a system reference
`potential (earth). The principles underlying this technique are
`described in U.S. Pat. No. 5,730,165 and U.S. Pat. No. 6,466,
`036, for example. In broad Summary, passive capacitive sen
`sors employ sensing electrodes coupled to capacitance mea
`Surement circuits. Each capacitance measurement circuit
`measures the capacitance (capacitive coupling) of its associ
`ated sensing electrode to a system ground. When there is no
`pointing object near to the sensing electrode, the measured
`capacitance has a background or quiescent value. This value
`depends on the geometry and layout of the sensing electrode
`and the connection leads to it, and so on, as well as the nature
`and location of neighboring objects, e.g. the sensing elec
`trodes proximity to nearby ground planes. When a pointing
`object, e.g. a user's finger, approaches the sensing electrode,
`the pointing object appears as a virtual ground. This serves to
`increase the measured capacitance of the sensing electrode to
`ground. Thus an increase in measured capacitance is taken to
`indicate the presence of a pointing object. U.S. Pat. No. 5,730,
`165 and U.S. Pat. No. 6,466,036 are primarily directed to
`
`6
`discrete (single button) measurements, and not to 2D position
`sensor applications. However the principles described in U.S.
`Pat. No. 5,730,165 and U.S. Pat. No. 6,466,036 are readily
`applicable to 2D capacitive touch sensors (2DCTs), e.g. by
`providing electrodes to define either a 2D array of discrete
`sensing areas, or rows and columns of electrodes in a matrix
`configuration.
`Active 2DCT sensors are based on measuring the capaci
`tive coupling between two electrodes (rather than between a
`single sensing electrode and a system ground). The principles
`underlying active capacitive sensing techniques are described
`in U.S. Pat. No. 6,452,514. In an active-type sensor, one
`electrode, the so called drive electrode, is supplied with an
`oscillating drive signal. The degree of capacitive coupling of
`the drive signal to the sense electrode is determined by mea
`Suring the amount of charge transferred to the sense electrode
`by the oscillating drive signal. The amount of charge trans
`ferred, i.e. the strength of the signal seen at the sense elec
`trode, is a measure of the capacitive coupling between the
`electrodes. When there is no pointing object near to the elec
`trodes, the measured signal on the sense electrode has a
`background or quiescent value. However, when a pointing
`object, e.g. a user's finger, approaches the electrodes (or more
`particularly approaches near to the region separating the elec
`trodes), the pointing object acts as a virtual ground and sinks
`some of the drive signal (charge) from the drive electrode.
`This acts to reduce the strength of the component of the drive
`signal coupled to the sense electrode. Thus a decrease in
`measured signal on the sense electrode is taken to indicate the
`presence of a pointing object.
`It will be appreciated that there are several other touch
`sensing technologies, such as those based on resistive
`screens, which typically operate with stylus input, and tech
`nologies developed for large areas, such as those based on
`ultrasonics or other acoustic techniques, and those based on
`total internal reflection, or other optical techniques. All of
`these touch technologies may benefit from the present inven
`tion.
`It is noted that the term of art “state machine' is used
`throughout this document. A synonym is finite State machine
`(FSM), the acronym FSM appearing in some of the figures.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`For a better understanding of the invention, and to show
`how the same may be carried into effect, reference is now
`made by way of example to the accompanying drawings.
`FIG. 1 illustrates a gesture recognition state machine
`according to a first embodiment.
`FIG. 2 illustrates a gesture recognition state machine
`according to a second embodiment.
`FIG. 3 illustrates features of a gesture recognition state
`machine according to a third embodiment.
`FIG. 4 illustrates a gesture recognition state machine
`according to a fourth embodiment for handling N touches.
`FIG. 5 illustrates internal states of an N-touch state
`machine according to the fourth embodiment.
`FIG. 6 is an example of the fourth embodiment showing
`generation of 2-touch events using two 1-touch state
`machines.
`FIG. 7 is an example of 3-touch gesture handling according
`to the fourth embodiment.
`FIG. 8 is a schematic plan view showing parts of an elec
`trode pattern for a two-dimensional capacitive touch screen
`(2DCT).
`
`US 8,526,767 B2
`
`10
`
`15
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Petitioner Samsung Ex-1001, 0016
`
`
`
`7
`FIG. 9 is a plan view of a 2DCT of FIG. 8 showing the
`electrode patternanda first layer of connections at the periph
`ery of the electrode pattern area to connect to they-electrodes.
`FIG. 10 is a plan view of the 2DCT of FIG.9 showing the
`electrode pattern and a second layer of connections at the
`periphery of the electrode pattern area to connect to the
`X-electrodes and also to connect the y-electrode external feed
`lines to the y-electrode connections shown in FIG. 9.
`FIG. 11 is a schematic system level drawing of drive and
`data acquisition circuitry for the 2DCT of FIGS. 8-10.
`FIG. 12 schematically shows a display monitor and an
`input device according to the present invention.
`FIG. 13 schematically shows a cellular telephone accord
`ing to the present invention.
`
`10
`
`15
`
`DETAILED DESCRIPTION
`
`Before describing embodiments of the invention, we first
`define each of the gestures referred to in the detailed descrip
`tion of the embodiments.
`Tap: A tap happens when the user quickly touches and
`releases the touch Surface. No significant movement takes
`place while the user's finger is on the touch surface. It is
`characterized by a short touch duration. This could be used,
`for example, to activate a hyperlink on a displayed web page.
`DoubleTap: A double tap happens when the user quickly
`touches and releases the touch surface twice in quick Succes
`Sion. No significant movement takes place while the user's
`finger is on the touchSurface, or between Successive touches.
`It is characterized by sh