`
`
`(19) United States
`
`
`
`
`
`
`
`
`(12) Patent Application Publication (10) Pub. No.: US 2006/0026521 A1
`
`
`
` Hotelling et al. (43) Pub. Date: Feb. 2, 2006
`
`
`
`
`
`
`
`US 20060026521A1
`
`
`
`
`
`
`
`
`(54) GESTURES FOR TOUCH SENSITIVE INPUT
`DEVICES
`
`
`
`
`(75)
`
`
`
`
`
`
`
`
`
`Inventors: Steve Hotelling, San Jose, CA (US);
`
`
`
`
`Joshua A. Strickon, San Jose, CA
`
`
`
`
`(US); Brian Q. Huppi, San Francisco,
`
`
`
`
`
`CA (US); Imran Chaudhri, San
`
`
`
`
`Francisco, CA (US); Greg Christie,
`
`
`
`
`
`
`San Jose, CA (US); Bas Ording, San
`
`
`
`
`Francisco, CA (US); Duncan Robert
`
`
`
`
`Kerr, San Franc1sco, CA (US);
`Jonathan P. Ive, San Francisco, CA
`
`
`
`
`(US)
`
`
`
`
`
`
`
`Correspondence Address:
`
`
`
`BEYER WEAVER & THOMAS LLP
`
`
`
`P.O. BOX 70250
`
`
`OAKLAND, CA 94612-0250 (US)
`
`
`
`
`
`
`
`
`
`(73) Assignee: Apple Computer, Inc.
`
`
`(21) Appl. No.:
`
`
`(22)
`
`Filed:
`
`
`
`
`
`Jul. 30, 2004
`
`
`
`
`
`10/903,964
`
`
`
`
`
`
`Publication Classification
`
`
`
`(51)
`
`
`
`
`Int. Cl“
`
`
`
`(200601)
`6061: 1 7/00
`
`
`
`
`
`(52) US. Cl.
`............................................ 715/702; 715/863
`
`
`
`(57)
`
`
`
`ABSTRACT
`
`
`
`
`
`
`
`
`
`
`
`
`Methods and systems for processing touch inputs are dis-
`
`
`
`
`
`
`
`
`closed. The invention in one respect includes reading data
`
`
`
`
`
`
`
`from a multipoint sensing device such as a multipoint touch
`
`
`
`
`
`
`
`
`
`screen Where the data pertains to touch input with respect to
`
`
`
`
`
`
`
`
`the multipoint sensing device, and identifying at least one
`
`
`
`
`
`
`
`
`multipoint gesture based on the data from the multipoint
`
`
`sensing device.
`
`
`
`
`69\
`
`
`
`DISPLAY
`
`
`
`GUI
`
`85
`
`
`
`
` l/O DEVICE
`
`88
`INPUT
`
`
` GESTURE
`PROCESSOR
`
`
`
`DEVICE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`OUTPUT
`
`
`
`Exhibit 1054
`Apple v. Qualcomm
`|PR2018-01278
`
`1
`
`Exhibit 1054
`Apple v. Qualcomm
`IPR2018-01278
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 1 0f 37
`
`US 2006/0026521 A1
`
`mm
`
`_‘.OE
`
`Eéwomm
`
`mOmmwOOmm
`
`._.Dn_Z_
`
`mO_>m_D
`
`
`JébhmmwmmDmeO
`
`><.Ew_o
`
`we
`
`on
`
`mm
`
`m_0_>m_mO:
`
`.5950
`
`2
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 2 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`100 —\
`
`
`READ IMAGE FROM TOUCHSCREEN
`
`
`
`
`
`
`
`
`
`
`CONVERT IMAGE TO COLLECTION OR
`
`
`
`LIST OF FEATURES
`
`
`
`
`
`
`
`CLASSIFY AND GROUP FEATURES
`
`
`
`
`
`
`
`
`FEATURE GROUPS
`
`CALCULATE KEY PARAMETERS OF
`
`
`
`
`
`
`
`
`ELEMENT(S)
`
`
`
`ASSOCIATE GROUP(S) TO U.|.
`
`
`112
`
`
`
`
`
`102
`
`
`
`104
`
`
`
`106
`
`
`
`108
`
`
`
`110
`
`
`114
`
`
`
`FEEDBACK
`
`PERFORM
`
`ACTION
`
`
`PROVIDE USER
`
`
`
`
`
`
`FIG. 2
`
`3
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 3 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`120\
`
`
`TOUCH 1
`
`(ID1)
`
`
`
`
`A1
`
`
`
`TOUCH 2
`
`
`(IDZ)
`
`
`
`fl1
`
`
`
`
`Y1
`
`122A
`
`A2
`
`"9‘2
`
`Y
`
`2
`
`X1
`
`1223
`
`FIG. 3A
`
`x
`
`4
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 4 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`DAVG=
`
`
`
`D2+D1
`
`
`2
`
`
`ID2
`
`D2I/
`
`Ix
`
`130
`
`
`
`ID
`
`
`1+
`
`’x'
`
`
`
`
`x, ’01
`
`,./
`
`C
`
`130
`
`
`
`
`FIG. 4
`
`5
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 5 0f 37
`
`
`
`US 2006/0026521 A1
`
`K150
`
`
`
`RECEIVE GROUP OF FEATURES
`
`
`
`
`
`
`
`
`
`
`
`IS THERE A
`
`
`CHANGE IN # OF
`
`FEATURES?
`
`
`
`
`
`
`
`
`
`158
`
`
`
`
`
`
`CALCULATE CURRENT
`CALCULATE INITIAL
`
`
`
`
`PARAMETER VALUES
`PARAMETER VALUES
`
`
`
`
`
`
`
`
`
`REPORT INITIAL AND
`
`
`CURRENT
`
`PARAMETER VALUES
`
`
`
`
`
`
`
`
`
`
`
`
`6
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 6 0f 37
`
`
`
`US 2006/0026521 Al
`
`
`
`CD
`[\‘—
`
`r175
`172
`
`(‘175
`
`
`
`
`
`7
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 7 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`FIG.GB
`
`
`
`
`
`8
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 8 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`FIG.6C
`
`
`
`
`
`9
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 9 0f 37
`
`
`
`US 2006/0026521 A1
`
`A
`
`S
`
`
`g
`
`<
`
`V00
`
`
`‘—
`
`188
`
`
`
`FIG.6D
`
`
`
`
`
`10
`
`10
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 10 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`
`190
`
`
`
`FIG.6E
`
`11
`
`11
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 11 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`/////@
`
`A
`
`/
`
`
`
`g
`
`
`
`
`
`
`
`Q 3%
`
`
`
`‘“
`& fl
`g \ /
`F
`N!
`
`E3
`E
`
`/ \
`
`x
`
`
`p»);
`
`
`
`FIG.6F
`
`12
`
`12
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 12 0f 37
`
`
`
`US 2006/0026521 A1
`
`202
`
`DETECT USER INPUT
`
`
`
`
`
`
`
`
`
`
`
`
`
`CLASSIFY USER INPUT AS
`TRACKING OR SELECTION
`
`
`
`
`
`INPUT OR GESTURE INPUT
`
`‘
`
`
`208
`
`
`
`PERFORM GESTURE
`
`
`CONTROL ACTIONS
`
`
`
`PERFORM TRACKING
`
`
`
`
`
`
`
`DURING USER INPUT
`
`
`
`206
`
`
`
`
`
`
`
`
`FIG. 7
`
`300
`
`
`N
`
`
`
`302
`
`
`OUTPUT GUI OBJECT
`
`
`
`RECEIVE GESTURE INPUT OVER
`
`
`
`GUI OBJECT
`
`
`
`
`
`304
`
`306
`
`
`
`
`
`
`
`MODIFY GUI OBJECT BASED ON
`
`
`
`
`AND IN UNISON WITH THE
`
`
`
`
`GESTURE INPUT
`
`
`
`
`
`
`FIG. 9
`
`13
`
`13
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 13 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`CAPTURE INITIAL IMAGE
`
`
`
`252
`
`
`
`254
`
`
`
`
`
`
`DETERMINE TOUCH MODE
`
`
`
`BASED ON INITIAL IMAGE
`
`260
`
`
`
`SET NEXT IMAGE
`
`
`
`
`
`
`AS INITIAL IMAGE
`
`
`
`CAPTURE NEXT IMAGE
`
`
`
`
`
`
`
`
` DETERMINE IF
`
`MODE CHANGED?
`
`
`
`
`
`
`
`
`256
`
`262
`
`
`
`
`
`
`
`COMPARE INITIAL AND NEXT
`
`
`
`IMAGES
`
`
`
`
`
`
`
`FIG. 8
`
`14
`
`14
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 14 0f 37
`
`
`
`US 2006/0026521 A1
`
`352
`
`
`
`
`
`
`
`
`
`
`DETECT PRESENCE OF FIRST FINGER AND SECOND FINGER ON
`A TOUCH SENSITIVE SURFACE AT THE SAME TIME
`‘
`
`
`
`
`
`
`
`
`
`
`
`
`
`COMPARE DISTANCE BETWEEN
`
`
`TWO FINGERS
`
`
`
`
`
`
`
`356
`
`
`
`
`GENERATE ZOOM IN
`GENERATE ZOOM OUT
`
`
`
`
`
`
`
`
`SIGNAL IF DISTANCE
`SIGNAL IF DISTANCE
`
`
`
`
`
`BETWEEN TWO FINGERS
`BETWEEN TWO FINGERS
`
`
`
`
`
`
`
`
`DECREASES
`INCREASES
`
`
`
`
`
`358
`
`
`
`
`FIG. 10
`
`
`400\
`
`
`
`DETECT PRESENCE OF FIRST
`
`
`
`FINGER AND SECOND FINGER
`
`
`
`
`ON A TOUCH SENSITIVE
`
`
`SURFACE AT THE SAME TIME
`
`
`
`
`
`
`
`
`
`
`INITIAL POSITION
`
`MONITOR POSITION OF TWO
`
`
`OBJECTS WHEN MOVED
`
`
`
`TOGETHER ACROSS TOUCH
`
`
`SENSITIVE SURFACE
`
`
`
`GENERATE PAN SIGNAL WHEN
`
`
`
`THE POSITION OF THE FINGERS
`
`
`
`
`CHANGES RELATIVE TO AN
`
`
`
`
`
`
`FIG. 12
`
`15
`
`15
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 15 0f 37
`
`
`
`US 2006/0026521 A1
`
`368
`
`
`
`\_
`
`372‘
`
`370
`
`364
`
`
`
`
`
`
`FIG. 1 1A
`
`
`
`368
`
`
`
`364
`
`
`
`366
`
`
`FIG. 11B
`
`
`
`
`
`16
`
`16
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 16 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`FIG. 11C
`
`
`
`366
`
`
`
`372
`
`
`
`
`
`N. California
`
`
`
`
`
`364
`
`
`
`
`366
`
`
`
`
`FIG. 11D
`
`
`
`
`
`17
`
`17
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 17 0f 37
`
`
`
`US 2006/0026521 Al
`
`
`
`N. California
`
`
`
`
`FIG. 11E
`364
`
`
`
`
`
`
`
`
`
`
`FIG. 11F
`
`366
`
`
`
`18
`
`18
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 18 0f 37
`
`
`
`US 2006/0026521 A1
`
`376
`
`
`
`364
`
`
`
`San Franci
`,-’ 1?"
`
`378
`
`
`
`366
`
`
`
`I
`
`
`
`
`
`
`"5‘
`\(-_I
`
`"\_. San Jose
`
`
`
`
`
`
`v
`
`
`366
`
`
`
`
`
`
`FIG. 116
`378
`
`
`
`364
`
`
`
`
`..... ,
`
`
`\
`
`
`
`
`‘. \ \
`
`
`\
`
`
`
`
`
`
`‘2‘
`
`
`\_,_,.:"‘
`
`
`>~
`
`‘4‘.
`
`\'._\../'I
`”r
`
`
`
`‘,’ .\
`‘._
`
`
`
`
`”I; :. .2
`
`\.‘\.
`
`‘/
`
`
`
`
`
`.‘
`
`.
`
`
`
`
`\'._/
`
`
`
`
`
`
`
`‘-.V.~/-:;\._
`
`
`
`' I
`
`
`
`366
`
`
`
`x x.‘
`-
`
`\
`
`
`S riVQ/l
`aros
`
`‘a
`
`/
`
`')
`l‘— ‘ J
`M-
`
`‘,\
`(K
`
`‘9
`\,
`
`‘/»‘
`
`_V/V
`
`7’
`
`‘.\,
`
`.‘I
`1k.
`I!
`
`\R
`
`\ \
`
`f
`
`
`FIG. 11H
`
`
`
`366
`
`
`
`19
`
`19
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 19 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`368
`
`
`
`
`
`
`
`FIG. 13A
`
`
`
`FIG. 13B
`
`20
`
`20
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 20 0f 37
`
`
`
`US 2006/0026521 A1
`
`368
`
`
`
`
`
`
`
`
`
`
`
`
`
`366
`
`
`
`
`
`
`
`FIG. 130
`
`
`
`
`
`
`FIG. 13D
`
`
`
`21
`
`21
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 21 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`450 \‘
`
`
`452
`DETECT PRESENCE OF FIRST
`
`
`
`
`
`FINGER AND SECOND FINGER
`
`
`
`
`ON A TOUCH SENSITIVE
`
`
`
`SURFACE AT THE SAME TIME
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`GENERATE INITIAL ANGLE AT
`
`
`SET DOWN
`
`
`
`
`
`454
`
`
`
`456
`
`
`
`
`
`
`
`GENERATE ROTATE SIGNAL
`
`
`
`
`WHEN THE ANGLE CHANGES
`
`
`
`
`
`
`
`
`
`FIG. 14
`
`
`500 \'
`
`DETECT PRESENCE AN OBJECT
`
`
`
`ON TOUCH SENSITIVE SURFACE
`
`
`
`
`
`RECOGNIZE OBJECT
`
`
`
`
`502
`
`
`
`504
`
`
`
`DISPLAY IMAGE IN VICINITY OF
`
`
`
`OBJECT
`
`
`506
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 16
`
`
`
`22
`
`22
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 22 0f 37
`
`
`
`US 2006/0026521 A1
`
`368
`
`
`
`
`
`364
`
`
`FIG. 15A
`
`
`
`
`
`
`
`366
`
`
`
`
`FIG. 158
`
`
`
`36 8
`
`
`
`
`364
`
`366
`
`
`
`FIG. 150 366
`
`23
`
`23
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 23 0f 37
`
`
`
`US 2006/0026521 A1
`
`FIG.170
` FIG.17B
`FIG.17A
`
`
`520
`
`
`
`24
`
`24
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 24 0f 37
`
`
`
`US 2006/0026521 A1
`
`516
`
`520
`
`
`
`
`
`//
`
`FIG.17E
`
`
`512
`
`
`
`17D
`
`FIG.
`
`25
`
`25
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 25 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`550 \‘
`
`552
`
`
`
`DISPLAY CONTROL BOX HAVING
`
`
`
`CONTROL BUTTONS
`
`
`
`
`
`ENLARGE CONTROL BOX AND CONTROL BUTTONS WHEN
`
`
`
`
`
`
`
`THE PRESENCE OF AN OBJECT IS DETECTED OVER THE
`
`
`
`
`
`
`CONTROL BOX
`
`
`
`554
`
`
`
`
`
`
`
`
`
`GENERATE CONTROL SIGNAL ASSOCIATED WITH A SELECTED BUTTON
`OF THE CONTROL BOX WHEN THE PRESENCE OF THE OBJECT IS
`
`
`
`
`
`
`
`
`
`DETECTED OVER ONE OF THE ENLARGED CONTROL BUTTONS
`
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 18
`
`600
`
`
`\
`
`
`
`602
`
`
`
`DISPLAY PAGE FROM A GROUP
`
`
`
`OF PAGES
`
`
`
`
`
`604
`
`
`
`
`
`
`
`
`
`
`DETECT PRESENCE OF OBJECT
`
`
`
`
`OVER PREDETERMINED REGION
`
`
`
`
`OF DISPLAYED PAGE
`
`
`
`
`
`
`
`
`606
`GENERATE PAGE TURN SIGNAL WHEN THE
`
`
`
`
`
`
`DETECTED OBJECT IS SLID HORIZONTALLY ACROSS
`
`
`
`
`
`THE PAGE IN THE PREDETERMINED REGION OF THE
`
`
`
`
`
`
`
`
`PAGE
`
`
`
`
`
`
`
`
`FIG. 20
`
`26
`
`26
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 26 0f 37
`
`
`
`US 2006/0026521 A1
`
`580
`
`512
`
`
`
`
`
`576 578
`
`.‘_
`
`
`"7‘-@'
`WINDOW
`0’,
`
`
`
`
`
` 516
`
`
`
`
`FIG. 19A
`
`
`
`
`
`580
`
`
`
`
`
`
`FIG. 19B
`
`
`
`27
`
`27
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 27 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`516
`
`
`FIG. 190
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 19D
`
`
`
`28
`
`28
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 28 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`516
`
`512
`
`FIG.21B
`
`
`
`512
`
`
`
`
`
`FIG.21A
`
`V‘—
`
`L0
`
`
`
`29
`
`29
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 29 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`516
`
`
`
`FIG.21D
`
`516
`
`
`
`
`512
`
`512
`
`
`FIG.21C
`
`
`
`30
`
`30
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 30 0f 37
`
`\‘
`
`650
`
`652
`
`DISPLAY IMAGE
`
`
`
`
`DETECT SCROLLING OR PANNING STROKE
`
`
`
`
`
`
`
`
`US 2006/0026521 A1
`
`
`
`654
`
`
`
`DETERMINE SPEED AND DIRECTION OF SCROLLING OR PANNING
`
`
`
`
`
`STROKE
`
`
`
`
`MOVE IMAGE IN ACCORDANCE WITH THE SPEED AND DIRECTION
`
`
`
`
`
`
`
`OF THE SCROLLING OR PANNING STROKE
`
`
`
`
`
`
`
`
`LONGER DETECTED
`
`SLOWING MOTION OF IMAGE IN ACCORDANCE WITH INERTIA
`
`
`
`
`
`
`PRINCIPALS WHEN THE PANNING OR SCROLLING STROKE IS NO
`
`
`
`
`
`
`
`
`
`
`FIG. 22
`
`700
`
`
`
`
`\‘
`
`702
`
`
`
`656
`
`
`
`658
`
`
`
`660
`
`
`
`DISPLAY KEYBOARD
`
`
`
`
`
`
`
`
`
`
`SAME TIME 706
`
`DETECT PRESENCE OF A FIRST OBJECT
`
`
`
`OVER A FIRST KEY AND A SECOND
`
`
`
`
`
`OBJECT OVER A SECOND KEY AT THE
`
`
`
`
`
`
`
`
`
`GENERATE SINGLE CONTROL FUNCTION WHEN THE
`
`
`
`
`
`FIRST OBJECT IS DETECTED OVER THE FIRST KEY
`
`
`
`
`
`
`
`AND WHEN THE SECOND OBJECT IS DETECTED OVER
`
`
`
`
`
`
`THE SECOND KEY
`
`
`
`
`
`
`FIG. 24
`
`31
`
`31
`
`
`
` =55...
`Sangflm
`song-m
`
`
`Smllfll
`songnod
`sangnn:
`
`
`
`
`Song-ms
`‘soogllmswamSowlma
`
`
`
`
`
`
`mum
`
`
`imamSolon"
`
`
`
`
`smut“!
`
`
`, 5mm:
`
`
`’seoglm5mm:
`
`
`
`
`Swims
`
`
`sung-m
`
`
`
`
`small":
`‘
`
`
`Song"!!!
`
`
`Songllll
`sangIm
`
`
`Sun-nun
`
`
`Senna
`
`
`Sonollil
`
`
`
`
`
`
`s-ngnzl
`
`
`1:22:12:
`songlm
`
`
`Song 1120
`
`
`Sung nae
`
`
`son-mso m3:
`
`
`
`
`
`
`
`
`
`.
`
`
`
`
`
`~ ~-
`
`-
`
`-»
`
`..
`.
`
`.
`f
`
`-
`:
`'
`:
`;
`
`
`
`
`W.
`
`
`
`
`
`3
`
`
`
`H
`,;
`‘.
`.
`1
`:
`..
`,
`'
`
`.
`
`‘
`
`.
`.
`
`:.
`!
`'
`
`680
`
`
`
`678
`
`
`
`679
`
`
`
`520
`
`
`
`678
`
`
`
`67
`
`520
`
`9
`
`
`
`
`
`
`munSonata)
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 31 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`;
`E
`:
`:
`'
`
`‘
`
`681
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`681
`
`
`
`576
`
`
`
`/
`
`.
`
`‘
`
`'
`
`Sommm
`Sang Mm
`,
`So
`llfi
`’ my?“
`nguoe
`: Solulllfl
`Sunfillm
`‘
`
`Songmw
`sowlm
`songmo
`
`
`Son-gm
`
`
`30mmSou-um
`
`
`
`
`Song!!!
`
`Sougl‘l
`
`scrum"
`
`
`
`
`
`SonflIHUSona'il'
`
`
`suntan
`
`
`song-m
`
`
`
`
`
`
`
`
`
`>
`
`
`FIG. 238
`
`
`
`32
`
`32
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 32 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`
`
`
`
`681
`
`
`
`FIG. 230
`
`576
`
`681
`
`/
`.......
`
`
`
`
`
`
`
`
`
`FIG. 23D
`
`
`
`33
`
`33
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 33 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
`512
`
`
`
`730
`
`520
`
`730
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` .QBDEIBEE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`576A
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`FIG. 258
`
`
`
`34
`
`34
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 34 0f 37
`
`
`
`US 2006/0026521 A1
`
`512
`
`
`
`/
`
`. Satin-MW"Nu"0mm—WESEHEEéEEéEBE........................................C)
`
`(10
`
`;
`
`v
`
`J
`
`
`
`
`
`\
`j
`
`
`..............................
`
`L
`
`
`
`......
`
`
`
`5768
`
`
`
`FIG. 25c:
`
`
`
`
`'
`
`
`
`oowoamocgssm"""""""""""""""""""""6"""
`
`
`
`
`
`
`
`512
`
`520
`
`/ o
`
`qu
`
`
`
`E]
`Di]
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`\
`
`
`
`
`
`V
`
`
`
`
`
`
`
`
`
`
`
`U
`
`_“ v
`
`
`
`:
`
`
`
`730
`
`
`
`
`FIG. 25D
`
`
`
`35
`
`576A
`
`
`
`r
`
`,
`
`576A
`
`
`
`r
`
`520
`
`
`
`
`
`
`
`
`mm
`Q l
`
`v 1
`
`
`
`
`
`
`
`
`
`
`
`mum n1
`WEED O U
`
`in:
`DA v D ‘
`DI
`m V
`
`
`
`V
`U
`
`
`
`
`
`35
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 35 0f 37
`
`
`
`US 2006/0026521 A1
`
`750
`
`
`
`752
`
`
`
`
`RESENT VIRTUAL SCROLL
`
`WHEEL
`
`
`
`\ P
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`DETECT PRESENCE OF
`
`
`FINGER(S) ON TOUCHSCREEN
`
`754
`
`
`
`756
`
`
`
`
`
`
`SET INITIAL POSITION OF
`
`
`
`FINGER(S) ON VIRTUAL SCROLL
`
`WHEEL
`
`758
`
`
`
`
`
`
`
`
`GENERATE ROTATE SIGNAL
`
`
`
`
`WHEN FINGER(S) MOVES ABOUT
`
`
`
`
`
`THE VIRTUAL SCROLL WHEEL
`
`
`FIG. 26
`
`36
`
`36
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 36 0f 37
`
`
`
`US 2006/0026521 A1
`
`512
`
`
`
`762
`
`
`FIG. 27A
`
`
`
`764
`
`
`
`
`
` FIG. 27B
`
`
`
`
`
`37
`
`37
`
`
`
`
`
`
`
`
`
`Patent Application Publication Feb. 2, 2006 Sheet 37 0f 37
`
`
`
`US 2006/0026521 A1
`
`
`
` FIG. 27D
`
`
`
`
`
`38
`
`38
`
`
`
`US 2006/0026521 A1
`
`
`
`
`Feb. 2, 2006
`
`
`
`GESTURES FOR TOUCH SENSITIVE INPUT
`
`
`
`
`DEVICES
`
`
`
`
`
`
`
`
`BACKGROUND OF THE INVENTION
`
`
`1. Field of the Invention
`[000 1]
`
`
`
`
`
`[0002] The present invention relates generally to gesturing
`
`
`
`
`
`
`associated with touch sensitive devices.
`
`
`
`
`
`2. Description of the Related Art
`[0003]
`
`
`
`
`
`
`[0004] There exist today many styles of input devices for
`
`
`
`
`
`
`
`
`
`performing operations in a computer system. The operations
`
`
`
`
`
`
`generally correspond to moving a cursor and making selec-
`
`
`
`
`
`
`
`tions on a display screen The operations may also include
`
`
`
`
`
`
`
`
`scrolling, panning, zooming, etc. By way of
`paging,
`
`
`
`
`
`
`
`
`example, the input devices may include buttons, switches,
`
`
`
`
`
`
`
`
`keyboards, mice, trackballs, touch pads, joy sticks, touch
`
`
`
`
`
`
`
`
`screens and the like. Each of these devices has advantages
`
`
`
`
`
`
`
`
`
`and disadvantages that are taken into account when design-
`
`
`
`
`
`
`
`
`
`ing the computer system.
`
`
`
`
`[0005] Buttons and switches are generally mechanical in
`
`
`
`
`
`
`
`nature and provide limited control with regards to the
`
`
`
`
`
`
`
`
`
`movement of
`the cursor and making selections. For
`
`
`
`
`
`
`
`
`example, they are generally dedicated to moving the cursor
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`in a specific direction (e.g., arrow keys) or to making
`specific selections (e.g., enter, delete, number, etc.).
`
`
`
`
`
`
`
`[0006]
`In mice, the movement of the input pointer corre-
`
`
`
`
`
`
`
`
`
`sponds to the relative movements of the mouse as the user
`
`
`
`
`
`
`
`
`moves the mouse along a surface. In trackballs, the move-
`
`
`
`
`
`
`
`ment of the input pointer corresponds to the relative move—
`
`
`
`
`
`
`
`
`ments of a ball as the user moves the ball within a housing.
`
`
`
`
`
`
`
`
`Mice and trackballs also include one or more buttons for
`
`
`
`
`
`
`
`
`
`making selections. Mice may also include scroll wheels that
`
`
`
`
`
`
`
`
`
`allow a user to move through the GUI by simply rolling the
`
`
`
`
`
`
`
`
`
`wheel forward or backward.
`
`
`
`[0007] With touch pads, the movement of the input pointer
`
`
`
`
`
`
`
`
`corresponds to the relative movements of the user’s finger
`
`
`
`
`
`
`
`(or stylus) as the finger is moved along a surface of the touch
`
`
`
`
`
`
`
`
`pad. Touch screens, on the other hand, are a type of display
`
`
`
`
`
`
`
`
`
`screen that has a touch—sensitive transparent panel covering
`
`
`
`
`
`
`
`the screen. When using a touch screen, a user makes a
`
`
`
`
`
`
`
`
`selection on the display screen by pointing directly to GUI
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`objects on the screen (usually with a stylus or finger).
`[0008]
`In order to provide additionally functionality, ges—
`
`
`
`
`
`
`
`tures have been implemented with some of these input
`
`
`
`
`
`
`
`
`devices. By way of example, in touch pads, selections may
`
`
`
`
`
`
`
`
`be made when one or more taps are detected on the surface
`
`
`
`
`
`
`
`
`
`of the touch pad. In some cases, any portion of the touch pad
`
`
`
`
`
`
`
`may be tapped, and in other cases a dedicated portion of the
`
`
`
`
`
`
`
`
`touch pad may be tapped. In addition to selections, scrolling
`
`
`
`
`
`
`may be initiated by using finger motion at the edge of the
`
`
`
`
`
`
`
`
`touch pad.
`
`
`[0009] U.S. Pat. Nos. 5,612,719 and 5,590,219, assigned
`
`
`
`
`
`
`
`
`to Apple Computer, Inc. describe some other uses of ges—
`
`
`
`
`
`
`
`
`
`turing. U.S. Pat. No. 5,612,719 discloses an onscreen button
`
`
`
`
`
`
`
`
`that is responsive to at least two different button gestures
`
`
`
`
`
`
`
`
`made on the screen on or near the button. U.S. Pat. No,
`
`
`
`
`
`
`
`
`
`
`5,590,219 discloses a method for recognizing an ellipse—type
`
`
`
`
`
`gesture input on a display screen of a computer system.
`
`
`
`
`
`
`[0010]
`In recent times, more advanced gestures have been
`
`
`
`
`
`
`
`
`implemented. For example, scrolling may be initiated by
`
`
`
`
`
`
`
`placing four fingers on the touch pad so that the scrolling
`
`
`
`
`
`
`
`
`
`gesture is recognized and thereafter moving these fingers on
`
`
`
`
`
`
`
`the touch pad to perform scrolling events. The methods for
`
`
`
`
`
`
`
`
`implementing these advanced gestures, however, has several
`
`
`
`
`
`
`
`drawbacks. By way of example, once the gesture is set, it
`
`
`
`
`
`
`
`
`cannot be Changed until the user resets the gesture state. In
`
`
`
`
`
`
`
`
`
`
`touch pads, for example, if four fingers equals scrolling, and
`
`
`
`
`
`
`
`
`
`the user puts a thumb down after the four fingers are
`
`
`
`
`
`
`
`
`
`
`recognized, any action associated with the new gesture
`
`
`
`
`
`
`
`
`including four fingers and the thumb will not be performed
`
`
`
`
`
`
`
`
`
`until the entire hand is lifted off the touch pad and put back
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`down again (e.g., reset). Simply put, the user cannot change
`gesture states midstream. Along a similar vein, only one
`
`
`
`
`
`
`
`
`gesture may be performed at any given time. That
`is,
`
`
`
`
`
`
`
`
`multiple gestures cannot be performed simultaneously.
`
`
`
`
`
`[0011] Based on the above, there is a need [or improve-
`
`
`
`
`
`
`
`ments in the way gestures are performed on touch sensitive
`
`
`
`
`
`
`
`devices.
`
`
`
`
`
`SUMMARY OF THE INVENTION
`
`
`
`to a
`[0012] The invention relates,
`in one embodiment,
`
`
`
`
`
`
`
`computer implemented method for processing touch inputs.
`
`
`
`
`
`
`The method includes reading data from a multipoint touch
`
`
`
`
`
`
`
`
`screen. The data pertains to touch input with respect to the
`
`
`
`
`
`
`
`
`
`touch screen. The method also includes identifying at least
`
`
`
`
`
`
`
`
`one multipoint gesture based on the data from the multipoint
`
`
`
`
`
`
`
`
`touch screen.
`
`
`[0013] The invention relates, in another embodiment to a
`
`
`
`
`
`
`gestural method. The method includes detecting multiple
`
`
`
`
`
`
`
`touches at different points on a touch sensitive surface at the
`
`
`
`
`
`
`
`same time. The method also includes segregating the mul-
`
`
`
`
`
`
`
`
`
`tiple touches into at
`least
`two separate gestural
`inputs
`
`
`
`
`
`
`
`
`
`occurring simultaneously. Each gestural input has a different
`
`
`
`
`
`
`
`function such as zooming, panning, rotating and the like.
`
`
`
`
`
`
`
`
`[0014] The invention relates, in another embodiment to a
`
`
`
`
`
`
`gestural method. The method includes concurrently detect-
`
`
`
`
`
`
`
`ing a plurality of gestures that are concurrently performed
`
`
`
`
`
`
`
`
`with reference to a touch sensing device. The method also
`
`
`
`
`
`
`
`
`includes producing dilferent commands for each of the
`
`
`
`
`
`
`
`
`gestures that have been detected.
`
`
`
`
`
`[0015] The invention relates, in another embodiment to a
`
`
`
`
`
`
`gestural method. The method includes displaying a graphi—
`
`
`
`
`
`
`
`cal image on a display screen. The method also includes
`
`
`
`
`
`
`
`
`
`detecting a plurality of touches at the same time on a touch
`
`
`
`
`
`
`
`
`sensitive device. The method further includes linking the
`
`
`
`
`
`
`
`
`detected multiple touches to the graphical image presented
`
`
`
`
`
`
`
`on the display screen.
`
`
`
`
`[0016] The invention relates, in another embodiment to a
`
`
`
`
`
`
`method of invoking a user interface element on a display via
`
`
`
`
`
`
`
`a multipoint
`touch screen of a computing system. The
`
`
`
`
`
`
`
`
`method includes detecting and analyzing the simultaneous
`
`
`
`
`
`
`
`presence of two or more objects in contact with the multi-
`
`
`
`
`
`
`
`
`point touch screen. The method also includes selecting a
`
`
`
`
`
`
`
`
`user interface tool, from a plurality of available tools, to
`
`
`
`
`
`
`
`
`
`display on a display for interaction by a user of the com—
`
`
`
`
`
`
`
`
`puting system based at least in part
`the analyzing. The
`
`
`
`
`
`
`
`
`
`method further includes controlling the interface tool based
`
`
`
`
`
`
`
`
`at least in part on the further movement of the objects in
`
`
`
`
`
`
`
`
`
`relation to the multipoint touch screen.
`
`
`
`
`
`[0017] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`touch-based method. The method includes detecting a user
`
`
`
`
`
`
`
`
`39
`
`
`
`
`39
`
`
`
`US 2006/0026521 A1
`
`
`
`
`Feb. 2, 2006
`
`
`
`
`
`
`
`
`input that occurs over a multipoint sensing device. The user
`
`
`
`
`
`
`
`
`
`input includes one or more inputs. Each input has a unique
`
`
`
`
`
`
`
`
`
`identifier. The method also includes, during the user input,
`
`
`
`
`
`
`
`
`
`classifying the user input as a tracking or selecting input
`
`
`
`
`
`
`
`
`when the user input
`includes one unique identifier or a
`
`
`
`
`
`
`
`
`
`gesture input when the user input includes at
`least two
`
`
`
`
`
`
`
`
`
`unique identifiers. The method further includes performing
`
`
`
`
`
`
`
`tracking or selecting during the user input when the user
`
`
`
`
`
`
`
`
`
`is classified as a tracking or selecting input, The
`input
`
`
`
`
`
`
`
`
`method additionally includes performing one or more con-
`
`
`
`
`
`
`
`trol actions during the user input when the user input is
`
`
`
`
`
`
`
`
`
`
`classified as a gesturing input. The control actions being
`
`
`
`
`
`
`
`
`based at least in part on changes that occur between the at
`
`
`
`
`
`
`
`
`
`least two unique identifiers.
`
`
`
`
`[0018] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`touch-based method. The method includes outputting a GUI
`
`
`
`
`
`
`
`on a display. The method also includes detecting a user input
`
`
`
`
`
`
`
`
`
`on a touch sensitive device. The method further includes
`
`
`
`
`
`
`
`
`analyzing the user input for characteristics indicative of
`
`
`
`
`
`
`
`
`tracking, selecting or a gesturing. The method additionally
`
`
`
`
`
`
`includes categorizing the user input as a tracking, selecting
`
`
`
`
`
`
`
`or gesturing input. The method further includes performing
`
`
`
`
`
`
`
`
`tracking or selecting in the GUI when the user input is
`
`
`
`
`
`
`
`
`
`
`categorized as a tracking or selecting input. Moreover, the
`
`
`
`
`
`
`method includes performing control actions in the GUI
`
`
`
`
`
`
`
`
`when the user input is categorized as a gesturing input, the
`
`
`
`
`
`
`
`actions being based on the particular gesturing input.
`
`
`
`
`
`
`
`[0019] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`touch-based method. The method includes capturing an
`
`
`
`
`
`
`
`initial touch image. The method also includes determining
`
`
`
`
`
`
`
`
`the touch mode based on the touch image. The method
`
`
`
`
`
`
`
`
`
`
`further includes capturing the next touch image. The method
`
`
`
`
`
`
`
`
`
`further includes determining if the touch mode changed
`
`
`
`
`
`
`
`between the initial and next
`touch images. The method
`
`
`
`
`
`
`
`
`
`additionally includes, if the touch mode changed, setting the
`
`
`
`
`
`
`
`
`next touch image as the initial touch image and determining
`
`
`
`
`
`
`
`
`
`the touch mode based on the new initial
`touch image.
`
`
`
`
`
`
`
`
`
`
`Moreover, the method includes, if the touch mode stayed the
`
`
`
`
`
`
`
`
`
`same, comparing the touch images and performing a control
`
`
`
`
`
`
`
`function based on the comparison.
`
`
`
`
`[0020] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`computer implemented method for processing touch inputs.
`
`
`
`
`
`
`The method includes reading data from a touch screen. The
`
`
`
`
`
`
`
`
`
`data pertaining to touch input with respect
`to the touch
`
`
`
`
`
`
`
`
`
`screen, and the touch screen having a multipoint capability.
`
`
`
`
`
`
`
`
`The method also includes converting the data to a collection
`
`
`
`
`
`
`
`
`of features. The method further includes classifying the
`
`
`
`
`
`
`
`
`features and grouping the features into one or more feature
`
`
`
`
`
`
`
`
`
`groups. The method additionally includes calculating key
`
`
`
`
`
`
`
`parameters of the feature groups and associating the feature
`
`
`
`
`
`
`
`
`groups to user interface elements on a display.
`
`
`
`
`
`[0021] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`computer implemented method. The method includes out-
`
`
`
`
`
`
`
`putting a graphical image. The method also includes receiv-
`
`
`
`
`
`
`
`
`ing a multitouch gesture input over the graphical image. The
`
`
`
`
`
`
`
`
`
`method further includes changing the graphical image based
`
`
`
`
`
`
`
`on and in unison with multitouch gesture input.
`
`
`
`
`
`
`
`[0022] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`touch based method. The method includes receiving a ges-
`
`
`
`
`
`
`
`
`tural input over a first region. The method also includes
`
`
`
`
`
`
`
`
`
`generating a first command when the gestural
`input
`is
`
`
`
`
`
`
`
`received over the first region. The method further includes
`
`
`
`
`
`
`
`
`
`
`4o
`
`
`
`receiving the same gestural input over a second region. The
`
`
`
`
`
`
`
`
`method additionally includes generating a second command
`
`
`
`
`
`when the same gestural input is received over the second
`
`
`
`
`
`
`
`region. The second command being different than the first
`
`
`
`
`
`
`
`
`command.
`
`[0023] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`method for recognizing multiple gesture inputs. The method
`
`
`
`
`
`
`
`includes receiving a multitouch gestural stroke on a touch
`
`
`
`
`
`
`
`sensitive surface. The multitouch gestural stroke maintain-
`
`
`
`
`
`
`
`ing continuous contact on the touch sensitive surface. The
`
`
`
`
`
`
`
`
`method also includes recognizing a first gesture input during
`
`
`
`
`
`
`
`
`the multitouch gestural stroke. The method further includes
`
`
`
`
`
`
`
`
`recognizing a second gesture input during the multitouch
`
`
`
`
`
`
`
`gestural stroke.
`
`
`[0024] The invention relates, in another embodiment, to a
`
`
`
`
`
`
`computer
`implemented method. The method includes
`
`
`
`
`
`
`detecting a plurality of touches on a touch sensing device.
`
`
`
`
`
`
`The method also includes forming one or more touch groups
`
`
`
`
`
`
`
`
`
`with the plurality of touches. The method further includes
`
`
`
`
`
`
`
`
`
`monitoring the movement of and within each of the touch
`
`
`
`
`
`
`
`
`groups. The method additionally includes generating control
`
`
`
`
`
`
`
`signals when the touches within the touch groups are moved
`
`
`
`
`
`
`
`
`
`or when the touch groups are moved in their entirety.
`
`
`
`
`
`
`
`
`[0025]
`It should be noted that in each of the embodiments
`
`
`
`
`
`
`
`described above, the methods may be implemented using a
`
`
`
`
`
`
`
`touch based input device such as a touch screen or touch pad,
`
`
`
`
`
`
`
`
`more particularly a multipoint touch based input device, and
`
`
`
`
`
`
`
`
`even more particularly a multipoint touch screen. It should
`
`
`
`
`
`
`
`also be noted that
`the gestures, gesture modes, gestural
`
`
`
`
`
`
`
`
`
`inputs, etc. may correspond to any of those described below
`
`
`
`
`
`
`
`
`in the detailed description. For example, the gestures may be
`
`
`
`
`
`
`
`
`
`associated with zooming, panning,
`scrolling,
`rotating,
`
`
`
`
`
`
`enlarging, floating controls, zooming targets, paging, inertia,
`
`
`
`
`
`
`
`keyboarding, wheeling, and/or the like.
`
`
`
`
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`[0026] The invention will be readily understood by the
`
`
`
`
`
`
`
`following detailed description in conjunction with the
`
`
`
`
`
`
`accompanying drawings, wherein like reference numerals
`
`
`
`
`
`designate like structural elements, and in which:
`
`
`
`
`
`
`[0027: FIG. 1 is a block diagram of a computer system, in
`
`
`
`
`
`
`accorc ance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0028: FIG. 2 is a multipoint processing method, in accor-
`
`
`
`
`
`
`dance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0029: FIGS. 3A and B illustrate an image, in accordance
`
`
`
`
`with one embodiment of the present invention.
`
`
`
`
`
`
`[0030: FIG. 4 illustrates a group of features, in accordance
`
`
`
`
`
`with one embodiment of the present invention.
`
`
`
`
`
`
`[0031: FIG. 5 is a parameter calculation method, in accor—
`
`
`
`
`
`dance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0032: FIGS. 6A-6H illustrate a rotate gesture, in accor-
`
`
`
`
`
`
`dance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0033: FIG. 7 is a diagram of a touch-based method, in
`
`
`
`
`
`accore ance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0034: FIG. 8 is a diagram of a touch-based method, in
`
`
`
`
`
`accorc ance with one embodiment of the present invention.
`
`
`
`
`
`
`
`[0035: FIG. 9 is a diagram of a touch-based method, in
`
`
`
`
`
`accorc ance with one embodiment of the present invention.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`40
`
`
`
`US 2006/0026521 A1
`
`
`
`
`Feb. 2, 2006
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`:0036] FIG. 10 is a diagram of a zoom gesture method, in
`
`
`
`
`
`
`accordance with one embodiment of the present invention.
`
`
`
`
`
`
`
`:0037] FIGS. 11A-11J illustrates a zooming sequence, in
`
`
`
`
`
`
`accordance with one embodiment of the present invention,
`
`
`
`
`
`
`
`:0038] FIG. 12 is a diagram of a pan method, in accor—
`
`
`
`
`
`
`dance with one embodiment of the present invention.
`
`
`
`
`
`
`
`:0039] FIGS. 13A-13D illustrate a panning sequence, in
`
`
`
`
`
`
`accordance with one embodiment of the present invention.
`
`
`
`
`
`
`
`:0040] FIG. 14 is a diagram of a rotate method,
`in
`
`
`
`
`
`
`
`accordance with one embodiment of the present invention.
`
`
`
`
`
`
`
`:0041] FIGS. 15A-15C illustrate a rotating sequence, in
`
`
`
`
`
`
`accordance with one embodiment of the present invention.
`
`
`
`
`
`
`
`:0042] FIG. 16 is a diagram of a GUI operational method,
`
`
`
`
`
`
`in accordance with one embodiment of the present inven-
`
`
`
`
`
`
`
`
`t