throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2015/0153913 A1
`Ballard et al.
`(43) Pub. Date:
`Jun. 4, 2015
`
`US 2015O153913A1
`
`(54) SYSTEMS AND METHODS FOR
`INTERACTING WITH AVIRTUAL MENU
`
`on Dec. 1, 2013, provisional application No. 62/043,
`759, filed on Aug. 29, 2014.
`
`(71) Applicants: Brian Adams Ballard, Herndon, VA
`(US); James Leighton Athey,
`Washington, DC (US); Jeffrey Edward
`Jenkins, Clarksburg, MD (US); Todd
`Richard Reily, Stoneham, MA (US);
`Harold Ronald Villanueva Tagunicar,
`Falls Church, VA (US); Michael
`Anthony Sciscenti, Ashburn, VA (US)
`(72) Inventors: Brian Adams Ballard, Herndon, VA
`(US); James Leighton Athey,
`Washington, DC (US); Jeffrey Edward
`Jenkins, Clarksburg, MD (US); Todd
`Richard Reily, Stoneham, MA (US);
`Harold Ronald Villanueva Tagunicar,
`Falls Church, VA (US); Michael
`Anthony Sciscenti, Ashburn, VA (US)
`
`(73) Assignee: APX LABS, LLC, Herndon, VA (US)
`
`(21) Appl. No.: 14/556,622
`
`(22) Filed:
`
`Dec. 1, 2014
`
`O
`O
`Related U.S. Application Data
`(60) Provisional application No. 61/910,419, filed on Dec.
`1, 2013, provisional application No. 61/910,425, filed
`
`Publication Classification
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`(51) Int. Cl.
`G06F 3/0482
`GO2B 27/0
`G06F 3/0
`G06F 3/16
`(52) U.S. Cl.
`CPC .............. G06F 3/0482 (2013.01); G06F 3/167
`(2013.01); G02B 27/017 (2013.01); G06F
`3/013 (2013.01); G02B 2027/0178 (2013.01)
`ABSTRACT
`(57)
`Systems and methods allow a user to interact with an aug
`mented reality device. In one implementation, a wearable
`device for providing a virtual menu to a user includes a
`display; at least one sensor configured to provide an output
`indicative of a viewing direction the user, and at least one
`processing device. The at least one processing device is con
`figured to cause a virtual menu to be shown on the display;
`monitor a viewing direction of the user based on the output of
`the at least one sensor, determine, based on the monitored
`viewing direction, whether the user is looking in a direction of
`a selectable element of the virtual menu; determine an
`amount of time that the user looks in the direction of the
`selectable element; and cause at least one action associated
`with the selectable element if the amount of time exceeds a
`predetermined dwell time threshold.
`
`
`
`Supercell
`Exhibit 1008
`Page 1
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 1 of 33
`
`US 2015/O153913 A1
`
`PROCESSOR:
`
`EWCE
`
`
`
`
`
`MEMORY
`3.32
`
`----
`
`w8.
`
`DATABASE
`333
`SERERSYSE
`C
`
`s
`s
`'.
`
`
`
`UNICATION
`ERACE
`
`SAY
`3.
`myr
`
`a--
`
`
`
`. - - - - - - - - - - - - - - ..
`
`PROCESSOR:
`DEVICE
`
`:
`
`F.
`
`EORY
`
`3.
`
`:
`
`SERSYSE
`to "...o.
`
`Supercell
`Exhibit 1008
`Page 2
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 2 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 3
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 3 of 33
`
`US 2015/O153913 A1
`
`''''''.----------X--------------------------------------resssssss---------------------.........------------------------,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,3.
`Third Fatty Appications 362
`Augmented Reality she 364
`
`---,-,-,-,-,-,-,-,-,---------------------------------------------------------
`offragiggessSo 38
`
`Eryofreas: 88ssaf 38
`
`4,444-4-4-4-4-4-4-4-44444444
`
`Rendering Services
`case 37
`
`
`
`
`
`
`
`
`
`
`
`
`
`Visual
`
`sing
`
`Geolocational Processing Module
`378
`
`-c.c. :
`Graphics
`
`
`
`Positional Processing 88odule 378
`i"national
`-------------,
`-.
`MiNv'n's'ss'ss's
`lifectii.3
`Ciaratica
`i.
`
`Network iteration
`Services 372
`
`:
`
`:
`
`
`
`
`
`360
`gerating Syster (Android, iOS, Windows Aobile, etc.,
`-------------------------------------aaaaa------------------------------------
`-----------------------------------
`
`Fig. 3
`
`Supercell
`Exhibit 1008
`Page 4
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 4 of 33
`
`US 2015/O153913 A1
`
`
`
`F. :
`
`Supercell
`Exhibit 1008
`Page 5
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 5 of 33
`
`US 2015/O153913 A1
`
`FG. SA
`
`
`
`
`
`Supercell
`Exhibit 1008
`Page 6
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 6 of 33
`
`US 2015/O153913 A1
`
`
`
`MONITOR AN ORIENTATION
`610 OF THE HEAD OF THE USER
`
`'''''''''''.........v.
`
`DETERMINE WHETHER THE
`USER IS LOOKING UPWARD
`WITH RESPECT TO A
`PREDEERNEO
`HORIZONA. THRESO)
`
`620-
`
`630
`
`CAUSEAVIRTUAL MENUTO
`3E SHOWN ON ADSPAY
`
`F.G. 6
`
`Supercell
`Exhibit 1008
`Page 7
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 7 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 8
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 8 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 9
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 9 of 33
`
`US 2015/O153913 A1
`
`--------------
`
`CASE AWRAENO
`BES OWN ON A SPAY
`
`
`
`ONOR A CRENAON
`828:
`OF TE EAD OF E SER
`
`s
`
`
`
`EERNE WEER
`USERS OOKNG N A
`RECON OF A
`SELECTABLE ELEMENT OF
`HE WRAL VEN
`
`8.----------
`
`
`
`840 -
`
`EERN AN AON OF
`TIME THE USER Looks iN
`THE DiRECTION OF THE
`SEECASE EN
`
`
`
`------------
`
`CAUSEAT LEAST ONE
`ACTIONASSOCATED WITH
`850 x THE SELECTABLE ELEMENT
`OF THE VIRTUAL vieMU
`
`aaaaaaaaAaaas,waaaaaaaaaaaaaaaaaaaaaaa...:"ww.essess'''''''''''''''''''sisssssssssssss
`
`C END s
`
`s
`
`s
`
`F.G. 8
`
`Supercell
`Exhibit 1008
`Page 10
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 10 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 11
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 11 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 12
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 12 of 33
`
`US 2015/O153913 A1
`
`MONITOR AN ORIENTATION
`1110M OF THE HEAD OF THE USER
`top
`
`โ€” โ€”
`DETERMINE WHETHER THE
`USER S LOOKING iN A
`RECON OF
`1130Y, LoCATION OF THE MENU ON
`E OSPAY
`
`
`
`ssassassasssssss
`
`FG.
`
`Supercell
`Exhibit 1008
`Page 13
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 13 of 33
`
`US 2015/O153913 A1
`
`
`
`k:
`
`FC. 2
`
`Supercell
`Exhibit 1008
`Page 14
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 14 of 33
`
`US 2015/O153913 A1
`
`
`
`usef view -
`
`G. 3
`
`Supercell
`Exhibit 1008
`Page 15
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 15 of 33
`
`US 2015/O153913 A1
`
`
`
`
`
`ONOR AWEANG
`ORECON OF E USER
`
`-------------------------------
`
`EERN AfEERE
`USER IS LOOKING IN A
`RECON OF AN
`NWDA.
`
`1:x:
`
`
`
`1439
`
`ESABSA
`COMMUNCATIONLINK
`BEWEEN E SER AN
`E - WOA
`
`Supercell
`Exhibit 1008
`Page 16
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 16 of 33
`
`US 2015/O153913 A1
`
`
`
`59.
`
`FG. 5
`
`Supercell
`Exhibit 1008
`Page 17
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 17 of 33
`
`US 2015/O153913 A1
`
`
`
`FG. s.
`
`
`
`*";
`: Unsocked
`
`Supercell
`Exhibit 1008
`Page 18
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 18 of 33
`
`US 2015/O153913 A1
`
`
`
`F.G. 73
`
`Supercell
`Exhibit 1008
`Page 19
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 19 of 33
`
`US 2015/O153913 A1
`
`
`
`F.G.
`
`O.
`
`Supercell
`Exhibit 1008
`Page 20
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 20 of 33
`
`US 2015/O153913 A1
`
`
`
`
`
`FG, E
`
`G.
`
`F
`
`Supercell
`Exhibit 1008
`Page 21
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 21 of 33
`
`US 2015/O153913 A1
`
`F.G. G
`
`7 O2
`
`
`
`FG. 7
`
`172
`
`Supercell
`Exhibit 1008
`Page 22
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 22 of 33
`
`US 2015/O153913 A1
`
`
`
`F.G.
`
`Supercell
`Exhibit 1008
`Page 23
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 23 of 33
`
`US 2015/O153913 A1
`
`
`
`Supercell
`Exhibit 1008
`Page 24
`
`

`

`Patent Application P
`ublication
`
`Jun. 4, 2015 Sheet 24 of 33
`
`US 2015/O153913 A1
`
`TRACKAPATTERNOF".
`yOWEEN OF E
`ORENAON OF THE HEA
`OF E SER
`
`8
`8
`8
`
`8
`assasaasaaaa8.
`
`
`
`
`
`OEERVNE-A-
`RACKEO PAERN OF
`MOWEEN ACES A
`PREEERVNE AERN
`OF WEiEN
`
`1820
`
`
`
`co
`NOCKE WARABE
`DEVCE O PROWE E
`1830^
`SER fi
`ACCESS O
`8 INFORMATION ON ADISPLAY
`
`8
`
`Supercell
`Exhibit 1008
`Page 25
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 25 of 33
`
`US 2015/O153913 A1
`
`9.
`
`ARsics
`23:
`
`
`
`.................................
`
`Series captured information eating
`to regress of a task
`--8.-
`
`Provide iaioiation retating a text
`step if the task
`
`SSERSR
`SYSE
`t
`
`Send ugliated infortation relating
`to progress of the task
`33
`
`Prwis a scasiona ask is
`coinetex
`
`- - - - - - - - - - - - - - - - - - -
`
`----------------
`
`FC.
`
`Supercell
`Exhibit 1008
`Page 26
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 26 of 33
`
`US 2015/O153913 A1
`
`
`
`
`
`Send captured info.nation retating
`to segress of a task
`
`
`
`rvice wide sci fi
`perfiring a next step in the task
`23
`
`SERVER
`SYSE
`
`Senda initiation that the fivide
`task step is congested
`
`FG.
`
`Supercell
`Exhibit 1008
`Page 27
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 27 of 33
`
`US 2015/O153913 A1
`
`CAPTURE INFORMATION
`24 or RELATED TO PROGRESS OF
`A. ASK
`:
`
`aaaaaaaaaa-as-s-s
`
`220
`
`PROVIDE TO ASERVER
`SYSTEM THE CAPTURED
`INFORMATION RELATING TO
`PROGRESS OF A. ASK
`
`
`
`
`
`2130
`
`
`
`RECEWE ROE SERVER
`SYSE: NORAO
`RE ANG A NEX SEP
`NEASK
`
`2:38:
`
`CASE ENERA i{N
`RE ANG O -E NEX
`SEP N
`E ASKO BE
`SOWN ON
`E SAY
`
`----------a
`
`FG. 2
`
`Supercell
`Exhibit 1008
`Page 28
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 28 of 33
`
`US 2015/O153913 A1
`
`
`
`F.G. 22
`
`Supercell
`Exhibit 1008
`Page 29
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 29 of 33
`
`US 2015/O153913 A1
`
`F.G. 3
`
`
`
`F.C. 23C
`
`Supercell
`Exhibit 1008
`Page 30
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 30 of 33
`
`US 2015/O153913 A1
`
`----------------X-X-X-
`
`|
`CAUSE A GRAPHICALICON
`24 {^
`ASSOCAED WA
`CONRO OF ANASPEC OF
`AN ON-BOARD COMPONEN
`TO BE SHOWN ON ADSPAY
`
`OETERMNE WEER THE
`SERS OOKNG NEA
`ORECTION OF THE
`GRAPHCA CON
`
`
`
`NITIATE THE CONTROL OF
`ASPECT OF THE ON
`BOARD COMPONENT
`
`
`
`F.G. 24.
`
`Supercell
`Exhibit 1008
`Page 31
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 31 of 33
`
`US 2015/O153913 A1
`
`
`
`
`
`Seri
`SYSE
`t
`
`ยง
`
`F. S.
`
`Supercell
`Exhibit 1008
`Page 32
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 32 of 33
`
`US 2015/O153913 A1
`
`O START D
`--
`
`s
`
`281
`
`RECEIVE INFORMATION
`FROf A FRS DEVCE
`
`SEEC ONE OR WORE
`OESGNAEO DEWCESO
`RECEIVE ENFORMATON
`
`โ€”-
`
`xxi.
`
`
`
`SEND HENFORMAON TO
`E ONE OR WORE
`OESGNAED DEVICES
`
`F.G. 26
`
`Supercell
`Exhibit 1008
`Page 33
`
`

`

`Patent Application Publication
`
`Jun. 4, 2015 Sheet 33 of 33
`
`US 2015/O153913 A1
`
`O
`
`
`
`271
`
`RECEIVE USER PROFILES OF
`A PURAY OF USERS
`
`
`
`X.
`228.
`
`ENFY A COMMONAY
`OF THE PRAY OF
`SERS
`
`
`
`
`
`
`
`273 -
`
`SELECT ONE OR WORE
`DESGNAED DEVCESTO
`SEND INFORMAON TO
`BASED ONE
`COMMONALITY
`
`F.G. 27
`
`Supercell
`Exhibit 1008
`Page 34
`
`

`

`US 2015/O153913 A1
`
`Jun. 4, 2015
`
`SYSTEMS AND METHODS FOR
`INTERACTING WITH AVIRTUAL MENU
`
`CROSS-REFERENCES TO RELATED
`APPLICATIONS
`0001. This application claims the benefit of priority of
`U.S. Provisional Patent Application No. 61/910,419, filed on
`Dec. 1, 2013, U.S. Provisional Patent Application No.
`61/910,425, filed on Dec. 1, 2013, and U.S. Provisional
`Patent Application No. 62/043,759, filedon Aug. 29, 2014, all
`of which are incorporated herein by reference in their entirety.
`
`TECHNICAL FIELD
`0002 The present disclosure relates generally to an aug
`mented reality device and, more particularly, to methods and
`systems for representing and interacting with augmented
`reality content using the augmented reality device.
`
`BACKGROUND
`0003 Technology advances have enabled mobile personal
`computing devices to become more capable and ubiquitous.
`In many cases, these devices will have both a display as well
`as a combination of sensors. For example, the devices may
`include GPS, accelerometers, gyroscopes, cameras, light
`meters, and compasses or some combination thereof. These
`devices may include mobile computing devices as well as
`head mounted displays.
`0004 Additionally, these mobile personal computing
`devices are increasingly capable of both displaying informa
`tion for the user as well as Supplying contextual information
`to other systems and applications on the device. Such contex
`tual information can be used to determine the location, ori
`entation and movement of the user interface display of the
`device.
`
`SUMMARY
`0005 Embodiments consistent with the present disclosure
`provide an apparatus and methods for representing and inter
`acting with augmented reality content.
`0006 Consistent with a disclosed embodiment, a wear
`able device provides a virtual menu to a user. The wearable
`device may include a display; at least one sensor configured to
`provide an output indicative of an orientation of a head of the
`user, and at least one processing device. The at least one
`processing device may be configured to monitor an orienta
`tion of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of
`the head whether the user is looking upward or downward
`with respect to a predetermined horizontal threshold; and
`cause the virtual menu to be shown on the display if the user
`is determined to be looking upward or downward with respect
`to the predetermined horizontal threshold.
`0007 Consistent with another disclosed embodiment, a
`method provides a virtual menu to a user of a wearable device.
`The method may include monitoring, based on output of at
`least one sensor, an orientation of a head of the user; deter
`mining based on the monitored orientation of the head
`whether the user is looking upward or downward with respect
`to a predetermined horizontal threshold; and causing the Vir
`tual menu to be shown on a display of the wearable device if
`the user is determined to be looking upward or downward
`with respect to the predetermined horizontal threshold.
`
`0008 Consistent with another disclosed embodiment, a
`wearable device provides a virtual menu to a user. The wear
`able device may include a display; at least one sensor config
`ured to provide an output indicative of a viewing direction of
`the user, and at least one processing device. The at least one
`processing device may be configured to cause a virtual menu
`to be shown on the display, the virtual menu including at least
`one selectable element; monitor a viewing direction of the
`user based on the output of the at least one sensor, determine,
`based on the monitored viewing direction, whether the user is
`looking in a direction of the at least one selectable element of
`the virtual menu; determine an amount of time that the user
`looks in the direction of the at least one selectable element of
`the virtual menu; and cause at least one action associated with
`the at least one selectable element of the virtual menu if the
`amount of time exceeds a predetermined dwell time thresh
`old.
`0009 Consistent with another disclosed embodiment, a
`method provides a virtual menu to a user of a wearable device.
`The method may include causing a virtual menu to be shown
`on a display of the wearable device, wherein the virtual menu
`includes at least one selectable element; monitoring, based on
`output of at least one sensor, a viewing direction of the user;
`determining, based on the monitored viewing direction,
`whether the user is looking in a direction of the at least one
`selectable element of the virtual menu; determining an
`amount of time that the user looks in the direction of the at
`least one selectable element of the virtual menu; and causing
`at least one action associated with the at least one selectable
`element of the virtual menu if the amount of time exceeds a
`predetermined dwell time threshold.
`0010 Consistent with another disclosed embodiment, a
`wearable device provides a virtual menu to a user. The wear
`able device may include a display; at least one sensor config
`ured to provide an output indicative of a viewing direction of
`the user, and at least one processing device. The at least one
`processing device may be configured to monitor a viewing
`direction of the user based on the output of the at least one
`sensor; provide a menu on the display; determine, based on
`the monitored viewing direction, whether the user is looking
`in a direction of the location of the menu on the display; and
`expand the menu if the user is determined to be looking in the
`direction of the location of the menu on the display.
`0011 Consistent with another disclosed embodiment, a
`method provides a virtual menu to a user of a wearable device.
`The method may include monitoring, based on output of at
`least one sensor, a viewing direction of the user; providing a
`menu on a display of the wearable device; determining, based
`on the monitored viewing direction, whether the user is look
`ing in a direction of the location of the menu on the display;
`and expanding the menu if the user is determined to be look
`ing in the direction of the location of the menu on the display.
`0012 Consistent with a disclosed embodiment, a wear
`able device establishes a communication path with a user of
`the wearable device and at least one individual. The wearable
`device includes a display; at least one sensor configured to
`provide an output indicative of a viewing direction of the user;
`and at least one processing device. The at least one processing
`device is configured to monitor the viewing direction of the
`user based on the output of the at least one sensor, determine,
`based on the monitored viewing direction, whether the user is
`looking in the direction of the at least one individual; and
`establish the communication path between the user and the at
`
`Supercell
`Exhibit 1008
`Page 35
`
`

`

`US 2015/O153913 A1
`
`Jun. 4, 2015
`
`least one individual if the user is determined to be looking in
`the direction of the at least one individual.
`0013 Consistent with another disclosed embodiment, a
`method establishes a communication path with a user of a
`wearable device and at least one individual. The method
`includes monitoring, based on an output of at least one sensor,
`a viewing direction of the user; determining, based on the
`monitored viewing direction, whether the user is looking in
`the direction of the at least one individual; and establishing
`the communication path between the user and the at least one
`individual if the user is determined to be looking in the direc
`tion of the at least one individual.
`0014 Consistent with a disclosed embodiment, a lock
`able, wearable device is provided. The wearable device com
`prise a display; at least one sensor configured to provide an
`output indicative of a viewing direction of a user; and at least
`one processing device. The at least one processing device is
`configured to track a pattern of the viewing direction of the
`user; and unlock the lockable, wearable device to provide the
`user with access to information on the display of the device
`when the tracked pattern of movement matches a predeter
`mined pattern of movement.
`0.015
`Consistent with another disclosed embodiment, a
`lockable, wearable device is provided. The wearable device
`comprise a display; at least one sensor configured to provide
`an output indicative of a viewing direction of a user, and at
`least one processing device. The at least one processing
`device is configured to cause an array of graphical objects to
`be shown on the display of the wearable device; detect selec
`tion by the user of at least two graphical objects from among
`the array of graphical objects based on the output indicative of
`the viewing direction of the user; and unlock the lockable,
`wearable device to provide the user with access to informa
`tion on the display of the device based on whether the
`detected selection of the at least two graphical objects
`matches a predetermined object selection sequence.
`0016 Consistent with another disclosed embodiment, a
`method unlocks a wearable device. The method includes
`tracking, using at least one sensor of the wearable device, a
`viewing direction of a user of the wearable device; and
`unlocking the wearable device to provide the user with access
`to information on a display of the wearable device when the
`tracked viewing direction matches a predetermined pattern of
`moVement.
`0017 Consistent with another disclosed embodiment, a
`method unlocks a wearable device. The method includes
`causing an array of graphical objects to be shown on a display
`of the wearable device; detecting selection by the user of at
`least two graphical objects from among the array of graphical
`objects based on an output of at least one sensor of the wear
`able device, wherein the output is indicative of a viewing
`direction of a user of the wearable device; and unlocking the
`wearable device to provide the user with access to informa
`tion of a display of the device based on whether the detected
`selection of the at least two graphical objects matches a pre
`determined object selection sequence.
`0.018 Consistent with another disclosed embodiment, a
`wearable device provides task-based instructions to a user.
`The wearable device may include a display; a network inter
`face; a data input device configured to capture information
`relating to progress of a task; and at least one processing
`device. The at least one processing device may be configured
`to provide to a server system, via the network interface, the
`captured information relating to progress of the task; receive
`
`from the server system, via the network interface, information
`relating to a next step in the task; and cause the information
`relating to the next step in the task to be shown on the display.
`0019 Consistent with another disclosed embodiment, a
`method provides task-based instructions to a user of a wear
`able device. The method may include capturing, via a data
`input device of the wearable device, information relating to
`progress of a task; providing to a server system, via a network
`interface, the captured information relating to progress of the
`task; receiving from the server system, via the network inter
`face, information relating to a next step in the task; and
`causing the information relating to the next step in the task to
`be shown on a display of the wearable device.
`0020 Consistent with another disclosed embodiment, a
`wearable device controls operation of an on-board compo
`nent. The wearable device may include a display; at least one
`sensor configured to provide an output indicative of a viewing
`direction of a user; and at least one processing device. The at
`least one processing device may be configured to cause at
`least one graphical icon associated with a control of at least
`one aspect of the on-board component to be shown on the
`display such that the user perceives the location of the at least
`one graphical icon as fixed relative to real world coordinates;
`determine, based on the output of the at least one sensor,
`whether the user is looking in a direction of the at least one
`graphical icon; and initiate the control of the at least one
`aspect of the on-board component when the user is deter
`mined to be looking in the direction of the at least one graphi
`cal icon.
`0021 Consistent with another disclosed embodiment, a
`method controls operation of an on-board component of a
`wearable device. The method may include causing at least
`one graphical icon associated with a control of at least one
`aspect of the on-board component to be shown on a display of
`the wearable device such that the user perceives the location
`of the at least one graphical icon as fixed relative to real world
`coordinates; determining, based on an output of at least one
`sensor of the wearable device configured to provide an output
`indicative of a viewing direction of the user of the wearable
`device, whether the user is looking in a direction of the at least
`one graphical icon; and initiating the control of the at least one
`aspect of the on-board component when the user is deter
`mined to be looking in the direction of the at least one graphi
`cal icon.
`0022 Consistent with another disclosed embodiment, a
`system interacts with and shares information between a plu
`rality of users of a corresponding plurality of wearable
`devices. The system may include a network interface; and at
`least one processing device. The at least one processing
`device may be configured to receive, via the network inter
`face, information from a first wearable device; select from the
`plurality of wearable devices one or more designated wear
`able devices to receive the information; and send, via the
`network interface, the information to the one or more desig
`nated wearable devices.
`0023 Consistent with another disclosed embodiment, a
`method interacts with and shares information between a plu
`rality of users of a corresponding plurality of wearable
`devices. The method may include receiving, via a network
`interface, information from a first wearable device; selecting
`from the plurality of wearable devices one or more designated
`wearable devices to receive the information; and sending, via
`the network interface, the information to the one or more
`designated wearable devices.
`
`Supercell
`Exhibit 1008
`Page 36
`
`

`

`US 2015/O153913 A1
`
`Jun. 4, 2015
`
`0024 Consistent with other disclosed embodiments, non
`transitory computer-readable storage media may store pro
`gram instructions, which are executed by at least one proces
`sor and perform any of the methods described herein.
`0025. The foregoing general description and the following
`detailed description are exemplary and explanatory only and
`are not restrictive of the claims.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0026. The accompanying drawings, which are incorpo
`rated in and constitute a part of this disclosure, illustrate
`various disclosed embodiments. In the drawings:
`0027 FIG. 1 illustrates an exemplary system for imple
`menting disclosed embodiments.
`0028 FIG. 2 illustrates an exemplary Augmented Reality
`(AR) device consistent with disclosed embodiments.
`0029 FIG.3 is a block diagram illustrating a configuration
`of a software-driven system consistent with disclosed
`embodiments.
`0030 FIG. 4 illustrates an example of a user wearing an
`AR device consistent with disclosed embodiments.
`0031
`FIG. 5A illustrates an example of a virtual menu
`being displayed by an AR device consistent with disclosed
`embodiments.
`0032 FIG. 5B illustrates an example of displaying a sub
`menu by an AR device consistent with disclosed embodi
`mentS.
`0033 FIG. 6 is a flowchart of an exemplary process for
`providing a virtual menu to a userby an AR device consistent
`with disclosed embodiments.
`0034 FIG. 7A illustrates an example of a selectable ele
`ment of virtual menu being displayed by an AR device by
`consistent with disclosed embodiments.
`0035 FIG. 7B illustrates another example of a selectable
`element of virtual menu being displayed by an AR device
`consistent with disclosed embodiments.
`0036 FIG.7C illustrates an example of expanding an ele
`ment of virtual menu being displayed by an AR device con
`sistent with disclosed embodiments.
`0037 FIG. 8 is a flowchart of an exemplary process for
`causing an action associated with an element of virtual menu
`to be performed by an AR device consistent with disclosed
`embodiments.
`0038 FIG.9A illustrates an example of a nested menu that
`is displayed by an AR device consistent with disclosed
`embodiments.
`0039 FIG. 9B illustrates another example of a nested
`menu that is displayed by an AR device consistent with dis
`closed embodiments.
`0040 FIG. 10 illustrates an example of accessing a nested
`menu that is displayed by an AR device consistent with dis
`closed embodiments.
`0041
`FIG. 11 is a flowchart of an exemplary process for
`accessing a nested menu by an AR device consistent with
`disclosed embodiments.
`0042 FIG. 12 illustrates an example of a user of a wear
`able AR device initiating communication with another user
`consistent with disclosed embodiments.
`0043 FIG. 13 illustrates an example of a user of a wear
`able AR device initiating communication with a group of
`users consistent with disclosed embodiments.
`0044 FIG. 14 is a flowchart of an exemplary process for
`initiating communicating with a user of a wearable AR device
`consistent with disclosed embodiments.
`
`0045 FIG. 15 illustrates an example of a user unlocking a
`wearable AR device consistent with disclosed embodiments.
`0046 FIG. 16A illustrates an example of a display show
`ing a wearable AR device in a locked state consistent with
`disclosed embodiments.
`0047 FIG. 16B illustrates an example of a display show
`ing a wearable AR device in an unlocked State consistent with
`disclosed embodiments.
`0048 FIGS. 17A-17K illustrate another example of a dis
`play showing a wearable AR device changing from a locked
`to an unlocked state consistent with disclosed embodiments.
`0049 FIG. 18 is a flowchart of an exemplary process for
`unlocking a wearable AR device consistent with disclosed
`embodiments.
`0050 FIG. 19 illustrates an example of providing a user of
`an AR device with task-based instructions consistent with
`disclosed embodiments.
`0051
`FIG. 20 illustrates another example of providing a
`user of an AR device with task-based instructions consistent
`with disclosed embodiments.
`0.052
`FIG. 21 is a flowchart of an exemplary process for
`providing task-based instructions via anAR device consistent
`with disclosed embodiments.
`0053 FIG. 22 illustrates an example of a user wearing an
`AR device to control operation of an on-board component
`consistent with disclosed embodiments.
`0054 FIG. 23A illustrates an example of a graphical icon
`associated with controlling an on-board component of an AR
`device consistent with disclosed embodiments.
`0055 FIG. 23B illustrates an example of accessing a
`graphical icon associated controlling an on-board component
`of an AR device consistent with disclosed embodiments.
`0056 FIG. 23C illustrates an example of a menu of opera
`tions associated with an on-board camera associated with an
`AR device consistent with disclosed embodiments.
`0057 FIG. 24 is a flowchart of an exemplary process for
`controlling an on-board component of an AR device consis
`tent with disclosed embodiments.
`0.058 FIG.25 illustrates an example environment for shar
`ing information between users of AR devices consistent with
`disclosed embodiments.
`0059 FIG. 26 is a flowchart of an exemplary process for
`sharing information between users of AR devices consistent
`with disclosed embodiments.
`0060 FIG. 27 is a flowchart of an exemplary process for
`selecting designated AR devices to receive information con
`sistent with disclosed embodiments.
`
`DETAILED DESCRIPTION
`0061 Mobile personal computing devices may include
`one or more portable displays used to overlay virtual objects
`with real world objects. Virtual content that relates to one or
`more real world objects (e.g., places, things, people, etc.) and
`that may be provided on a display may be referred to as
`Augmented Reality (AR) content. Such AR content may be
`provided on a display together with the real world objects to
`which the AR content relates. Further, the views of the real
`world objects on a display may correspond to computer
`generated representations of those objects or, in some
`embodiments (e.g., where at least a portion of the display
`passes or channels light), may correspond to actual, non
`computer-generated views of the objects.
`0062. In some embodiments, a device may provide vari
`ous menus from which a user may select. For example, in one
`
`Supercell
`Exhibit 1008
`Page 37
`
`

`

`US 2015/O153913 A1
`
`Jun. 4, 2015
`
`embodiment, a wearable augmented reality device may pro
`vide a menu that appears to hover over a users head and
`outside of the user's field of view when the user is looking at
`the horizon. To access the menu, the user looks up toward the
`spot where the user perceives the menu to be located. For
`example, in one embodiment, the wearable augmented reality
`device may provide a user interface that enables a user to
`select a menu item by looking at the menu item. In another
`embodiment, the wearable augmented reality device may
`provide a nested menu system that enables the user to look
`toward a menu shown on a display, select the menu, and
`expand the menu upon selection. In another embodiment, a
`system may provide the capability to monitor the progress of
`a task assigned to a particular user of a wearable augmented
`reality device. As steps in the task are completed, information
`rela

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket