throbber
#
`1a
`
`U.S. Patent No. 10,406,432
`1. A computer program product
`embodied on a non-transitory
`computer-readable medium,
`comprising code executable by a
`virtual image display apparatus
`having at least a processor and a
`memory, the memory being
`configured to store an information
`providing condition of the virtual
`image display apparatus and being
`further configured to store to-be-
`provided information, to cause the
`virtual image display apparatus to
`carry out the following steps:
`
`Invalidity of U.S. Patent No. 10,406,432
`in View of
`
`U.S. Pat. Pub. No. 20150153913 to Ballard (“Ballard”),
`published July 4, 2015.
`
`Ballard
`Ballard expressly or inherently discloses these claim element(s).
`
`[A computer program product embodied on a non-transitory computer-readable medium,
`comprising code executable by a virtual image display apparatus having at least a processor and
`a memory,]
`
`“The present disclosure relates generally to an augmented reality device and, more particularly, to
`methods and systems for representing and interacting with augmented reality content using the
`augmented reality device.” [0002]
`
`“Consistent with other disclosed embodiments, non-transitory computer-readable storage media
`may store program instructions, which are executed by at least one processor and perform any of
`the methods described herein.” [0024]
`
`“FIG. 1 illustrates an exemplary system 100 for implementing the disclosed embodiments. In one
`aspect, system 100 may include a server system 110, a user system 120, and network 130. It should be
`noted that although a single user system 120 is shown in FIG. 1, more than one user system 120 may
`exist in system 100. Furthermore, although a single server system 110 is shown in FIG. 1, more than one
`server system 110 may exist in system 100.” [0066]
`
`“User system 120 may include a system associated with a user (e.g., a consumer, field technician,
`equipment operator, or any other individual that may benefit from received AR content) that is
`configured to perform one or more operations consistent with the disclosed embodiments. In one
`embodiment, a user may operate user system 120 to perform one or more such operations. User system
`120 may include a communication interface 121, a processor device 123, a memory 124, a sensor
`array 125, a display 122, and/or any other components that may facilitate the display of AR content
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 1 of 25
`
`Supercell
`Exhibit 1014
`Page 1
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`Ballard
`to the user. The processor device 123 may be configured to execute software instructions to perform
`aspects of the disclosed embodiments. User system 120 may be configured in the form of an AR device,
`such as a head mounted display (HMD). Although in the present disclosure user system 120 is described
`in connection with a HMD, user system 120 may include tablet devices, mobile phone(s), laptop
`computers, a wearable device, such as a smart watch, and any other computing device(s) known to those
`skilled in the art.” [0069]
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of A computer
`program product embodied on a non-transitory computer-readable medium (e.g., non-transitory
`computer storage media may store program instructions), comprising code executable by a virtual
`image display apparatus (e.g., the augmented reality device) having at least a processor and a
`memory (e.g., the processor device 123 and storage media).
`
`[the memory being configured to store an information providing condition of the virtual image
`display apparatus and being further configured to store to-be-provided information, to cause the
`virtual image display apparatus to carry out the following steps:]
`
`“In some embodiments, a memory of AR device 200 (e.g., positional processing module 378) may
`be configured to store instructions that when executed by a processing device (e.g., microprocessor
`208) of AR device 200, determine the viewing direction of user 1201 (e.g., the orientation of the
`head of user 1201 and/or the gaze direction of the eyes of user 1201) based on output from the one
`or more sensors. The processing device may be further configured to execute instructions to initiate a
`communication link between AR device 200 and another device (e.g., another AR device), based on the
`determined viewing direction of user 1201.” [0203]
`
`“The predetermined horizontal threshold may be pre-configured by user 401 through a user interface of
`AR device 200 or be pre-set based on a default setting of AR device 200. For example, display 204
`may display a menu with different values of the predetermined horizontal threshold to enable user
`401 to make a selection. As another example, display 204 may display a field that enables user 401
`to input a desirable value of the predetermined horizontal threshold. The predetermined
`horizontal threshold may be set in units of degrees, radians, or any other units of angular
`measurement. As an example, the predetermined horizontal threshold may be set as 20, 30, or 60
`degrees or more according to a preference of user 401.” [0112]
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 2 of 25
`
`Supercell
`Exhibit 1014
`Page 2
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`1b
`
`detecting, with a sensor
`operationally linked to the virtual
`image display apparatus, a
`movement of a body part of a
`player, the body part comprising
`at least one of a head of the player
`and an eye of the player, and the
`sensor being at least one of the set
`of: a gyro sensor configured to
`measure movement of the head of
`the player, an acceleration sensor
`configured to measure movement
`of the head of the player, a
`geomagnetic sensor configured to
`measure movement of the head of
`the player and a line-of-sight
`
`Ballard
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking upward
`or downward with respect to a predetermined horizontal threshold; and cause the virtual menu to be
`shown on the display if the user is determined to be looking upward or downward with respect to the
`predetermined horizontal threshold.” [0006]
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of the memory
`(e.g., the memory of the AR device) being configured to store an information providing condition of
`the virtual image display apparatus (e.g., the instructions that determine the viewing direction of the
`user and whether it is within the predetermined horizontal threshold) and being further configured
`to store to-be-provided information (e.g., the virtual menu that is caused to display), to cause the
`virtual image display apparatus to carry out the following steps.
`
`Ballard expressly or inherently discloses these claim element(s).
`
`
`[detecting, with a sensor operationally linked to the virtual image display apparatus, a movement
`of a body part of a player, the body part comprising at least one of a head of the player and an
`eye of the player,]
`
`“At step 1110, AR device 200 may monitor a viewing direction of the user (e.g., an orientation of
`the head of the user and/or an orientation of the gaze direction of the user's eyes) based on the
`output of the at least one sensor associated with the AR device. For example, the processing device
`(e.g., microprocessor 208) of AR device 200 may execute instructions of positional processing module
`378, discussed above in relation to FIG. 3, to perform this step. The sensor may be included in AR
`device 200 and be configured to provide an output indicative of the orientation of the user's head.
`For example, the sensor may be configured to provide an output indicative of the viewing direction of
`the user by tracking a pattern of movement of an orientation of the head of the user. As another
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 3 of 25
`
`Supercell
`Exhibit 1014
`Page 3
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`sensor configured to measure
`movement of the eye of the
`player; and
`
`Ballard
`example, the sensor may be configured to provide an output indicative of the viewing direction of
`the user by tracking a gaze of the user's eyes.” [0193]
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of detecting,
`with a sensor (e.g., the at least one sensor associated with the AR device) operationally linked (e.g.,
`associated with the AR device) to the virtual image display apparatus (e.g., the AR device), a
`movement of a body part of a player (e.g., gaze direction of the user’s eyes), the body part
`comprising at least one of a head of the player and an eye of the player (e.g., the orientation of the
`head of the user).
`
`
`[and the sensor being at least one of the set of: a gyro sensor configured to measure movement of
`the head of the player, an acceleration sensor configured to measure movement of the head of the
`player, a geomagnetic sensor configured to measure movement of the head of the player and a
`line-of-sight sensor configured to measure movement of the eye of the player;]
`
`“As described above in connection with FIG. 4, the processing device of AR device 200 may be
`configured to monitor a viewing direction of user 401 (e.g., an orientation of the head of user 401
`and/or an orientation of the gaze of user 401) based on output received from the one or more
`sensors. Such sensors may include, for example, one or more components associated with IMU 201
`and/or sensor array 125. Such sensors may also include one or more accelerometers, gyroscopes,
`magnetometers, eye tracking sensors, etc. as discussed in detail above. For example, the detected
`orientation of the head of user 401 may include an angle of the user's head formed with respect to the
`horizontal plane, which is associated with upward or downward movement of the head of the user, along
`with a direction of the user's head in the horizontal plane, which may be associated with left or right
`movement of the head of the user. For example, the one or more sensors of AR device 200 may output
`three-dimensional coordinates of multiple points of AR device 200 to the processing device, and the
`processing device may determine the angle of the user's head with respect to the horizontal plane and the
`direction of the user's head within the horizontal plane based on the received coordinates.” [0135]
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of and the
`sensor being at least one of the set of: a gyro sensor configured to measure movement of the head of
`the player, an acceleration sensor configured to measure movement of the head of the player, a
`geomagnetic sensor configured to measure movement of the head of the player and a line-of-sight
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 4 of 25
`
`Supercell
`Exhibit 1014
`Page 4
`
`

`

`
`
`#
`
`1c
`
`U.S. Patent No. 10,406,432
`
`determining, based on the
`movement of the body part of the
`player, a position and direction of
`the body part of a player;
`
`1d
`
`displaying, on a display
`operationally linked to the virtual
`image display apparatus, in
`accordance with the position and
`direction of the body part of the
`player, an image of a virtual
`space including a first area and a
`second area; and
`
`Ballard
`sensor configured to measure movement of the eye of the player (e.g., sensors may include IMU 201,
`sensor array 125, accelerometers, magnetometers, eye tracking sensors)
`
`Ballard expressly or inherently discloses these claim element(s).
`
`“At step 1110, AR device 200 may monitor a viewing direction of the user (e.g., an orientation of
`the head of the user and/or an orientation of the gaze direction of the user's eyes) based on the
`output of the at least one sensor associated with the AR device. For example, the processing device
`(e.g., microprocessor 208) of AR device 200 may execute instructions of positional processing module
`378, discussed above in relation to FIG. 3, to perform this step. The sensor may be included in AR
`device 200 and be configured to provide an output indicative of the orientation of the user's head.
`For example, the sensor may be configured to provide an output indicative of the viewing direction of
`the user by tracking a pattern of movement of an orientation of the head of the user. As another example,
`the sensor may be configured to provide an output indicative of the viewing direction of the user
`by tracking a gaze of the user's eyes.” [0193]
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of
`determining (e.g., using the output of the at least one sensor), based on the movement of the body
`part of the player (e.g., orientation of the head of the user), a position (e.g., orientation of the head)
`and direction of the body part of a player (e.g., gaze direction of the eyes);
`
`
`[displaying, on a display operationally linked to the virtual image display apparatus, in
`accordance with the position and direction of the body part of the player,]
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking
`upward or downward with respect to a predetermined horizontal threshold; and cause the virtual
`menu to be shown on the display if the user is determined to be looking upward or downward with
`respect to the predetermined horizontal threshold.” [0006]
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 5 of 25
`
`Supercell
`Exhibit 1014
`Page 5
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`Ballard
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of
`determining (e.g., using the sensors 18 to determine the view direction), based on the movement of
`the body part of the player (e.g., gaze, user body part orientation), a position (e.g., position of the
`user) and direction of the body part of a player (e.g., and other directional information);
`
`
`[an image of a virtual space including a first area and a second area;]
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking upward
`or downward with respect to a predetermined horizontal threshold; and cause the virtual menu to be
`shown on the display if the user is determined to be looking upward or downward with respect to
`the predetermined horizontal threshold.” [0006]
`
`“In some embodiments, a device may provide various menus from which a user may select. For example,
`in one embodiment, a wearable augmented reality device may provide a menu that appears to
`hover over a user's head and outside of the user's field of view when the user is looking at the
`horizon. To access the menu, the user looks up toward the spot where the user perceives the menu
`to be located. For example, in one embodiment, the wearable augmented reality device may provide a
`user interface that enables a user to select a menu item by looking at the menu item. In another
`embodiment, the wearable augmented reality device may provide a nested menu system that enables the
`user to look toward a menu shown on a display, select the menu, and expand the menu upon selection.
`In another embodiment, a system may provide the capability to monitor the progress of a task assigned
`to a particular user of a wearable augmented reality device. As steps in the task are completed,
`information relating to the next steps is passed to the user.” [0062]
`
`“The predetermined horizontal threshold may be pre-configured by user 401 through a user interface of
`AR device 200 or be pre-set based on a default setting of AR device 200. For example, display 204 may
`display a menu with different values of the predetermined horizontal threshold to enable user 401
`to make a selection. As another example, display 204 may display a field that enables user 401 to
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 6 of 25
`
`Supercell
`Exhibit 1014
`Page 6
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`Ballard
`input a desirable value of the predetermined horizontal threshold. The predetermined horizontal
`threshold may be set in units of degrees, radians, or any other units of angular measurement. As
`an example, the predetermined horizontal threshold may be set as 20, 30, or 60 degrees or more according
`to a preference of user 401.” [0112]
`
`
`[FIG. 4]
`
`
`
`
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 7 of 25
`
`Here, the display in Ballard reads upon the image of the virtual space of the ‘432 claim. The first area
`of the claim may be an area of the display in Ballard which the user views when the user’s gaze is
`
`Supercell
`Exhibit 1014
`Page 7
`
`

`

`
`
`#
`
`1e
`
`U.S. Patent No. 10,406,432
`
`with the virtual image display
`apparatus, providing, when the
`information providing condition
`is satisfied, the to-be-provided
`information to the player by
`displaying the to-be-provided
`information in the second area;
`
`Ballard
`within the predetermined horizontal threshold. The second area of the ‘432 claim may be an area of
`the display in Ballard which the user views when the user’s gaze exceeds the predetermined horizontal
`threshold, and is also the spot where the user perceives the virtual menu to be located.
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of an image of
`a virtual space (e.g., image shown on the display) including a first area (e.g., the area of the display
`viewed by the user when the user’s gaze angle is within the predetermined horizontal threshold, e.g.,
`angle alpha in FIG. 4) and a second area (e.g., the area of the display viewed by the user when the
`user’s gaze angle exceeds the predetermined horizontal threshold, e.g., angle alpha in FIG. 4 – this
`is the spot where the user perceives the menu to be located).
`
`Ballard expressly or inherently discloses these claim element(s).
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking
`upward or downward with respect to a predetermined horizontal threshold; and cause the virtual
`menu to be shown on the display if the user is determined to be looking upward or downward with
`respect to the predetermined horizontal threshold.” [0006]
`
`“In some embodiments, a device may provide various menus from which a user may select. For example,
`in one embodiment, a wearable augmented reality device may provide a menu that appears to
`hover over a user's head and outside of the user's field of view when the user is looking at the
`horizon. To access the menu, the user looks up toward the spot where the user perceives the menu
`to be located. For example, in one embodiment, the wearable augmented reality device may provide a
`user interface that enables a user to select a menu item by looking at the menu item. In another
`embodiment, the wearable augmented reality device may provide a nested menu system that enables the
`user to look toward a menu shown on a display, select the menu, and expand the menu upon selection.
`In another embodiment, a system may provide the capability to monitor the progress of a task assigned
`to a particular user of a wearable augmented reality device. As steps in the task are completed,
`information relating to the next steps is passed to the user.” [0062]
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 8 of 25
`
`Supercell
`Exhibit 1014
`Page 8
`
`

`

`
`
`#
`
`1f
`
`U.S. Patent No. 10,406,432
`
`wherein the information
`providing condition is a condition
`of a gaze position moving to the
`second area from the first area,
`the gaze position being specified
`by at least one of the body part of
`the player being in a specified
`position or the direction of the
`body part of the player being at
`least a specified direction.
`
`Ballard
`
`As the angle of the user’s gaze exceeds the predetermined horizontal threshold in Ballard, such that
`the user’s gaze is in the second area, new information in the form of the virtual menu is to be shown to
`the user in the second area. This virtual menu corresponds to the to-be-provided information of the
`‘432 claims.
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of with the
`virtual image display apparatus, providing (e.g., cause the virtual menu to be shown), when the
`information providing condition is satisfied (e.g., when the user’s gaze angle exceeds the
`predetermined horizontal threshold), the to-be-provided information (e.g., the virtual menu) to the
`player by displaying the to-be-provided information in the second area (e.g., in the area/spot where
`the user perceives the menu to be located).
`
`Ballard expressly or inherently discloses these claim element(s).
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking
`upward or downward with respect to a predetermined horizontal threshold; and cause the virtual
`menu to be shown on the display if the user is determined to be looking upward or downward with
`respect to the predetermined horizontal threshold.” [0006]
`
`“In some embodiments, a device may provide various menus from which a user may select. For example,
`in one embodiment, a wearable augmented reality device may provide a menu that appears to
`hover over a user's head and outside of the user's field of view when the user is looking at the
`horizon. To access the menu, the user looks up toward the spot where the user perceives the menu
`to be located. For example, in one embodiment, the wearable augmented reality device may provide a
`user interface that enables a user to select a menu item by looking at the menu item. In another
`embodiment, the wearable augmented reality device may provide a nested menu system that enables the
`user to look toward a menu shown on a display, select the menu, and expand the menu upon selection.
`In another embodiment, a system may provide the capability to monitor the progress of a task assigned
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 9 of 25
`
`Supercell
`Exhibit 1014
`Page 9
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`2a
`
`2. The computer program product
`of claim 1, wherein
`
`Ballard
`to a particular user of a wearable augmented reality device. As steps in the task are completed,
`information relating to the next steps is passed to the user.” [0062]
`
`“The predetermined horizontal threshold may be pre-configured by user 401 through a user interface of
`AR device 200 or be pre-set based on a default setting of AR device 200. For example, display 204
`may display a menu with different values of the predetermined horizontal threshold to enable
`user 401 to make a selection. As another example, display 204 may display a field that enables
`user 401 to input a desirable value of the predetermined horizontal threshold. The
`predetermined horizontal threshold may be set in units of degrees, radians, or any other units of
`angular measurement. As an example, the predetermined horizontal threshold may be set as 20, 30,
`or 60 degrees or more according to a preference of user 401.” [0112]
`
`As noted above, in Ballard, the first area may be the area of the display where the user’s gaze angle is
`within the predetermined threshold, and the second area is the area outside this first region, and also a
`spot where the user perceives the virtual menu. Ballard also discloses that the menu is shown when the
`user’s gaze moves to this second area where they perceive the virtual menu to be. This is the
`information providing condition, i.e., the movement of the user’s gaze from the area within the
`predetermined horizontal threshold to the area where the virtual menu would be perceived.
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of wherein the
`information providing condition is a condition of a gaze position moving to the second area from the
`first area (e.g., moving from an area of the display where the user’s gaze is within the predetermined
`horizontal threshold to an area where the user’s gaze exceeds this threshold), the gaze position (e.g.,
`monitored orientation of the head to determine where the user is looking) being specified by at least
`one of the body part of the player (e.g., user’s head) being in a specified position (e.g., where the
`user is looking) or the direction of the body part of the player (e.g., where the user is looking) being
`at least a specified direction (e.g., looking up or down).
`
`
`Ballard expressly or inherently discloses these claim element(s).
`
`“Consistent with another disclosed embodiment, a wearable device provides a virtual menu to a user.
`The wearable device may include a display; at least one sensor configured to provide an output indicative
`of a viewing direction of the user; and at least one processing device. The at least one processing device
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 10 of 25
`
`Supercell
`Exhibit 1014
`Page 10
`
`

`

`
`
`#
`
`2b
`
`U.S. Patent No. 10,406,432
`the virtual space includes a target
`object selectable by a gaze of the
`player,
`wherein an area including the
`target object is recorded in the
`memory,
`
`
`wherein the information
`providing condition includes a
`condition wherein a gaze position
`of the player, the gaze position
`being identified from the position
`and the direction of the body part
`of the player, becomes directed
`outside the area, and
`further comprising providing,
`with the virtual image display
`apparatus, the to-be-provided
`information to the player when
`the gaze position of the player
`becomes directed outside the
`area.
`
`Ballard
`may be configured to cause a virtual menu to be shown on the display, the virtual menu including at least
`one selectable element; monitor a viewing direction of the user based on the output of the at least one
`sensor; determine, based on the monitored viewing direction, whether the user is looking in a direction
`of the at least one selectable element of the virtual menu; determine an amount of time that the user
`looks in the direction of the at least one selectable element of the virtual menu; and cause at least
`one action associated with the at least one selectable element of the virtual menu if the amount of
`time exceeds a predetermined dwell time threshold.” [0008]
`
`Here, Ballard discloses a method being able to select an item by looking at that item for a predetermined
`dwell time. In order to trigger the menu item, instructions should be stored in memory such that the
`system knows to trigger the menu item when the user looks at this area.
`
`Accordingly, in view of the above, the Ballard reference anticipates the claim element of the virtual
`space includes a target object (e.g., menu items) selectable by a gaze of the player (e.g., selecting a
`menu item by looking at it for a predetermined dwell time), wherein an area including the target
`object (e.g., the trigger area/look direction of the menu item) is recorded in the memory (e.g.,
`storage).
`Ballard expressly or inherently discloses these claim element(s).
`
`“Consistent with a disclosed embodiment, a wearable device provides a virtual menu to a user. The
`wearable device may include a display; at least one sensor configured to provide an output indicative of
`an orientation of a head of the user; and at least one processing device. The at least one processing device
`may be configured to monitor an orientation of the head of the user based on the output of the at least
`one sensor; determine based on the monitored orientation of the head whether the user is looking
`upward or downward with respect to a predetermined horizontal threshold; and cause the virtual
`menu to be shown on the display if the user is determined to be looking upward or downward with
`respect to the predetermined horizontal threshold.” [0006]
`
`“In some embodiments, a device may provide various menus from which a user may select. For example,
`in one embodiment, a wearable augmented reality device may provide a menu that appears to
`hover over a user's head and outside of the user's field of view when the user is looking at the
`horizon. To access the menu, the user looks up toward the spot where the user perceives the menu
`to be located. For example, in one embodiment, the wearable augmented reality device may provide a
`
`Invalidity of U.S. Patent No. 10,406,432 in view of Ballard
`
`Page 11 of 25
`
`Supercell
`Exhibit 1014
`Page 11
`
`

`

`
`
`#
`
`U.S. Patent No. 10,406,432
`
`Ballard
`user interface that enables a user to select a menu item by looking at the menu item. In another
`embodiment, the wearable augmented reality device may provide a nested menu system that enables the
`user to look toward a menu shown on a display, select the menu, and expand the menu upon selection.
`In another embodiment, a system may provide the capability to monitor the progress of a task assigned
`to a particular user of a wearable augmented reality device. As steps in the task are completed,
`information relating to the next steps is passed to the user.” [0062]
`
`“…if the detected viewing direction of the user changes from the previously detected orientation for
`equal to or more than the predetermined orientation threshold, the processing device may determine that
`the user is not looking into the direction of the location of the nested menu anymore and may not cause
`the menu to be expanded.” [0171]
`
`“The predetermined horizontal threshold may be pre-configured by user 401 through a user interface of
`AR device 200 or be pre-set based on a default setting of AR device 200. For example, display 204
`may display a menu with different values of the predetermined horizontal threshold to enable user
`401 to make a selection. As another example, display 204 may display a field that enables user 401
`to input a desirable value of the predetermined horizontal threshold. The predetermined
`horizontal threshold may be set in units of degrees, radians, or any other units of angular
`measurement. As an example, the predetermined horizontal threshold may be set as 20, 30, or 60
`degrees or more according to a preference of user 401.” [0112]
`
`Here, Ballard discloses that when the user is looking at the horizon, and thus not at the selectable
`menu item or target object, a menu indicator appears. Thus, the area where the menu indicator
`appears is the area of the target object/menu item, and the area of the horizon, i.e., the a

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket