throbber
(12) United States Patent
`ROSS
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 9,392.212 B1
`Jul. 12, 2016
`
`US009392212B1
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`(54)
`
`(71)
`
`(72)
`
`(73)
`
`(*)
`
`(21)
`(22)
`
`(60)
`
`(51)
`
`(52)
`
`(58)
`
`SYSTEMAND METHOD FOR PRESENTING
`VIRTUAL REALITY CONTENT TO AUSER
`
`Applicant: VISIONARY VR, INC., Los Angeles,
`CA (US)
`
`Inventor: Jonathan Michael Ross, Santa Monica,
`CA (US)
`Assignee: VISIONARY VR, INC., Los Angeles,
`CA (US)
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 16 days.
`Appl. No.: 14/489,128
`Filed:
`Sep. 17, 2014
`
`Notice:
`
`Related U.S. Application Data
`Provisional application No. 61/980,658, filed on Apr.
`17, 2014.
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`Int. C.
`H04N 9/47
`H)4N 5/93
`H04N 5/74
`GIB 27/00
`U.S. C.
`CPC ............ H04N5/9305 (2013.01); GIIB 27/005
`(2013.01); H04N5/7491 (2013.01)
`Field of Classification Search
`CPC ............... G11B 27/105; G11B 27/005; H04N
`13/0497; H04N 5/7491; H04N 5/9305;
`G06F 3/0346; G06T 19/006
`USPC ...................... 386/230, 239, 240; 348/51,53;
`345/158,633
`See application file for complete search history.
`
`2009,0256904 A1* 10, 2009 Krill .................. GO2B 27/O172
`348/47
`2012/0133884 A1* 5, 2012 Ishida ................ GO2B 27,2264
`351,158
`
`2012fO212414 A1
`2013/0050260 A1
`
`2014/0009476 A1
`
`8, 2012 Osterhout et al.
`2/2013 Reitan ..................... G06F 3/011
`345,633
`1/2014 Venkitaraman .... HO4N 21f4126
`34.5/5O2
`
`OTHER PUBLICATIONS
`
`Snubber, Kadhim, "Are you ready for the virtual reality revolution?”
`The Guardian, Aug. 2, 2014, 5 pages.
`
`* cited by examiner
`Primary Examiner — Thai Tran
`Assistant Examiner — Mishawn Hunter
`(74) Attorney, Agent, or Firm — Sheppard Mullin Richter &
`Hampton LLP
`ABSTRACT
`(57)
`This disclosure describes a system configured to present pri
`mary and secondary, tertiary, etc., virtual reality content to a
`user. Primary virtual reality content may be displayed to a
`user, and, responsive to the user turning his view away from
`the primary virtual reality content, a sensory cue is provided
`to the user that indicates to the user that his view is no longer
`directed toward the primary virtual reality content, and sec
`ondary, tertiary, etc., virtual reality content may be displayed
`to the user. Primary virtual reality content may resume when
`the user returns his view to the primary virtual reality content.
`Primary virtual reality content may be adjusted based on a
`users interaction with the secondary, tertiary, etc., virtual
`reality content. Secondary, tertiary, etc., virtual reality con
`tent may be adjusted based on a user's progression through
`the primary virtual reality content, or interaction with the
`primary virtual reality content.
`23 Claims, 9 Drawing Sheets
`
`First Wiew
`Direction
`
`w
`
`A
`
`
`
`First Field
`of Wiew
`400
`
`Field
`Boundary
`404
`
`Second Field
`of Wiew
`402
`
`Second Wiew
`Direction
`410
`
`f
`
`A
`
`Supercell
`Exhibit 1004
`Page 1
`
`

`

`womfl‘wusWmummhmuw:.
`
`
`
` M_0_2NuEmcanoo2.,_1_m_Jcm3389;_8:025
`
`
`
`
`
`
`
`
`
`
`oo_>wn_mESQEoo
`
`5m:_oEobowm.
`
`
`
`
`
`
`
`U.S. Patent
`U.S. Patent
`
`e
`
`.m
`
`_9_
`
`%_
`
`
`
`1_mmEacanoo>m_o_m_n_a_
`
`
`
`n«NEwcogsoo23.50
`
`8],
`
`_9.
`
`
`
`whom—Em_mmEgon—:0053098:.
`
`9IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIISqAU___
`
`US 9,392.212 B1
`3,
`
`1Bn
`
`r – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –T
`
`n,F.9".cas:9\I_‘w
`
`Supercell
`Exhibit 1004
`
`Page 2
`
`Supercell
`Exhibit 1004
`Page 2
`
`
`
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 2 of 9
`
`US 9,392.212 B1
`
`User
`200
`
`
`
`FIG. 2
`
`Supercell
`Exhibit 1004
`Page 3
`
`

`

`US. Patent
`
`Jul. 12, 2016
`
`Sheet 3 019
`
`US 9,392,212 B1
`
`c»
`C
`.5
`:5
`o.
`
`Device 12
`
`Eo0
`
`
`
`ExternalResources 300
`
`FIG.3
`
`Supercell
`Exhibit 1004
`
`Page 4
`
`Supercell
`Exhibit 1004
`Page 4
`
`

`

`U.S. Patent
`
`2.}2
`
`1
`
`\
`
`tNw,%we»hM....
`.22”.nwm.NFN;Mm5%memx2;\2558x,we.Banzai\f3:025\x2655:”.\
`
`.22"..888 ,NOV9Nz..32>howM«s.
`
`
`6M2:.\zcomma”.w./2:.m1is:30.pis:so0wUm.Um.M2,.22".
`
`
`.8:93.M”iiiééma...mw‘é.éééém
`2ex,\8:025.f\32>ccoowm.w......
`
`f
`
`‘x
`
`
`
`
`
` wis.as...is::8...3.x,9,xasgS\a,Ux.x.\f
`
`B.n59“.
`
`Supercell
`Exhibit 1004
`
`Page 5
`
`Supercell
`Exhibit 1004
`Page 5
`
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 5 Of 9
`
`US 9,392.212 B1
`
`AeIds[c]
`
`
`
`
`
`Supercell
`Exhibit 1004
`Page 6
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 6 of 9
`
`US 9,392.212 B1
`
`
`
`N
`£07TN.
`
`N
`
`Supercell
`Exhibit 1004
`Page 7
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 7 Of 9
`
`US 9,392.212 B1
`
`Supercell
`Exhibit 1004
`Page 8
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 8 of 9
`
`US 9,392.212 B1
`
`View
`500
`
`User Interface
`14
`
`
`
`Virtual Reality
`Content Field
`502
`
`Virtual Reality
`Content
`Relationship Field
`504
`
`Field of View
`Definition Field
`506
`
`Boundary
`Field
`508
`
`Content Creation Preview
`Field
`510
`
`FIG. 5
`
`Supercell
`Exhibit 1004
`Page 9
`
`

`

`U.S. Patent
`
`Jul. 12, 2016
`
`Sheet 9 Of 9
`
`US 9,392.212 B1
`
`METHOD
`600
`
`
`
`Generate output signals.
`
`Determine view direction.
`
`Present virtual reality content.
`
`Provide a Sensory Cue.
`
`Facilitate interaction with virtual reality content.
`
`Adjust the virtual reality content based on the interaction.
`
`FIG. 6
`
`Supercell
`Exhibit 1004
`Page 10
`
`

`

`US 9,392.212 B1
`
`1.
`SYSTEMAND METHOD FOR PRESENTING
`VIRTUAL REALITY CONTENT TO AUSER
`
`FIELD OF THE DISCLOSURE
`
`This disclosure relates to a system and method for present
`ing virtual reality content to a user. The system and method
`may be configured such that the virtual reality content is user
`configurable.
`
`10
`
`BACKGROUND
`
`2
`experience within parameters allowed by the creator. The
`virtual reality content may be presented to the user in a virtual
`space. The virtual reality content may include primary virtual
`reality content, secondary virtual reality content, tertiary vir
`tual reality content (etc.), and/or other virtual reality content.
`As used herein, “virtual reality” may refer to what is tra
`ditionally considered virtual reality as well as augmented
`reality and/or other similar concepts. In some implementa
`tions, “virtual reality” may refer to a form of virtual reality/
`augmented reality hybrid and/or include an aspect and/or
`ability to view content in an augmented reality way. For
`example, creators may generate traditional virtual reality con
`tent but use augmented reality cameras to keep the user's
`peripheral vision open so they can keep an eye on the physical
`world around them.
`The terms “primary virtual reality content” and “secondary
`virtual reality content used herein are not intended to be
`limiting. The system may include any number of different
`types of virtual reality content. “Primary' and “secondary
`may be used generically throughout this disclosure to repre
`sent various different types of virtual reality content. The
`functionality described herein may be applied to any number
`of different types (e.g., primary, secondary, tertiary, etc.) of
`virtual reality content. In some implementations, secondary
`virtual content may refer to any virtual reality content that is
`not primary virtual reality content (e.g., secondary virtual
`reality content may include several different types of virtual
`reality content). In some implementations, the term virtual
`reality may include virtual reality as described herein, aug
`mented reality, mixed reality and/or other forms of virtual
`reality.
`The system may be configured such that primary virtual
`reality content may be displayed to a user, and, responsive to
`the user turning his view away from the primary virtual reality
`content; moving away from the primary virtual reality con
`tent; and/or taking other actions; one or more sensory cues
`may be provided to the user that indicate to the user that his
`view is no longer directed toward the primary virtual reality
`content. At Such times the primary virtual reality content may
`be paused and secondary, tertiary, etc., virtual reality content
`may be displayed to the user. For example, the system may be
`configured to determine that a view direction of the user has
`moved across a field boundary between a first field of view
`(where the primary virtual reality content is displayed) and a
`second field of view (e.g., where the secondary virtual reality
`content is displayed). The view direction of the user may refer
`to a direction toward which the user's gaze is directed, an
`orientation of the user's gaze (e.g., the user may tilt his head
`and/or lean over), a position of the user within the virtual
`space, and/or other directional and/or positional information
`(e.g., a user may move his position in the virtual space across
`a boundary). The primary virtual reality content may resume
`(e.g., automatically) when the user returns his view (e.g.,
`and/or moves within the virtual space) to the primary virtual
`reality content first field of view. The primary virtual reality
`content Subsequently displayed to the user may be adjusted
`based on a user's interaction with the secondary, tertiary, etc.,
`virtual reality content and/or other information. These fea
`tures are not limited to the primary virtual reality content. For
`example, the system may be configured such that secondary,
`tertiary, etc., virtual reality content may pause when the user
`looks and/or moves away and then resume when the user
`returns. The secondary, tertiary, etc., virtual reality content
`may be adjusted based on a user's progression through the
`other (e.g., primary) virtual reality content, interaction with
`the other (e.g., primary) virtual reality content, and/or other
`information.
`
`Virtual reality headset display devices are known. These
`devices visually simulate a user's physical presence in virtual
`spaces. Simulations typically include a 360° view of the
`user's Surrounding virtual space Such that user may turn his
`head to view different portions of the Surrounding space.
`Activity in the virtual space continues to progress regardless
`of which direction the user is facing and the user does not
`receive any indication of his view direction.
`Virtual reality presents a problem for storytellers. How
`does a storyteller maintain the attention of an audience mem
`ber so he can be told a story if he can look wherever he wants
`and go wherever he wants in the virtual space? (Note that the
`term “he” is used generically throughout the application to
`indicate both male and female.) In traditional forms of story
`telling there is typically a content narrative presented within
`the boundaries of a view screen. In a virtual space, there is a
`continuum. The audience member may simply wander and/or
`look away during a story, thus breaking a rhythm, pacing, and
`style of the narrative, and/or the narrative all together, thus
`disrupting the intended flow of a story, information, emotion,
`etc., to the user, as intended by the storyteller. Storytellers
`have therefore resorted to gimmicks and tricks to re-direct the
`attention of the audience member (e.g., loud noises and/or
`characters telling and/or showing the audience member
`where to look), which does not allow for more engaged and/or
`focused storytelling, and becomes progressively more prob
`lematic and transparent, the longer the story.
`
`15
`
`25
`
`30
`
`35
`
`40
`
`SUMMARY
`
`The present system uses boundaries to divide a virtual
`space into areas where primary content, secondary content,
`tertiary content, etc. may be presented to the user. These
`boundaries may be indicated to a user (e.g., an audience
`member) via sensory cues so that the user understands what is
`considered on stage (e.g. primary content) and off stage (sec
`ondary, tertiary, etc. content), for example. In some imple
`mentations, the boundaries may indicate areas in which a
`passive narrative content experience is taking place versus
`areas where interactive content experiences may be available.
`This may facilitate maintaining a users attention while nar
`rative content and/or other content is presented to the user in
`the virtual space.
`As such, one or more aspects of the disclosure relate to a
`system configured to present virtual reality content to a user.
`The system may include software components, hardware
`components, and/or other components operating together to
`cause the system to function as described herein. The system
`may be configured such that the virtual reality content is user
`configurable. For example, the system may enable a content
`creator, a storyteller, a filmmaker, a game maker, a game
`creator, and/or other content creators to maintain an audience
`members (e.g., a users) attention in virtual reality by deter
`mining how to deliver content (described below) to the audi
`ence member and enable the audience member to adjust their
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Supercell
`Exhibit 1004
`Page 11
`
`

`

`US 9,392.212 B1
`
`5
`
`10
`
`15
`
`3
`In some implementations, the system may be configured
`Such that a user may create, customize, and/or adjust the
`virtual reality content, the fields of view, view boundaries,
`and/or other characteristics of the system. The system may be
`configured such that a user may set and/or adjust, and/or the
`system may be configured to automatically adjust the fields of
`view, the field boundaries, relationships between the primary
`virtual reality content and the secondary, tertiary, etc. Virtual
`reality content, types of interactions between the user and the
`primary virtual reality content and/or between the user and
`secondary, tertiary, etc., virtual reality content, and/or other
`content creation and/or adjustment activities. In some imple
`mentations, the system may be configured Such that the cre
`ating, customizing, and/or adjusting enhances a user's physi
`cal comfort while viewing the virtual reality content. In some
`implementations, the system may be configured Such that a
`user may create, customize, and/or adjust within previously
`determined parameters for creating, customizing, and/or
`adjusting (e.g., the previously determined parameters deter
`mined by a content creator at manufacture).
`In Some implementations, the system may comprise one or
`more of a sensor, a user interface, a processor, electronic
`storage, and/or other components.
`The sensor may be configured to generate output signals
`conveying information related to a view direction of the user
`and/or other information. The view direction of the user may
`correspond to a physical direction toward which a gaze of the
`user is directed, an orientation of the user's body (e.g., the
`user may lean over) and/or a part of the user's body, a position
`of the user within the virtual space, and/or other directional
`information. For example, the view direction may include a
`first view direction that corresponds to a first physical direc
`tion toward which the gaze of the user is directed (e.g., the
`user may be looking in a forward direction) and a second view
`direction that corresponds to a second physical direction
`toward which the gaze of the user is directed (e.g., the user
`may turn around and look in a reverse direction).
`The user interface may include a display and/or other com
`ponents. The display may be configured to present the virtual
`reality content to the user. The display may be controlled by
`the processor to present the virtual reality content to the user
`Such that the presented virtual reality content corresponds to
`a view direction of the user. In some implementations, the
`display may be included in a virtual reality headset worn by
`the user. It should be noted that the description of the display
`provided herein is not intended to be limiting. Rather, the
`description of the display is intended to include future evolu
`tions of virtual reality display technology (which may not
`even be display based, for example). For example, the display
`may include cameras and/or systems for augmented reality,
`and/or other augmented reality components, light field imag
`ing devices that project an image onto the back of a user's
`retina (e.g., near-eye light field displays, etc.) virtual reality
`technology that utilizes contact lenses, virtual reality technol
`ogy that communicates directly with the brain, and/or other
`display technology.
`The processor may be configured to execute computer
`program components. The computer program components
`may be configured to enable an expert and/or user to interface
`with the system and/or provide other functionality attributed
`herein to the user interface, the sensor, the electronic storage,
`and/or the processor. The computer program components
`may include a direction component, a content component, a
`display component, an interaction component, and/or other
`components.
`The direction component may be configured to determine
`the view direction of the user based on the output signals from
`
`4
`the sensor and/or based on other information. The direction
`component may be configured to determine whether the view
`direction of the user falls within a field of view for displaying
`virtual reality content to the user. Fields of view may be
`predetermined at manufacture (e.g., during creation of Soft
`ware, at physical manufacture, etc.), determined and/or
`adjusted by a user via the system, determined by the system
`based on the virtual reality content presented to the user,
`and/or determined by other methods. In some implementa
`tions, the direction component may be configured Such that
`the fields of view share field boundaries. The direction com
`ponent may determine whether the view direction of the user
`moves across a given field boundary between two different
`fields of view.
`The content component may be configured to cause the
`display to present the virtual reality content to the user based
`on the determined view direction, the fields of view, and/or
`other information. The content component may be configured
`to cause the display to present primary virtual reality content,
`secondary virtual reality content, tertiary virtual reality con
`tent, etc., and/or other virtual reality content to the user. The
`content component may be configured such that primary vir
`tual reality content is presented to the user responsive to the
`user's view direction being within a first field of view (e.g., a
`forward looking field of view), and secondary virtual reality
`content may be presented to the user when the user's view
`direction is within a second field of view (e.g., a rearward
`looking field of view), and so on for tertiary, etc., virtual
`reality content.
`The display component may be configured to cause the
`display and/or other components of the system to provide one
`or more sensory cues (e.g., visual, auditory, Somatosensory,
`olfactory, etc.) to the user responsive to the view direction of
`the user changing between fields of view. In some implemen
`tations, the sensory cue(s) may comprise a pause in the pri
`mary virtual reality content, a visually perceptible darkening
`of the primary virtual reality content and a brightening of the
`secondary, tertiary, etc., virtual reality content, and/or other
`sensory cues.
`The interaction component may be configured to facilitate
`interaction between the user and the primary virtual reality
`content, interaction between the user and the secondary vir
`tual reality content, interaction between the user and tertiary
`virtual reality content, etc., and/or other interaction. In some
`implementations, the interaction component may be config
`ured to adjust the primary virtual reality content presented to
`the user based on interaction between the user and the sec
`ondary, tertiary, etc., virtual reality content, and/or based on
`other information. In some implementations, the interaction
`component may be configured to adjust the secondary, ter
`tiary, etc., virtual reality content presented to the user based
`on interaction between the user and the primary virtual reality
`content, a progression of the user through the primary virtual
`reality content, and/or other information.
`In some implementations, the interaction component may
`be configured to facilitate the creation, customization, and/or
`adjustment of the fields of view, the view boundaries, the
`virtual reality content, and/or other characteristics of the sys
`tem. The interaction component may be configured to facili
`tate the creation, customization, and/or adjustment via the
`user interface and/or other components of the system. The
`interaction component may cause the user interface to present
`one or more views of a graphical user interface to the user to
`facilitate the creation, customization, and/or adjustment. The
`one or more views may include one or more fields and/or
`overlays (e.g., the customization views may appear as sepa
`rate field and/or as overlays on a particular field and/or in a
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Supercell
`Exhibit 1004
`Page 12
`
`

`

`US 9,392.212 B1
`
`5
`particular scene in the virtual space) for creating, customiz
`ing, and/or adjusting the fields of view, the view boundaries,
`the virtual reality content, and/or other characteristics of the
`system. The one or more views may include one or more
`fields and/or overlays for previewing changes to the system
`before implementation.
`These and other features, and characteristics of the present
`technology, as well as the methods of operation and functions
`of the related elements of structure and the combination of
`parts and economies of manufacture, will become more
`apparent upon consideration of the following description and
`the appended claims with reference to the accompanying
`drawings, all of which form a part of this specification,
`wherein like reference numerals designate corresponding
`parts in the various figures. It is to be expressly understood,
`however, that the drawings are for the purpose of illustration
`and description only and are not intended as a definition of the
`limits of the invention. As used in the specification and in the
`claims, the singular form of “a”, “an', and “the' include
`plural referents unless the context clearly dictates otherwise.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`6
`It should be noted that the word “movie' as used herein is
`not intended to be limiting. It may refer to any passive and/or
`active narrative content and/or experience that system 10 is
`capable of displaying to a user.
`For example, the present system may be configured Such
`that the user's sense of boundaries becomes a thing that they
`may sense because they can hear? see/feel (described below)
`when they are approaching a boundary. Without the content
`creator having to tell and/or warn the user verbally (for
`example), the user may “sense' when they are going to cross
`a boundary, which gives the user an opportunity to choose
`whether or not they want to cross a particular boundary.
`In some implementations, the present system may facili
`tate a content creator's ability to use classic storytelling tech
`niques such as cuts and/or dissolves. In some implementa
`tions, the present system may facilitate a user's ability to
`choose between boundary defined areas of content (e.g., some
`of which may be narrative experiences) without having to lose
`track of primary, secondary, tertiary, etc., virtual reality con
`tent presented on the other side of a boundary. For example, a
`user may enjoy multiple areas of different types of experi
`ences (e.g., passive and/or interactive) within a virtual world,
`while having individual narratives pause when the user dis
`engages to move to another experience and then resume when
`the user returns to the narrative. The present system may
`facilitate storytelling at a rhythm, pacing, and/or style
`intended by the content creator.
`The terms “primary virtual reality content” and “secondary
`virtual reality content used herein are not intended to be
`limiting. The system may include any number of different
`types of virtual reality content. “Primary' and “secondary
`may be used generically throughout this disclosure to repre
`sent various different types of virtual reality content. The
`functionality described herein may be applied to any number
`of different types (e.g., primary, secondary, tertiary, etc.) of
`virtual reality content. In some implementations, secondary
`virtual content may refer to any virtual reality content that is
`not primary virtual reality content (e.g., secondary virtual
`reality content may include several different types of virtual
`reality content).
`FIG. 1 illustrates an implementation of a system 10 con
`figured to present virtual reality content to a user. The virtual
`reality content may be presented to the user in a virtual space.
`The virtual reality content may include primary virtual reality
`content, secondary virtual reality content, tertiary virtual real
`ity content, etc., and/or other virtual reality content. System
`10 may be configured such that primary virtual reality content
`may be displayed to a user, and, responsive to the user turning
`his view and/or otherwise moving away from the primary
`virtual reality content, one or more sensory cues may be
`provided to the user that indicates to the user that his view is
`no longer directed toward the primary virtual reality content.
`At Such times the primary virtual reality content may be
`paused and secondary, tertiary, etc., and/or other virtual real
`ity content may be displayed to the user. System 10 may be
`configured to determine that a view direction of the user has
`moved across a field boundary between a first field of view
`where the primary virtual reality content is displayed and a
`second field of view where, for example, secondary virtual
`reality content may be displayed. (This applies for example,
`to any one of several boundaries that separate any two fields
`of view). The view direction of the user may refer to a direc
`tion toward which the user's gaze is directed, an orientation of
`the user's gaze (e.g., the user may tilt his head and/or lean
`over), a position of the user within the virtual space, and/or
`other directional and/or positional information (e.g., a user
`may move his position in the virtual space across aboundary).
`
`5
`
`10
`
`15
`
`25
`
`30
`
`35
`
`FIG. 1 illustrates an implementation of a system configured
`to present virtual reality content to a user.
`FIG. 2 illustrates a virtual reality headset computing
`device.
`FIG. 3 illustrates a server configured to communicate with
`a computing device via a network.
`FIG. 4A illustrates a top view of individual fields of view
`that share a field boundary.
`FIG. 4B illustrates a first example of a view of virtual
`reality content presented to a user on a display.
`FIG. 4C illustrates a second example of a view of virtual
`reality content presented to a user on a display.
`FIG. 4D illustrates multiple fields of view that a user may
`look toward and/or into.
`FIG.5 illustrates an example of a possible view of a content
`creation graphical user interface.
`FIG. 6 illustrates an implementation of a method for pre
`senting virtual reality content to a user.
`
`40
`
`DETAILED DESCRIPTION
`
`The present system may facilitate storytelling using virtual
`reality via boundaries that divide a virtual space into areas
`where primary content, secondary content, tertiary content,
`etc. may be presented to the user. These boundaries may be
`indicated to a user in the virtual space via sensory cues so that
`the user understands and/or may become accustomed to what
`is considered on stage (e.g. primary content) and off stage
`(secondary, tertiary, etc. content), for example.
`The present system may facilitate content governance by a
`content creator in one or more ways (e.g., defining the loca
`tion of boundaries, creating virtual reality content that is
`presented to a user based on a user's virtual presence on one
`side of a boundary or another) that allow the content creator to
`direct a user's attention and/or point of view throughout a
`specific narrative, while allowing the user to pause and/or opt
`out and/or into other areas of the virtual space and/or narra
`tives as defined by the boundaries, and also allowing the user
`to come back into and/or resume the original narrative. In
`Some implementations, the present system may allow the
`content creator to maintain the attention of audience members
`(e.g., users) throughout the duration of a story and/or story
`telling (e.g., during the entire length of a movie presented to
`a user in the virtual space).
`
`45
`
`50
`
`55
`
`60
`
`65
`
`Supercell
`Exhibit 1004
`Page 13
`
`

`

`US 9,392.212 B1
`
`10
`
`15
`
`25
`
`30
`
`35
`
`7
`The primary virtual reality content may resume when the user
`returns his view to the primary virtual reality content first field
`of view. The primary virtual reality content subsequently
`displayed to the user may be adjusted based on a user's
`interaction with the secondary (tertiary, etc.) virtual reality
`content and/or other information. The secondary (tertiary,
`etc.) virtual reality content may be adjusted based on a user's
`progression through the primary virtual reality content, inter
`action with the primary virtual reality content, and/or other
`information.
`By way of a non-limiting example, system 10 may be
`configured Such that the virtual reality content is displayed to
`the user via a virtual reality headset. In this example, the
`primary virtual reality content may be a movie (2D and/or 3D
`displayed content, captured video, 2D and/or 3D generated
`content, digitally created characters, objects, and spaces,
`algorithmically created content), and/or any other primary
`virtual reality content displayed to the user. The movie may be
`displayed to the user while the user looks in a forward direc
`tion (e.g., the user's view direction is within the first field of
`view). Responsive to the user turning his view from the pri
`mary virtual reality content (e.g., left or right, up or down,
`and/or other directions) and/or otherwise moving away from
`the primary virtual reality content in the virtual space, a
`sensory cue comprising a visually perceptible darkening of
`the primary virtual reality content (e.g., a movie), a brighten
`ing of related (e.g., movie characters, objects seen in the
`movie, a setting of the movie, an extension of the setting of the
`movie, etc.) secondary, tertiary, etc., virtual reality content, a
`slowing to a stop of the primary virtual reality content (e.g., a
`slowing to a stop of the movie and/or the movie sound),
`and/or other cues (described below for example) may be
`displayed to the user. The secondary, tertiary, etc., virtual
`reality content may be determined by system 10 based on the
`user's progression through the movie (e.g., as characters are
`introduced in the movie they may be added to the secondary,
`tertiary, etc., virtual reality content), based on a users inter
`action with the movie (e.g., a user may repeatedly look overto
`a particular character in the movie which the user is thenable
`to interact with as secondary, tertiary, etc., virtual reality
`40
`content), and/or based on other information. The user may
`interact with the secondary, tertiary, etc., virtual reality con
`tent and then return his view back to the movie wherein the
`movie has been adjusted to reflect the user's interaction with
`the secondary, tertiary, etc., virtual reality content. In some
`implementations, while a user views primary virtual reality
`content, System 10 may be configured to display a “pop-up”
`cue indicating that additional virtual reality content (e.g.,
`additional characters) is available in a different (e.g., second
`ary, tertiary, etc.) content area.
`In some implementations, system 10 may be configured
`Such that a user may create, customize, and/or adjust the
`virtual reality content, the fields of view, view boundaries,
`and/or other characteristics of system 10. System 10 may be
`configured Such that a user may set and/or adjust the fields of
`view, the field boundaries, relationships between the primary
`virtual reality content and the secondary, tertiary, etc., virtual
`reality content, types of interactions between the user and the
`primary virtual reality content and/or between the user and
`secondary, tertiary, etc., virtual reality content, and/or other
`content creation and/or adjustment activities (described in
`greater detail below).
`In some implementations, system 10 may comprise one or
`more of a user interface 14 (which may include a display 16
`and/or other components as described herein), a sensor 18, a
`processor 20, electronic storage 30, and/or other components.
`In Some implementations, one or more components of system
`
`50
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket