`Bar-Nahum
`
`USOO6496 183B1
`(10) Patent No.:
`US 6,496,183 B1
`(45) Date of Patent:
`Dec. 17, 2002
`
`(54) FILTER FORTRANSFORMING 3D DATA IN
`A HARDWARE ACCELERATED RENDERING
`ARCHITECTURE
`
`5,867,210 A 2/1999 Rod ............................ 34.8/51
`5,929,859 A * 7/1999 Meijers ...................... 345/419
`5,949,420 A * 9/1999 Terlutter ..................... 345/419
`
`(75) Inventor: Guy Bar-Nahum, Palo Alto, CA (US)
`(73) Assignee: Koninklijke Philips Electronics N.V.,
`Eindhoven (NL)
`Subject to any disclaimer, the term of this
`atent is extended or adiusted under 35
`p C. 154(b) by 0
`
`(*) Notice:
`
`(21) Appl. No.: 09/107,918
`(22) Filed:
`Jun. 30, 1998
`(51) Int. Cl. ................................................ G06T 15/00
`(52) U.S. Cl. ....................................................... 345/419
`(58) Field of Search ................................. 345/419,427,
`345/302, 421, 426, 619, 220, 621
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`5,594,843 A * 1/1997 O'Neill ...................... 345/427
`
`* cited b
`cited by examiner
`
`Primary Examiner Phu K. Nguyen
`(74) Attorney, Agent, or Firm Michael Schmitt
`(57)
`ABSTRACT
`
`A method and apparatus for generating multiple views for
`graphics objects utilizes a filter for transforming the 3D
`content of the data. The filter is configured to receive
`function calls from a 3D rendering engine and generate
`multiple viewpoint data based on the 3D content of the
`graphics object. The filter then transmits the Viewpoint data
`to a display driver for display to a display device.
`
`28 Claims, 8 Drawing Sheets
`
`1OO
`
`3D RENDERINGAP
`OSVENDORAP
`
`SWk 3D ENGINE
`INTERFACE
`OSVENDORAP
`
`
`
`
`
`
`
`3D APPLICATION
`INDEPENDENTSOFTWARE
`WENDOR COMPONEN
`
`16
`
`3D STRUCTURES
`FORAP
`OSVENOORDEFINED
`
`OS 3D RENDERING
`ENGINE
`OSVENDOR COMPONENT
`
`3D STRUCTURESFOR
`3D ENGINEINTERFACE
`OSVENDORDEFINE)
`
`SIESPIC
`
`3RD PARTY COMPONENT
`
`SERE
`
`OSVENDORDEFINED
`
`3D ENGINE
`INTERFACE
`OSWENDORAP
`
`IONTOREGISTERS
`OEM DEFINED
`
`ANAGVDESIGNAL
`INDUSTRYSTANDARD
`WITH3RD PARTY FRAME
`BUFFERFORMAT
`
`EB
`
`
`
`3D HARDWARE
`ACCELERATION
`DRIVER
`OEM COMPONENT
`
`3D ACCELERATOR
`OEM HARDWARE
`
`34
`
`STEREOSCOPIC DESPLAY
`3RD PARTYHARDWARE
`
`LEFT
`VIEW
`
`RIGHT
`VIEW
`
`DAC
`32 Haryat
`50
`
`IPR2018-01045
`Sony EX1005 Page 1
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 1 of 8
`
`US 6,496,183 B1
`
`16
`
`3D STRUCTURES
`OS VENDORDEFINED
`
`2D STRUCTURES
`OSVENDORDEFINED
`
`
`
`
`
`
`
`
`
`3D RENDERINGAP
`OSVENDORAP
`
`SOFARE
`
`2DDRIVERINTERFACE
`OSVENDORAP
`
`IOINTO REGISTERS
`
`OEMDEFINED
`
`22
`
`
`
`3D APPLICATION
`INDEPENDENTSOFTWARE
`VENDOR COMPONENT
`
`OS 3D RENDERIN
`S ENGINE
`G
`OSVENDOR COMPONENT
`
`
`
`
`
`2D DISPLAY DRIVER
`
`OEM.COMPONENT
`
`U
`
`2DDISPLAYADAPTER
`OEM HARDWARE
`
`ANALOG WIDEOSIGNAL
`
`INDUSTRYSTANDARD
`
`
`
`
`
`HARyARE
`
`s
`
`24
`
`MONITOR
`
`FIG.
`PRIOR ART
`
`IPR2018-01045
`Sony EX1005 Page 2
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 2 of 8
`
`US 6,496,183 B1
`
`3D RENDERINGAP
`OSVENDORAP
`
`3D ENGINE
`INTERFACE
`sy
`T OSVENDORAP
`
`
`
`
`
`
`
`
`
`
`
`3D ENGINE
`INTERFACE
`OSVENDORAP
`
`'
`ONTO REGISTERS
`OEMDEFINED
`
`3D APPLICATION
`INDEPENDENTSOFTWARE
`VENDOR COMPONEN
`
`16
`
`3D STRUCTURES
`FORAP
`OSVENDORDEFINED
`
`OS 3D RENDERING
`ENGINE
`OSVENDOR COMPONENT
`
`STEREOSCOPIC
`FILTER
`3RD PARTY COMPONENT
`
`3D HARDWARE
`ACCEON
`OEM COMPONENT
`
`
`
`3D STRUCTURESFOR
`3D ENGINEINTERFACE
`OSVENDORDEFINED
`
`3D STRUCTURESFOR
`3D ENGINEINTERFACE
`OSVENDORDEFINED
`
`
`
`LEFT
`VIEW
`
`RIGHT
`VIEW
`
`ANAGVDESIGNAL
`INDUSTRYSTANDARD
`WITH3RD PARTY FRAME
`BUFFER FORMAT
`
`EB
`:
`
`3D ACCELERATOR
`OEM HARDWARE
`
`34
`
`STEREOSCOPIC DISPLAY
`3RD PARTYHARDWARE
`
`
`
`
`
`DAC
`32
`
`HARDWARE
`50
`
`IPR2018-01045
`Sony EX1005 Page 3
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 3 of 8
`
`US 6,496,183 B1
`
`
`
`LOAD
`STEREOSCOPIC
`FILTER
`
`DEFINE
`STEREOSCOPIC
`MODE
`
`EXECUTE RUN
`TIME STAGE
`
`GENERATE LEFT
`AND RIGHT
`VIEWPOINT DATA
`
`TRANSMIT
`VIEWPOINT DATA
`TO DISPLAY DRIVER
`
`TRANSMIT
`VIEWPOINT DATA
`TO FRAMEBUFFER
`
`OUTPUT RGB SIGS.
`TO STEREOSCOPIC
`DISPLAY
`
`FIG. 3
`
`IPR2018-01045
`Sony EX1005 Page 4
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 4 of 8
`
`US 6,496,183 B1
`
`
`
`IPR2018-01045
`Sony EX1005 Page 5
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 5 of 8
`
`US 6,496,183 B1
`
`
`
`IPR2018-01045
`Sony EX1005 Page 6
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 6 of 8
`
`US 6,496,183 B1
`
`?
`CO
`
`ed
`
`L
`
`CC
`CO
`ed
`
`L
`
`E (Z
`>C
`)
`
`
`
`IPR2018-01045
`Sony EX1005 Page 7
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 7 of 8
`
`US 6,496,183 B1
`
`
`
`IPR2018-01045
`Sony EX1005 Page 8
`
`
`
`U.S. Patent
`
`Dec. 17, 2002
`
`Sheet 8 of 8
`
`US 6,496,183 B1
`
`
`
`FIG. 8
`
`FIG. 9
`
`
`
`
`
`
`
`FIG 10
`
`PROCESSOR
`
`FIG 11
`
`IPR2018-01045
`Sony EX1005 Page 9
`
`
`
`US 6,496,183 B1
`
`1
`FILTER FORTRANSFORMING 3D DATA IN
`A HARDWARE ACCELERATED RENDERING
`ARCHITECTURE
`
`2
`engine 18 is merely a 2-dimensional (2D) image in a frame
`buffer of the 2D display adapter 22. The 2D image is then
`output to monitor 24, Such as a conventional PC monitor.
`Accordingly, the Software rendering architecture is unable to
`generate Stereoscopic 3D content.
`Additionally, in other Systems utilizing hardware render
`ing architectures, 3D acceleration devices may be used to
`execute Some or all of the rendering requests in hardware.
`However, the resulting output of the 3D rendering is still
`merely a 2D image in the frame buffer of the 3D accelerator.
`Accordingly, in Systems using hardware rendering
`architectures, the System is unable to generate Stereoscopic
`3D content.
`There exists a need for an arrangement in a computer
`System that enables the generation of Stereoscopic 3D
`imageS for graphic objects, e.g., for head-mounted displayS.
`There is also a need for an arrangement that enables
`Stereoscopic 3D images to be generated and output using
`conventional hardware drivers and 3D acceleration hard
`WC.
`These and other needs are met by the present invention,
`where a Stereoscopic filter intercepts calls from an operating
`System graphics module to a display driver requesting 3D
`rendering operations on a graphic object. The filter then
`generates Stereoscopic image data for a left eye viewpoint
`and a right eye viewpoint that is then Stored in a frame
`buffer. The output from the display driver can then be
`converted to analog video signals and output to a Stereo
`Scopic display device.
`According to one aspect of the invention, a method is
`provided for generating 3D data for a graphic object. The
`method includes generating, in a 3D rendering module, a
`function call request for a 3D rendering operation for the
`graphic object. The method also includes receiving the
`function call request by a filter and generating a plurality of
`Viewpoint data for the 3D graphic object. Another aspect of
`the present invention provides a computer-readable medium
`that includes Stored Sequences of instructions that are
`executed by a processor. The instructions cause the proces
`Sor to receive a function call generated by a 3D rendering
`module requesting a 3D operation for a graphics object. The
`instructions also cause the processor to generate a plurality
`of Viewpoint data for the graphics object.
`Other objects and advantages of the present invention will
`become readily apparent to those skilled in this art from the
`following detailed description. The embodiments shown and
`described provide illustration of the best mode contemplated
`for carrying out the invention. The invention is capable of
`modifications in various obvious respects, all without
`departing from the invention. Accordingly, the drawings are
`to be regarded as illustrative in nature, and not as restrictive.
`BRIEF DESCRIPTION OF THE DRAWINGS
`Reference is made to the attached drawings, wherein
`elements having the same reference numeral designations
`represent like elements throughout.
`FIG. 1 is a block diagram of a conventional computer
`display System using Software rendering.
`FIG. 2 is a block diagram of an exemplary computer
`System upon which the present invention can be imple
`mented.
`FIG. 3 is a flow diagram illustrating the operation of an
`exemplary computer System in accordance with an embodi
`ment of the invention.
`FIGS. 4A-4C illustrate various horizontal stereoscopic
`modes that can be used with the present invention.
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`This application is related to the following commonly
`assigned, copending applications, Ser. No. 08/972,511 filed
`Nov. 18, 1997, entitled FILTER BETWEEN GRAPHICS
`ENGINE AND DRIVER FOR EXTRACTING
`INFORMATION, and Ser. No. 09/002,139 filed Dec. 31,
`1997, entitled APPARATUS AND METHOD FOR
`DYNAMICALLY CONTROLLING BRIGHTNESS OF
`OBJECTS ON ADISPLAY MONITOR, both of which are
`incorporated herein by reference.
`TECHNICAL FIELD
`The present invention relates to computer Systems that
`generate graphics and more particularly to generating
`3-dimensional (3D) graphics.
`BACKGROUND ART
`Software application programs that utilize increased
`graphics have become more prevalent. For example, Video
`games often utilize an increasing amount of 3D graphics for
`display on a typical personal computer (PC) monitor. These
`graphics applications require the computer System to include
`3D rendering Software to Support the graphical content. FIG.
`1 is a block diagram illustrating a conventional computer
`System for executing typical graphics application programs
`and displaying the graphics.
`Referring to FIG. 1, computer system 10 may be
`implemented, for example, as a Pentium-based PC utilizing
`a Windows 9x or Windows NT operating system from
`Microsoft Corporation. Computer system 100 includes a
`Software portion 12 and a hardware display portion 14. The
`Software portion 12 includes an application program 16, an
`operating System 3D rendering engine 18 and a third party
`2-dimensional (2D) display driver 20. The application 16
`generates function calls to the operating System (OS) of
`computer system 10 to perform OS services. Specifically,
`the application 16 generates function calls to 3D rendering
`engine 18, also referred to as a graphics module, via the
`operating System's defined applications programmer inter
`face (API).
`The 3D rendering engine 18 performs operations associ
`ated with graphics, for example tracking the State of a Scene,
`caching geometry and textures into their internal
`representations, customizing rendering engine 18 according
`to the application calls, etc. The 3D rendering engine 18
`manipulates the 3D objects it processes using temporary
`buffers, Such as stencil buffers and Z-buffers and one or more
`final result 2D buffers for a rendered frame.
`After performing 3D rendering, the application 16
`instructs 3D rendering engine 18 to render the Scene into a
`resultant frame buffer via the 2D display driver 20. The 3D
`rendering engine communicates with the 2D display driver
`20 via the operating systems API. The 3D rendering engine
`18 renders the 3D scene into a final 2D image in the frame
`buffer, typically located in 2D display adapter 22. The 2D
`display adapter then converts the Video data in the frame
`buffer to analog video (RGB) signals, and display monitor
`24 displays the final image.
`SUMMARY OF THE INVENTION
`A drawback with the Software rendering architecture
`illustrated in FIG. 1 is that the output from 3D rendering
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`IPR2018-01045
`Sony EX1005 Page 10
`
`
`
`3
`FIGS. 5A-5C illustrate various vertical stereoscopic
`modes that can be used with the present invention.
`FIGS. 6A-6B illustrate viewpoint rendering.
`FIGS. 7A-7C illustrate viewpoint rendering in accor
`dance with an embodiment of the present invention.
`FIG. 8 illustrates a virtual focal eye point model used in
`a viewer model.
`FIG. 9 illustrates a modified projection for the object in
`FIG. 8.
`FIG. 10 illustrates a left view coordinate system for the
`projection in FIG. 9.
`FIG. 11 is a simplified block diagram of a computer
`System in which the present invention can be employed.
`DETAILED DESCRIPTION OF INVENTION
`The present invention provides an apparatus and method
`for generating Stereoscopic 3D graphics using a Stereoscopic
`filter. The Stereoscopic filter is a Software module integrated
`in the operating System of a computer System and is con
`figured to intercept function calls to a 3D hardware accel
`eration driver. The filter generates left eye and right eye
`Viewpoint data for a graphics object and outputs the data to
`a display driver for Storage in a frame buffer.
`According to an embodiment of the present invention
`illustrated in FIG. 2, computer system 100 includes software
`40 employing an operating System that Supports execution of
`application 16 and a hardware portion 50, described below.
`The operating System includes 3D rendering engine 18,
`described in detail above in connection with FIG. 1, for
`creating graphics for the Software application 16.
`Specifically, the application 16 generates function calls to
`3D graphics rendering engine 18, also referred to as a
`graphics module, via the operating System's defined appli
`cations programmer interface (API).
`The 3D rendering engine 18 performs operations associ
`ated with graphics and is configured in a hardware acceler
`ated rendering mode. That is, the 3D rendering engine 18 is
`configured to generate function calls to a display driver.
`Specifically, the 3D rendering engine 18 transmits function
`calls, via an API, intended for 3D acceleration driver 28
`requesting the performing of a graphics-related operation for
`a graphic object.
`The Stereoscopic filter 26 is a Software component and is
`loaded by the OS of computer system 100 in a manner
`Similar to loading a conventional display driver. The Stereo
`Scopic filter 26 appears to 3D rendering engine 18 as a
`conventional third party display driver configured for
`executing and performing the graphic operations called by
`the 3D rendering engine 18. AS Such, the Stereoscopic filter
`26 is transparent to the 3D rendering engine 18.
`The hardware acceleration driver 28 is implemented as a
`Separate code module that extends the 3D rendering engine
`18. Accordingly, it runs in the context of the OS 3D
`rendering engine 18. The 3D acceleration driver 28 also
`exposes an interface through an array of entry points that are
`called as a routine call by the OS 3D rendering engine 18.
`The stereoscopic filter 26 utilizes this interface and uses the
`same entry points as the 3D acceleration driver 28.
`For example, in a Microsoft Windows OS, the 3D ren
`dering engine 18 and the 3D hardware acceleration driver 30
`are implemented as a dynamic link library (DLL). AS Such,
`each executable code module has entry points of exported
`functions that external code can link to and call. By imple
`menting the same required entry points in the Stereoscopic
`filter 26, the 3D rendering engine 18 uses the stereoscopic
`filter 26 as if it was the conventional OEM driver.
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,496,183 B1
`
`4
`Accordingly, the Stereoscopic filter 26 intercepts the func
`tion calls intended for the 3D hardware acceleration driver
`28 and performs various graphic operations, e.g.,
`initialization, rendering and mode change activities.
`Although the 3D rendering engine is configured to generate
`function calls to 3D hardware acceleration driver 28, the
`actual 3D accelerator 30 is not required. The actual 3D
`rendering can be done in Software by calling back to the 3D
`rendering engine 18 to emulate the requests in Software.
`However, for illustrative purposes, the embodiment illus
`trated in FIG. 2 includes 3D accelerator 30.
`Referring to FIG. 2, the 3D hardware acceleration driver
`28 controls the 3D accelerator 30. The 3D acceleration
`driver 28 outputs data to the 3D accelerator in the form of
`video data and control data. The 3D accelerator typically
`includes a frame buffer 31 for storing the video data and a
`digital-to-analog (DAC) converter 32 for converting the
`Video data into analog RGB signals.
`The Stereoscopic filter 26 Software operation uses three
`major Sets of entry points:
`1) Load entry points-include loading the Stereoscopic
`filter 26 by computer system 100 and other initializa
`tion activities that would ensure the correct filtering of
`the requests from the 3D rendering engine 18.
`2) Mode Set Entry points-enable a special Stereoscopic
`mode to be set. This changes the management of the 2D
`frame buffer 31 from one viewpoint into left and right
`Viewpoints for the Stereoscopic display.
`3) Run time entry points-includes the necessary conver
`Sion of a single viewpoint 3D object geometry into a
`left and right viewpoint in the defined Stereoscopic
`mode.
`FIG. 3 is a flow diagram illustrating the operation of the
`computer system 100 of FIG. 2. At step 60, the stereoscopic
`filter 26 is loaded into computer system 100 in a similar
`manner to loading a conventional hardware acceleration
`driver. For example, the Stereoscopic filter 26 install proce
`dure registers the Stereoscopic filter 26 in the System data
`base as the 3D acceleration driver 28 for the system. Other
`miscellaneous initialization may be needed depending on the
`particular 3D rendering engine 18 used. For example, dif
`ferent 3D rendering engines 18 have different mechanisms
`to ensure that the engine always calls the driver, in our case
`the Stereoscopic filter 26, for any rendering activity.
`The stereoscopic filter 26 operates to ensure that the 3D
`rendering engine does not try to render the image into the 2D
`frame buffer 31 directly. There are a variety of ways in
`which the stereoscopic filter 26 can accomplish this. For
`example, the Stereoscopic filter 26 can declare itself as a
`device managed Surface driver or a fully accelerated driver
`to the 3D rendering engine 18. The stereoscopic filter 26
`then processes the geometry function calls from the 3D
`rendering engine 18.
`The geometry calls map the 3D objects into 2D pixels in
`the frame buffer 31. The stereoscopic filter 26 then passes
`the input 3D objects into the viewer model to generate two
`views and from there into reformatted 2D buffers, as dis
`cussed in more detail below.
`After loading the Stereoscopic filter 26, the Stereoscopic
`filter 26 defines a Special Stereoscopic mode in the frame
`buffer 31 at step 62. For a given stereoscopic mode, the left
`and right eye viewpoint frames are packed into the frame
`buffer 31. The particular mode can be a line-by-line hori
`Zontal Stereoscopic mode. For example referring to FIGS.
`4A-4C, FIG. 4A represents a graphic object as Stored in a
`conventional frame buffer. FIG. 4B represents a mode in
`
`IPR2018-01045
`Sony EX1005 Page 11
`
`
`
`S
`which the frame buffer is double the length of the conven
`tional frame buffer. In this case, the physical frame buffer 31
`is divided into two frames and each line in the physical
`frame buffer 31 is divided to contain a left line and a right
`line, as shown in FIG. 4B. Alternatively, if frame buffer size
`constraints do not allow for doubling the actual length of the
`frame buffer, the left and right frames can be compressed in
`the X-direction, as shown in FIG. 4C, to maintain the
`original frame buffer line length.
`Alternatively, the Stereoscopic mode can be a frame-by
`frame vertical Stereoscopic mode. For example referring to
`FIGS. 5A-5C, FIG. 5A represents a graphic object as stored
`in a conventional frame buffer. FIG. 5B represents a mode
`in which the frame buffer is double the height of the
`conventional frame buffer. In this case, the physical frame
`buffer 31 is divided into two areas. The upper half holds the
`left eye frame (or the right eye frame) and the lower half
`holds the right eye frame (or the left eye frame), as shown
`in FIG. 5B. Alternatively, if frame buffer 31 size constraints
`do not allow for doubling the actual height of the frame
`buffer, the left and right frames can be compressed in the
`Y-direction, as shown in FIG. 5C, to maintain the original
`frame buffer column size.
`Other known stereoscopic modes that hold right and left
`ViewS can also be implemented in the Stereoscopic filter 26
`depending on the particular Stereoscopic display 34 being
`used. The digital-to-analog (DAC) 32, typically located on
`the 3D accelerator 30, Scans the frame buffer 31 and
`generates an analog video Signal that consists of the left
`frame followed by the right frame. The stereoscopic display
`34 decodes the analog signal and extracts the Separate view
`from the Signal.
`Since the frame buffer 31 configuration may be changed
`by the stereoscopic filter 26, more memory or different
`memory allocation may be needed. However, the Stereo
`Scopic filter 26 is configured to define a mode that has
`memory capacities that Satisfy its requirements. At Step 64,
`the Stereoscopic filter 26 run time Stage is executed. That is,
`the Stereoscopic filter 26 intercepts the geometry request
`from the 3D rendering engine 18. In an exemplary embodi
`ment as discussed previously, the Stereoscopic filter 26 can
`declare itself as a device managed driver to eliminate any
`direct writes to the frame buffer 31, the 3D rendering engine
`18 will not try to emulate directly into the frame buffer 31
`and bypass the Stereoscopic filter 26.
`The Stereoscopic filter 26 receives the rendering requests
`from the 3D rendering engine 18. The 3D rendering engine
`18 typically delivers the rendering requests with 3D objects
`represented as arguments. The particular details of how the
`3D data is encoded vary between different engine imple
`mentations. However, each 3D engine must define its atomic
`3D geometry terminology Since these are the building blockS
`for the 3D scene. Common 3D object representations known
`in the art use points in Space that represent polygons, usually
`triangular that tile the object surface to form a 3D object.
`Next at step 66, the stereoscopic filter 26 modifies the 3D
`object geometry requests by generating geometry values for
`each eye. Every 3D object in the scene is modified by a
`Viewer model algorithm in the Stereoscopic filter 26, to
`reflect a projection from an other point in Space.
`The present invention will be described in conjunction
`with a viewer model representation of a 3D object in relation
`to the final rendering viewpoints. There are many possible
`Viewer models that can be implemented in Stereoscopic filter
`26. Each particular viewer model highlights different aspects
`of 3D depth perception and models different head and eye
`configurations. In the discussion below, only the inputs
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 6,496,183 B1
`
`6
`needed for the stereoscopic filter 26 and the outputs from it
`are discussed in order not to unduly obscure the invention.
`However, given the guidance and objectives disclosed
`herein, the particular details of the viewer model processing
`can be readily implemented based on the Specific 3D ren
`dering engine 18 being used, as well as other System
`constraints.
`In viewer model processing, every 3D object is separated
`into two instances, or viewpoints, i.e., a left viewpoint and
`a right viewpoint. The given target projection plane of the
`3D geometry is replaced with two new projection planes,
`one for each eye. The exact relation of the 3D location of the
`new planes to the original projection plane is the viewer
`model. The particular details of the viewer model are
`dependent on the OS 3D rendering engine 18 geometry
`representation and by the way the 3D rendering engine 18
`Specifies the target 2D rendering plane.
`For example, in a System that renders the 3D Scene into
`a projection of a point in Space, the Stereoscopic filter 26
`Viewer model representation replaces the rendering request
`from the 3D rendering engine 18 by two different rendering
`requests to the 3D acceleration driver 30.
`FIGS. 6A-6B illustrate 3D rendering using regular view
`point rendering without the stereoscopic filter and FIGS.
`7A-7C illustrate view point rendering using the Stereoscopic
`filter 26. As shown in FIG. 7C, the stereoscopic filter 26
`displaces the requested viewpoint to the left for the left
`Viewpoint and assigns the result into the left frame in the
`frame buffer 31. The second requests handles the right
`Viewpoint in a Similar way. In the Stereoscopic filter 26
`Viewer model, only viewpoint rendering requests generate
`two viewpoints, one for each eye, and the resultant eye
`frames are stored in the frame buffer 31.
`Alternatively, in a fixed point rendering engine, the Scene
`is always rendered into the XY plain, where geometry is in
`XYZ Cartesian Space, and every point in the Scene is
`relocated to represent a left viewpoint and again for the right
`Viewpoint. The exact displacement is dependent on the
`location of the eyes in Space.
`There are other various models known in the art that can
`be used to generate physical viewpoints for 3D objects in
`Space. However, for illustrative purposes, the following
`Viewer model transform is an example that demonstrates
`processing Steps by Stereoscopic filter 26. In the example,
`the model is tuned for a case where the 3D rendering engine
`requests a rendering of the 3D objects from their XYZ
`representation into the XY plane.
`FIG. 8 illustrates a virtual focal eye point model repre
`sentation. Given the width and height of the target 2D plane
`WxH, the point in space in location h, as shown in FIG. 8,
`is Set to be far from the projection plane and in the middle
`of the target XY plane. This point, h, emulates the eye focal
`point. In order to create a separate left and right view, the
`focal eye point is relocated to the left and right and the
`stereoscopic filter 26 recalculates the XYZ location of the
`3D object with respect to the relocated point.
`The focal point is moved in the X-axis direction by the
`amount of d. The left eye coordinates become (W/2-d,
`H/2,h) and (W/2+d, H/2,h) for the right eye. The stereo
`Scopic filter 26 then uses the new focal point to generate a
`new coordinate systems XYZ, as shown in FIG. 9, that
`share the Y axis with the original XYZ coordinate system,
`but is offset by angle C. towards the new focal point to restore
`its central position W/2 on the X axis and H/2 on the Y axis.
`For the new eye view, the converted 3D object is represented
`in the new coordinate system XYZ and rendered into the
`eye view 2D buffer. Although the above model was defined
`
`IPR2018-01045
`Sony EX1005 Page 12
`
`
`
`US 6,496,183 B1
`
`7
`by the h and d parameters that profile the depth perception
`of the viewer, the Stereoscopic filter 26 can parameterize the
`model based on the tilt angle C. alone.
`FIG. 10 illustrates a top view of the left view projection
`plane from FIG. 9. Because the new XYZ coordinate
`System share the Y-axis with the original coordinate System,
`the Y values of the 3D point stay the same. From FIG. 10,
`the following equations regarding C. are defined:
`
`8
`For example, a head mounted display that has a separate
`monitor for each eye may be used. In this type of display, the
`RGB signal is modulated to create an effect that simulated
`depth vision.
`Alternatively, a projection System display that projects
`two views may also be used. In Such a display, the viewer
`uses Special glasses that block one view and let the other
`View pass to the eye, thereby delivering a different view to
`each eye. A common implementation uses a different color
`or polarization with color or polarization filters on the
`glasses.
`In any event, the Stereoscopic filter 26 can be used with
`any Stereoscopic display device Since the RGB Video signal
`output from DAC 32 is generated to meet industry Standards.
`Described above has been a method and apparatus for
`generating Stereoscopic 3D imageS for graphic objects using
`a hardware accelerated rendering architecture. Computer
`system 100 of FIG.2 may be implemented, for example, as
`a Pentium-based PC utilizing a Window 9x or Windows NT
`operating System from MicroSoft Corporation. However, the
`Stereoscopic filter is not limited to use with any particular
`operating System. For example, FIG. 11 is a simplified block
`diagram of a computer system 200 upon which an embodi
`ment of the present invention can be implemented. Com
`puter system 200 includes a bus 210 or other communication
`medium for communicating information, and a processor
`202 coupled to bus 210 for processing information. Com
`puter System 200 further comprises a random access
`memory (RAM) or other dynamic storage device 204
`(referred to as main memory), coupled to bus 210 for storing
`information and instructions to be executed by processor
`202. Main memory 204 also may be used for storing
`temporary variables or other intermediate information dur
`ing execution of instructions by processor 202. Computer
`system 200 also comprises a read only memory (ROM)
`and/or other static storage device 206 coupled to bus 210 for
`Storing Static information and instructions for processor 202.
`A data Storage device 208, Such as a magnetic disk or optical
`disk and its corresponding disk drive, can be coupled to buS
`210 for storing information and instructions.
`Advantageously, the Stereoscopic filter of the present
`invention can be configured to interface with any vendor 3D
`rendering engine, such as OpenGL or Direct3D by Microsoft
`Corporation. However, a Specific Stereoscopic filter imple
`mentation is to be used with a specific 3D rendering engine
`Since the filter interfaces with the rendering engine's Specific
`interface points. Further, the Stereoscopic filter 26 is not
`limited to be used with any particular 3D acceleration driver
`Since the communication between the filter and the accel
`eration driver is over the same published API.
`Additionally, the Stereoscopic filter can be advanta
`geously used with any third party 3D accelerator or video
`adapter. The Stereoscopic filter is configured to Set the
`Stereoscopic mode based on the actual device utilized.
`Further, the Stereoscopic filter is usable with any Stereo
`Scopic display device that Supports its Stereoscopic mode.
`Another advantage of the invention is that OS Software and
`third party components do not have to be modified to
`accommodate the Stereoscopic filter thereby Saving consid
`erable time and costs associated with customizing various
`Software and hardware components.
`An important aspect of the invention as related to the
`Viewer model discussed above is the ability to change the
`viewer's point of view to any place in the 3D space. The user
`then controls the stereoscopic filter in order to be able to
`move within the Scene in a manner that is independent of the
`3D content at the application level. The invention thus
`
`iii)
`tan C=lfix=kfz'
`From equation iii), l=Xtan C. and from equation ii) Z'=(Z+1)
`cos C=(Z+xtan O.)cos C=ZcoS C+XSin C. Further, from equa
`tion iii) k=ztan C=ZSin C+XSin Cltan C. and from equation ii)
`x'=(X-kcos C)/cos C=X/cos C.-k=x(l/cos C.-Sin C.Sin C/cos
`O)-ZSin C=Xcos C-ZSin C. Using the above equations, the
`Stereoscopic filter 26 obtains the following new coordinates
`for the left viewpoint:
`3'-x cos C-z sin C,
`z'=z cos C+x sin C,
`
`and
`
`iv)
`y'Ey
`The stereoscopic filter 26 then calculates the right view
`for the same point by mirroring the results from the left view.
`That is, the stereoscopic filter 26 mirrors the point to the left,
`from (x,y,z) to (W-x, y,z). Next, the stereoscopic filter 26
`calculates the left value for the point, using the equation iv)
`transformation above. Finally, the Stereoscopic filter mirrors
`the resultant point back to the right again, from (x,y,z) to
`(W-x,y,z).
`After the calculation of the left and right values for the
`point, the Stereoscopic filter 26 displaces the left and right
`point to their proper place in the Stereoscopic mode. The
`transformation details depend on the exact format of the
`frame buffer 31 in the stereoscopic mode, which is set by the
`Stereoscopic filter 26 as discussed previously. For example,
`if the same length buffer horizontal mode illustrated in FIG.
`4B is used, the stereoscopic filter locates the left view in the
`left half of the original frame buffer and the right view in the
`right half. So the left point (x, y, z) would move into (x/2,
`y, z), while a right point would move into (W/2+x'/2, y, z).
`Other modes would change the Y value with or without
`Scale. Finally, the Stereoscopic filter 26 passes the new
`geometry into the rendering Stage in the 3D acceleration
`driver 28.
`Referring back to FIG. 3, at step 68, after the left and right
`eye viewpoint data has been