`Luken
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`US005923334A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,923,334
`Jul. 13, 1999
`
`[54] POLYHEDRAL ENVIRONMENT MAP
`UTILIZING A TRIANGULAR DATA
`STRUCTURE
`
`Primary Examiner-Mark K. Zimmerman
`Attorney, Agent, or Firm-Jay P. Sbrollini; Perman &
`Green, LLP
`
`[75]
`
`Inventor: William Louis Luken, Yorktown
`Heights, N.Y.
`
`[73] Assignee: International Business Machines
`Corporation, Armonk, N.Y.
`
`[21] Appl. No.: 08/720,321
`
`[22] Filed:
`
`Sep. 27, 1996
`
`[60]
`
`[51]
`[52]
`[58]
`
`[56]
`
`Related U.S. Application Data
`
`Provisional application No. 60/023,143, Aug. 5, 1996, pro(cid:173)
`visional application No. 60/022,424, Aug. 5, 1996, and
`provisional application No. 60/022,428, Aug. 5, 1996.
`Int. Cl.6
`..................................................... G06T 11/40
`U.S. Cl. ............................................. 345/423; 345/425
`Field of Search ..................................... 345/419, 420,
`345/421, 423, 425, 430, 431
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,396,583
`5,495,562
`5,561,756
`5,704,024
`
`3/1995 Chen et a!. ............................. 345/427
`2/1996 Denney et a!. ......................... 345/421
`10/1996 Miller et a!. ............................ 345/437
`12/1997 Voorhies et a!. ........................ 345/426
`
`OTHER PUBLICATIONS
`
`QuickTime VR-An Image-Based Approach to Virtual
`Environment Navigation, Shenchang Eric Chen, Apple
`Computer, Inc., Siggraph, Computer Graphics Proceedings,
`Annual Conference Series, 1995, pp. 29-38.
`
`[57]
`
`ABSTRACT
`
`The present invention generates an environment map by
`storing in memory color values associated with pixels of an
`image representing the panoramic scene. For at least one
`element of each facet of the environment map, a mapping
`operation is performed that comprises the following steps. A
`direction vector is generated that corresponds to the element.
`A pixel of the image that corresponds to the direction vector
`is determined. A color value is derived based upon a stored
`color value associated with the pixel of the image, and the
`derived color value is stored at a location in memory
`associated with the element of the environment map. A view
`of the environment map is generated by determining a view
`window corresponding to a field of view. The view window
`comprises an array of pixels identified by a plurality of rows
`and columns. The environment map is mapped to the view
`window for display. The mapping step includes the follow(cid:173)
`ing steps for each pixel of the view window. A direction
`vector is generated that corresponds to the pixel of the view
`window. A facet of the environment map intersected by the
`direction vector is determined. An element of the intersected
`facet which corresponds to the direction vector is deter(cid:173)
`mined. A color value is derived based upon a color value of
`the element of the environment map corresponding to the
`direction vector, and the derived color value is stored as the
`color value of the pixel of the view window for display.
`
`22 Claims, 14 Drawing Sheets
`
`701
`
`For each Direction Vector Di, J
`
`-707
`
`711
`
`For each Direction Vector Di,
`Determine Pixel of Rectangular
`Image Intersected by Direction
`Vector Di that corresponds
`to Direction Vector Di
`
`store color value of pixel
`identified in step 707 as
`color value of pixel
`(irow,icol) of face Fi
`
`-713
`
`_N
`
`y
`r
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 1 of 14
`
`5,923,334
`
`101
`
`Memory
`
`103
`
`f""
`- - - -
`
`CPU
`
`~. --·--- ----·---
`
`Frame
`Buffer
`
`~104
`
`I
`
`Display
`Device
`
`/-105
`
`'
`
`-
`~
`108
`
`Static
`Storage
`Device
`
`~
`107
`
`Input
`Device
`
`·---- ------
`
`/
`109
`
`Comm
`Link
`
`----------
`
`Fig. 1
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 2 of 14
`
`5,923,334
`
`A (0 10)
`
`A (1 ~0)
`
`A (2,0)
`
`• • •
`
`A (ncol-1,0)
`
`A (0 1 1)
`
`A (1 I 1)
`
`A (2,1)
`
`• • •
`
`A (ncol-1 I 1)
`
`A (0 12)
`
`A (1 ,2)
`
`A (2,2)
`
`, , ,
`
`A (ncol-1 ,2)
`
`•
`•
`•
`
`A(O,nrow-1) A(1,nrow-1) A(2,nrow-1)
`
`• • • A(ncol-1,nrow-1)
`
`Fig. 2
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 3 of 14
`
`5,923,334
`
`f ()
`
`<::'
`
`T (0,1)
`
`T (0,0)
`
`T (1, 1)
`
`T (0,2)
`
`T (1 ,2)
`
`T (2,2)
`
`•
`•
`•
`
`•
`•
`•
`
`T (O,nrow-1)
`
`T (1 ,nrow-1)
`
`•••
`
`T (nrow-1 ,nrow-1)
`
`nrows
`
`Fig. 3
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 4 of 14
`
`5,923,334
`
`z {+)
`
`z (-)
`
`Fig.4
`
`X(+)
`
`X (-)""'f
`
`y (-)
`
`z (+)
`
`y (-)
`
`z (-)
`
`i z (-)
`
`z (-)
`
`z (-)
`
`Fig. 5
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 5 of 14
`
`5,923,334
`
`top image
`
`··. ...... __ _
`
`I
`I
`I
`I
`
`,.----~----------------------- --------- -
`
`side 0
`
`side 3
`
`bottom
`image
`
`Fig. 6(A)
`
`y
`
`r--- side 1
`
`side 2 --,_ __
`
`_r-side 0
`
`- - - --------------------- - - - - - - ---------~----------- ----~
`X
`
`side3Y
`
`Fig. 6(8)
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 6 of 14
`
`5,923,334
`
`'-----....----~ 701
`
`Select iRow
`
`_j
`
`_j
`-----r------f1703
`Select icol
`'-
`
`Generate Direction
`Vectors DO, Dl ... 07
`L---------~------
`
`j -705
`
`For each Direction Vector Di,
`Determine Pixel of Rectangular
`Image Intersected by Direction
`Vector Di that corresponds
`to Direction Vector Di
`
`For each Direction Vector Di,
`store color value of pixel
`identified in step 707 as
`color value of pixel
`(irow,icol) of face Fi
`
`707
`
`711
`
`N
`
`Fig. 7
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 7 of 14
`
`5,923,334
`
`801--
`
`BEGIN
`
`(irow,icol)
`=(0,0)
`
`805
`
`y
`
`icol = 0
`
`N
`
`y
`
`-803
`
`(Dx,Dy,Dz) = (0,0, 1)
`
`807
`
`DzN = DzN_1 - 1/(nrow-1)
`DxN = 1 -Dz N
`DyN = 0
`
`809
`DxN = DxN_1 - 1/(nrow-1)
`DyN = DyN_1 + 1/(nrow-1)
`DzN = DzN_1
`
`--------------f---=-~~=-· ···----··-(cid:173)
`c~~
`Fig. 8
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 8 of 14
`
`5,923,334
`
`~)
`
`--. 901
`
`Dx>Dy
`
`/
`
`903
`
`>----.
`
`905
`
`Dz > (Dy•AR)
`
`Case
`1
`
`Case
`4
`
`------·· ------------
`(
`[NO
`~-------------
`
`)
`
`Fig. 9
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 9 of 14
`
`5,923,334
`
`BEGIN
`
`Determine Field of
`View
`
`Map Octahedral Environment
`Map to View Window
`
`Map View Window
`to Display
`
`~
`
`1005
`
`END
`
`Fig. 10
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 10 of 14
`
`5,923,334
`
`eye or
`camera"~(
`
`z
`
`Fig. 11
`
`y
`
`___,_ v
`
`X
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 11 of 14
`
`5,923,334
`
`Viewing
`Window
`----·--·--~.,. v
`
`' - -
`
`Fig. 12
`
`u
`
`...___ Viewing
`Window
`---------------- -------------- ---~ v
`
`Fig. 13
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 12 of 14
`
`5,923,334
`
`1401
`
`Position view window into ncol
`columns and nrow rows
`
`_.----- 1403
`[~~~-d~e~te~r_m~i-n~e~d~R~==
`.----------... ,r---1405
`determine dU
`. - - - - - - - - - - - - - - -
`
`~
`
`1407
`
`Select column i
`
`for each row in range j = 0 to {nrow-1) for
`the selected column i,
`-Determine D(i,j,)
`-Determine Face F intersected by D(i,j)
`-Determine Pixel Element of face
`F corresponding to D(i,j)
`-Write color value of corresponding
`Pixel Element of Face F as
`color value of pixel (i,j) of
`the view window
`
`"'---- 1409
`
`---~-- ------------ -~---
`
`N
`
`Fig.14
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 13 of 14
`
`5,923,334
`
`neal columns
`
`nrow rows
`
`eye <(
`
`eye <(
`
`R
`
`Fig. 15
`
`u
`
`Fig. 16
`
`
`
`U.S. Patent
`
`Jul. 13, 1999
`
`Sheet 14 of 14
`
`5,923,334
`
`view
`neal columns
`______ ____. '-----------... (window
`
`nrow rows
`
`~ u
`
`Fig. 17
`
`
`
`5,923,334
`
`1
`POLYHEDRAL ENVIRONMENT MAP
`UTILIZING A TRIANGULAR DATA
`STRUCTURE
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`The present application claims the benifit of U.S. patent
`application Ser. No. 60/023,143, U.S. patent application Ser.
`No. 60/022,424 and U.S. patent application Ser. No. 60/022,
`428, filed on Aug. 5, 1996 and assigned to the common
`assignee of the present invention, herein incorporated by
`reference in their entirety.
`
`BACKGROUND OF THE INVENTION
`
`1. Technical Field
`The invention relates generally to image processing
`systems, and, more particularly, to image processing sys(cid:173)
`tems that utilize polyhedral environment maps to create and
`view three dimensional images from data representing mul(cid:173)
`tiple views of a scene.
`2. Description of the Related Art
`An environment map is a data structure representing the
`colors and/or other information that is characteristic of a set
`of samples defined with respect to a fixed point in space. A
`complete environment contains data for a set of directional
`samples distributed over all possible angles ( 4n2 steradians ).
`A partial environment map may represent any subset of the
`possible solid angles. The directional samples are all defined
`with respect to a single point in space which forms the origin
`or center of the environment map.
`The data contained in an environment map may be used
`to determine an image that is characteristic of any direction
`and field of view contained within the solid angles repre- 35
`sented by the environment map. The image is typically
`composed of a rectangular grid of elements or pixels, each
`of which is characteristic of a direction with respect to the
`origin of the environment map. The characteristic direction
`the elements of the image may be associated with one or
`more directional samples of the environment map. In the
`simplest case, known as a point-sampled image, each ele(cid:173)
`ment of the image is associated with the single sample of the
`environment map that corresponds to a direction that most
`closely matches the direction associated with the element of
`the image. More elaborate sampling and filtering techniques
`are also possible in which each element of the image is
`associated with multiple samples of the environment map
`that corresponds to directions that match the directions
`associated with the element of the image.
`An environment map may also be used to approximate the
`effects of light reflected by the surface of a three dimensional
`object located at or near the origin of the environment map.
`For example, consider a point on a three dimensional object
`which is characterized by coordinates (xp,yp,zp) and a unit
`normal vector N. In order to generate a view of the object
`from a direction DO, the color associated with the object at
`the coordinates (xp,yp,zp) may be determined partly or
`entirely by the values of the environment associated with the
`Direction De=D-2N*(N·D).
`In principle, any set of fiat or curved surfaces may be used
`to define an environment map. For example, the Quickti(cid:173)
`me VR product from Apple Corp of Cupertino, Calif., uti(cid:173)
`lizes a cylindrical environment map to support panoramic
`image generation and the RenderMan product from Pixar
`Animation Studios of Port Richmond, Calif. uses both
`spherical environment maps and environment maps based
`
`2
`on six sides of a cube. All of these examples take advantage
`of the ability to represent the constituent surfaces with a
`rectangular two dimensional coordinate system. The cylin(cid:173)
`drical environment map is based on azimuth and elevational
`coordinates. The spherical environment map uses latitude
`and longitude coordinates. The cube-face environment map
`utilizes standard row and column addressing for each of the
`six faces of the cube.
`Each of these environment maps have distinct limitations
`10 that restrict their usefulness.
`The spherical environment map has an inefficient distri(cid:173)
`bution of samples with higher densities of samples near the
`poles of the sphere than around the equator of the sphere. In
`principle, a sphere treats all directions is space equally.
`15 However, imposition of a latitude and longitude coordinate
`system on a sphere breaks this symmetry and requires
`designation of a preferred axis (i.e., the polar axis) in space.
`This leads to unequal treatment of the polar regions with
`respect to the equatorial regions. The use of latitude and
`20 longitude coordinate system also tends to require frequent
`evaluation of trigonometric functions which may be pro(cid:173)
`hibitively intensive for use in some applications and/or
`systems.
`More specifically, the choice of the direction of the polar
`25 axis of the spherical environment map is completely arbi(cid:173)
`trary; and, after choosing the direction of the polar axis, the
`resulting combination of a sphere and polar axis no longer
`has the full symmetry of the sphere itself. Only the sym(cid:173)
`metry of a cylinder remains. Thus, it is possible to rotate the
`30 sphere and polar axis combination about the polar axis
`without changing the polar axis, but rotating this combina(cid:173)
`tion about any other direction moves the polar axis. Moving
`the polar axis is equivalent to choosing a new polar axis, and
`this does not preserve the original polar axis choice.
`After choosing a polar axis, a latitude and longitude
`coordinate system may be defined. The latitude and longi(cid:173)
`tude coordinate system include an angle 8 running from n/2
`at one pole to -Jt/2 at the other pole, and a second
`(azimuthal) angle <jJ running from 0 to 2n starting at some
`40 arbitrary direction perpendicular to the polar axis. The
`resulting lines of constant 8 and constant <jJ converge on the
`poles. Near the "equator" (the line with 8=0), the lines of
`constant 8 and <jJ form a rectangular grid. Near the poles,
`however, the lines of constant 8 (latitude) form concentric
`45 circles around the poles, and lines of constant <jJ (longitude)
`form spokes radiating from the poles. The circles and spokes
`patterns seen near the poles is quite distinct from the grid
`pattern seen near the equator, but the underlying sphere has
`exactly the same shape in both regions. The circles and
`50 spokes patterns seen near the poles are not properties of the
`sphere itself. These are only consequences of the arbitrary
`decision to impose a polar axis on the sphere.
`In contrast, the cylindrical and cube-face environment
`maps have relatively uniform distributions of directional
`55 samples. However, the cylindrical environment is open at
`both ends which prevents it from being used for view
`directions approaching the axis of the cylinder. The ends of
`the cylinder may be capped with additional top and bottom
`images as set forth in U.S. patent application Ser. No.
`60 60/023,143, filed on Aug. 5, 1996 and commonly assigned
`to the assignee of the present application, incorporated by
`reference above in its entirety. However, the resulting data
`structure requires special treatment for directions having
`elevations above, within, or below the cylindrical data. In
`65 addition, the algorithms required for viewing the top and
`bottom images may be more laborious than the algorithms
`required to view the data within the cylindrical region.
`
`
`
`5,923,334
`
`3
`With respect to the cube-face environment map, the
`algorithms required to determine which faces of the cube are
`intersected by the viewing frustrum are computationally
`intensive, and may impact the performance of the system.
`Thus, there is a need in the art to provide for an efficient
`system for generating and viewing three dimensional pan(cid:173)
`oramic images based environment maps, and thus provide
`an improved level of interactive graphical feedback.
`
`SUMMARY OF THE INVENTION
`
`The above-stated problems and related problems of the
`prior art are solved with the principles of the present
`invention, polyhedral environment map utilizing a triangular
`data structure. The environment map of the present inven(cid:173)
`tion represents a panoramic scene and comprises a plurality
`of triangular facets each partitioned into a triangular grid of
`elements. Each element is associated with a color value
`representing color of the corresponding element. The envi(cid:173)
`ronment map of the present invention is generated by storing
`in memory color values associated with elements of an
`image representing the panoramic scene. For at least one
`element of each facet of the environment map, a mapping
`operation is performed that comprises the following steps. A
`direction vector is generated that corresponds to the element. 25
`An element of the image representing the panoramic scene
`that corresponds to the direction vector is determined. A
`color value is derived based upon a stored color value
`associated with the element of the image, and the derived
`color value is stored at a location in memory associated with
`the element of the environment map.
`A view of the environment map of the present invention
`is generated by determining a view window corresponding
`to a field of view. The view window comprises an array of
`pixels identified by a plurality of rows and columns. The
`environment map is mapped to the view window for display.
`The mapping step includes the following steps for each pixel
`of the view window. A direction vector is generated that
`corresponds to the pixel of the view window. A facet of the
`environment map intersected by the direction vector is
`determined. An element of the intersected facet which
`corresponds to the direction vector is determined. A color
`value is derived based upon a color value of the element of
`the environment map corresponding to the direction vector,
`and the derived color value is stored as the color value of the 45
`pixel of the view window for display.
`The environment map of the present invention as
`described above provides for improved performance in
`rendering the environment map. This improved performance
`results from the use of a triangular data structure in defining
`the environment map, which provides an efficient mecha(cid:173)
`nism for determining which face of the environment map is
`intersected by the direction vectors that correspond to the
`pixels of the view window when mapping the environment
`map to the view window.
`
`20
`
`4
`FIG. 5 illustrates the mapping of the triangular data
`structure of FIG. 3 to each of the eight faces of the
`octahedral environment map of FIG. 4 according to the
`present invention.
`FIGS. 6(A) and (B) illustrate six rectangular images that
`may be mapped to the octahedral environment map of the
`present invention.
`FIG. 7 is a flow chart illustrating operation of mapping the
`rectangular images of FIGS. 6(A) and (B) to the octahedral
`10 environment map of the present invention.
`FIG. 8 is a flow chart illustrating the operation of gener(cid:173)
`ating the direction vectors DO,Dl, ... D7 for a given pixel
`element of the faces of the octahedral environment map.
`FIG. 9 is a flow chart illustrating the operation of deter-
`15 mining the rectangular image intersected by each of the
`direction vectors DO,Dl. .. D7, and for determining the
`pixel of the intersected image that corresponds to each of the
`direction vectors DO,Dl. .. D7.
`FIG. 10 is a flow chart illustrating operation of the system
`in rendering the polyhedral environment map of the present
`invention for display.
`FIG. 11 illustrates the relationship of the view vector, up
`vector and right vector with respect to the eye.
`FIG. 12 illustrates the relationship between the view
`vector, right vector, eye, horizontal field of view and the
`viewing window.
`FIG. 13 illustrates the relationship between the view
`vector, up vector, eye, vertical field of view and the viewing
`30 window.
`FIG. 14 is a flow chart illustrating the operation of
`mapping the polyhedral environment map of the present
`invention to the view window.
`FIG. 15 illustrates the neal columns of the view plane and
`35 the horizontal step vector.
`FIG.16 illustrates the nrow rows of the view plane and the
`vertical step vector.
`FIG. 17 illustrates the direction vectors that correspond to
`the elements of the view window, and the relationship of the
`40 horizontal and vertical step vectors of FIGS. 15 and 16 to
`such direction vectors.
`
`DETAILED DESCRIPTION OF THE PRESENT
`INVENTION
`A method and apparatus for generating an image from a
`polyhedral environment map that utilizes a triangular data
`structure is set forth herein. The present invention may be
`implemented on any computer processing system including,
`for example, a personal computer, a workstation, or a
`50 graphics adapter that works in conjunction with a personal
`computer or workstation. As shown in FIG. 1, a computer
`processing system as may be utilized by the present inven(cid:173)
`tion generally comprises memory 101, at least one central
`processing unit (CPU) 103 (one shown), and at least one
`55 user input device 107 (such as a keyboard, mouse, joystick,
`voice recognition system, or handwriting recognition
`system). In addition, the computer processing system
`includes non-volatile storage, such as a read only memory
`(ROM), and/or other non-volatile storage devices 108, such
`60 as a fixed disk drive, that stores an operating system and one
`or more application programs that are loaded into the
`memory 101 and executed by the CPU 103. In the execution
`of the operating system and application program(s), the CPU
`may use data stored in the non-volatile storage device 108
`65 and/or memory 101.
`In addition, the computer processing system includes a
`frame buffer 104 coupled between the CPU 103 and a
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a functional block diagram of a computer
`processing system that may be utilized by the preferred
`embodiment of the present invention.
`FIG. 2 is a pictorial illustration of a rectangular data
`structure.
`FIG. 3 is a pictorial illustration of a triangular data
`structure.
`FIG. 4 is a pictorial illustration of the eight faces of an
`octahedral environment map.
`
`
`
`5
`display device 105 such as a CRT display or LCD display.
`The frame buffer 104 contains pixel data for driving the
`display device 105. In some systems, a rendering device (not
`shown), also known as a graphics accelerator, may be
`coupled between the CPU 103 and the frame buffer 104.
`In addition, the computer processing system may include
`a communication link 109 (such as a network adapter, RF
`link, or modem) coupled to the CPU 103 that allows the
`CPU 103 to communicate with other computer processing
`systems over the communication link, for example over the
`Internet. The CPU 103 may receive portions of the operating
`system, portions of the application program(s), or portions
`of the data used by the CPU 103 in executing the operating
`system and application program(s).
`It should be noted that the application program(s) 15
`executed by the CPU 103 may perform the rendering
`methods of the present invention described below.
`Alternatively, portions or all of the rendering methods
`described below may be embodied in hardware that works in
`conjunction with the application program executed by the 20
`CPU 103.
`What will now be described is the polyhedral environ(cid:173)
`ment map that utilizes a triangular data structure, techniques
`for creating such polyhedral environment maps, and tech-
`niques for rendering views of such polyhedral environment
`maps.
`In order to illustrate a polyhedron environment map that
`utilizes a triangular data structure, first consider a conven(cid:173)
`tional rectangular array. As shown in FIG. 2, a rectangular
`array includes data values stored in a computer's memory as
`nrow successive blocks of memory each having neal ele(cid:173)
`ments. The address of any element may be calculated as
`follows:
`
`5,923,334
`
`6
`of the polyhedral environment map. For the sake of
`description, an octahedral environment map that utilizes
`such a triangular data structure is set forth below. However,
`the present invention is not limited in this respect and can be
`applied to other polyhedral environment maps, for example
`environment maps based on a tetrahedron or an icosahedron.
`However, the use of a triangular data structure for the
`octahedral environment map is particularly advantageous in
`terms of limiting the computations required to determine
`10 which face of the octahedron is associated with a given
`direction vector (i.e., look at the signs of the x, y, and z
`values).
`FIG. 3 shows an octahedron with its eight faces and the
`three coordinate axes x,y,z. Each coordinate axis is divided
`into ( +) and (-) directions associated with the six vertices of
`the octahedron. FIG. 4 shows the eight faces of the octahe-
`dron with corners labelled by the corresponding coordinate
`axes and directions. Each face is partitioned into a triangular
`grid of elements (or pixels), for example a 4x4 triangular
`grid of elements as shown. For each element of the grid of
`elements, the triangular data structure preferably stores
`information that is associated with the element. The infor(cid:173)
`mation preferably includes color data that identifies the color
`of the element. The color data may represent the red, green
`and blue components of the color of the pixel (true color) or
`an index to a color palette (palette color). The information
`stored for a given element may also include Z data that
`represents the depth of the image at the given element. In
`
`25
`
`30 :~~~~~~~~{~:ti:;~~~:~i~: t~~:~~~~~~a~~rp:~:~;l~: t:~:a~:
`
`may be actions related to the element, including, but not
`limited to, a hyperlink to another panorama, a command to
`play a sound, or a command to launch an external applica-
`tion.
`Having described a octahedral environment map that
`utilizes a triangular data structure to store the information
`related to the elements of each facet of the octahedron, a
`technique to map six rectangular images to the octahedral
`40 environment map of the present invention is now set forth
`with reference to FIG. 7. Preferably, the six rectangular
`images are faces of an axis-aligned rectangular solid as
`shown in FIG. 6(A). The six images will be referred to as the
`top image, bottom image, and four side images (side 0, side
`1 side 2 and side 3 ). FIG. 6(B) is a top view of xy plane of
`the rectangular solid of FIG. 6(A) illustrating the four side
`images. Side 0 is on the positive x axis, side 1 along the
`positive y axis, side 2 along the negative x axis, and side 3
`along the negative y axis.
`As shown in FIG. 7, the operation of mapping the six
`rectangular images to the octahedral environment map of the
`present invention begins in step 701 by selecting a row index
`irow. In step 703 a column index icol is selected. The value
`of irow runs from 0 to nrow-1 where nrow is the dimension
`55 of the triangular arrays comprising the octahedron. For each
`value of irow, the value of icol runs from 0 to irow. Each pair
`of irow and icol values identifies a pixel in each face of the
`octahedron.
`In step 705, for each face of the octahedron, a direction
`vector is generated that corresponds to the pixel (irow,icol)
`in the particular face of the octahedron. Preferably, the
`direction vector corresponding to the pixel (irow,icol) rep(cid:173)
`resents the direction from the origin of the octahedron to the
`pixel (irow, icol). Moreover, the direction vector preferably
`has the form D=(SxDx,SyDy,SzDz) where Sx,Sy,Sz corre(cid:173)
`sponds to a particular face of the octahedron and Dx,Dy,Dz
`corresponds to a particular pixel in the faces of the octahe-
`
`A(i,j)~address of A(O,O)+(size of each element)*(i+j*(ncol)),
`
`where i and j represent the column and row of the element,
`respectively.
`In order to find the data for the element (i, j), one must know
`the column dimension (neal), as well as the values of (i) and
`G).
`A triangular data structure is illustrated in FIG. 3. Instead
`of every row having neal elements, the first row has one
`element, the second row has two elements, etc. Like the
`rectangular array, a triangular array is mapped into linear
`memory using a well-defined algorithm. The triangular
`array, however, uses a different algorithm. Thus, the location
`of element T(i,j) is determined as follows:
`
`T(i,j)~address of T(O,O)+(size of each element)*(i+j*(i+1)/2),
`
`35
`
`45
`
`50
`
`where i and j represent the column and row of the element,
`respectively.
`Unlike the rectangular array, this does not have independent
`row and column dimensions. In effect nrow=ncol. Moreover,
`unlike the rectangular array, it is not necessary to know the
`array dimension(s) to find the address of any array element
`(Since the address of the element is based on j*Q+1)/2
`instead of j*ncol. Moreover, because j and j+1 are succes(cid:173)
`sive integers, one is even and one is odd, so the product 60
`j*Q+1) is always even and exactly divisible by 2. In addition,
`the values of j*Q+1)/2 may be stored in an array to reduce
`the number of operations needed to generate an address for
`an element of the array.
`According to the present invention, a polyhedral environ- 65
`ment map is provided that utilizes a triangular data structure
`to store the information related to the elements of each facet
`
`
`
`5,923,334
`
`7
`dron. Preferably Sx,Sy,Sz is mapped to the faces of the
`octahedron as follows:
`
`Face
`
`FO ( +x,+y,+z)
`F1 ( -x,+y,+z)
`F2 ( -x,-y,+z)
`F3 ( +x,-y,+z)
`F4 ( +x,+y,-z)
`F5 ( -x,+y,-z)
`F6 ( -x,-y,-z)
`F7 ( +x,-y,-z)
`
`Sx
`
`-1
`-1
`
`-1
`-1
`
`Sy
`
`-1
`-1
`
`-1
`-1
`
`Sz
`
`-1
`-1
`-1
`-1
`
`10
`
`20
`
`25
`
`This results in the generation of eight direction vectors
`DO,D1,D2 ... D7 that correspond to the pixel (irow,icol) in 15
`each of the eight faces of the octahedron where
`DO=(Dx,Dy,Dz)
`Dl=(-Dx,Dy,Dz)
`D2=(-Dx,-Dy,Dz)
`D3=(Dx,-Dy,Dz)
`D4=(Dx,Dy,-Dz)
`DS=( -Dx,Dy,-Dz)
`D6=( -Dx,-Dy,-Dz)
`D7=(Dx,-Dy,-Dz)
`An example of a technique that generates the components
`Dx,Dy,Dz of the direction vectors DO,D1,D2 ... D7is set
`forth below with respect to FIG. 8.
`In step 707, for each of the eight direction vectors
`DO,D1,D2 ... D7, it is determined which of the six rectan(cid:173)
`gular images is intersected by the direction vector Di (where 30
`i ranges from 0 to 7), and which pixel of the particular
`rectangular image intersected by the direction vector Di
`corresponds to the direction vector Di. An example of a
`technique to identify the pixel of the rectangular image that
`corresponds to the direction vector Di is set forth below with 35
`respect to FIG. 9.
`In step 711, for each direction vector Di, the color value
`of the pixel of the rectangular image that corresponds to the
`direction vector Di is stored as the color value of the pixel
`(irow,icol) of the face Fi of the octahedral environment map, 40
`where face Fi corresponds to the direction vector Di.
`In step 713, it is determined if the last column (icol=irow)
`of the row (irow) of the faces of the octahedron has been
`processed in steps 705-711. If not, operation returns to step
`~~;cr~~e~e~~~v~~~h::~lu:~u:~e~n~::g:~ f:~~ ~~~eiro~ 45
`Otherwise, operation continues to step 715.
`In step 715, it is determined if the last row (irow=nrow-1)
`of the faces of the octahedron has been processed in steps
`705-711. If not, operation returns to step 701 to select the
`next row index in the range. As described above, the row 50
`index ranges from 0 to nrow-1. Otherwise, when the last
`row of the faces of the octahedron have been processed, the
`operation ends.
`FIG. 8 illustrates a technique for generating the compo(cid:173)
`nents Dx,Dy,Dz of the direction vectors DO,D1,D2 ... D7for 55
`the pixel identified by the indices (irow,icol). The operation
`begins in step 801 by checking whether the indices (irow,
`icol) are set to (0,0), thus indicating that the current pixel is
`the initial pixel of the faces of the octahedron. If in step 801
`the indices (irow,icol) are set to (0,0), operation continues to 60
`step 803 where the components Dx,Dy,Dz are initialized to
`values 0,0,1, respectively and the operation ends.
`If in step 801 the indices (irow,icol) are not set to (0,0),
`operation continues to step 805 to check whether the index
`icol is set to 0, thus indicating that the current pixel is located 65
`at the beginning of the row irow on the faces of the
`octahedron.
`
`8
`If in step 805, it is determined that the index icol is set to
`0, then in step 807 the components Dx,Dy,Dz of the current
`pixel are set as follows:
`DzAFDzN_ 1-(l!(nrow-1))
`DxAF1-DZN
`DyAFO
`where DxN, Dy N• DzN are the Dx,Dy,Dz components
`for the current pixel (irow,icol) and DzN_ 1 is the Dz
`component for the pixels of the previous row, which
`is identified by the row index (irow=irow-1).
`If in step 805 it is determined that the index icol is not set
`to 0, then in step 809 the components are set as follows:
`DxAFDxN_ 1-(l!(nrow-1))
`DyAFDYN- 1+(1/(nrow-1))
`DzAFDzN_ 1
`where DxN, Dy N• DzN are the Dx,Dy,Dz components
`for the current pixel (irow,icol) and DxN_ 1 , DyN_ 1 ,
`DzN_ 1 are the Dx,Dy, Dz components for the previ(cid:173)
`ous pixel, which is identified by the indices (irow,
`icol-1).
`FIG. 9 illustrates a technique for identifying the pixel of
`the rectangular image that c