`
`Approved for use through 1/31:9!!. OMB 0651-0037
`Pateni aM Tradernark ON1ce; U.S DEPARTMENT OF COMMERCE
`U•,der the Paperwork Reduct1ao Act of I &95, no per>ons are requ"ed to resoond to a coRect1on ol~nlormation unless 11 displays a
`valid CMB control number
`PROVISIONAL APPL/CA TION FOR J~ATENT COVER SHEET
`This Is a request fer filing a PROVISIONAL APPLICATION FOR PATENT under 37 CFR 1.53 (c).
`
`i>TO/SB/16 (1~-9s) +
`
`Given Name (first and middle [if anyl)
`
`Ford
`
`INVENTOR{S)
`
`Fam11v Name or Surname
`Oxaal
`
`I
`'"'
`!
`142 Western Avenue
`1 Cohoes, NY 12047
`0 Adrit11011a: Inventors are being named on tile_ sepersrely numbered sheets attached hereto
`
`Fies;dance
`
`Countrvl
`
`TITLe OF THE INVENTION 12e0 characters maxl
`METHOD FOR INTERACTIVELY VIEWING FULL-5URROU1'oi"D
`IMAGE DATA AND APPARATUS THEREFOR
`
`Oirect all corrsspo."!dence to:
`
`D Customer Number I
`
`CORRESPONDENCE ADDRESS
`
`I--.
`
`Place Customer Number
`Bar Code Lebel here
`
`OR
`~Firm or
`1ndivid ual Nar1e
`
`.A.octress
`
`Type 0Jstomer Number here
`
`Raymond H. J. Powell, Jr.
`P .0. Box 30269
`
`Address
`Ci•y
`
`Country
`
`I Virginia
`I ZIP 122310-0269
`I State
`Alexandria
`!Telephone 1(703) 924-88641 Fax I (703) 924-8891
`USA
`ENCLOSED APPLICATION PARl'S (check sll JfJat apply)
`~ Specification tVumbu ol Pages I 6
`I
`~ Small Ent1ty Statement
`I [!] Other (specify} I Appendix-15 Pagesl
`~ Drav.1ng(s) Number of Sheets I 4
`00 A check or mol'ley ord;;~r is enclosed to cover the filing lees
`
`METHOD OF PAYMENT OF FILING FEES FOR THIS P80VISIONALAPPLICATION FOR PATENT (check one)
`FILING FEE
`AMOUNT($)
`
`!XJ<he Commissioner is hereby auth:;,rized to charge filing
`fees :;,r credit any Overpayment to Deposit Account Number[16-2372
`
`I $75.00
`
`The invention was made by an agency of the United States Government or under a contract wfth an agency at the
`Uni!ed States Government.
`!XI No.
`0 Yes. the nam<t of the U.S. Govemment agency and the GO\-emment coniract number are·
`
`AespecttullysubmVi:J~ ~
`
`Date
`
`SIGNATURE
`
`--'
`
`__..(_
`
`..____/
`
`TELEPHONE
`
`(if epp:opriate)
`Docket Number:
`
`lo» 1219ij
`TYPED or PRINTEJ NAME Raymond H J fowe1! Jr. ~EGISTRATION NO.I 34,231
`I R P98-QQQ1
`(703) 924-8864
`USE ONLY FOR FILING A PRO VISIOM4L APPLICATION FOR PATENT
`Burden !-<our Statement: This form is estimated to taka C.2 hours to complete. Time wiU ~ar')' depending upcn th~ needs of the Individual case.
`~.ny co'1Cmsnts on the amount of time you are requir&d to complete this torm should l:>e sant to the Chief lntorrr.at•on O!Ncer. Paten:_ and
`Trademark Office. W..shington, SIC 20231. DO NOT SEND FEES OR COMPL.ETED FOAMS TO THIS ADDRESS. SEND TO: Bo> Ptovls,~nal
`+.ppl'catJon, Asslstan1 Commissioner for Patents, Washington, DC 20231.
`
`20-d
`
`!688.1726-£0L •J.(' •ll6M0d -(' ·H puowA'-e~ 6£:60 86-21-Ut'('
`
`
`
`PTO/SB/09 ~ 12 -97i
`Appro~ed for use lllroiJII.h 9/30J!JO. OM6 0651-0031
`Patent and Tra<iemark Office; U.S. DEPARTMENT OF COW.MERCE
`Ur<ier 1t1e Papermnlt Reduction Act of 1995 no cersons a'!> reQtlired to respond 1G a coUec:tion of ;nformatlcn unless ~ dSI)Iays a valid CMB centro! numtler
`STATEMENT CLAIMING SMALL ENTITY STATUS
`Docket NU·11ber (Optional)
`(37 CFR 1.9(f) & 1.27(b))--INDEPENOENT INVEN'rOR
`RP98-0001
`
`Applicant,Patentee,orldentifier: Ford OXAAL, 42 Western Ave., Cohoes l'!'Y 12047
`
`Appl!catlonorPatentNo.: New Provisional Patent Application Attached Hereto
`
`Flledorlssued: January 12, 1998
`METHOD FOR INTERACTIVELY VIEWING FULL-8URROUND
`Title. IMAGE DATA AND APPARATUS THEREFOR
`
`As a below named inventor, I hereby state that I qualify as an inoependent 1nventorasdefined in 37 CFR 1.9(c)
`for purposes of pay1ng "educed fees to the Patent and Trademark Office descnbed in:
`[X!
`the specification filed herewith with title as listed above
`D the appl,catio'1 1dent:fied above.
`0
`the patent Identified abcve
`I have not ass1gned, granted, conveyed, or licensed, and arn under no obligation under contract or law to assign,
`grant, convey, or license, any rights in the invention to any pel'sonwhowouldnotqualify as an independent inventor
`under 37 CFR 1.9(c} if that person had made the in\lention, or to any concern which would not qualify as a small
`bus.r.ess concern under 37 CF,!;( 1 9(d) ora nonprofit organization under 37 CFR 1.9(e).
`
`Each person. concern, or organization to which I have assigned, granted, conveyed, or licensed or am under an
`obligat1 on under contract or law to assign, grant. convey. or license any r1ghts 1n the invention is listed below:
`IX] No such person, ccncern. or organization exists.
`D Each such person. concern, ororganizat:on iS listed below.
`
`Separate statements are required from each named person, concern, at organization having rights to the invention
`statmg their status as small entities. (37 CFR 1.27)
`
`1 acknowledge the duty to file, in this application or patent. notification of any change in status resulting In loss of
`entitlement to small entity status prior to paying, or at lht~ time of paying, the earliest of the issue fee or any
`maintenance fee due after the date on which status as a small entity is no longer appropriate. (37 CFR 1.28(b))
`
`Ford OXAAL
`NAMEOFINVENTOR
`
`~&-Signature of inventor
`
`NAMEOF!NVENTOR
`
`NAME OF INVENTOR
`
`Signature of in11entor
`
`Signature of irtventor
`
`January 12, 1998
`Date
`
`Date
`
`Da1e
`
`lime wijl vary depending· upon the needs of the individual case. Any
`Burden Hour Statement Thl~ lorm Is estimated to take 0.2 hours to compiete.
`comments ~n 1t1e amount of tirl'e you are requ'red to compi<lte this form should be sent to the Chief lrforma~on Cfficar. Patent and Trademarl< Office.
`Washington. DC 20231. DO NOT SEND FEES OR COMPLETED FORMS TO THIS ADDRESS. SEND TO: Assi~tant Commissianerfor Pateots. ll'l'ashir.Qtco.
`DC ~0231.
`
`£0"d
`
`017:50 86-ZI-UE("
`
`
`
`METHOD FOR INTER<\CTIVELY VIEWING FULL-SURROUND IMAGE DATA
`
`AND APPARATUS THEREFOR
`
`5
`
`The method and corresponding apparatus according to the present invention are predicated
`
`on the following starting. i.e., given, conditions:
`
`( 1) the set of all rays V from a given point VP, as illustrated in Fig. 1;
`
`(2) a set of points P not including VP, each point in P being contained by one and only one
`
`10
`
`ray in V, as illustrated in Fig. 2; and
`
`(3) the set of color values C, each color inC being associated with one and only one ray in
`
`V, and also thereby associated with the point in P contained by said ray.
`
`Moreover, the following definitions apply:
`
`(1) The visible world: points P;
`
`(2) A projection ofP: any number of points Pn contained in P may be slid doserto or further
`
`from point VP along their containing rays. The resultant new configuration of points P is
`
`called a projection of P. The concept can best be understood by referring to Fig. 3;
`
`(3) Magic point, viewpoint, or point of projection: point VP. Please note, no matter how
`
`points Pare projected, their appearance will remain the same when viewed from point VP.
`
`This latter concept may best be u.1derstood by referring to Fig. 4.
`
`(4} Full-surround image data: data which samples the points P. This data encodes, explidtly
`
`or implicitly, the association of a color value with a given direction from a given point of
`
`projection. Full-surround image data is useful because when delivered to many viewers, it
`
`enables the building of an independent vie~'ing system defined below.
`
`(5) p-sphere: a computer graphics representation of any polyhedron where there exists at
`
`least or:e point x inside (neither intersecting, nor lying outside) the polyhedron which may
`
`he connected to every point of the polyhedron \Vith a distinct line segment, no portion of
`
`-1-
`
`15
`
`20
`
`25
`
`vo-d
`
`
`
`which said line segment lie'5 outside the polyhedron or intersects the polyhedron at a point
`
`not 2.11 endpoint. The union of all such points x form the region X of the p-sphere. For a
`
`convex p-sphere, the region X is all points of the interior of the p-sphere. Examples of
`
`computer graphics objects which may be modeled as p-spheres: tetrahedron, cube,
`
`RP98-0001
`
`5
`
`dodecahedron, faceted sphere;
`
`(6) p-surface: a computer graphics representation of any surface with a well-defined inside
`
`and outside, where there exists at least one point x inside (neither intersecting, nor lying
`
`outside) the stuface which may be connected to ever/ point of the surface with a distinct line
`
`segment, no portion of which said line segment lies outside the surface or intersects the
`
`!0
`
`surface at a point not an endpoint. The union of all such points x form the region X ofthe
`
`p-surface. For a convex p-surface, the region X is ail points of the interior of the p-surface.
`Examples of computer graphics objects which may be modeled as p-surfaces: tetrahedron,
`
`cube, sphere, ellipsoid, cylinder, apple torus, lemon torus, b-spline surfaces closed or
`
`periodic in u and v. A p-sphere is a p-surfu.ce;
`
`15
`
`(7) linear perspective: the projection of a portion ofthe visible world onto a plane or portion
`
`of a plane, as illustrated !n Fig. 5;
`
`(8) circular perspective: the projection of the visible world, or portion thereof onto a plane,
`
`or a portion of a plane, after performing the perspective transformation of the visible world
`
`according to patent 5,684,937, where the constant used is 2. After such a transformation.
`
`20
`
`when the direction of vision specified in s.aid transformation is perpendicular to the
`
`projection piane, there is a one-to-one mapping of all the points P defining the visible \vorld
`
`and all the points of an infinite plane. This definition is illustrated in Fig. 6;
`
`(9) stereo graphic projection: the one-to-one mapping of each point on a sphere to each point
`
`cf an infinite tangent plane, said mapping performed by constructing the set of all rays from
`
`25
`
`the ar:tipodal point of the tangent point, the intersection of said rays with the plane defining
`
`said mapping. The understanding of this definition v.i.ll be facilitated by reference to Fig. 7.
`
`Please note that circular perspective and stereographic projection produce geometrically
`
`similar (identically proportioned) mappings when the direction of vision specified in the
`
`-2-
`
`SO"d
`
`
`
`RP98-0001
`
`perspective transformation of circular perspective contains the tangent point specified in
`
`stereographic projection;
`
`(1 0) independent viewing system: an interactive viewing system in which multiple viewers
`
`can freely, independently of one another, and independe:1tly of the source of the image data,
`
`5
`
`pan that image da1a in all directions with the effect that each viewer feels like they are
`
`"inside" of that imagery, or present at the location from which the imagery was produced,
`
`recorded, or transmitted; a..-1d
`
`( 11) standard computer graphics system: a computer graphics system which supports linear
`
`perspective viewing, including the changing of the focal length or the altering of the view
`
`10
`
`angle, the apparent rotation of viewed objects, and/or the apparent changing of direction of
`
`vision, and the texture mapping of image data onto objects within the class of p-surface.
`
`It will be appreciated that in a standard computer graphics system by texture mapping
`
`full-surround image data onto a p-surface such that the resultant texture map is effectively roughly
`equivalent to projecting said full-surround imagery onto the p-surface from a some point Q contained
`
`15
`
`in the region X of the p-surface, a representation of the visible world is achieved.
`
`Now, by setting the viewpoint of the graphics system close to the point Q, and then enabling
`
`the viewer to rotate that p-sphere around a point close to point Q, an independent viewing system
`
`20
`
`in linear perspective is achieved.
`
`By further adding the capability for the viewer to alter the focal length or angle of view, zoom
`
`ability is achieved.
`
`25
`
`In the case where the p-surface used to model the visible world is a good approximation of
`
`a sphere, that is, a substantially better model than a tetrahedron or a cube, and where the point Q of
`
`that representation is close to the approximate center of that p-surface, then by allowing the viewer
`
`to move the vie\ovpoint away from point Q to a point close to the surfac.e of the p-surface, an
`
`-3-
`
`90"d
`
`t6BB"i7Z6"£0L
`
`. ...A('
`
`'LLSMOd "(' -H puowA'-e~ £t=Ot 86-Zt-U"e('
`
`
`
`independent viewing system is achieved in circular perspective. Th1s is astounding when the native
`
`graphics system only supports viewing in linear perspective. It works because such an independent
`
`viewing system models stereographic projection which is geometrically similar to circular
`
`RP98-000l
`
`perspective.
`
`5
`
`By letting the viewer move the viewpoint outside the p-surface, the viewer can get a feeling
`
`for how the independent viewing works. This can be useful for designers of systems containing
`
`many hyper-linked full-surround surfaces. For example, many p-spheres picturing the penthouse
`
`terraces ofNew York may be linked together so that the viewer may hop from p-sphere top-sphere,
`
`10
`
`simulating a tour of the terraces.
`
`The above described method of the invention may be perfmmed, for example, by the
`
`apparatus shown in FIG. 8. Trns viewing apparatus is composed of a central processing unit (CPlJ)
`
`l 0 for controlling the components of the system in accordance with a control program stored in read-
`
`15
`
`only memory (ROM) 60 or the like. The CPlJ 10 stores temporary data used during execution oft.he
`
`inventive method, i.e., viewing method, in random-access memory (R:\M) 40. After the method
`
`majority of the method steps are performed, the generated visible points are displayed on display
`
`device 20 (e.g .. cathode ray tube (CRT) or liquid crystal display (LCD)) as visible points in
`
`accordance with the appropriate color values stored in color table 30. Advantageously, a spherical
`
`20
`
`data generator device 70, such as a camera or the like, and preferably the data generator system
`
`disclosed in pending application serial No. 081749,166 (November 14, 1996), which application is
`
`incorporated herein by reference for all purposes, may be used to generate different color values
`
`corresponding to the visible points of a viewed object, image or picture. An input device 50 is
`
`provided for entering data such as the viewer's vieV~~ing direction, reference plane, configuration
`
`25
`
`(scaling) factor k, and other pertinent information used in the inventive viewing method.
`
`Appendix A shows an example implementation of a viewer which exercises claim 1 0 (below).
`
`-4-
`
`LO-d
`
`
`
`WHAT IS CLAIMED IS:
`
`RP98-0001
`
`1. A method of modeling of the visible world by texture mapping full-surround image data
`
`onto a p-surface such that the resultant texture map is roughly equivalent to projecting full-surround
`image data onto the p-surface from a point Q inside the region X of the p-surface.
`
`5
`
`2. An independent viewing system using the method of modeling the visible world as recited
`
`in claim 1, wherein the viewer is allowed to interactively rotate the model.
`
`10
`
`3. An independent viewing system using the method of modeling the visible world as recited
`
`in claim 1, wherein the vie\ver is allowed to interactively change the direction of vision.
`
`4. An independent viewing system using the method of modeling the visible world as recited
`
`in claim 1, wherein the viewer is allowed to interactively alter the focal length or view angle.
`
`15
`
`5. An independent vievving system using the method of modeling the visible world as recited
`
`in claim 1, wherein the viewer is allowed to interactively move the viewpoint.
`
`6. An independent viewing system using the method of modeling the visible world as recited
`
`20
`
`in claim 1. wherein the vie\.-vpoint is close to the surface ofthe p-sphere.
`
`7. An independent viewing system using the method of modeling the visible world as recited
`
`in claim 1, wherein the viewer is allowed to interactively move the viewpoint.
`
`25
`
`8. An independent vie'Wing system using the m~thod of modeling the visible world as recited
`
`in claim L wherein the viewer is allowed to select an :area of the image and cause another model of
`
`the visible world to be loaded into said viewing system.
`
`-5-
`
`SO"d
`
`t68'8"1726"£0L
`
`"....l(" ~ll6M0d "("' "H puowA'e~ St=Ot 86-Zt-UE("
`
`
`
`9. An independent viewing system using the method of modeling the visible world as recited
`
`in claim 1, wherein the viewer is allowed to perform <my combination of actions specified in claims
`
`2 through 8.
`
`RP~S-0001
`
`5
`
`10. An independent viewing system using the method of modeling the visible world as
`
`recited in claim 1 and a standard computer graphics system to perform any combination of actions
`
`specified in claims 2 through 8.
`
`-6-
`
`60"d
`
`
`
`
`
`‘”\/V?
`
`\1
`
`FIG-URE 1.
`FIGURE 1.
`
`f'.I GtH~ E- 2.. ..
`Fl GORE- 1.
`
`‘P
`
`{>2
`
`“\pr
`
`FIE G-URE: 3.
`
`DI'd
`OI"d
`
`I688‘1726‘80L ‘JC‘
`
`‘LLaMDd 'C‘
`
`'H puow/x‘ea
`
`9I=OI 85-21-149?
`
`GOOGLE EXHIBIT 1018, Page 9 of 27
`
`
`
`~~
`I
`I
`r
`~~I ~
`
`~
`
`
`
`VP
`'F I G- Vl\ k 4 A.
`
`0 00 ~?
`F!&t>~E. 4b.
`
`
`
`{I‘d
`I I . d
`
`I688'1726'5201.
`
`‘JE‘
`
`‘LLaMOd ‘f‘
`
`’H PUOMPB 41:01: 86-ZI—U9f‘
`
`GOOGLE EXHIBIT 1018, Page 10 of 27
`
`
`
`
`
`
`
`r-- ~"+ipo&.""i po~ .. ,t
`t~f-'r~tt
`
`\,---+-,...,.....
`
`21‘
`21"d
`
`I688 ' 1726 ‘ €01
`
`.40 ‘LLeMOd 'E‘
`
`'H puowfiea
`
`1113012 86—21442?
`
`GOOGLE EXHIBIT 1018, Page 11 of 27
`
`
`
`I
`r-t .
`
`M
`
`tl.
`
`1'"1
`
`0\ ro ro
`
`}'"~ig. ~
`
`COLOR TABLE
`
`~30
`
`RAM
`
`.----
`
`INPUT
`
`?0
`
`DISPLAY
`
`I
`
`r---~---
`!
`
`CPU
`
`ROM
`
`10
`(
`
`I
`
`60
`(
`)
`
`I
`'
`r--;---.. L. - - - - - - .
`l
`i SPHERICAL
`I
`I DATASET
`lGENERATOR
`I
`I
`I
`t_ _ _ _ _ _ _ _ _ _ _ _ J
`
`¢
`N
`
`0\ .
`
`M
`0
`""
`1-
`'j
`
`r
`r-
`r-
`Q)
`
`~ n. .
`'j . :c
`
`"0 c
`0
`E
`~
`11)
`a::
`
`"" r-t ..
`
`0
`r-t
`
`ro
`O'l
`I
`N
`r-f
`I c:
`
`l1j
`'j
`
`I
`
`I
`
`(0
`
`J
`
`70
`
`7
`
`
`
`/* Includes required */
`#include <GL/gl.h>
`#include <GL/glut.h>
`#include <stdio.h>
`#include <ppm.h>
`#include <math.h>
`
`/**
`* something because of windows
`*I
`void _epr int f () {
`}
`
`/**
`* our data structure of choice
`*I
`typedef struct obj {
`/* other parameters */
`float matrix[l6];
`
`/* view angle */
`float viewangle;
`
`I* aspect ratio */
`float aspect;
`
`/* z of the camera */
`float tz;
`
`/* ry of the camera */
`float ry;
`Obj;
`
`I* hold the display lists for textures */
`typedef struct texture {
`int texl;
`int tex2;
`Texture;
`
`!**
`* our global variables
`*I
`/* camera settings */
`Obj scene;
`
`!* texture stuff */
`Texture def;
`Texture• current_texture = &def;
`
`/* track the next display list number */
`int nextDLnum = 2;
`
`I* stuff for lighting */
`float 1 ight Pos [ 4] = { 2. 0, 4. 0, 2. 0, 0} ;
`float lightDir[4] = {0, 0, 1.0, 1.0};
`float lightAmb[4] = {0.4, 0.4, 0.4, 1.0};
`float lightDiff[4] = {0.8, 0.8, 0.8, 1.0};
`float lightSpec[4J = {0.8, 0.8, 0.8, 1.0};
`int lights = 0;
`int outsideView = 0;
`int parent;
`
`#define HEMISPHERE 1
`void createHemisphere(int listNum, int numPts, int geom);
`
`/**
`* Read in the ppm files and create display lists for a texture
`* returns the dimension of the image
`*I
`pixel **map1, **map2;
`GLubyte *tex1, *tex2, **tmpPP, *tmpP;
`void readTexture(Texture* t, char* file1, char* file2) {
`FILE *fp1, *fp2;
`int cols, rows, i, j, index;
`pixval maxval;
`
`/* open the files */
`fp1 = fopen(file1, "r');
`fp2 = fopen(file2, "r");
`if (! fpl) {
`fprintf(stderr, "Couldn't open %s\n", filel);
`
`if (! fp2) {
`fprintf(stderr, •couldn't open %s\n', file2);
`
`!* read the ppm files */
`map1 = ppm_readppm(fpl, &cols, &rows, &maxval);
`fprintf(stderr, "%s: rows= %d \t cols = %d\n", filel, rows, cols, maxval);
`map2 = ppm_readppm(fp2, &cols, &rows, &maxval);
`fprintf(stderr, "'l.s: rows = %d \t cols = %d\n", file2, rows, cols, maxval);
`
`/* convert them */
`texl = malloc(sizeof(GLubyte) *rows* cols * 3);
`tex2 = malloc(sizeof(GLubyte) *rows* cols * 3);
`index = 0;
`i < rows;
`for {i = 0;
`i++) {
`for (j = 0; j < cols; j++)
`I* R */
`texl[index]
`tex2[index]
`index ++;
`
`PPM_GETR(mapl[i] [j]);
`PPM_GETR(map2[i] [j]);
`
`!* G */
`texl [index]
`tex2[index]
`index ++;
`
`/* B */
`texl[index]
`tex2 [index]
`index ++;
`
`PP~GETG(map1[i] [j]);
`PPM_GETG(map2[i] [j]);
`
`PPM_GETB (mapl [ i] [ j]);
`PPM_GETB(map2[i] [j]);
`
`/* create the textures */
`I* new display list*/
`glNewList(nextDLnum, GL_COMPILE);
`t->texl = nextDLnum;
`nextDLnum++;
`glTeximage2D(GL_TEXTURE_2D, 0, 3, cols, rows, 0, GL_RGB, GL_UNSIGNED_BYTE,
`texl);
`
`glEndList();
`
`/* new display list*/
`glNewList(nextDLnum, GL_COMPILE);
`t->tex2 = nextDLnum;
`nextDLnum++;
`glTeximage2D(GL_TEXTURE_2D, 0, 3, cols, rows, 0, GL_RGB, GL_UNSIGNED_BYTE,
`
`
`
`glEndList();
`
`tex2);
`
`/**
`* this will initialize the display lists for the objects
`*I
`void initialize_objects(int argc, char**argv)
`float tmp[4];
`
`/* read in the texture *I
`readTexture(&def, argv[1], argv[2]);
`
`I* create hemisphere */
`createHemisphere(1, 50, GL_TRIANGLE_STRIP);
`
`I* scene *I
`scene.viewangle = 130;
`scene.tz = 0;
`scene.ry = 0;
`
`/*
`
`Clear the screen. draw the objects
`
`*I
`void display ()
`{
`
`float tmp[4];
`float height;
`
`I* clear the screen *I
`glClear(GL_COLOR_BUFFER_BIT
`
`GL_DEPTH_BUFFER_BIT);
`
`I* adjust for scene orientation *I
`glMatrixMode(GL_PROJECTION);
`if (outsideView) {
`glLoadidentity();
`gluPerspective(45, scene.aspect, 0.1, 10.0);
`glTranslatef(O, 0, -3);
`glRotatef(45, 1, 0, 0);
`glRotatef(45, 0, 1, 0);
`glDisable(GL_TEXTURE_2D);
`.8);
`.8,
`glColor3f(.8,
`else {
`glLoadidentity();
`gluPerspective(scene.viewangle, scene.aspect, 0.1, 10.0);
`glTranslatef(O, 0, scene.tzl;
`glRotatef(scene.ry, 0, 1, 0);
`
`I* draw our models *I
`glMatrixMode(GL_MODELVIEW);
`glPushMatrix ();
`
`if (outsideView)
`I* transform to where the camera would be *I
`glPushMatrix ();
`
`I* draw a cube for the camera *I
`glLoadidentity();
`glRotatef(180, 1, 0, Ol;
`glTranslatef(O, 0, scene.tz);
`tmp[O] = tmp[1] = tmp[2] = .8;
`tmp[3] = 1;
`
`tmp);
`glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR,
`glMaterialf(GL_FRONT_AND_BACK, GL_SHININESS, 0.0);
`glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE,
`glutSolidCube(.1);
`
`tmp);
`
`I* draw a cone for the view frustrum *I
`glLoadidentity();
`height = 1 - scene.tz;
`glRotatef(45, 0, 0, 1);
`glTranslatef(O, 0, -1);
`tmp [ 0] = tmp [ 1 J = 1;
`tmp [2 J = 0;
`tmp [ 3 J = . 3;
`tmp);
`glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR,
`glMaterialf(GL_FRONT_AND_BACK, GL_SHININESS, 0.0);
`tmp);
`glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE,
`glutSolidCone(tan(scene.viewangle * 3.14 I 360.0) *height, height, 20, 1);
`glPopMatrix();
`glEnable(GL_TEXTURE_2D);
`
`I* now draw the semisphere *I
`if (lights) {
`tmp[O] = tmp[1] = tmp[2] = .8;
`. 8;
`tmp[3]
`tmp);
`glMater alfv(GL_FRONT_AND_BACK, GL_SPECULAR,
`glMater alf(GL_FRONT_AND_BACK, GL_SHININESS, 10.0);
`glMater alfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE,
`
`tmp);
`
`glCallList(current_texture->texl);
`glCallList(HEMISPHERE);
`
`i f (lights) {
`tmp [ 0] = tmp [l J = tmp [ 2 J = . 5;
`• 5;
`tmp [ 3 J
`tmp);
`glMater alfv(GL_FRONT_AND_BACK, GL_SPECULAR,
`glMater alf(GL_FRONT_AND_BACK, GL_SHININESS, 10.0);
`glMater alfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE,
`
`tmp);
`
`glRotatef(180.0, 0.0, 0.0, 1.0);
`g1Cal1List(current_texture->tex2);
`glCallList (HEMISPHERE);
`glPopMatrix ( l;
`
`fprintf(stderr, "%s\n", gluErrorString(glGetError()));
`glutswapBuffers();
`
`I*
`
`Handle Menus
`
`*I
`#define M_QUIT 1
`void Select(int value)
`{
`
`switch (value)
`case M_QUIT:
`exit(O);
`break;
`
`glutPostRedisplay();
`
`void create_menu() {
`fprintf(stderr, "Press
`
`for help\n" l;
`
`
`
`glutCreateMenu(Select);
`glutAddMenuEntry("Quit", M_QUIT);
`glutAttachMenu(GLUT_RIGHT_BUTTON);
`
`/* Initializes hading model */
`void my!nit!void)
`(
`
`glEnable(GL_DEPTH_TEST);
`glShadeModel(GL_SMOOTH);
`
`I* texture stuff */
`glPixelStorei(GL_UNPACK_ALIGNMENT, sizeof(GLubyte));
`glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
`glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
`glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
`glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
`glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
`g1Enable(GL_TEXTURE_2D);
`
`I*
`
`Called when the window is first opened and whenever
`the window is reconfigured (moved or resized).
`
`*I
`void myReshape(int w,
`{
`
`int h)
`
`/* define the viewport *I
`glViewport (0, 0, w, h);
`scene.aspect = 1.0*(GLfloat)w/(GLfloat)h;
`glMatrixMode(GL_PROJECTION);
`glLoadidentity{);
`gluPerspective(scene.viewangle, scene.aspect, 0.1, 10.0);
`glMultMatrixf(scene.matrix);
`glMatrixMode (GL_MODELVIEW);
`
`/*back to modelview matrix */
`
`I*
`* Keyboard handler
`*I
`void
`Key(unsigned char key, int x, int y)
`{
`
`float matrix(16];
`glMatrixMode(GL_MODELVIEW);
`glGetFloatv(GL_MODELVIEW_MATRIX, matrix);
`glLoadidentity();
`fprintf(stderr, "%d- %c
`switch (key) {
`case 'o':
`if (!outsideView) {
`fprintf(stderr, "outside on
`outsideView = 1;
`
`", key, key);
`
`");
`
`I* turn on blending *I
`glEnable(GL_BLEND);
`glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
`
`/* We want to see color */
`glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
`
`I* turn on our spotlight */
`glEnable(GL_LIGHTl);
`glLightfv(GL_LIGHTl, GL_AMBIENT,
`
`lightAmb);
`
`glLightfv(GL_LIGHTl, GL_DIFFUSE, lightDiff);
`lightSpec);
`glLightfv(GL_LIGHTl, GL_SPECULAR,
`glLightfv(GL_LIGHTl, GL_SPOT_DIRECTION, lightDir);
`else {
`fprintf(stderr, "outside off ");
`outsideView = 0;
`glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
`glDisable(GL_BLEND);
`
`%f
`
`scene.tz);
`
`scene.tz);
`
`break;
`case 'F':
`fprintf(stderr, "flat ");
`glShadeModel(GL_FLAT);
`break;
`case 'f':
`fprintf(stderr, "smooth ");
`glShadeModel(GL_SMOOTH);
`break;
`case 'Y':
`printf("ry = %f\n", scene.ry);
`scene.ry -= 5;
`break;
`case 'Y':
`scene.ry += 5;
`break;
`case 'z':
`scene.tz -=
`.02;
`fprintf(stderr, " tz
`break;
`case 'Z':
`scene.tz += .02;
`fprintf(stderr, " tz = %f
`break;
`case 'a':
`scene.viewangle -= 1;
`fprintf(stderr, " angle: %f
`break;
`case 'A':
`scene.viewangle += 1;
`fprintf(stderr, "angle: %f
`break;
`case 55:
`glRotatef(-5, 0.0, 0.0, 1.0);
`break;
`case 57:
`glRotatef(5, 0.0, 0.0, 1.0);
`break;
`case 52:
`glRotatef(-5, 0.0, 1.0, 0.0);
`break;
`case 54:
`g1Rotatef(5, 0.0, 1.0, 0.0);
`break;
`case 56:
`glRotatef(5, 1.0, 0.0, 0.0);
`break;
`case 50:
`glRotatef(-5, 1.0, 0.0, 0.0);
`break;
`case •q•:
`if (lights) {
`glDisable(GL_LIGHTO);
`glDisable(GL_LIGHTING);
`lights = 0;
`fprintf(stderr, "no lights ");
`
`scene.viewangle);
`
`"
`
`scene.viewangle);
`
`
`
`'<O
`' v '
`
`else {
`glEnable(GL_LIGHTING);
`glEnable(GL_LIGHTO);
`glLightfv(GL_LIGHTO, GL_POSITION, lightPos);
`lightAmb);
`glLightfv(GL_LIGHTO, GL_AMBIENT,
`glLightfv(GL_LIGHTO, GL_DIFFUSE, lightDiff);
`lightSpec);
`glLightfv(GL_LIGHTO, GL_SPECULAR,
`lights = 1;
`fprintf (stderr, "lights "l;
`
`");
`
`");
`
`break;
`case 't':
`fprintf(stderr, "texture off
`g1Disable(GL_TEXTURE_2D);
`break;
`case 'T':
`fprintf(stderr, "texture on
`g1Enable(GL_TEXTURE_2D);
`break;
`case'?':
`rotate current object \n") ;
`fprint f ( stderr, "hjkl -
`fprintf(stderr, "s/S- shrink I grow the object or zoom the scene\n");
`fprintf(stderr, "a/A viewangle\n");
`fprintf(stderr, "z/Z camera position\n");
`fprintf(stderr, "f/F flat smooth\n");
`fprintf(stderr, "Escape quits \n");
`break;
`case 27:
`exit(l);
`break;
`default:
`fprintf(stderr, "Unbound key- %d
`break;
`
`/* Esc will quit */
`
`key);
`
`fprintf(stderr, "\n");
`glMultMatrixf(matrix);
`glutPostRedisplay();
`
`/*
`
`Main Loop
`Open window with initial window size, title bar,
`RGBA display mode, and handle input events.
`
`*I
`int main(int argc, char** argv}
`(
`
`glutinit(&argc, argv);
`glutinitDisplayMode (GLUT~DOUBLE ) GLUT_RGBA);
`parent= glutCreateWindow (argv[O]);
`rnyinit ();
`glutKeyboardFunc(Key);
`glutReshapeFunc (rnyReshape);
`glutDisplayFunc(display);
`create_rnenu ();
`initialize_objects(argc, argv);
`glutMainLoop();
`
`
`
`#ifndef GGCONSTANTS_H
`#define GGCONSTANTS_H
`
`const double ggFourPi = 12.566371;
`/* needs more digits */
`const double ggTwoPi = 6.2831853;
`/* needs more digits */
`const double ggPi = 3.14159265358979323846;
`canst double ggHalfPi = 1.57079632679489661923;
`canst double ggThirdPi = 1.0471976; /*needs more digits */
`canst double ggQuarterPi = 0.78539816; /*needs more digits*/
`canst double gginversePi = 0.31830989;
`canst double ggSqrtTwo = 1.4142135623730950488;
`canst double gginverseSqrtTwo = 0.70710678;
`const double ggSqrtThree = 1.7320508075688772935;
`canst double ggSqrtFive = 2.2360679774997896964;
`canst double ggE = 2.718281828459045235360287;
`
`canst double ggRad = 57.29577951308232;
`
`#i fdef sun
`canst double gglnfinity
`#else
`#include <float.h>
`canst double gginfinity
`#end if
`
`1. Oe10;
`
`DBL_MAX;
`
`# i fndef M_PI
`#define M_PI ggPi
`#endif
`
`canst double ggBigEpsilon = 0.0001;
`const double ggEpsilon = 0.000001;
`canst double ggSmallEpsilon = 0.000000001;
`const double ggTinyEpsilon = 0.000000000001;
`
`canst double ggColorRatio
`
`0.0039215686274509803;
`
`#endif
`
`#define ggMin(x,y)
`#define ggMax(x,y)
`
`(x < y)
`(x > y)
`
`X
`X
`
`; y
`y
`
`
`
`I* Includes required *I
`#ifdef WINDOWS
`#include <windows.h>
`#end if
`
`#include <GL\gl.h>
`#include <GL\glut.h>
`#include <Stdio.h>
`#include <math.h>
`#ifdef USE_NETPBM
`#include <ppm.h>
`#end if
`
`I**
`• something because of windows
`*I
`void _eprint f ()
`}
`
`(
`
`I**
`* our data structure of choice
`*I
`typedef struct obj (
`I* other parameters *I
`float matrix[l6];
`
`I* view angle *I
`float viewangle;
`
`I* aspect ratio *I
`float aspect;
`
`I* z of the camera *I
`float tz;
`
`I* ry of the camera *I
`float ry;
`Obj;
`
`I* hold the display lists for textures *I
`typedef struct texture (
`int texl;
`int tex2;
`Texture;
`
`I**
`* our global variables
`*I
`I* camera settings *I
`Obj scene;
`
`I* texture stuff *I
`Texture def;
`Texture* current texture = &def;
`
`I* track the next display list number *I
`int nextDLnum = 2;
`
`I* stuff for lighting *I
`float lightPos[4] = (2.0, 4.0, 2.0, 0};
`float lightDir(4] = {0, 0, 1.0, 1.0};
`{ 0 . 3 , 0 . 3 , 0 . 3 , 1. 0 l ;
`float li ghtAmb ( 4] =
`( 0. 6, 0. 6, 0. 6, 1. 0 l ;
`float lightDif f ( 4] =
`float lightSpec(4] = (0.6, 0.6, 0.6, 1.0};
`float clipDistance = 2.14;
`
`int left, right, parent;
`int width, height;
`#ifdef USE_NETPBM
`pixel** ppmPixels = 0;
`#end if
`GLubyte* sgiPixels
`FILE* commands;
`int doTakeSnapshot
`
`0;
`
`0;
`
`#define HEMISPHERE
`void createHemisphere(int listNum, int numPts, int geom);
`void draw_left();
`void draw_right();
`void Key(unsigned char,int,int);
`
`I**
`* read the frame buffer and write out a ppm file
`*I
`void takesnapshot() {
`#ifdef USE_NETPBM
`static int shotNum = 0;
`FILE* file;
`char name (50];
`int index~ i, j;
`
`I* draw everything again *I
`draw_right ();
`glFlush();
`draw_left();
`glFlush();
`
`I* read the pixels from the frame buffer *I
`glReadPixels(O. 0. width, height, GL_RGB, GL_UNSIGNED_BYTE, sgiPixels);
`
`I* convert them to the ppm *I
`1ndex = 0;
`for (i = height - 1; i >= 0; i--)
`for (j = 0;
`j < width; j++) {
`PPM_ASSIGN(ppmPixels(i] (j],
`sgiPixels[index],
`sgiPixels(index < 1],
`sg1Pixels[index + 2]):
`index +;;:;: 3;
`
`I* open a file *I
`sprintf (name, "%d.ppm", shotNum);
`shotNum++;
`file= fopen(name, "w");
`
`I* write the ppm file *I
`ppm_writeppm(file, ppmPixels, width, height, 255, 0);
`
`I* close the file *I
`fclose(file);
`#end if
`}
`
`I**
`* Read in the ppm files and create display lists for a texture
`* returns the dimension of the image
`*I
`void readTexture(Texture* t, char• filel, char* file2) {
`#ifdef USE_NETPBM
`
`
`
`FILE *fpl, *fp2;
`int cols, rows, i, j, index;
`pixel **mapl, **map2;
`GLubyte •texl, *tex2;
`pixval maxval;
`
`/* open the files */
`fpl = fopen(filel, "r");
`fp2 = fopen(file2, "r");
`if (! fpl) {
`fprintf(stderr, "Couldn't open %s\n", filel);
`
`if (! fp2) {
`fprintf(stderr, "Couldn't open %s\n", file2);
`
`/* read the ppm files */
`mapl = ppm_readppm(fpl, &cols, &rows, &maxvall;
`fprintf(stderr, "%s: rows= %d \t cols = %d\n", filel, rows, cols, maxval);
`map2 = ppm_readppm(fp2, &cols, &rows, &maxval);
`fprintf(stderr, "%s: rows = %d \t cols = %d\n", file2, rows, cols, maxval);
`
`/* convert them */
`texl = malloc(sizeof(GLubyte) * rows* cols * 3);
`tex2 = malloc(sizeof(GLubyte) *rows* cols * 31;
`index = 0;
`i++) {
`i < rows;
`for (i = 0;
`j < cols; j++)
`for (j = 0;
`I* R */
`texl [index)
`tex2[index)
`index ++;
`
`PPM_GETR(mapl[i] [j]);
`PPM_GETR(map2[i] [j]);
`
`I* G */
`texl[indexl
`tex2[index]
`index ++;
`
`I* B */
`texl[index]
`tex2 [index]
`index ++;
`
`PPM_GETG(rnapl[i] [j]);
`PPM_GETG(map2[i] [j]);
`
`PPM_GETB (mapl [ i] [ j] I ;
`PPM_GETB(map2[i] [j]);
`
`/* create the textures