`Oxaal
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`US006243099Bl
`US 6,243,099 Bl
`Jun.5,2001
`
`(10) Patent No.:
`(45) Date of Patent:
`
`(54) METHOD FOR INTERACTIVE VIEWING
`FULL-SURROUND IMAGE DATA AND
`APPARATUS THEREFOR
`
`(76)
`
`Inventor: Ford Oxaal, 42 Western Ave., Cohoes,
`NY (US) 12047
`
`( *) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 09/228,760
`
`(22) Filed:
`
`Jan. 12, 1999
`
`Related U.S. Application Data
`
`( 63) Continuation-in-part of application No. 08/749,166, filed on
`Nov. 14, 1996, now Pat. No. 5,903,782.
`(60) Provisional application No. 60/071,148, filed on Jan. 12,
`1998.
`Int. Cl? .............................. G09G 3/04; G06T 15/00
`(51)
`(52) U.S. Cl. .......................... 345/430; 345/427; 382/293;
`382/294
`(58) Field of Search ..................................... 345/430, 431,
`345/427, 424, 425, 428, 419; 382/181,
`190, 154, 285, 276, 164, 162, 293, 294;
`396/584; 428/141; 348/48; 600/411, 425
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`5,396,583 * 3/1995 Chen et a!. .......................... 345/427
`5,699,497 * 12/1997 Erdahl et a!. ........................ 345/428
`5,951,475 * 9/1999 Gueziec et a!. ...................... 600/425
`5,973,700 * 10/1999 Taylor eta!. ........................ 345/427
`
`6,016,439 * 1!2000 Acker ................................... 600/411
`6,028,955 * 2/2000 Cohen et a!. ........................ 382/154
`6,031,540 * 2/2000 Golin eta!. .......................... 345/419
`6,084,979 * 7/2000 Kanade eta!. ....................... 382/154
`
`OTHER PUBLICATIONS
`
`Suya You, "Interactive Volume Rendering for Vitural
`Colonoscopy", Visualization '97., Proceedings , 1997, pp.
`433-436, 571.*
`
`* cited by examiner
`
`Primary Examiner-Mark R. Powell
`Assistant Examiner-Thu-Thao Havan
`(74) Attorney, Agent, or Firm-Westerlund•Powell, P.C.;
`Raymond H. J. Powell, Jr.; Robert A Westerlund
`
`(57)
`
`ABSTRACT
`
`A method of modeling of the visible world using full(cid:173)
`surround image data includes steps for selecting a view point
`within a p-surface, and texture mapping full-surround image
`data onto the p-surface such that the resultant texture map is
`substantially equivalent to projecting full-surround image
`data onto the p-surface from the view point to thereby
`generate a texture mapped p-surface. According to one
`aspect of the invention, the method also includes a step for
`either rotating the texture mapped p-surface or changing the
`direction of view to thereby expose a new portion of the
`texture mapped p-surface. According to another aspect of the
`invention, a first the texture mapped p-sphere is replaced by
`a second texture mapped p-sphere by interactively selecting
`the new viewpoint from viewpoints within the second tex(cid:173)
`ture mapped p-sphere. A corresponding apparatus is also
`described.
`
`22 Claims, 16 Drawing Sheets
`
`antipodal point
`tangent point
`
`GOOGLE EXHIBIT 1001, Page 1 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 1 of 16
`
`US 6,243,099 Bl
`
`FIG. 1
`
`FIG. 2
`
`FIG. 3
`
`GOOGLE EXHIBIT 1001, Page 2 of 22
`
`
`
`U.S. Patent
`
`Jun. s, 2001
`
`Sheet 2 of 16
`
`US 6,243,099 Bl
`
`FIG. 4A
`
`f
`
`I' A
`
`I 8 C
`~----
`
`1
`Ooo
`VP
`
`FIG. 48
`
`f
`
`If A
`
`I 8 C
`~---
`
`1
`Ooo
`VP
`
`GOOGLE EXHIBIT 1001, Page 3 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 3 of 16
`
`US 6,243,099 Bl
`
`FIG. 5
`
`-rr ........ "' '\ .......... tl_
`
`/
`
`-
`-
`
`--
`---
`
`-
`....
`-
`
`I
`
`\
`
`FIG. 6
`
`FIG. 7
`antipodal point
`tangent point
`
`GOOGLE EXHIBIT 1001, Page 4 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 4 of 16
`
`US 6,243,099 Bl
`
`0
`-.:::t
`L-
`
`I'-~
`~
`
`0
`LO
`l~
`I-
`:::>
`0.. z -
`
`....__
`
`C>
`
`~
`
`UJ __.
`co
`~
`0:::
`0
`_J
`0
`0
`
`co
`•
`(9
`LL
`
`I
`r 0
`
`~
`
`:::>
`a_
`0
`
`0 c.o
`\....--....
`
`~
`0
`0:::
`
`I
`I
`I
`L _________
`
`--r
`C> r-
`
`C>
`~
`~~
`a_
`en
`0
`
`GOOGLE EXHIBIT 1001, Page 5 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 5 of 16
`
`US 6,243,099 Bl
`
`FIG. 9A
`
`/* Includes required */
`#include <GL/gl.h>
`Uinclude <GL/glut.h>
`#include <stdio.h>
`#include <ppm.h>
`#include <math.h>
`
`/**
`* something because of windows
`*/
`void
`
`eprintf ()
`
`/**
`* our data structure of choice
`*/
`typedef struct obj {
`/* other parameters */
`float matrix[l6];
`
`/* view angle */
`float viewangle;
`
`/* aspect ratio */
`float aspect;
`
`/* z of the camera */
`float tz;
`
`/* ry of the camera */
`float ry;
`Obj;
`
`/* hold the display lists for textures */
`typedef struct texture {
`int texl;
`int tex2;
`Texture;
`
`/**
`* our global variables
`*I
`/* camera settings */
`Obj scene;
`
`GOOGLE EXHIBIT 1001, Page 6 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 6 of 16
`
`US 6,243,099 Bl
`
`FIG. 98
`
`/* texture stuff */
`Texture def;
`Texture* current texture = &def;
`
`/* track the next display list number */
`int nextDLnum = 2;
`
`/* stuff for lighting */
`float lightPos[4)
`{2.0, 4.0, 2.0, 0);
`float lightDir[4) = {0, 0, 1.0, 1.0};
`float lightAmb[4) = {0.4, 0.4, 0.4, 1.0);
`float lightDiff[4)
`{0.8, 0.8, 0.8, 1.0};
`float lightSpec[4] = {0.8, 0.8, 0.8, 1.0};
`int lights = 0;
`int outsideView = 0;
`int parent;
`
`#define HEMISPHERE 1
`void createHemisphere(int listNum, int numPts, int geom);
`
`/**
`* Read in the ppm files and create display lists for a texture
`* returns the dimension of the image
`*I
`pixel **mapl, **map2;
`GLubyte *texl, *tex2, **tmpPP, *tmpP;
`void readTexture(Texture* t, char* filel, char* file2)
`FILE *fpl, *fp2;
`int cols, rows, i, j, index;
`pixval maxval;
`
`/* open the files */
`fpl = fopen(filel, "r");
`fp2 = fopen ( file2, "r");
`if (!fpl) {
`fprintf(stderr, "Couldn't open %s\n", filel);
`
`)
`if (! fp2)
`fprintf(stderr, "Couldn't open %s\n", file2);
`}
`
`/* read the ppm files */
`mapl = ppm_readppm(fpl, &cols, &rows, &maxval);
`fprintf(stderr, "%s: rows = %d \t cols = %d\n", filel, rows,
`cols, maxval);
`map2 = ppm_readppm(fp2, &cols, &rows, &maxval);
`
`GOOGLE EXHIBIT 1001, Page 7 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 7 of 16
`
`US 6,243,099 Bl
`
`fprintf(stderr, "%s: rows= %d \t cols
`cols, maxval);
`
`%d\n", file2, rows,
`
`FIG. 9C
`
`/* convert them */
`texl = malloc(sizeof(GLubyte) *rows* cols * 3);
`tex2 = malloc(sizeof(GLubyte) *rows* cols * 3);
`index = 0;
`i < rows; i++) {
`for (i = 0;
`j < cols; j++) {
`for (j = 0;
`/* R */
`texl[index]
`tex2[index]
`index ++;
`
`PPM GETR (mapl [i) [j]);
`PPM_GETR(map2[i] [j]);
`
`/* G */
`texl[index]
`tex2[index]
`index ++;
`
`/* 8 */
`texl[index]
`tex2[index]
`index ++;
`
`}
`
`}
`
`PPM_GETG(mapl[i] [j]);
`PPM_GETG{map2[i] [j]);
`
`PPM GETB (mapl [i] [j]);
`PPM_GETB(map2[i] [j]);
`
`/* create the textures */
`/* new display list*/
`glNewList(nextDLnum, GL COMPILE);
`t->texl = nextDLnum;
`nextDLnum++;
`glTeximage2D(GL_TEXTURE 20, 0, 3, cols, rows, 0, GL_RGB,
`GL_UNSIGNED_BYTE,
`texl) ;
`glEndList();
`
`/* new display list*/
`glNewList(nextDLnum, GL_COMPILE);
`t->tex2 = nextDLnum;
`nextDLnum++;
`glTeximage2D(GL_TEXTURE 20, 0, 3, cols, rows, 0, GL RGB,
`GL UNSIGNED BYTE'
`tex2);
`glEndList();
`
`GOOGLE EXHIBIT 1001, Page 8 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 8 of 16
`
`US 6,243,099 Bl
`
`FIG. 90
`
`/**
`* this will initialize the display lists for the objects
`*/
`void initialize_objects(int argc, char**argv)
`float tmp [ 4];
`
`/* read in the texture */
`readTexture(&def, argv[1], argv[2]);
`
`/* create hemisphere */
`createHemisphere(1, 50, GL TRIANGLE STRIP);
`
`/* scene */
`scene.viewangle
`scene.tz
`0;
`scene.ry = 0;
`
`130;
`
`Clear the screen. draw the objects
`
`/*
`*
`*I
`void display ()
`{
`
`float tmp [ 4] ;
`float height;
`
`/* clear the screen */
`glClear(GL COLOR_BUFFER_BIT
`
`I GL DEPTH_BUFFER_BIT);
`
`/* adjust for scene orientation */
`glMatrixMode(GL_PROJECTION);
`if (outsideView) {
`glLoadidentity();
`gluPerspective(45, scene.aspect, 0.1, 10.0);
`glTranslatef(O, 0, -3);
`glRotatef (45, 1, 0, 0);
`g1Rotatef(45, 0, 1, 0);
`g1Disable(GL_TEXTURE_2D);
`glColor3f(.8,
`.8,
`.8);
`} else {
`glLoadidentity();
`gluPerspective(scene.viewangle, scene.aspect, 0.1, 10.0);
`glTranslatef(O, 0, scene.tz);
`glRotatef(scene.ry, 0, 1, 0);
`}
`
`GOOGLE EXHIBIT 1001, Page 9 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 9 of 16
`
`US 6,243,099 Bl
`
`FIG. 9E
`
`/* draw our models */
`glMatrixMode(GL MODELVIEW);
`glPushMatrix();
`
`if (outsideView)
`/* transform to where the camera would be */
`glPushMatrix();
`
`/* draw a cube for the camera */
`glLoadidentity();
`glRotatef (180, 1, 0, 0);
`gl Translate£ ( 0, 0, scene. tz) ;
`tmp[O] = tmp[1] = tmp[2) = .8;
`tmp[3] = 1;
`tmp);
`glMaterialfv(GL FRONT_AND_BACK, GL_SPECULAR,
`glMaterialf(GL FRONT_AND_BACK, GL_SHININESS, 0.0);
`glMaterialfv(GL FRONT_AND BACK, GL_AMBIENT_AND DIFFUSE,
`glutSolidCube(.1);
`
`tmp);
`
`/* draw a cone for the view frustrum */
`glLoadidentity();
`height = 1 - scene.tz;
`glRotate£(45, 0, 0, 1);
`glTranslatef(O, 0, -1);
`tmp [ 0]
`tmp [ 1] = 1 ;
`tmp[2] = 0;
`tmp[3] =
`.3;
`tmp);
`glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR,
`glMaterialf(GL FRONT_AND_BACK, GL_SHININESS, 0.0);
`tmp);
`glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE,
`glutSolidCone(tan(scene.viewangle * 3.14 I 360.0) * height,
`height, 20, 1);
`glPopMatrix();
`glEnable(GL_TEXTURE 2D);
`}
`
`/* now draw the semisphere */
`if (lights) {
`tmp[O) = tmp[1) = tmp[2] = .8;
`tmp [ 3] =
`. 8;
`tmp);
`glMaterialfv(GL FRONT_AND_BACK, GL SPECULAR,
`glMaterialf(GL_FRONT_AND_BACK, GL_SHININESS, 10.0);
`glMaterialfv(GL FRONT_AND_BACK, GL_AMBIENT_AND DIFFUSE,
`
`tmp);
`
`)
`glCallList(current_texture->tex1);
`glCallList(HEMISPHERE);
`
`GOOGLE EXHIBIT 1001, Page 10 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 10 of 16
`
`US 6,243,099 Bl
`
`FIG. 9F
`
`if (lights) {
`tmp[O] = tmp[l] = tmp[2] = .5;
`tmp [ 3] =
`. 5;
`tmp);
`glMaterialfv(GL FRONT AND_BACK, GL_SPECULAR,
`glMaterialf(GL_FRONT_AND_BACK, GL_SHININESS, 10.0);
`glMaterialfv(GL FRONT AND_BACK, GL_AMBIENT AND DIFFUSE,
`
`tmp);
`
`g 1 Rot ate f ( 18 0 . 0, 0 . 0, 0 . 0, 1 . 0) ;
`glCallList(current texture->tex2);
`glCallList(HEMISPHERE);
`glPopMatrix();
`
`fprintf(stderr, "%s\n", gluErrorString(glGetError()));
`glutSwapBuffers();
`
`Handlt~ Menus
`
`jk
`*
`*/
`#define M __ QUIT 1
`void Select(int value)
`{
`
`switch (value)
`case M_QUIT:
`exit(O);
`break;
`
`glutPostRedisplay();
`
`void create_menu() {
`fprintf (stderr, "Press ? for help\n");
`glutCreateMenu(Select);
`glutAddMenuEntry("Quit", M_QUIT);
`glutAttachMenu(GLUT RIGHT BUTTON);
`
`/* Initializes hading model */
`void myinit(void)
`{
`
`g1Enable(GL_DEPTH TEST);
`glShadeModel(GL SMOOTH);
`
`/* texture stuff */
`glPixelStorei(GL_UNPACK_ALIGNMENT, sizeof(GLubyte));
`glTexParameterf(GL_TEXTURE 2D, GL_TEXTURE WRAP S, GL CLAMP);
`glTexParameterf(GL_TEXTURE 2D, GL_TEXTURE_WRAP_T, GL CLAMP);
`
`GOOGLE EXHIBIT 1001, Page 11 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 11 of 16
`
`US 6,243,099 Bl
`
`FIG. 9G
`
`glTexParameterf(GL_TEXTURE 2D, GL TEXTURE_MAG FILTER,
`GL NEAREST) ;
`glTexParameterf(GL TEXTURE 2D, GL TEXTURE_MIN_FILTER,
`GL NEAREST);
`glTexEnvf(GL_TEXTURE_ENV, GL TEXTURE ENV MODE, GL DECAL);
`glEnable(GL TEXTURE 2D);
`
`Called when the window is first opened and whenever
`the window is reconfigured (moved or resized) .
`
`/*
`*
`*
`*/
`void myReshape(int w, int h)
`{
`
`/* define the viewport*/
`glViewport (0, 0, w, h);
`scene.aspect = 1.0*(GLfloat)w/(GLfloat)h;
`glMatrixMode(GL_PROJECTION);
`glLoadidentity();
`gluPerspective(scene.viewangle, scene.aspect, 0.1, 10.0);
`glMultMatrixf(scene.matrix);
`glMatrixMode (GL_MODELVIEW);
`matriix */
`
`/*back to modelview
`
`/*
`* Keyboard handler
`*I
`void
`Key(unsigned char key, int x, int y)
`{
`
`float matrix[16];
`glMatrixMode(GL_MODELVIEW);
`glGetFloatv(GL_MODELVIEW_MATRIX, matrix);
`glLoadidentity();
`fprintf (stderr, "%d - %c
`switch (key) {
`case 'o':
`if (!outsideView)
`fprintf (stderr, "outside on ");
`outsideView = 1;
`
`", key, key);
`
`/* turn on blending */
`glEnable(GL_BLEND);
`glBlendFunc(GL SRC_ALPHA, GL ONE_MINUS SRC_ALPHA);
`
`GOOGLE EXHIBIT 1001, Page 12 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 12 of 16
`
`US 6,243,099 Bl
`
`FIG. 9H
`
`/* We want to see color */
`glTexEnvf(GL_TEXTURE ENV, GL_TEXTURE ENV MODE, GL_MODULATE);
`
`/* turn on our spotlight */
`glEnable(GL_LIGHTl);
`lightAmb);
`glLightfv(GL LIGHTl, GL AMBIENT,
`glLightfv(GL_LIGHTl, GL_DIFFUSE, lightDiff);
`glLightfv(GL_LIGHTl, GL SPECULAR, lightSpec);
`glLightfv(GL LIGHTl, GL SPOT DIRECTION, lightDir);
`else {
`fprintf(stderr, "outside off ");
`outsideView = 0;
`glTexEnvf(GL_TEXTURE ENV, GL TEXTURE_ENV_MODE, GL DECAL);
`glDisable(GL BLEND);
`
`%f
`
`scene.tz);
`
`break;
`case 'F' :
`fprintf(stderr, "flat ");
`glShadeModel(GL FLAT);
`break;
`case 'f' :
`fprintf(stderr, "smooth ");
`glShadeModel(GL SMOOTH);
`break;
`case 'y' :
`printf("ry = %f\n", scene.ry);
`scene.ry -= 5;
`break;
`case 'Y':
`scene.ry += 5;
`break;
`case 'z' :
`scene.tz -= .02;
`fprintf ( stderr, " tz
`break;
`case 'Z' :
`scene.tz += .02;
`fprintf ( stderr, " tz
`break;
`case 'a' :
`scene.viewangle -= 1;
`fprintf(stderr, " angle: %f
`
`%f
`
`", scene.tz);
`
`, scene.viewangle);
`
`GOOGLE EXHIBIT 1001, Page 13 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 13 of 16
`
`US 6,243,099 Bl
`
`FIG. 91
`
`scene.viewangle);
`
`break;
`case 'A':
`scene.viewangle += 1;
`fprintf(stderr, "angle: %£
`break;
`case 55:
`glRotatef(-5, 0.0, 0.0, 1.0);
`break;
`case 57:
`g1Rotatef(5, 0.0, 0.0, 1.0);
`break;
`case 52:
`glRotatef(-5, 0.0, 1.0, 0.0);
`break;
`case 54:
`g1Rotatef(5, 0.0, 1.0, 0.0);
`break;
`case 56:
`glRotate£(5, 1.0, 0.0, 0.0);
`break;
`case 50:
`glRotatef(-5, 1.0, 0.0, 0.0);
`break;
`case 'q':
`if (lights)
`glDisable(GL LIGHTO);
`glDisable(GL LIGHTING);
`lights = 0;
`fprintf(stderr, "no lights ");
`else {
`glEnable(GL_LIGHTING);
`glEnable(GL_LIGHTO);
`glLightfv(GL_LIGHTO, GL_POSITION, lightPos);
`glLightfv(GL_LIGHTO, GL_AMBIENT,
`lightAmb);
`glLightfv(GL_LIGHTO, GL DIFFUSE, lightDiff);
`glLightfv(GL LIGHTO, GL_SPECULAR, lightSpec);
`lights = 1;
`fprintf(stderr, "lights ");
`
`break;
`case 't':
`fprintf(stderr, "texture off ");
`glDisable(GL TEXTURE 20);
`break;
`case 'T':
`fprintf(stderr, "texture on ");
`glEnable(GL_TEXTURE 20);
`break;
`
`GOOGLE EXHIBIT 1001, Page 14 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 14 of 16
`
`US 6,243,099 Bl
`
`FIG. 9J
`
`case '?':
`fprintf(stderr, "hjkl -rotate current object\n");
`fprintf(stderr, "s/S - shrink I grow the object or zoom the
`scene\n");
`fprintf(stderr, "a/A viewangle\n");
`fprintf(stderr, "z/Z camera position\n");
`fprintf (stderr, "f/F flat smooth\n");
`fprintf(stderr, "Escape quits \n");
`break;
`case 27:
`exit(l);
`break;
`default:
`fprintf(stderr, "Unbound key - %d
`break;
`
`/* Esc will quit */
`
`key);
`
`fprintf (stderr, "\n");
`glMultMatrixf(matrix);
`glutPostRedisplay();
`
`Main Loop
`Open window with initial window size, title bar,
`RGBA display mode, and handle input events.
`
`/*
`*
`*
`*
`*/
`int main(int argc, char** argv)
`
`glutinit(&argc, argv);
`I GLUT RGBA);
`glutinitDisplayMode (GLUT DOUBLE
`parent= glutCreateWindow (argv[O]);
`myinit();
`glutKeyboardFunc(Key);
`glutReshapeFunc (myReshape);
`glutDisplayFunc(display);
`create_menu();
`initialize_objects(argc, argv);
`glutMainLoop();
`
`GOOGLE EXHIBIT 1001, Page 15 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 15 of 16
`
`US 6,243,099 Bl
`
`FIG. 1 OA
`
`#ifdef WINDOWS
`#include <windows.h>
`#endif
`#include <GLigl.h>
`#include <GLiglut.h>
`
`#include "warp.h"
`#include <stdio.h>
`I**
`* Triangulate a hemisphere and texture coordinates.
`* listNum - display list number
`* numPts -
`number of points to a side
`* return the display list
`*I
`void createHemisphere(int listNum, int numPts, int geom)
`double incr = 1.0 I numPts;
`double u, v, x, y, z;
`float tx, tz;
`inti, j;
`
`I* start the display list *I
`glNewList(listNum, GL_COMPILE_AND EXECUTE);
`
`/* create the coordinates *I
`/* use the square to circle map *I
`I* across then down *I
`v = 0;
`j < numPts ;
`for (j = 0;
`j++)
`I* start the tri strip *I
`glBegin ( geom) ;
`u = 0;
`i <= numPts; i++)
`for (i = 0;
`/* do the top point *I
`/* get the XYZ coords */
`rna p ( u, v, & x, & y, & z ) ;
`
`I* create the texture coord *I
`tx = x I 2 + .5;
`tz = z I 2 + .5;
`if (tx > 1.0 II tz > 1.0 II tx < o.o II tz < 0.0)
`printf("not in range %f %f\n", tx, tz);
`}
`glTexCoord2f(tx, tz);
`
`GOOGLE EXHIBIT 1001, Page 16 of 22
`
`
`
`U.S. Patent
`
`Jun.5,2001
`
`Sheet 16 of 16
`
`US 6,243,099 Bl
`
`FIG. 1 OB
`
`I* normal *I
`g1Normal3f(x, y, z);
`
`/* create the coord *I
`g1Vertex3f(x, y, z);
`
`I* get the XYZ coords *I
`ffia P ( U 1
`& Y 1 & Z ) i
`inC r 1
`& X 1
`-f-
`
`V
`
`I* create the texture coord *I
`I 2 + .5;
`tx = x
`I 2 + .5;
`tz = z
`if (tx > 1.0 I I tz > 1.0 I I tx < o.o I I tz < 0.0)
`printf("not in range %f %f\n", tx, tz);
`}
`glTexCoord2f(tx, tz);
`
`I* normal *I
`g1Normal3f(X 1 y, z);
`
`I* create the coord *I
`glVertex3f (X 1 y, z);
`
`I* adjust u *I
`u += incr;
`
`I* done with the list *I
`glEnd();
`
`I* adjust v *I
`v += incr;
`
`/* all done with the list */
`glEndList();
`
`GOOGLE EXHIBIT 1001, Page 17 of 22
`
`
`
`US 6,243,099 Bl
`
`1
`METHOD FOR INTERACTIVE VIEWING
`FULL-SURROUND IMAGE DATAAND
`APPARATUS THEREFOR
`
`This application is a provision of 60/071,148 filed Jan.
`12, 1998 and a Continuation-in-Part of Ser. No. 08/749,166,
`which was filed on Nov. 14, 1996 (now U.S. Pat. No.
`5,903,782).
`
`BACKGROUND OF THE INVENTION
`
`2
`ating three-dimensional images from data generated by
`diagnostic equipment, such as magnetic resonance imaging.
`However, none of the above described methods or sys(cid:173)
`tems permit viewing in circular perspective, which is the
`best way to view spherical data. Circular perspective does all
`that linear perspective does when zoomed in, but it allows
`the view to zoom out to the point where the viewer can see
`almost everything in the spherical data simultaneously in a
`visually palatable and coherent way.
`What is needed is a method for viewing full-surround,
`e.g., spherical, image data employing circular perspective.
`Moreover, what is needed is an apparatus for viewing
`fall-surround, e.g., spherical, image data employing circular
`perspective. What is also needed is a method for viewing
`15 full-surround, e.g., spherical, image data employing circular
`perspective which is computationally simple. Preferably, the
`method for viewing full-surround, e.g., spherical, image data
`employing circular perspective can be employed on any
`personal computer (PC) system possessing a three dimen-
`20 sional (3-D) graphics capability.
`
`10
`
`The present invention relates generally to a method and
`corresponding apparatus for viewing images. More
`specifically, the present invention relates to a method and
`corresponding apparatus for viewing full-surround, e.g.,
`spherical, image data;
`Systems and techniques for changing the perspective of a
`visible image in producing a resultant image, or systems and
`methods of transforming an image from one perspective
`form to another have been the subject of scientific thought
`and research for many years. Systems and techniques for
`transforming visible images can generally be divided into
`three separate categories:
`(1) perspective generation systems and methods suitable
`for applications such as flight simulators;
`(2) three-dimensional (3D) to two-dimensional (2D) con(cid:173)
`version systems and methods; and
`(3) miscellaneous systems and methods.
`The first category includes U.S. Pat. No. 3,725,563, which 30
`discloses a method of and apparatus for raster scan trans(cid:173)
`formations using rectangular coordinates which are suitable
`for electronically generating images for flight simulators and
`the like. More specifically, the patent discloses a technique
`for raster shaping, whereby an image containing information 35
`from one viewpoint is transformed to a simulated image
`from another viewpoint. On the other hand, U.S. Pat. No.
`4,763,280 discloses a curvilinear dynamic image generation
`system for projecting rectangular coordinate images onto a
`spherical display surface. In the disclosed system, rectan- 40
`gular coordinates are converted to spherical coordinates and
`then the spherical coordinates are distorted for accomplish(cid:173)
`ing the desired simulation of curvature.
`The second category of systems and techniques perform
`3D-to-2D conversion, or vice versa. For example, U.S. Pat.
`No. 4,821,209 discloses a method of and apparatus for data
`transformation and clipping in a graphic display system,
`wherein data transformation is accomplished by matrix
`multiplication. On the other hand, U.S. Pat. No. 4,667,236
`discloses a television perspective effects system for provid(cid:173)
`ing perspective projection whereby each point of a three(cid:173)
`dimensional object is projected onto a two-dimensional
`plane. New coordinates X' and Y' are prepared from the
`original coordinates X, Y and Z, and the viewing distanceD,
`using the general formulas X'=XD/Z and Y'= YD/Z. As the 55
`object to be displayed is rotated around the X or Y axis, the
`viewing distance D is changed for each point.
`In the third category, miscellaneous systems and methods
`are disclosed by, for example, U.S. Pat. No. 5,027,287,
`which describes a device for the digital processing of images
`to obtain special geometrical effects wherein digital image
`data corresponding to intersection points on a rectangular
`X,Y grid are transposed by interpolation with respect to
`intersection points of a curved surface. U.S. Pat. No. 4,882,
`679, on the other hand, discloses a system and associated
`method of reformatting images for three-dimensional dis(cid:173)
`play. The disclosed system is particularly useful for gener-
`
`SUMMARY OF THE INVENTION
`Based on the above and foregoing, it can be appreciated
`that there presently exists a need in the art for viewing
`25 methods and corresponding apparatuses which overcome
`the above-described deficiencies. The present invention was
`motivated by a desire to overcome the drawbacks and
`shortcomings of the presently available technology, and
`thereby fulfill this need in the art.
`The present invention implements a novel and practical
`circular perspective viewer for spherical data. Moreover, it
`implements the circular perspective viewer within the con(cid:173)
`text of existing 3D graphics utilities native to personal
`computers (PCs). Thus, the method and corresponding appa(cid:173)
`ratus for circular perspective viewing is practical for a broad
`market.
`One object according to the present invention is to pro(cid:173)
`vide a method and corresponding apparatus for modeling the
`visible world by texture mapping full-surround image data.
`Another object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`the visible world by texture mapping full-surround image
`data onto a p-surface whereby the resultant texture map is
`substantially equivalent to projecting full-surround image
`data onto the p-surface from a point Q inside the region X
`of the p-surface.
`Still another object according to the present invention is
`to provide a method and corresponding apparatus for mod-
`50 eling the visible world by texture mapping full-surround
`image data wherein the viewer is allowed to interactively
`rotate the model.
`Yet another object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`the visible world by texture mapping full-surround image
`data wherein the viewer is allowed to interactively change
`the direction of vision.
`A still further object according to the present invention is
`to provide a method and corresponding apparatus for mod-
`60 eling the visible world by texture mapping full-surround
`image data, wherein the viewer is allowed to interactively
`alter the focal length or view angle.
`Another object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`65 the visible world by texture mapping full-surround image
`data, wherein the viewer is allowed to interactively move the
`viewpoint.
`
`45
`
`GOOGLE EXHIBIT 1001, Page 18 of 22
`
`
`
`US 6,243,099 Bl
`
`4
`point to thereby generate a texture mapped p-surface, and
`third circuitry for displaying a predetermined portion of the
`texture mapped p-sphere.
`These and other objects, features and advantages of the
`invention are disclosed in or will be apparent from the
`following description of preferred embodiments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`10
`
`3
`Still another object according to the present invention is
`to provide a method and corresponding apparatus for mod(cid:173)
`eling the visible world by texture mapping full-surround
`image data, wherein the viewpoint is close to the surface of
`the p-sphere.
`Another object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`the visible world by texture mapping full-surround image
`data, wherein the viewer is allowed to interactively direction
`of view.
`A further object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`the visible world by texture mapping full-surround image
`data, wherein the viewer is allowed to select an area of the
`image and cause another model of the visible world to be 15
`loaded into said viewing system.
`Another object according to the present invention is to
`provide a method and corresponding apparatus for modeling
`the visible world by texture mapping full-surround image
`data, wherein the viewer is allowed to perform any combi(cid:173)
`nation of actions specified immediately above.
`It will be appreciated that none of the above-identified
`objects need actually be present in invention defined by the
`appended claims. In other words, only certain, and not all,
`objects of the invention have been specifically described
`above. Numerous other objects advantageously may be
`provided by the invention, as defined in the appended
`claims, without departing from the spirit and scope of the
`invention.
`These and other objects, features and advantages accord(cid:173)
`ing to the present invention are provided by a method of
`modeling the visible world using full-surround image data.
`Preferably, the method includes steps for selecting a view
`point within a p-surface, and texture mapping full-surround 35
`image data onto the p-surface such that the resultant texture
`map is substantially equivalent to projecting full-surround
`image data onto the p-surface from the view point to thereby
`generate a texture mapped p-surface.
`According to one aspect of the invention, the method also
`includes a step for either rotating the texture mapped
`p-surface or changing the direction of view to thereby
`expose a new portion of the texture mapped p-surface.
`According to another aspect of the invention, a first the
`texture mapped p-sphere is replaced by a second texture 45
`mapped p-sphere by interactively selecting the new view(cid:173)
`point from viewpoints within the second texture mapped
`p-sphere.
`These and other objects, features and advantages accord(cid:173)
`ing to the present invention are provided by a method of 50
`modeling of the visible world using full-surround image
`data, the method comprising steps for providing the full
`surround image data, selecting a view point within a
`p-surface, texture mapping full-surround image data onto
`the p-surface such that the resultant texture map is substan- 55
`tially equivalent to projecting full-surround image data onto
`the p-surface from the view point to thereby generate a
`texture mapped p-surface, and displaying a predetermined
`portion of the texture mapped p-sphere.
`These and other objects, features and advantages accord-
`ing to the present invention are provided by an apparatus for
`modeling the visible world using full-surround image data,
`comprising first circuitry for selecting a view point within a
`p-surface, second circuitry for texture mapping full(cid:173)
`surround image data onto the p-surface such that the result- 65
`ant texture map is substantially equivalent to projecting
`full-surround image data onto the p-surface from the view
`
`These and various other features and aspects of the
`present invention will be readily understood with reference
`to the following detailed description taken in conjunction
`with the accompanying drawings, in which like or similar
`numbers are used throughout, and in which:
`FIG. 1 illustrates a set of all rays from a predetermined
`viewpoint, which illustration facilitates an understanding of
`the present invention;
`FIG. 2 illustrates a set of points, excluding the viewpoint,
`located on a corresponding one of the rays, which illustra-
`20 tion facilitates an understanding of the present invention;
`FIG. 3 illustrates the formation of a projection of the set
`of points, or a subset thereof, illustrated in FIG. 2;
`FIGS. 4Aand 4B illustrate the resultant images generated
`by two different projections, respectively, given a constant
`viewpoint;
`FIG. 5 illustrates the concept of linear perspective:
`FIG. 6 illustrates the concept of circular perspective;
`FIG. 7 illustrates the concept of stereographic projection;
`FIG. 8 is a high level block diagram of a circular
`perspective viewing system according to the present inven-
`tion;
`FIGS. 9A through 9G collectively form a listing of the
`dedicated code for converting a general purpose computer
`system into the circular perspective viewing system illus(cid:173)
`trated in FIG. 8; and
`FIGS. lOA and lOB collectively forming a listing of an
`exemplary code block for triangulating a hemisphere and
`40 texture coordinates.
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`
`25
`
`30
`
`The method and corresponding apparatus according to the
`present invention are similar to that disclosed in U.S. Pat.
`No. 5,684,937, which patent is incorporated herein by
`reference for all purposes, in that it generates perspective
`views derived for constants less than or equal to two, and
`greater than or equal to one, i.e., l.O~X~2.0. However, it
`will be appreciated that the inventive method and apparatus
`are different from U.S. Pat. No. 5,684,937 in that the method
`for deriving perspective is different than 'explicitly' dividing
`all angles by a selected constant, as disclosed in that patent.
`Instead, the angles are 'implicitly' divided by a constant by
`moving the viewpoint around inside a "p-sphere". Addi(cid:173)
`tional details will be provided below.
`By employing the method and corresponding apparatus
`according to the present invention, it is possible to create a
`virtual pictosphere using a conventional 3-D graphics sys(cid:173)
`tem. Preferably, the inventive method and apparatus texture
`map the visible world onto a sphere. It should be mentioned
`that when the user selects a viewpoint at the center of this
`sphere and renders the view using the primitives of a
`conventional 3D graphics system, the user implicitly divides
`all angles by one, and results in a linear perspective view.
`However, when the user selects a viewpoint on the surface
`of this sphere, selects a direction of view towards the center,
`
`60
`
`GOOGLE EXHIBIT 1001, Page 19 of 22
`
`
`
`US 6,243,099 Bl
`
`5
`and renders the view using the primitives of a conventional
`3D graphics system, the user implicitly divides all angles by
`two, thus creating a circular perspective view. Moreover, by
`allowing the viewpoint to move around within or on the
`sphere, the user achieves results virtually identical to those
`achieved by U.S. Pat. No. 5,684,937 for constants ranging
`from 1.0 to 2.0.
`It will be appreciated that the method and corresponding
`apparatus according to the present invention implement a
`novel and practical circular perspective viewer of spherical
`data. The inventive method and apparatus advantageously
`can be achieved within the context of existing 3-D graphics
`utilities and hardware native to PCs. It will be noted from the
`statement immediately above that the inventive method and
`apparatus advantageously can be implemented in a broad
`range of existing systems.
`The method and corresponding apparatus according to the
`present invention are predicated on the following starting,
`i.e., given, conditions:
`(1) the set of all rays V from a given point VP, as
`illustrated in FIG. 1;
`(2) a set of points P not including VP, each point in P being
`contained by one and only one ray in V, as illustrated in FIG.
`2; and
`(3) the set of color values C, each color in C being
`associated with one and only one ray in V, and also thereby
`associated with the point in P contained by said ray.
`Moreover, the following definitions apply:
`(1) POINTS P: The visible world.
`(2) A PROJECTION OF P: A subset of points P. Any
`number of points Pn contained in P may be slid closer t