throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2003/0025710 A1
`(43) Pub. Date:
`Feb. 6, 2003
`Fukushima et al.
`
`US 2003OO25710A1
`
`(54) RENDERING PROCESSING METHOD
`
`Publication Classification
`
`(76) Inventors: Takashi Fukushima, Saitama (JP);
`Kentaro Motomura, Tokyo (JP)
`Correspondence Address:
`KATTEN MUCHIN ZAVIS ROSENMAN
`575 MADSON AVENUE
`NEW YORK, NY 10022-2585 (US)
`(21) Appl. No.:
`10/179,908
`(22) Filed:
`Jun. 24, 2002
`(30)
`Foreign Application Priority Data
`
`Aug. 3, 2001 (JP)...................................... 2001-236567
`
`(51) Int. Cl." ....................................................... G09G 5/02
`(52) U.S. Cl. ........................... 345/592; 34.5/582; 34.5/629
`(57)
`ABSTRACT
`A first texture is used for determining color and design of a
`polygon Structuring an object rendered upon a two-dimen
`Sional Screen. A Second texture has a pattern of dense
`distribution of color with a predetermined slant relative to
`the two-dimensional Screen. An rendering processing device
`first applies a first texture to a polygon Structuring an object,
`and thereafter performs translucent Synthesis of a Second
`texture on an object applied with the first texture, thereby
`making it possible to easily render an image in a hand-drawn
`illustration Style in, for example, home video games and
`computer graphics.
`
`GEOMETRY
`PROCESSOR
`
`MEMORY
`
`
`
`51
`
`
`
`
`
`
`
`
`
`POLYGON SETUP/ s
`RASTERIZING UNIT
`
`
`
`
`
`l TEXTURE BUFFER
`
`NORMAL TEXTURE
`OBLIQUE LINE
`TEXTURE
`
`
`
`6 7
`
`6 8
`
`
`
`
`
`
`
`
`
`TEMPORARY
`BUFFER
`
`T
`
`i PXEL PIPELINE
`
`
`
`
`
`
`
`
`
`
`
`UN
`
`FRAME BUFFER
`RENDERNG PROCESSOR
`
`52
`
`
`
`DISPLAY
`
`56
`
`IPR2020-01218
`Sony EX1016 Page 1
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003. Sheet 1 of 6
`
`US 2003/0025710 A1
`
`F.G. 1
`
`
`
`IPR2020-01218
`Sony EX1016 Page 2
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003 Sheet 2 of 6
`
`US 2003/0025710 A1
`
`FG
`3
`
`
`
`! 22, 22, 12
`
`IPR2020-01218
`Sony EX1016 Page 3
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003. Sheet 3 of 6
`
`US 2003/0025710 A1
`
`F. G. 4
`
`GEOMETRY
`PROCESSOR
`
`50
`
`MEMORY
`
`51
`
`POLYGON SETUP/ s
`RASTERIZING
`
`53
`
`
`
`TEMPORARY
`BUFFER
`
`l PROCESS UNIT
`
`X L. PIPELINE
`XE
`NT
`
`
`
`
`
`
`
`
`
`
`
`
`
`TEXTURE BUFFER
`
`6 7
`
`6 8
`
`OBLIQUE LINE
`TEXTURE
`
`FRAME BUFFER
`RENDERING PROCESSOR
`
`
`
`52
`
`DISPLAY CONTROLLER
`
`54
`
`DISPLAY
`
`56
`
`IPR2020-01218
`Sony EX1016 Page 4
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003 Sheet 4 of 6
`
`US 2003/0025710 A1
`
`F.G. 5
`BUS
`
`121
`
`122
`
`COMMUN- SN CATIONS
`CATIONS
`Eic
`
`
`
`128
`
`PROGRAM
`
`129
`
`130
`
`23
`
`132
`
`133
`
`124 SEH INPUT
`
`134
`
`135
`
`140
`
`125 HRH DRIVE
`
`136
`DISPLAY
`DRIVE
`
`DISPLAY
`
`1
`37
`
`IPR2020-01218
`Sony EX1016 Page 5
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003 Sheet 5 of 6
`
`US 2003/0025710 A1
`
`F.G. 6
`
`START
`
`TAKE IN GRAPHC INFORMATION
`FOR POLYGON PLOTTING
`
`CALCULATE GEOMETRY,
`CALCULATE LIGHT SOURCE, VECTORS,
`DETERMINE U, V PARAMETERS
`
`RASTERIZING
`
`PERFORM PARTS PROCESSING
`AND TEXTURE MAPPING
`
`PRODUCE SCREEN MAGE
`
`OUTPUT AND DISPLAY
`
`S1
`
`S2
`
`S3
`
`S4
`
`S5
`
`S6
`
`IPR2020-01218
`Sony EX1016 Page 6
`
`

`

`Patent Application Publication
`
`Feb. 6, 2003 Sheet 6 of 6
`
`US 2003/0025710 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`FG. 7
`CSTART D
`
`CLASSIFY PXEL, DATA OF CHARACTER
`FOR EVERY PART AND PERFORM
`PERSPECTIVE TRANSFORMATION
`
`DETERMINE PXEL COLOR OF EACH PART
`USING NORMAL TEXTURE.
`STORE IN RAM DATA FOR EACH PART
`HAVING PXEL COLOR DETERMINED
`
`GENERATE SUBTRACTION VALUE
`CORRESPONDING TO INTENSITY OF LIGHT
`HTTING EACH PART AND STORE IN RAM
`
`PERFORMPERSPECTIVE TRANSFORMATION OF
`OBLIQUE LINE SYNCHRONIZING POINT FOR
`EVERY PART AND CALCULATE REFERENCE POINT
`TO BE APPLIED WITH OBLIQUE LINE TEXTURE
`
`CENTER ON REFERENCE POINT AND APPLY
`OBLIQUE LINE TEXTURE ON PART USING
`SUBTRACTING TRANSLUCENT PROCESSING
`THEN STORE IN RAM
`
`ADD BORDER TO EACH PART AFTER APPLLYING
`OBLIQUE LINE TEXTURE AND GENERATE
`FRAME DATA.
`MAKE XY COORDNATES OFFSET FOR EVERY
`PART WHEN GENERATING FRAME DATA AND
`RENDER FOR EVERY BORDER
`
`S11
`
`S12
`
`S13
`
`-
`S4
`
`S15
`
`S16
`
`PROCESSING FOR ALL PARTS OF
`ALL CHARACTERS COMPLETED
`YES
`GO TO NEXT STEP
`
`S17
`
`IPR2020-01218
`Sony EX1016 Page 7
`
`

`

`US 2003/002571.0 A1
`
`Feb. 6, 2003
`
`RENDERING PROCESSING METHOD
`0001. This application is related to Japanese Patent
`Application No. 2001-236567 filed on Aug. 3, 2001, based
`on which this application claims priority under the Paris
`Convention and the contents of which are incorporated
`herein by reference.
`
`BACKGROUND OF THE INVENTION
`0002) 1. Field of the Invention
`0003. The present invention relates to a rendering pro
`cessing method and apparatus, which render, for example,
`three-dimensional image information upon a two-dimen
`Sional Screen Such as a television monitor, a recording
`medium having recorded therein a rendering processing
`program, and the rendering processing program.
`0004 2. Description of the Related Art
`0005. In recent years, video game units and personal
`computers continue to See advances in, for example, high
`degree of integration and high Speed of processors, memory,
`and the like. Accordingly, a rendering processing device
`configured with Such game console units and personal
`computers produces finer, high-definition two-dimensional
`imageS which are rich with diversity and which appear more
`life-like and give a higher Sense of realism from three
`dimensional image information, and is capable of rendering
`these images upon a two-dimensional Screen.
`0006 Meanwhile, a video game user, for example,
`desires not only games having life-like images, but also
`games using images in Styles Such as handwritten cel
`animation. Images in the cel animation Style mentioned
`above are generally produced through the use of a rendering
`process called cartoon shading (or cell shading).
`0007 On the other hand, most recently, images in a
`hand-drawn Style giving an even more interesting flavor than
`cel animation-styled images have been desired. Neverthe
`less, conventional cartoon shading processes can produce
`imageS close to hand-drawn cel images, but expression of
`flavorful and hand-drawn illustration styled images is diffi
`cult.
`
`SUMMARY OF THE INVENTION
`0008. The present invention has come about in consid
`eration of Such issues, and the objective thereof lies in
`providing a rendering processing method and apparatus, a
`recording medium having recorded therein a rendering pro
`cessing program, and the rendering processing program, all
`of which allow eased production of images in a hand-drawn
`illustration Style in, for example, a home Video game or
`computer graphics.
`0009. The present invention, applies a first texture that
`determines the color or design of a polygon Structuring an
`object to be rendered upon a two-dimensional Screen, and
`applies a Second texture having a pattern of dense distribu
`tion of color with a predetermined slant relative to the
`coordinates of the two-dimensional Screen to that object
`through translucent Synthesis.
`0.010 More specifically, by merely performing translu
`cent Synthesis and application of a Second texture on an
`
`object applied with a first texture, it is possible to easily
`produce an image in a hand-drawn illustration Style.
`0011. Other and further objects and features of the
`present invention will become obvious upon understanding
`of the illustrative embodiments about to be described in
`connection with the accompanying drawings or will be
`indicated in the appended claims, and various advantages
`not referred to herein will occur to one skilled in the art upon
`employing of the invention in practice.
`
`BRIEF DESCRIPTION OF DRAWINGS
`0012 FIG. 1 is a diagram showing an example of an
`oblique line texture used when performing hand-drawn
`illustration Style image rendering.
`0013 FIG. 2 is a diagram showing an example of another
`oblique line texture used together with the oblique line
`texture of FIG. 1 when performing hand-drawn illustration
`Style image rendering.
`0014 FIG. 3 is a diagram showing an example of an
`illustrated image.
`0015 FIG. 4 is a block diagram showing a schematic
`configuration of implementing a rendering processing
`according to an embodiment of the present invention.
`0016 FIG. 5 is a block diagram showing a schematic
`configuration of a computer that implements the rendering
`processing according to the embodiment of the present
`invention.
`0017 FIG. 6 is a flowchart of processing in the case
`where the rendering processing according to the embodi
`ment of the present invention is implemented.
`0018 FIG. 7 is a flowchart showing the details of the
`processing in Step 4 of the flowchart in FIG. 6.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`0019 Various embodiments of the present invention will
`be described with reference to the accompanying drawings.
`It is to be noted that the same or similar reference numerals
`are applied to the Same or similar parts and elements
`throughout the drawings, and the description of the same or
`Similar parts and elements will be omitted or simplified.
`0020. A rendering processing device according to an
`embodiment of the present invention, together with perform
`ing image rendering in the cel animation Style that is based
`on cartoon Shading, renders oblique lines for the individual
`main Structural components to be rendered upon a two
`dimensional Screen. The rendering processing device is
`capable of expressing portions where light hits and portions
`in the Shadows by adjusting the depth of oblique lines
`corresponding to the manner in which light hits each object
`upon the Screen, and further expressing an illustration image
`in the hand-drawn Style through rendering a border around
`the outline portion of each object.
`0021. The following embodiment is described using the
`example of Video game images.
`0022. The rendering processing device of this embodi
`ment is made capable of implementing illustration Style
`rendering through application of a texture (hereafter referred
`
`IPR2020-01218
`Sony EX1016 Page 8
`
`

`

`US 2003/002571.0 A1
`
`Feb. 6, 2003
`
`to as oblique line texture) wherein a color tint pattern is
`arranged at a Slant, for example as shown in FIG. 1 and FIG.
`2, relative to the XY coordinates on a two-dimensional
`Screen for all of the main Structural elements, or the back
`ground, game characters, etc., in a game Screen that form the
`game Screen. In particular, the rendering processing device
`according to this embodiment implements rendering closer
`to illustrated images by dividing a game character into parts
`(objects of each part structuring a game character) Such as
`the entire body, head, arms, and legs, and applying oblique
`line texture individually to every one of those parts. In the
`case where oblique line texture is applied to each part as
`described above, the rendering processing device of this
`embodiment first calculates the oblique line Synchronizing
`point of each part (center coordinates of the head, arms, legs,
`hands, etc.) and calculates the respective reference point by
`Subjecting the oblique line Synchronizing point of every one
`of these parts to perspective transformation. Then the ren
`dering processing device of this embodiment matches this
`reference point with the center point of the oblique line
`texture to apply this oblique line texture onto the already
`rendered parts.
`0023. Here, it is desirable that the oblique line texture in
`FIG. 1 and FIG. 2 is that which has, for example, the
`oblique line portion randomly arranged, and that each line
`has random lengths (namely, like hand-drawn oblique lines).
`By using Such oblique line texture, the rendering processing
`device of this embodiment is able to implement hand-drawn
`illustration Style rendering. In addition, the oblique line
`texture in FIG. 1 and FIG. 2 is situated so that the arranged
`positions of the respective oblique line portions are not
`aligned relative to each other. When the oblique line texture
`is applied for each part, the rendering processing device of
`this embodiment is able to represent movement (in other
`words, an “active feel”) by, for example, Switching the
`oblique line textures of FIG. 1 and FIG. 2 back and forth
`with each other at intervals of approximately one Second.
`Moreover, the rendering processing device of this embodi
`ment Sets the oblique line movement in each part to be
`varied, or in other words, Sets the timing of the Switching
`between the two oblique line textures and the Switching
`cycle therebetween to be at random for every one of the
`parts. This allows the obtained image to be an image having
`a “Soft illustration touch' unlike the already existing com
`puter graphics style or animation Style, without causing
`interference among the oblique lines of characters when, for
`example, two characters appear on the Same Screen. Note
`that these oblique line textures to be switched are not limited
`to being the two oblique line textures shown in FIG. 1 and
`FIG. 2, but may be any number of textures equal to or
`greater than this. Furthermore, it is not necessary for the
`timing of the Switching to be fixed, but may also be a
`variable length of time.
`0024.
`In addition, the rendering processing device of this
`embodiment performs ray tracing (calculation of illumina
`tion), and calculates the intensity of light that hits for every
`pixel. The rendering processing device then Sets the Sub
`traction value in response to the light intensity for every one
`of those pixels. The subtraction value here is set so that the
`value is low for a pixel having high light intensity and the
`value is high for a pixel having low light intensity. Basically,
`the rendering device Sets the above-mentioned Subtraction
`value So that the amount to be Subtracted becomes larger
`when the color of the above-mentioned oblique line texture
`
`is subtracted from the color of the portion of the parts to
`become shadow. Then when this oblique line texture is to be
`applied to each of the parts, the rendering processing device
`of this embodiment references the Subtraction value set for
`every one of the pixels of each of these parts and Subtracts
`the above-mentioned color of this oblique line texture from
`the color of the earlier rendered parts. Through Such Sub
`traction/translucent processing, the Shadow portion of each
`part of the game character darkens and the oblique lines of
`the oblique line texture becomes deeply noticeable. Mean
`while, the portion of each body part that the light hits
`becomes lighter and the oblique lines of the oblique line
`texture becomes lighter.
`0025 Moreover, the rendering processing device of this
`embodiment locates by calculation the outline of each of the
`parts comprising the game character and the border of every
`one of those parts can be represented by rendering that
`outline with, for example, a black line. In other words, the
`rendering processing device is able to clearly represent the
`boundary (outline) of the torso of the body and the arms
`even in circumstances where the body and arms, which are
`all the Same color, overlap by showing the border of each of
`these parts.
`0026. This allows the rendering processing device to
`clearly render, as shown in FIG. 3, the respective outlines
`101 of the characters 100, 110, as well as make the intense
`oblique lines 103 in the shadow portions 102 of the char
`acters 100, 110 prominent, while the rendering processing
`device renders the portion 104 where light hits with a
`hand-drawn illustration style image drawn with light oblique
`lines 105. In addition, with the rendering processing device
`of this embodiment, an image having a Soft illustration touch
`becomes possible by giving each of the oblique lines a
`feeling of movement through the Switching back and forth
`between oblique line textures applied to each part at inter
`vals of, for example, approximately one Second as described
`above.
`
`Configuration Example
`0027 FIG. 4 shows a specific configuration of an ren
`dering processing device, which performs Subtracting trans
`lucent processing of the oblique line texture described
`above. Note that the configuration shown in FIG. 4 is an
`example of the case where the rendering processing device
`of the present invention is implemented using hardware Such
`as a digital signal processor (DSP) or a graphics processor
`(GP). Each of the structural elements of FIG. 4 corresponds
`to the respective internal processing units of Such a DSP or
`GP. In addition, in the following description, only the
`portions that are characteristic to the present invention are
`particularly highlighted.
`0028. In FIG. 4, the memory 51 is stored with the
`graphics information (node information or node connection
`information Such as node coordinate values, RGB node
`color values, map coordinate values, and vector values) of a
`polygon, etc. Note that the graphics information is pre
`imported from various recording media such as a CD-ROM,
`DVD-ROM, or semiconductor memory, or via wired or
`wireleSS telecommunication media or transmission media.
`0029. The geometry processor 50 reads out the above
`mentioned graphics information Stored in the above-men
`tioned memory 51, and performs affine transformation,
`
`IPR2020-01218
`Sony EX1016 Page 9
`
`

`

`US 2003/002571.0 A1
`
`Feb. 6, 2003
`
`projection transformation to Screen coordinates, and geom
`etry calculation Such as ray tracing for the nodes. The
`above-mentioned post-projection transformation graphics
`information (polygon data) is sent to the rendering processor
`52.
`0030 The rendering processor 52 is the portion that
`performs arithmetic processing for rendering a polygon
`upon the Screen, and forms the polygon data Sent from the
`above-mentioned geometry processor 50 into pixels. This
`rendering processor 52 is generally divided into a polygon
`Setup/rasterizing unit (hereafter represented as the PSR unit
`61), a parts processing unit 62, a pixel pipeline unit 63, a
`frame buffer 64, a temporary buffer 65, and a Zbuffer 66, the
`latter two to be described later. Note that the Z value, which
`represents the distance from the viewing point in the direc
`tion of the depth of the image, is stored in the Z buffer 66.
`In addition, the temporary buffer 65, which is described in
`more detail later, temporarily Stores each pixel color of each
`part that has been already rendered, and the Subtraction
`value for the Subtracting translucent processing, which Sub
`tracts the color of the oblique line texture from the color of
`each part.
`0031. In addition, a texture buffer 53 is deployed in this
`rendering processor 52. The texture buffer 53 stores the three
`primary colors R (red), G (green), and B (blue) and an alpha
`value (A), which are for determining the texel color of the
`texture, or more Specifically, the pixel color of the polygon.
`In the case of this embodiment, this texture buffer 53 stores
`a normal texture 67 for determining the color, pattern, etc. of
`each part of a character, and an oblique line texture 68 to be
`applied to each part of this character. The normal texture 67
`and oblique line texture 68 stored in the texture buffer 53,
`and the Z value stored in the previously-mentioned Z buffer
`66 are pre-imported from, for example, various Storage
`media Such as a CD-ROM, DVD-ROM, or semiconductor
`memory, or via wired or wireleSS telecommunication media
`or transmission media.
`0.032 The above-mentioned PSR unit 61 comprises con
`figuration called a digital differential analyzer (DDA) for
`performing linear interpolation calculation, and performs
`import and buffering of polygon data Sent from the above
`mentioned geometry processor 50 as well as pixelization
`through rasterizing and calculation of the texel coordinate
`values. The PSR unit 61 then sends this pixel data, texel
`coordinate values, and light Source information to the pixel
`pipeline unit 63 via the parts proceSS unit, and also sends the
`texture UV coordinate values (the address for referencing a
`texel) corresponding to the above-mentioned texel coordi
`nate values to the texture buffer 53.
`0.033
`Here, the parts process unit 62 classifies pixel data
`and texel coordinate values Supplied from the above-men
`tioned PSR unit 61 for every part of the character, and
`performs perspective transformation for each of these parts.
`The pixel pipeline unit 63 at this point determines pixel color
`for the polygons of each of the above-mentioned parts by
`referencing the texture color from the normal texture 67
`within the texture buffer 53 in response to the texel refer
`encing address. The pixel color determined by this pixel
`pipeline unit 63 is temporarily Stored in a temporary buffer
`65.
`In addition, parts process unit 62 uses the light
`0034.
`Source information Supplied from the rendering processor 52
`via the PSR unit 61 to calculate by pixel the intensity with
`which light hits each individual pixel, and further calculates
`the Subtraction value to be used for the Subtracting translu
`
`cent processing described above for every pixel of each part
`and stores it in the temporary buffer 65. At the same time, the
`parts proceSS unit 62 calculates the reference point in
`conformity with the perspective projection of the oblique
`line Synchronizing point of each one of the parts (the center
`point of head, hand, etc.), and sends the coordinate values of
`that reference point to the pixel pipeline unit 63.
`0035) Next, the pixel pipeline unit 63 reads out from the
`temporary buffer 65 the pixel color and the subtraction value
`of each part that has already been rendered, and at the same
`time reads out from the texture buffer 53 each texel color of
`the oblique line texture 68. The pixel pipeline unit 63 then
`matches the reference point and center point of the oblique
`line texture, and in response to the Subtraction value per
`forms the Subtracting translucent processing by Subtracting
`the texel color of the oblique line texture from the pixel color
`of the part in question. Note that here the pixel pipeline unit
`63 is made so that it switches back and forth every fixed
`length of time (for example, every Second) between two
`oblique line textures where the arranged positions of the
`respective hatching portions are not aligned with each other
`as shown in FIG. 1 and FIG. 2, which were described
`earlier. The pixel color of each of the parts after the
`Subtracting translucent processing of this oblique line tex
`ture is again stored in the temporary buffer 65.
`0036) Next, the pixel color of each of the parts after the
`Subtracting translucent processing of the oblique line texture
`is read out from the temporary buffer 65 and sent to the pixel
`pipeline unit 63. The pixel pipeline unit 63 then, after giving
`a border for each of the parts, draws each pixel data of each
`of these parts into frame buffer 64. Note that at this point,
`each of the parts is drawn upon the frame buffer 64 with the
`XY coordinate values of every border (of every part) offset.
`0037. The frame buffer 64 comprises memory space
`corresponding to a display (Screen) 56 of a television
`monitor, and in that memory Space is written the color
`values, etc. for each pixel Structuring, for example, each of
`the parts of the character or the background. The Screen data
`to be formed by frame upon this frame buffer 64 is thereafter
`read out as needed from a display controller 54.
`0038. The display controller 54 produces the horizontally
`Synchronized signal and Vertically Synchronized Signal for
`the television monitor, and in addition, linearly dispenses in
`order the pixel data from the frame buffer 64 in accordance
`with the display timing of that monitor. The two-dimen
`Sional image comprising these linear, Sequentially given
`color values is displayed upon a display 56 Such as a
`television monitor.
`Alternative Example
`0039 The rendering process of this embodiment can be
`implemented not only in conformity with a hardware con
`figuration such as shown in FIG. 4 above, but may naturally
`be implemented through Software (i.e. an application pro
`gram for a computer, etc.).
`0040 FIG. 5 through FIG. 7 show a configuration and
`operation in the case of implementing rendering processing
`of the present invention in a computer. FIG. 5 shows a
`structural example of the main elements of a computer. FIG.
`6 shows the general process flow in the case where the CPU
`123 of the computer of FIG. 5 executes a rendering pro
`cessing program of the present invention. FIG. 7 the
`detailed process flow of the processing in step S4 in FIG. 6.
`0041. In FIG. 5, a storage unit 128 comprises, for
`example, a hard disk or a drive thereof. This storage unit 128
`
`IPR2020-01218
`Sony EX1016 Page 10
`
`

`

`US 2003/002571.0 A1
`
`Feb. 6, 2003
`
`is Stored with an operating System program, a computer
`program 129, which includes the rendering processing pro
`gram of this embodiment that is imported from or any
`variety of storage media such as CD-ROM, DVD-ROM,
`etc., or via a telecommunication line, and the various data
`130 to be used for polygon plotting, Such as graphical
`information, the RGBA value of the texture, Z value, and the
`oblique line texture to be applied to every one of the parts.
`0042. The communications unit 121 is a communications
`device for performing external data communications, Such
`as a modem for connecting to an analog public telephone
`line, a cable modem for connecting to a cable television
`network, a terminal adapter for connecting to an integrated
`Services digital network (ISDN), or a modem for connecting
`to an asymmetric digital subscriber line (ADSL). The com
`munications interface unit 122 is an interface device for
`performing protocol conversion to allow the exchange of
`data between the communications unit 121 and an internal
`bus (BUS).
`0043. The input unit 133 is an input device such as a
`keyboard, mouse, or touch pad. The user interface unit 132
`is an interface device for providing a signal from the input
`unit 133.
`0044) The drive unit 135 is a drive device capable of
`reading out various programs or data from, for example, disc
`media 140 Such as a CD-ROM or DVD-ROM or card
`shaped semiconductor memory. The drive interface unit 134
`is an interface device for providing a signal from the drive
`unit 135.
`004.5 The display unit 137 is a display device such as a
`cathode ray tube (CRT) or liquid crystal display device. The
`display drive unit 136 is a drive device, which drives the
`display unit 137.
`0046) The CPU 123 provides general control of the
`personal computer based upon an operating System program
`or a computer program 129 of the present invention Stored
`in the storage unit 128.
`0047. The ROM 124 comprises, for example, rewritable
`and nonvolatile memory Such as flash memory and is Stored
`with the basic input/output system (BIOS) and various initial
`settings of this computer. The RAM 125 is loaded with
`application programs and various data readout from the hard
`disk of the Storage unit 128, and in addition, is used as the
`working RAM of the CPU 123, the texture buffer, temporary
`buffer, Z buffer, and frame buffer.
`0048. With the structure shown in FIG. 5, the CPU 123
`realizes the rendering process of the earlier described
`embodiment through the execution of the computer program
`of this embodiment, which is one of the application pro
`grams read out from the hard disk of the above-mentioned
`storage unit 128 and loaded in RAM 125.
`0049 General Flow of the Rendering Program
`0050. Next, the process flow when the CPU 123 of the
`computer shown in FIG. 5 operates based on the application
`program of this embodiment used for rendering (rendering
`processing program), and is described forthwith using FIG.
`6 and FIG. 7.
`0051). In FIG. 6, the CPU 123, as the processing of Step
`S1, reads out from the Storage unit 128 graphics information,
`the RGBA value of the texture, the Z value, and the oblique
`line texture, which have been read out in advance from the
`
`disk media 140 and accumulated as data 130 in the storage
`unit 128 and are to be used for polygon plotting, holding
`them in RAM 125.
`0.052 Next, the CPU 123, as the processing of Step S2,
`reads out the graphics information held in RAM 125, which
`is then Subjected to affine transformation, projection trans
`formation to Screen coordinates, geometry calculation Such
`as ray tracing for the nodes, and perspective transformation.
`0053) Next, the CPU 123, as the processing of Step S3,
`performs rasterizing using the polygon data obtained
`through the geometry calculation, and further, as the pro
`cessing of Step S4, performs classification of the parts of the
`character, performs coloring to each part using the normal
`texture (texture mapping), and performs Subtracting trans
`lucent processing in response to the application of oblique
`line texture to every one of the parts and how the light hits.
`The details of the processing involved in this Step S4 are
`described forthwith with FIG. 7.
`0054) Thereafter, the CPU 123, as the processing of Step
`S5, produces a Screen image from the post-Step 4 processing
`pixel data, and further, as the processing of Step S6, Sends
`this screen image information to the display drive 136. This
`allows an image to be displayed upon the display unit 137.
`0055) Details of Step S4
`0056. In the following, the detailed process flow of the
`processing in Step 4 of FIG. 6 is described using FIG. 7.
`0057. In FIG. 7, as the CPU 123 proceeds to the pro
`cessing of Step S4 of FIG. 6, as the processing of Step S11,
`the CPU 123 first classifies the pixel data and texel coordi
`nate values produced in Step S3 of FIG. 6 for every part of
`a character, and further performs perspective transformation
`on every one of these parts.
`0.058 Next, the CPU 123, as the processing of Step S12,
`determines the pixel color of each of the parts using the
`normal texture loaded into RAM 125. The CPU 123 then
`stores in RAM 125 the data of each of the parts for which
`this pixel color was determined.
`0059) Next, the CPU 123, as the processing of Step S13,
`calculates the intensity of light hitting each part by perform
`ing ray tracing, finds the Subtraction value for each one of
`these parts from this light intensity, and stores it in RAM
`125.
`0060. In addition, the CPU 123, as the processing of Step
`S14, calculates a reference point by Subjecting the oblique
`line Synchronizing point of each one of the parts to perspec
`tive transformation.
`0061 Next, the CPU 123, as the processing of Step S15,
`reads out the pixel color and the Subtraction value for each
`of the parts stored earlier in the RAM 125, as well as the
`texel color for the oblique line texture. The CPU 123 then
`matches the reference point and center point of the oblique
`line texture, Subtracts the texel color of the oblique line
`texture from the pixel color of each of the parts in response
`to the Subtraction value, and thereafter returns it to RAM
`125. Note that during this oblique line texture subtraction/
`translucent processing, the CPU 123 switches back and forth
`every fixed length of time (for example, every Second),
`alternatingly applying two oblique line textures where the
`arranged positions of the respective oblique line portions are
`not aligned with each other as shown in the earlier described
`FIG. 1 and FG, 2.
`
`IPR2020-01218
`Sony EX1016 Page 11
`
`

`

`US 2003/002571.0 A1
`
`Feb. 6, 2003
`
`0062) Next, the CPU 123, as the processing of Step S16,
`gives a border to every one of the parts after applying an
`oblique line texture, and structures frame data by rendering
`each of these parts attached a border. Note that when
`producing this frame data, the CPU 123 produces frame data
`where each part is arranged within a frame by offsetting the
`XY coordinate values.
`0063) Thereafter, the CPU 123, as the processing of Step
`S17, determines whether the rendering process for all of the
`parts of all of the characters arranged in a frame, as well as
`for other objects, background, etc. has been completed.
`When processing has not been completed, the processing of
`the CPU 123 returns to Step S11; otherwise, when com
`pleted, the processing proceeds to Step S5 in FIG. 6.
`0.064 Synopsis of the Embodiments of the Present Inven
`tion
`0065 According to these embodiments, by displaying a
`border for each of the parts of a character, for example, even
`if parts having the Same color overlap the boundary between
`parts can b

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket