`
`Exhibit 1
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 2 of 26 PageID #: 23
`
`USOO6972774B2
`
`(12) United States Patent
`Eguchi
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 6,972,774 B2
`Dec. 6, 2005
`
`(54) IMAGE PROCESSING SYSTEM FOR
`INSERTING PLURALITY OF IMAGES INTO
`COMPOSITE AREA, AND MEDIUM
`
`75
`(75) Inventor: Harutaka Eguchi, Kawasaki (JP)
`(73) Assignee: Fujitsu Limited, Kawasaki (JP)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 131 days.
`(21) Appl. No.: 09/737,489
`(22) Filed:
`Dec. 18, 2000
`(65)
`Prior Publication Data
`US 2001/OO15729 A1
`Aug. 23, 2001
`Foreign Application Priority Data
`(30)
`Feb. 21, 2000 (JP)
`............................. 2000-043672
`(51) Int. Cl." ................................................ G09G 5/00
`(52) U.S. Cl. ...................................................... 345/629
`(58) Field of Search ................................ 345/629, 634,
`345/635, 55, 56,419,531, 540, 641, 790,
`345/794, 555,565, 420, 421, 422,423,424,
`345/426,427, 428,619, 625, 628, 630, 638,
`345/473,474, 475; 715/790, 794
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`4,602,286 A
`7/1986 Kellar et al. ................ 34.8/597
`5,381,518 A *
`1/1995 Drebin et al................ 345/424
`
`5,627,651 A * 5/1997 Seto et al. ................. 358/3.15
`5,636,334 A * 6/1997 Hidaka ..........
`... 345/419
`5,732.230 A * 3/1998 Cullen et al. .....
`... 345/764
`5,819,103 A * 10/1998 Endoh et al. .................. 710/1
`5,861,888 A *
`1/1999 Dempsey ..........
`... 34.5/582
`5,977.965 A * 11/1999 Davis et al. ......
`... 345/723
`5,979.424. A * 11/1999 Alvarez et al. ............... 124/16
`5,982,350 A * 11/1999 Hekmatpour et al. ....... 345/629
`6,031,542 A * 2/2000 Wittig ...
`... 345/426
`6,137.498 A * 1638 SN, - - - - - -
`... 3 i?:
`6,266,068 B1* 7/2001 Kang et al. .......
`... 34.5/629
`6,268,935 B1* 7/2001 Kingetsu et al...
`... 358/2.1
`6,323,876 B1* 11/2001 Rao et al. ................... 345/634
`6,342,900 B1* 1/2002 Ejima et al. ................ 345/698
`6,348,953 B1* 2/2002 Rybczynski ...
`... 348/584
`6,417.848 B1* 7/2002 Battle ........
`... 345/419
`6,469,701 B1 * 10/2002 Gumhold ..........
`... 345/419
`6,480,199 B1* 11/2002 Oomori et al. ............. 34.5/536
`6,496,189 B1 12/2002 Yaron et al. ................ 345/428
`6,590,586 B1* 7/2003 Swenton-Wall et al. ... 345/730
`* cited by examiner
`Primary Examiner-Kee M. Tung
`ASSistant Examiner-Enrique L. Santiago
`(74) Attorney, Agent, or Firm-Staas & Halsey LLP
`(57)
`ABSTRACT
`An image processing System according to the present inven
`tion aiming at providing a function capable of efficiently
`managing and utilizing image data, comprises a display unit
`for displaying on a Screen a composite area as an aggrega
`tion of unit areas into which images are inserted, and an
`operation unit for inserting a processing target image into the
`unit area within the composite area.
`
`20 Claims, 17 Drawing Sheets
`
`
`
`
`
`ENVIRONMENT SETTNG
`PROCESS
`
`
`
`DISPLAY ENVIRONMENT SETTING MENU
`SCREEN
`
`WAT USER S OPERATIONS (SUCH ASSETTING CONSECUf VE
`PHOTOGRAPHNG INTERVAL BACKGROUND COLOR, BLOCKSZE
`N NSERT AREA AND NUMBER OF BLOCKS)
`
`SORE SETING CONDON
`
`
`
`END OF PROCESS
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 3 of 26 PageID #: 24
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 1 of 17
`
`US 6,972,774 B2
`
`FIG. 1
`
`
`
`
`
`AAAAAAAZAAAA/
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 4 of 26 PageID #: 25
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 2 of 17
`
`US 6,972,774 B2
`
`FIG 2
`
`1 O
`
`IMAGE ACQUISTION DEVICE
`
`2 O O
`
`2 O
`
`CPU
`
`TOUCH PANEL
`
`2 O 2
`
`MEMORY
`
`HARD DISK
`
`KEYBOARD
`
`r
`
`-
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 5 of 26 PageID #: 26
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 3 of 17
`
`US 6,972,774 B2
`
`F.G. 3
`
`
`
`L a
`
`
`
`%
`%
`
`h
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 6 of 26 PageID #: 27
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 4 of 17
`
`US 6,972,774 B2
`
`FIG. 4
`
`
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 7 of 26 PageID #: 28
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 5 of 17
`
`US 6,972,774 B2
`
`FIG. 5
`
`
`
`
`
`NORMAL rv12
`
`2
`
`TAKE OFF
`SEE
`NSERT
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`D
`
`2
`
`
`
`
`
`2.
`
`3%
`53
`
`51 FRAMES FRAMES s É 33.3%
`%22 7% 2. 22
`%
`%
`5,
`3
`%
`- 3 -
`3K
`Eck,
`%| \/\ 1 &
`23
`
`
`
`2
`
`
`
`
`
`
`
`2 3 b
`
`
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 8 of 26 PageID #: 29
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 6 of 17
`
`US 6,972,774 B2
`
`FIG. 6
`
`
`
`2
`
`a
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`s
`
`U 2 J
`
`& 3
`
`a.'s
`
`NORMAL ru?? “
`
`TAKE OFF
`SEE
`INSERT
`
`...
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 9 of 26 PageID #: 30
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 7 of 17
`
`US 6,972,774 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`99ZG09
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 10 of 26 PageID #: 31
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 8 of 17
`
`US 6,972,774 B2
`
`FIG. 8
`
`
`
`
`
`
`
`NORMA h 24
`
`TAKE OFF
`SEE
`iNSERT
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 11 of 26 PageID #: 32
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 9 of 17
`
`US 6,972,774 B2
`
`FIG. 9
`
`
`
`NORMAL
`
`TAKE OFF
`SEE
`NSERT
`
`st FRAMES 5.
`2 3.
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 12 of 26 PageID #: 33
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 10 Of 17
`
`US 6,972,774 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`NORMA
`
`TAKE OFF
`SEE
`NSERT
`
`
`
`51 FRAMES sŽ A2
`22 51%
`% 3
`3.
`s Q- 3. eté,
`Q
`
`3. t% a? N3s & É
`
`
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 13 of 26 PageID #: 34
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 11 Of 17
`
`US 6,972,774 B2
`
`FIG 11
`
`NSERT PROCESS
`
`
`
`
`
`READ NUMBER OF INSERT BLOCKS, (LENGTHWISE, CROSSWISE)
`DMENSIONS OF BLOCK, BACKGROUND COLOR OF INSERT AREA
`
`CALCULATE (LENGTHWISE, CRSSWISE) DIMENSONS OF
`INSERT AREA, ENSURE IMAGE MEMORY AREA FOR NSERT AREA
`
`EVELP BEFORENSERT R
`
`AND DISPLAYTON SCREEN
`
`A.
`
`.
`
`NME
`
`W
`
`DETECT USERS DRAG & DROP
`
`N
`
`IS IMAGE DROPPED EN INSERT AREA
`
`S35
`
`S3
`
`S32
`
`S33
`
`S34
`
`DETECT NUMBER OF INSERT TARGETS
`
`S36
`
`CACULATE POSION OF NEXT SNER BLOCK
`
`S BLOCK POSITION WITHN INSERT AREA
`
`S38
`
`MODIFY DEMENSIONS OF ORIGINAL IMAGETO DIMENSIONS OF
`NEXT INSERT BLOCK,
`COPY THEM TO BLOCK POSITON, AND DISPLAY THEM ON SCREE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`DECREMENT NUMBER OF INSERT TARGETS BY "1" (-1)
`-y
`S3b
`
`
`
`sis is reig day
`
`N
`END OF PROCESS
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 14 of 26 PageID #: 35
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 12 of 17
`
`US 6,972,774 B2
`
`FIG. 12
`
`TRANSFERRING/DELETING PROCESS
`
`DETECT SELECTION OF INSERTED IMAGE (TO BE TRANSFERRED)
`WITH IN INSERT AREA
`
`S51
`
`DETECT DRAG & DROP TO TRANSFER DESTINATION
`
`S52
`
`STRANSFER DESTINATION
`WTHIN INSERT AREA
`
`Y
`
`S53
`
`N
`
`54
`
`CALCULATE POSITION OF TRANSFER DESTINATION BLOCK
`
`S56
`
`DELETETRANSFER ORIGINATING
`NSERTED IMAGE
`S55
`
`TRANSFER INSERTED IMAGE TO BLOCK POSITION
`
`END OF PROCESS
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 15 of 26 PageID #: 36
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 13 Of 17
`
`US 6,972,774 B2
`
`FIG 3
`
`NSERTED IMAGE STORMG PROCESS
`
`S 7
`REDUCE MAGE N INSERT AREA
`DOWN TO S ZE OF ONE FRAME
`
`
`
`S 7 2
`DSPLAY REDUCED IMAGE ON PHOTOGRAPHED
`MAGE DISPLAY UN
`
`
`
`S 7 3
`STORE SYNTHES ZED IMAGE IN
`HARD DSK 204
`
`END OF PROCESS
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 16 of 26 PageID #: 37
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 14 Of 17
`
`US 6,972,774 B2
`
`FIG. 14
`
`ENVIRONMENT SETTING
`PROCESS
`
`
`
`D!SPLAY ENVIRONMENT SETTING MENU
`SCREEN
`
`WAT USER SOPERATIONS (SUCH AS SETTING CONSECUT VE
`PHOTOGRAPHING INTERVAL, BACKGROUND COLOR, BLOCKSZE
`N NSERT AREA AND NUMBER OF BLOCKS)
`
`e S 93 N
`
`Y
`
`SORE SETTING CONDON
`
`END OF PROCESS
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 17 of 26 PageID #: 38
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 15 0f 17
`
`US 6,972,774 B2
`
`FIG. 15
`
`TAKE OFF
`SEE
`NSERT
`
`
`
`2% 32%
`ET2 . 2
`51 FRAMES 3.333333333333333
`%
`32.
`attra-.... 33
`5
`%
`3
`%
`'-oo......" %
`2.
`3%
`3| \/\ %
`% %
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 18 of 26 PageID #: 39
`
`U.S. Patent
`
`Dec. 6, 2005
`
`Sheet 16 0f 17
`
`US 6,972,774 B2
`
`FIG 16
`
`B
`
`B5
`B7 IR
`B8, B9 Ba “
`Yv | -- B6
`
`4. a
`
`a
`
`-
`
`as
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 19 of 26 PageID #: 40
`
`U.S. Patent
`
`US 6,972,774 B2
`
`
`
`LI '{DIJ
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 20 of 26 PageID #: 41
`
`US 6,972,774 B2
`
`1
`IMAGE PROCESSING SYSTEM FOR
`INSERTING PLURALITY OF IMAGES INTO
`COMPOSITE AREA, AND MEDIUM
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`The present invention relates to a technology of operating
`image data.
`2. Description of the Related Art
`What is exemplified as this type of conventional technol
`ogy may be a technology of managing a plurality of images
`by laying out these images on a sheet as in the case of a
`photo album.
`This technology is capable of laying out the images in any
`positions on the Sheet. Further, according to this conven
`tional technology, it is feasible to lay out the images So that
`they overlap with each other, and that the user may specify
`which target image appears as the first image when Viewed,
`and which image is placed behind the first image where the
`images overlap. Further, the user may arbitrarily Specify a
`different size for each image that is laid out.
`According to this conventional technology, it is required
`that a System (Software) manages positional data of the
`respective images laid out on the sheet, overlap data (front
`Surface/rear Surface) between the images, and size data of
`the images in order to actualize the function described
`above. A technique of managing the overlapped State
`between the images is generally known as a layer manage
`ment.
`On this type of sheet, when a new image is disposed on
`the sheet, or when a position of the image that has already
`been disposed on the sheet shifts, the respective images are
`re-respective image Segments which should be displayed,
`from various items of management data Stored.
`Hence, this type of conventional technology requires a
`great quantity of resources for retaining a multiplicity of
`pieces of management data, and the process of re-forming
`each image is a load on the System (Software). Therefore, a
`System having poor resources and a low throughput is
`insufficient for utilization of that technology.
`On the other hand, with a development of the Internet and
`a spread of digital cameras, the image data have come to be
`dealt with in daily lives. Further, a mobile environment for
`portable terminals, cellular phones etc that are capable of
`operating the image data, has remarkably been developed.
`Under Such a trend, it is much of importance to users who
`operate the images to efficiently manage and utilize the
`created image data etc. Particularly under the environment
`with poor resources for mobile equipment, a function of
`efficiently operating and editing the image data etc is also
`required.
`The prior arts did not, however, provide Such a function
`that the user is able to efficiently operate and edit the image
`data, Such as combining and Synthesizing the plurality of
`images into a new image. Moreover, there was not provided
`a function of efficiently managing and utilizing the images
`under the environment with the poor resources Such as a
`memory and a disk capacity.
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`SUMMARY OF THE INVENTION
`
`It is a primary object of the present invention, which was
`devised to obviate the above problems inherent in the prior
`arts, to provide Such a function that a user can efficiently
`manage and utilize image data.
`
`65
`
`2
`It is another object of the present invention to provide an
`image processing technology of processing a plurality of
`images, which can be Sufficiently utilized in a System with
`a low throughput and poor resources.
`To accomplish the above objects, according to one aspect
`of the present invention, an image processing System com
`prises a display unit for displaying, on a Screen, a composite
`area as an aggregation of unit areas into which images are
`inserted, and an operation unit for inserting a processing
`target image into the unit area within the composite area.
`The image inserted into the unit area may be transferable
`to other unit area within the composite area.
`The image inserted into the unit area may be deleted by
`transferring the same image to a position outside the com
`posite area.
`The processing target image may be inserted into the unit
`area by a drag-and-drop operation.
`The image processing System described above further
`comprise a transfer detection unit for indicating a processing
`target image and detecting a transfer of the indicated image,
`and the indicated image may be inserted into the unit area.
`In the image processing System described above, the
`composite area into which the images are inserted may be
`Stored as an image having predetermined dimensions.
`Herein, the image having the predetermined dimensions
`implies an image having a predetermined number of pixels.
`The image processing System described above may fur
`ther comprise a related image indicating module for relating
`a plurality of images to each other, and when the processing
`target image is related to other images, the related images
`may be consecutively inserted together with the processing
`target image into the plurality of unit areas. In this case,
`when the number of images to be inserted exceeds the
`number of insertable unit areas, the image insertion may be
`finished.
`The composite area may be composed of the unit areas
`having different dimensions.
`According to another aspect of the present invention, an
`image processing System comprises a plurality of unit Stor
`age areas for Storing processing target images, and a control
`unit for controlling an access to each of the unit Storage
`areas, and the control unit may Store the processing target
`unit images in the plurality of unit Storage areas, accesses the
`unit Storage areas in a predetermined Sequence, thereby
`generating a composite image from the unit images. In this
`case, the unit Storage areas may have different capacities,
`and the composite image may be composed of the unit
`images having different dimensions.
`According to a further aspect of the present invention,
`there is provided a readable-by-computer recording medium
`recorded with a program for making a computer actualize
`the function described above.
`AS explained above, according to the present invention,
`the image processing System comprises the display unit for
`displaying, on the Screen, the composite area as the aggre
`gation of unit areas into which the images are inserted, and
`the operation unit for inserting the processing target image
`into the unit area within the composite area. Further, a
`composite area in to which one or more images are inserted
`is Stored as an image having predetermined dimensions. A
`user is therefore able to efficiently manage and utilize the
`image data.
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 21 of 26 PageID #: 42
`
`US 6,972,774 B2
`
`3
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a view showing an external configuration of an
`image processing System in an embodiment of the present
`invention;
`FIG. 2 is a block diagram showing a hardware architec
`ture of the image processing System;
`FIG. 3 is a view showing a principle of an image insert
`proceSS,
`FIG. 4 is a diagram showing a memory map of an insert
`area,
`FIG. 5 is a view showing a typical image insert process,
`FIG. 6 is a view showing a consecutive image insert
`proceSS,
`FIG. 7 is a diagram showing a data Structure for an image
`data management;
`FIG. 8 is a view showing how the inserted image is
`transferred;
`FIG. 9 is a view how the inserted image is deleted;
`FIG. 10 is a view showing how the inserted image is
`Stored;
`FIG. 11 is a flowchart showing the insert process;
`FIG. 12 is a flowchart showing an inserted image trans
`ferring/deleting process,
`FIG. 13 is a flowchart showing an inserted image Storing
`proceSS,
`FIG. 14 is a flowchart showing an environment setting
`proceSS,
`FIG. 15 is a view showing an example of a variation of the
`insert area;
`FIG. 16 is a diagram showing an example of a variation
`of the memory map of the insert area; and
`FIG. 17 is a chart showing an insert area management
`table.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENT
`
`An embodiment of an image processing System according
`to the present invention will hereinafter be described with
`reference to FIGS. 1 through 17. FIG. 1 is a view showing
`an external configuration of the image processing System in
`this embodiment. FIG. 2 is a block diagram Showing a
`hardware architecture of this image processing System. FIG.
`3 is a view showing a principle of an image insert process.
`FIG. 4 is a diagram showing a memory map of an insert area
`2 shown in FIG. 3. FIG. 5 is a view showing a typical image
`insert process. FIG. 6 is a view showing a consecutive image
`insert process. FIG. 7 is a diagram showing a data structure
`for an image data management. FIG. 8 is a view showing
`how the inserted image is transferred. FIG. 9 is a view how
`the inserted image is deleted. FIG. 10 is a view showing how
`the inserted image is stored. FIG. 11 is a flowchart showing
`the insert process. FIG. 12 is a flowchart showing an inserted
`image transferring/deleting process. FIG. 13 is a flowchart
`showing an inserted image Storing process. FIG. 14 is a
`flowchart showing an environment setting process. FIG. 15
`is a view showing an example of a variation of the insert
`area. FIG. 16 is a diagram showing an example of a variation
`of the memory map of the insert area. FIG. 17 is a chart
`showing an insert area management table for managing the
`insert area.
`<Architecture>
`FIG. 1 is a view showing an external configuration of an
`image processing system 200 in this embodiment. The
`image processing System 200 is connected via a universal
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`serial bus (which will hereinafter abbreviated to USB) cable
`300 to an image acquisition device 101.
`The image processing System 200 is constructed to func
`tion in Such a way that a predetermined control program is
`executed by a portable terminal.
`The image acquisition device 101 includes a button 102,
`provided on a front surface of a device body 100, for
`indicating an execution of an image taking process, and a
`lens 103 for forming an image inside the device body 100.
`Further, the image acquisition device 101 has an unillus
`trated CCD imaging device incorporated into the device
`body 100.
`FIG. 2 is the block diagram showing the hardware archi
`tecture of the image processing System 200. AS shown in
`FIG. 2, the image processing system 200 includes a CPU
`201 for obtaining digital image Signals generated by the
`image acquisition device 101 and controlling the image
`process, and a memory 202 for Storing the control program
`executed by the CPU 201 and data processed by the CPU
`201. The image processing system 200 further includes a
`touch panel 203 for detecting an operation by a user, a hard
`disk 204 for recording the data, and a keyboard 206.
`The CPU 201 controls photographing by executing the
`control program stored in the memory 202. The CPU 201
`normally monitors a photographing instruction by the user
`or an image operation on the touch panel 203.
`The CPU 201, when detecting the user's photographing
`instruction, Stores a hard disk 204 with the image data
`obtained by the image acquisition device 101.
`Further, the CPU 201, when detecting the user's image
`operation, i.e., drag-and-drop etc of the image on the touch
`panel 203, executes a process corresponding to this opera
`tion. The drag-and-drop herein connotes an operation that an
`operation target displayed on the Screen, which is pinpointed
`by a depression on the touch panel 203 or by depressing a
`mouse button of a mouse, is moved (drag) in this as
`depressed State to a desired position, and then this depres
`Sion is released (drop).
`The memory 202 stores the control program executed by
`the CPU 201 and the image data processed by the CPU 201.
`The touch panel 203 is constructed of a combination of a
`liquid crystal display and a Sensor for detecting a position of
`the depressing operation of the user. Icons and image data,
`which are manipulated by the user, are displayed on the
`touch panel 203. The CPU 201 detects the user's manipu
`lations with respect to the icons and the image data through
`the touch panel 203.
`What is well known as the sensor of this type of touch
`panel 203 may be a pressure Sensitive type Sensor, an
`electroStatic type Sensor and an ultraSonic type Sensor.
`The device for detecting the manipulating position by the
`user is not, however, limited to this touch panel in terms of
`actualizing the present invention, and, as a matter of course,
`other devices Such as a mouse, a keyboard etc are usable (the
`device such as the touch panel 203, the mouse, the keyboard
`etc corresponds to a transfer detecting unit.).
`The hard disk 204 stores the data of the image data of a
`photographed object and the image data processed by the
`CPU 201.
`<Outline of Insert Operation>
`The image processing system 200 in this embodiment
`executes the image insert proceSS as a characteristic function
`thereof.
`FIG. 3 is the explanatory view showing the principle of
`the image insert process. The image insert proceSS is a
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 22 of 26 PageID #: 43
`
`US 6,972,774 B2
`
`25
`
`S
`process of inserting and displaying an image Specified by the
`user into and on an insert area 2 in Scale-down or in Scale-up
`or in equal size.
`The insert area 2 (corresponding to a composite area) is
`configured by combining a plurality of blockS 1 in a tile-like
`shape. The user may specify an arbitrary color as a back
`ground color of the insert area 2 among colors prepared in
`the image processing System 200. Further, the user may
`Specify dimensions of the insert area 2.
`The block(s) (corresponding to a unit area(s)) indicates a
`position and dimensions in which the image is inserted. The
`user may set arbitrary values for defining the dimensions of
`the block(s). The user may also specify an arbitrary value for
`Setting the number of blocks within the insert area 2.
`Referring to FIG. 3, the block 1 is framed in bold at a right
`upper corner of the insert area 2.
`In the insert process, the user Specifies, by the drag and
`drop, an insert target original image 3 and a block 4 into
`which the image is inserted. With this operation, at first the
`original image is copied. Next, the copied image is displayed
`in the insert destination block 4 in Scale-up or in Scale-down
`or in equal size corresponding to the dimensions of this
`block 4.
`Referring again to FIG. 3, the drag-and-drop (which will
`hereinafter be called an insert operation) is represented by
`use of an arrow line A. With this insert operation, the
`original image 3 is inserted into the insert destination block
`4. The user may specify any blocks within the insert area 2
`to define a position of this insert destination block. Further,
`if the user executes the insert operation outside the insert
`area 2, this operation is ruled out, with the result that the
`original image 3 is inserted into none of those blockS 1
`within the insert area 2.
`On the other hand, the user is able to transfer an inserted
`image 4 to a desired block by dragging and dropping (as
`indicated by an arrow line B) the inserted image in the insert
`area 2. Further, the user is able to delete the inserted image
`4 out of the insert area 2 by dragging and dropping (as
`indicated by an arrow line C) the inserted image outside the
`insert area 2.
`FIG. 4 shows the memory map for retaining the respective
`inserted image displayed in the insert area 2. BlockS B1
`through B9 configure the insert area 2. Image memory areas
`40 (corresponding to a plurality of unit storage areas)
`corresponding to those blocks B1-B9 are ensured on the
`memory 202. With the insert operation described above, data
`of the inserted image is Stored in the image memory area 40.
`The data in the image memory areas 40 is displayed
`directly as the inserted image on the touch panel 203.
`Further, the image data in the image memory area 40 is
`transferred or deleted or stored in the hard disk 204 by the
`above operation Such as transferring, deleting and Storing.
`These processes are executed under the control of the CPU
`201. The CPU 201 corresponds to a control unit controlling
`55
`an access to the unit Storage area.
`<Layout on Screen.>
`FIG. 5 shows a screen for showing the insert operations
`on the touch panel 203 (which corresponds to a display unit
`displaying the composite area on the Screen). The touch
`panel 203 includes the insert area 2, a photographing button
`icon 22, a photographed image display frame module 23, a
`photographing mode menu 24, and an environment Setting
`menu 26.
`The photographing button icon 22 Serves to detect a user's
`photograph instruction through the touch panel 203. The
`user Specifies the photograph by depressing any one of the
`
`6
`photographing button icon 22 or the button provided on the
`image acquisition device 101.
`The photographed image display frame module 23
`includes film frames that Simulate a configuration of a film
`as a whole, an image display area consisting of three pieces
`of film frames, and a Scroll button icon for Scrolling the
`3-frame image display area. The photographed image data
`are displayed on the 3-frame image display area.
`The photographing mode menu 24 is used for Switching
`over a photographing mode to a normal photographing mode
`and a consecutive photographing mode.
`The environment Setting menu 26 Serves to Set an envi
`ronment of this image photographing System, Such as a
`consecutive photographing interval, a background color of
`the insert area 2 etc.
`<Normal Insert Operation>
`FIG. 5 further shows a transition of the screen when
`executing a normal insert operation. The normal insert
`operation is an operation of inserting one piece of original
`image displayed in the photographed image display frame
`module 23 into one block in the insert area 2.
`Referring to FIG. 5, an arrow line D represents the
`drag-and-drop implemented by the user. An image 23b
`existing at a start point of the arrow line D is an insert target
`original image. The image 23b is Selected as a drag-and-drop
`target and is therefore framed 31 in bold. Further, a block
`21b positioned at an end point of the arrow line D is an insert
`destination block. With this drag-and-drop, the image 23b is
`inserted into the block 21b.
`Note that if a position outside the insert area 2 is specified
`as an insert destination, this insert operation is ruled out.
`<Consecutive Insert Operation>
`FIG. 6 shows a transition of the Screen when executing a
`consecutive insert operation. The consecutive insert opera
`tion is a process of consecutively inserting a Set of plural
`images by one insert operation (CPU201 which provides the
`function corresponds to a related image indicating module).
`This set of images consist of a plurality of images
`photographed in the consecutive photographing mode. If the
`mage positioned at a head of this set of images is Specified
`as an insert target, this whole Set of images becomes the
`insert target.
`Referring to FIG. 6, an arrow line E represents a user's
`drag-and-drop. An image 23c existing at a start point of the
`arrow line E is an insert target original image Specified by
`the user. The image 23C is Selected as a drag-and-drop target
`and is therefore framed 31 in bold.
`Further, a block 21c positioned at an end point of the
`arrow line E is an insert destination block. An arrow line F,
`of which a Start point exists in the block 21c, indicates a
`transition direction of a consecutive insert target block.
`Thus, according to the consecutive insert operation in the
`present image processing System 200, the images are
`inserted consecutively into the blocks disposed in the right
`direction from the insert destination block 21c. Further,
`when the insert destination reaches the block 21d disposed
`at a right end of the insert area 2 midways of the consecutive
`insert operation, the insert destination moves round down to
`a block 21e disposed at a left end of a next lower row. Then,
`the insert operation continues as an arrow line G indicates.
`Namely, the consecutive insert operation proceeds in a right
`downward direction with a priority given to the blocks
`disposed in the horizontal direction.
`Further, when the insert position arrives at a block 21f
`disposed at a right lower corner of the insert area 2 during
`the consecutive insert operation, this consecutive insert
`operation comes to an end. Therefore, remaining images
`
`15
`
`35
`
`40
`
`45
`
`50
`
`60
`
`65
`
`
`
`Case 2:20-cv-00234-JRG Document 1-1 Filed 07/13/20 Page 23 of 26 PageID #: 44
`
`US 6,972,774 B2
`
`7
`among the plurality of consecutive insert target images
`photographed in the consecutive photographing mode, are
`ruled out of the insert targets.
`On the other hand, all the images contained in the above
`Set of images are inserted before the insert position arrives
`at the block 21f disposed at a right lower corner of the insert
`area 2, the insert operation is finished at this point of time.
`FIG. 7 shows a data Structure for managing the insert
`target image data in the present image processing System
`200. This data structure is generally known as a list struc
`ture. In the list Structure, a plurality of elements are linked
`by next addresses 54a, 54b, 54c etc, and a sequence relation
`between the elements is thus expressed. In this list Structure,
`a next address 54g is NULL. This element is a tail element
`of the list structure. Further, the tail element is indicated by
`a list terminal address 51.
`On the other hand, the elements corresponding to the
`image data displayed in the photographed image display
`frame module 23 on the touch panel 203, are indicated by an
`intra-Screen head address 52 and an intra-Screen terminal
`address 53.
`Each element in this data Structure retains a file name, a
`date, a consecutive photographing attribute 55 in addition to
`the next address 54 etc. Herein, the file name is a name of
`a file in which the image data are Stored. This file is created
`on the hard disk 204. The date is a photographing date when
`the image data are created.
`Further, the consecutive photographing attribute 55 indi
`cates that the image concerned is consecutively photo
`graphed. The consecutive photographing attribute 55 has
`three categories Such as a start of the consecutive photo
`graphing, an on-consecutive-photographing State, and an
`end of the consecutive photographing.
`The Start of the consecutive photographing indicates that
`the image concerned is a start image of the consecutive
`photographing. Further, the end of the consecutive photo
`graphing indicates that the image concerned is an end image
`of the consecutive photographing. The on-consecutive-pho
`tographing State is Set as consecutive photographing
`attribute 55 in a position interposed between the start of the
`consecutive photographing and the end of the consecutive
`photographing. These elements form the Set of consecutive
`insert target images.
`When the image exhibiting the Start-of-consecutive-pho
`tographing attribute is specified as an insert target, the whole
`Set of imageS containing this image become the insert target.
`<Transfer of Inserted Imaged
`FIG. 8 shows a transition of the screen when transferring
`the inserted image. The transfer of the inserted image is an
`operation of transferring the inserted image to an arbitrary
`block within the insert area 2.
`Referring to FIG. 8, an arrow line H indicates the drag
`and-drop by the user. The inserted image in a block 21g
`disposed at a start of this arrow line His defined as a transfer
`target image. The in