`(12) Patent Application Publication (10) Pub. No.: US 2011/0007177 A1
`Kang
`(43) Pub. Date:
`Jan. 13, 2011
`
`US 2011 0007177A1
`
`(54) PHOTOGRAPHINGAPPARATUS AND
`METHOD
`
`(75) Inventor:
`
`Tae-hoon Kang, Seoul (KR)
`
`C
`d
`Address:
`DiNRER BioSE REATH LLP
`ATTN PATENT DOCKET DEPT.
`191 N. WACKER DRIVE, SUITE 3700
`CHICAGO, IL 60606 (US)
`
`(73) Assignee:
`
`Stung Elstronic Co., Ltd.,
`
`(21) Appl. No.:
`
`(22) Filed:
`
`12A831,422
`9
`
`Jul. 7, 2010
`
`Foreign Application Priority Data
`(30)
`Jul. 7, 2009 (KR) ........................ 10-2009-0061722
`Publication Classification
`
`(51) Int. Cl.
`H04N 5/228
`(2006.01)
`(52) U.S. Cl. .............................. 348/222.1; 348/E05.031
`(57)
`ABSTRACT
`A photographing apparatus includes an imaging device that
`converts light of an image into an electrical signal, an image
`conversion unit that converts the electrical signal into image
`data, a scene recognition unit that recognizes the type of a
`scene to be photographed by analyzing the image data, a
`display unit that displays scene information regarding the
`recognized scene, a user input unit that receives user input,
`and a condition setting unit that locks a shooting condition as
`a shooting mode corresponding to the recognized type of the
`scene for photographing, according to the user input received
`via the user input unit.
`
`(f Di
`
`14
`
`
`
`
`
`
`
`DRIVING
`UNIT
`
`ZOOM
`DRIVING
`UNIT
`
`20
`MAGING
`DEVICE .
`16
`APERTURE
`DRIVING
`UNIT
`
`
`
`
`
`
`
`50
`
`M
`a'
`
`DISPLAY UNIT ---51
`
`INPUT UNIT
`
`|-52
`
`USER INPUT
`UNIT
`
`a
`
`- - - - 40
`
`a.
`
`TOUCH
`SCREEN
`CONTROLLER
`81
`
`USER
`INTERFACE
`
`48
`
`CONDITION
`SETTING UNIT
`49
`
`
`
`IMAGE
`CONVERSION. He
`
`c-f-
`
`SHOOTING
`CONTROLLER
`
`DRIVING
`CIRCUIT
`UNIT
`
`SCENE
`RECOGNITION
`UNIT
`
`FLASH
`CONTROLLER
`
`ODE
`-TRECOMMENDATION
`UNIT
`
`IMAGE
`--- PROCESSING
`UNIT
`
`MEMORY -
`
`MEMORY
`CONTROLLER
`
`60
`
`s
`
`r: DATABASE
`
`IA1008
`
`Page 1 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 1 of9
`
`US 2011/0007177 A1
`
`wm.523.51;
`
`mm
`
`l-:2:
`
`F3&z_mmw:
`
`ow
`
`.n-IIIIIV
`
`mw<m<k<ot
`
`
`o¢:.
`
`
`E!
`
`om
`\.
`
`:23><._n_m_m
`
`
`IODOH
`
`zmmmom
`
`
`mm.._:_Om._.zOo
`
`mmw:
`
`mo<n_mm_._.z_
`
`zOEQzOo
`
`
`
`._._ZDoz_._.._.m_m
`
`mEOE
`
`
`
`zQEQEEEOQmm:Iv.
`
`:23
`
`m__o<_>__
`
`:23
`
`Oz_wmm00mn_
`
`I’lll
`
`t
`
`
`
`r...
`
`mw
`
`:‘EMJJOmHzoo
`
`>mo_>_m__2
`
`\\2
`
`>mO_>_m__2
`
`
`
`El:zO_wmm_>zOoA
`
`.fl—IH
`
`Im<4l
`
`.mMJJQszoo.n
`
`u.Im<nl
`
`:23
`
`mm
`
`©z_._.OOIm
`
`mijmHzoo
`
`mmPEng
`
`02::ma
`
`oz>ao
`
`:DogoEOON
`H2:oZ>Eo
`
`:2:
`
`zoEzooomm_m2QO
`
`
`
`52:@300;
`
`oz_>_mm
`
`:2:
`
`Page 2 of 17
`
`LAIOO8
`
`
`
`IA1008
`
`Page 2 of 17
`
`
`
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 2 of 9
`
`US 2011/0007177 A1
`
`FIG. 2
`
`PORTRAIT CONDITION SETTING UNIT
`
`LANDSCAPE CONDITION SETTING UNIT
`
`
`
`LANDSCAPE CONDITION SETTING UNIT PORTRATCONDITION SETTING UNIT
`
`48
`
`'
`
`- - - 91
`
`--92
`
`DAY/NIGHT CONDITION SETTING UNIT -- 93
`
`VIBRATION CONDITION SETTING UNIT
`
`94
`
`TEXT CONDITION SETTING UNIT
`
`----95
`
`CLOSE-UP CONDITION SETTING UNIT
`
`- - - -96
`
`IA1008
`
`Page 3 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 3 of 9
`
`US 2011/0007177 A1
`
`FIG 3
`
`START
`
`CAPTURE PREVIEW IMAGE
`
`S TYPE OF SCENE
`RECOGNIZED?
`
`PROCESS THE PREVIEW IMAGE
`
`S12O
`
`DISPLAY THE PREVIEW IMAGE AND
`SCENE INFORMATION
`
`-- S130
`
`IS MODE
`MATCHING THE RECOGNIZED
`SCENE TO BE LOCKED AS
`SHOOTING CONDITION?
`
`S140
`
`LOCK CONDITION FOR PHOTOGRAPHING
`
`S150
`
`PERFORMPHOTOGRAPHING
`
`- - - S160
`
`PROCESS AND STORE IMAGE DATA
`OBTANED BY PERFORMING PHOTOGRAPHING
`
`- S170
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`IA1008
`
`Page 4 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 4 of 9
`
`US 2011/0007177 A1
`
`50
`
`y
`
`:
`
`e s- co O
`
`50a
`
`IMAGE
`- LANDSCAPE
`LANDSCAPE PROCESSING
`
`PERFORM MAGE PROCESSING IMMEDIATELY AFTER
`LANDSCAPE MODE IS RECOGNIZED
`
`
`
`FIG. 5
`
`|
`
`e Cs O
`feels or
`
`DO YOU WANT TO
`
`IA1008
`
`Page 5 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 5 of 9
`
`US 2011/0007177 A1
`
`
`
`PERFORMED ACCORDING
`TO LANDSCAPE MODE
`
`IA1008
`
`Page 6 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 6 of 9
`
`US 2011/0007177 A1
`
`FIG. 7
`
`START
`
`CAPTURE PREVIEW IMAGE
`
`S TYPE OF SCENE
`RECOGNIZED?
`
`DISPLAY THE PREVIEW IMAGE AND
`SCENE INFORMATION
`
`S220
`
`
`
`IS MODE
`MATCHING THE RECOGNIZED
`SCENE TO BE LOCKED AS
`SHOOTING CONDITION?
`
`S230
`
`LOCK CONDITION FOR PHOTOGRAPHING
`
`S240
`
`
`
`PERFORM PHOTOGRAPHING
`
`- S250
`
`PROCESS AND STORE IMAGE DATA
`OBTANED BY PERFORMING PHOTOGRAPHING
`
`S260
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`IA1008
`
`Page 7 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 7 of 9
`
`US 2011/0007177 A1
`
`FIG. 8
`
`CAPTURE PREVIEW IMAGE
`
`S3OO
`
`
`
`
`
`
`
`
`
`
`
`S TYPE OF SCENE
`RECOGNIZED?
`
`S310
`
`NO
`
`YES
`
`DISPLAY THE PREVIEW IMAGE AND
`SCENE INFORMATION
`
`S320
`
`
`
`IS MODE
`MATCHING THE RECOGNIZED
`SCENE TO BE LOCKED AS
`SHOOTING CONDITION?
`
`S330
`
`
`
`NO
`
`
`
`
`
`
`
`
`
`YES
`
`LOCK CONDITION FOR PHOTOGRAPHING --- S340
`
`HALF PRESS SHUTTER BUTTON
`
`S350
`
`PROCESS THE PREVIEW IMAGE
`
`- S360
`
`PERFORM PHOTOGRAPHING
`
`---. S.370
`
`PROCESS AND STORE IMAGE DATA
`OBTANED BY PERFORMING PHOTOGRAPHING
`
`- S38O
`
`END
`
`IA1008
`
`Page 8 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 8 of 9
`
`US 2011/0007177 A1
`
`FIG. 9
`
`CAPTURE PREVIEW IMAGE
`
`-- S410
`
`
`
`S420
`S SHOOTING
`MODE FOR PARTICULAR SCENE
`LOCKED?
`
`M
`
`NO
`
`YES
`
`RECOGNIZE TYPE OF SCENE
`
`-- S430
`
`S445
`
`PROCESS THE PREVIEW IMAGE
`
`- - - S440
`
`RECOGNIZE TYPE
`OF SCENE
`
`DISPLAY THE PREVIEW IMAGE AND
`RECOMMENDED SCENE INFORMATION
`
`S450
`
`
`
`
`
`
`
`NO
`
`
`
`
`
`IS MODE
`MATCHING THE RECOGNIZED
`SCENE TO BE LOCKED AS
`SHOOTING CONDITION?
`
`S460
`
`YES
`
`LOCK CONDITION FORPHOTOGRAPHING
`
`- S470
`
`PERFORM PHOTOGRAPHING
`
`S480
`
`PROCESS AND STORE IMAGE DATA
`OBTANED BY PERFORMING PHOTOGRAPHING
`
`-- S490
`
`END
`
`IA1008
`
`Page 9 of 17
`
`
`
`Patent Application Publication
`
`Jan. 13, 2011 Sheet 9 of 9
`
`US 2011/0007177 A1
`
`
`
`IA1008
`
`Page 10 of 17
`
`
`
`US 2011/0007177 A1
`
`Jan. 13, 2011
`
`PHOTOGRAPHINGAPPARATUS AND
`METHOD
`
`CROSS-REFERENCE TO RELATED PATENT
`APPLICATION
`0001. This application claims the priority benefit of
`Korean Patent Application No. 10-2009-0061722, filed on
`Jul. 7, 2009, in the Korean Intellectual Property Office, the
`disclosure of which is incorporated herein in its entirety by
`reference.
`
`BACKGROUND
`0002 1. Field of the Invention
`0003 Embodiments relate to a photographing apparatus
`and method, and more particularly, to a photographing appa
`ratus and method, in which the type of a scene that is to be
`photographed may be automatically recognized to set a
`shooting condition suitable for the recognized type of the
`scene and photographing may be performed by locking the
`shooting condition according to a user's selection.
`0004 2. Description of the Related Art
`0005 Recently, cameras having a 'shooting mode func
`tion have come into widespread use. The shooting mode
`provides, for example, a portrait mode, a landscape mode, or
`a night-view mode, which is an automatically set shooting
`condition according to the environment of a scene that is to be
`photographed. In general, a shooting mode is selected by
`manipulating a mode dial. However, beginners who are inex
`perienced in handling cameras may have difficulty determin
`ing a mode that matches the current state of a scene that is to
`be photographed.
`0006 To solve this problem, a scene recognition function
`has been introduced, by which a current shooting state of a
`desired scene is analyzed. Via the scene recognition function,
`a shooting mode for photographing the scene is automatically
`determined, a shooting condition of a camera is automatically
`set according to the determined shooting mode, and then
`photographing is performed.
`0007. The scene recognition function would be very use
`ful for beginners but may inconvenience users who desire to
`photograph a scene according to a particular shooting mode.
`That is, when the scene recognition function is used, scene
`recognition is continuously performed even after the type of
`a scene has been automatically recognized and the particular
`shooting mode has been selected. Thus, a shooting mode
`change occurs when the scene changes even slightly.
`0008 Also, since scene recognition is continuously per
`formed even after the type of a scene has been automatically
`recognized and a particular shooting mode has been selected,
`an algorithm for scene recognition is repeatedly executed,
`and thus load on a camera is increased, thereby reducing the
`operating speed of the camera.
`
`SUMMARY
`0009 Embodiments provide a photographing apparatus
`and method, in which the type of a scene that is to be photo
`graphed may be automatically recognized to set a shooting
`condition that matches the type of scene recognized and pho
`tographing may be performed by locking the shooting con
`dition according to a user's selection.
`0010 Embodiments also provide a photographing appa
`ratus and method, in which a user may be recommended a
`shooting mode that matches a change in the type of a scene
`
`when the scene is to be photographed according to a shooting
`condition that matches the original type of the scene when the
`shooting condition has been locked in.
`0011. According to an embodiment, a photographing
`apparatus includes an imaging device that converts light of an
`image into an electrical signal; an image conversion unit that
`converts the electrical signal into image data; a scene recog
`nition unit that recognizes the type of a scene to be photo
`graphed by analyzing the image data; a display unit that
`displays scene information regarding the recognized scene; a
`user input unit that receives user input; and a condition setting
`unit that locks a shooting condition as a shooting mode cor
`responding to the recognized type of the scene for photo
`graphing, according to the user input received via the user
`input unit.
`(0012. The photographing apparatus may further include a
`shooting controller that controls the imaging device to cap
`ture the image according to the locked shooting condition.
`(0013 The photographing apparatus may further include
`an image processing unit that processes the image data gen
`erated by the image conversion unit according to the locked
`shooting condition and under control of the shooting control
`ler.
`(0014) The photographing apparatus may further include a
`storage unit that stores the image data processed by the image
`processing unit.
`(0015 The display unit may display the recognized type of
`the scene in the form of text.
`10016. The display unit may display the recognized type of
`the scene in the form of an icon.
`(0017. The display unit may display a preview image gen
`erated by the imaging device according to the locked shooting
`condition corresponding to the type of the scene recognized
`by the scene recognition unit.
`(0018. The photographing apparatus may further include a
`mode recommendation unit that displays a recommended
`mode that respectively matches the type of the scene recog
`nized by the scene recognition unit on the display unit, after
`the condition setting unit locks the shooting condition.
`(0019. According to another embodiment, a photographing
`method includes capturing a preview image by converting an
`electrical signal received from an imaging device into image
`data; recognizing the type of a scene to be photographed by
`analyzing the image data; displaying scene information
`regarding the recognized type of the scene on a display unit;
`locking a shooting condition, wherein the imaging device
`captures images according to the shooting condition, accord
`ing to user input; and performing photographing to obtain
`image data by operating the imaging device according to the
`locked shooting condition.
`0020. The photographing method may further include
`processing the image data obtained by the performing of the
`photographing according to the locked shooting condition.
`(0021. The displaying of the scene information may
`include displaying the preview image and the scene informa
`tion on the display unit, and the photographing method may
`further include processing the preview image according to the
`locked shooting condition.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0022. The above and other features and advantages will
`become more apparent by describing in detail exemplary
`embodiments with reference to the attached drawings in
`which:
`
`IA1008
`
`Page 11 of 17
`
`
`
`US 2011/0007177 A1
`
`Jan. 13, 2011
`
`0023 FIG. 1 is a block diagram schematically illustrating
`a photographing apparatus according to an embodiment;
`0024 FIG. 2 is a block diagram schematically illustrating
`a condition setting unit included in the photographing appa
`ratus of FIG. 1, according to an embodiment;
`0025 FIG. 3 is a flowchart illustrating a photographing
`method according to an embodiment;
`0026 FIG. 4 is a diagram illustrating an operation of the
`photographing method of FIG. 3, in which a preview image
`and scene recognition status are displayed, according to an
`embodiment;
`0027 FIG. 5 is a diagram illustrating an operation of the
`photographing method of FIG. 3, in which a shooting condi
`tion is locked after performing the scene recognition of FIG.
`4, according to an embodiment;
`0028 FIG. 6 is a diagram illustrating an operation of the
`photographing method of FIG. 3, in which photographing is
`performed after the locking of the shooting condition in FIG.
`5, according to an embodiment;
`0029 FIG. 7 is a flowchart illustrating a photographing
`method according to another embodiment;
`0030 FIG. 8 is a flowchart illustrating a photographing
`method according to another embodiment;
`0031
`FIG. 9 is a flowchart illustrating a photographing
`method according to another embodiment; and
`0032 FIG. 10 is a diagram illustrating the photographing
`method of FIG.9 according to an embodiment.
`
`DETAILED DESCRIPTION
`0033 Hereinafter, exemplary embodiments of a photo
`graphing apparatus and method will be described in detail
`with reference to the accompanying drawings.
`0034 FIG. 1 is a block diagram schematically illustrating
`a photographing apparatus according to an embodiment.
`Referring to FIG. 1, the photographing apparatus includes an
`imaging device 20, an image conversion unit 41, a scene
`recognition unit 47, a condition setting unit 48, a display unit
`51, and a user input unit 82. The imaging device 20 captures
`an image of a Subject to be photographed and converts light of
`the image into an electrical signal. The image conversion unit
`41 receives and converts the electrical signal into image data.
`The scene recognition unit 47 recognizes the type of a scene
`to be photographed. The display unit 51 displays scene infor
`mation regarding the recognized type of the scene. The user
`input unit 82 receives user input. The condition setting unit 48
`sets a shooting condition.
`0035. The photographing apparatus of FIG. 1 may per
`form a preview function of displaying a real time preview
`image of the Subject by processing the image captured by the
`imaging device 20 and then displaying the preview image on
`the display unit 51, or may perform a photographing opera
`tion of processing the captured image and storing the process
`ing result in a memory 15.
`0036 An optical system 10 that is disposed in front of the
`imaging device 20 includes a plurality of lenses 12 and an
`aperture 14, and focuses external image light onto a surface of
`the imaging device 20. The lenses 12 are arranged such that
`the distances between the lenses 12 may be changed. Zoom
`magnification and focus of the optical system 10 may be
`adjusted by changing the distances between the lenses 12.
`0037. When the plurality of lenses 12 are driven by either
`a Zoom driving unit 11 having a driving unit, Such as a Zoom
`motor, or a focus driving unit 13, the distances between the
`lenses 12 may be changed within the optical system 10. The
`
`plurality of lenses 12 may include a Zoom lens that increases
`or decreases the size of an image of a Subject and a focus lens
`that adjusts the focus of an image of the Subject. The Zoom
`driving unit 11 and the focus driving unit 13 operate when a
`driving circuit unit 42 of a controller 40 respectively supplies
`control signals to the Zoom driving unit 11 and the focus
`driving unit 13. Thus, the Zoom driving unit 11 may drive the
`optical system 10 such that the optical system 10 has one of a
`plurality of magnification factors. Also, the focus driving unit
`13 may drive the optical system 10 such that the optical
`system 10 adjusts the focus of an image of the subject. The
`aperture 14 is a device that controls the amount of light that
`may be incident on the imaging device 20, and is driven by an
`aperture driving unit 16.
`0038. The imaging device 20 includes a photoelectric con
`version device. Such as a charge coupled device (CCD) or a
`complementary metal oxide semiconductor (CMOS), and
`converts image light incident thereon via the optical system
`10 into an electrical signal. A process of converting the image
`light into the electrical signal may include converting the
`image light into an analog signal, converting the analog signal
`into a digital signal, and performing signal processing, e.g.,
`pixel defect correction, gain correction, white balance correc
`tion, or gamma correction, on the digital signal.
`0039. The image conversion unit 41 converts the electrical
`signal into image data that may be image processed. The
`image data generated by the image conversion unit 41 may be
`stored in the memory 15, and may be used by the scene
`recognition unit 47 or an image processing unit 46.
`0040. In order to generate image data by capturing image
`light of a Subject, the imaging device 20 may operate largely
`in two ways. That is, the imaging device 20 may operate to
`generate a highest-resolution image when a user manipulates
`a shutter button to photograph the Subject, or may operate to
`generate a preview image to be displayed on the display unit
`51 before the user manipulates the shutter button.
`0041. In general, the resolution of the display unit 51 is
`lower than the highest resolution that the photographing
`apparatus can capture and record. Thus, the imaging device
`20 generates an image having a resolution that is lower than
`the highest resolution that the photographing apparatus can
`capture and record as a preview image.
`0042. While the photographing apparatus performs the
`preview function, images of the Subject are repeatedly cap
`tured at predetermined intervals and processed in order to be
`displayed on the display unit 51 in real time to generate an
`electrical signal.
`0043. The controller 40 is electrically connected to the
`imaging device 20, the Zoom driving unit 11, the focus driving
`unit 13, the aperture driving unit 16, a flash 72, the memory
`15, and a touchscreen 50 in order to exchangea control signal
`with them for controlling the operations thereof or in order to
`process data. The controller 40 includes the image conversion
`unit 41, the driving circuit unit 42, a touch screen controller
`44, a memory controller 43 that manages data to be stored in
`the memory 15, a shooting controller 45, the condition setting
`unit 48, the scene recognition unit 47, a mode recommenda
`tion unit 49, the image processing unit 46, a flash controller
`71, and a user interface 81
`0044) The controller 40 may be embodied as a microchip
`or a printed circuit board having a microchip, and elements of
`the controller 40 may be embodied using software or circuits
`installed in the controller 40.
`
`IA1008
`
`Page 12 of 17
`
`
`
`US 2011/0007177 A1
`
`Jan. 13, 2011
`
`0045. The memory controller 43 manages image data to be
`written to the memory 15 or manages image data or setting
`information to be read from the memory 15. The memory 15
`may be, for example, a semiconductor memory device. Such
`as synchronous dynamic random access memory (SDRAM)
`that is capable of storing data of an image captured.
`0046. The flash controller 71 outputs a control signal for
`driving the flash 72, wherein the flash 72 is a light emitting
`device. The flash 72 emits a predetermined amount of light
`toward the Subject for a predetermined duration according to
`a command given from the flash controller 71.
`0047. The photographing apparatus may include the dis
`play unit 51, which displays an image corresponding to image
`data, and the touch screen 50 having an input unit 52 for
`selecting a part of the image displayed on the display unit 51.
`The touch screen 50 may display an image captured by the
`imaging device 20. When a surface of the touch screen 50 is
`touched, the touch screen 50 may sense the touch and may
`generate a signal corresponding to a location of the touch.
`0048. The touch screen 50 is an input device that may
`replace a keyboard and a mouse. A user may perform a
`desired operation by touching a surface of the display unit 51
`of the touch screen 50 directly with a hand or with a pen,
`thereby the operation may be intuitively performed in a
`graphic user interface (GUI)-based environment. The display
`unit 51 of the touch screen 50 may be a liquid crystal display
`(LCD) or an organic light emitting diode (OLED) display.
`0049. The input unit 52 may be installed on the display
`unit 51 to sense touching of a surface of the display unit 51,
`and is an example of a user input unit included in the photo
`graphing apparatus according to the current embodiment. The
`input unit 52 may be embodied in various forms, e.g., a
`resistive sense unit, a capacitive sense unit, a surface acoustic
`wave-based sense unit, an infrared-ray (IR)-based sense unit,
`or an optical sense unit.
`0050. A user may either select an option displayed on the
`display unit 51 to execute a desired operation or designate a
`part of displayed images via touching the input unit 52 of the
`touch screen 50.
`0051
`Referring to FIG. 1, the photographing apparatus
`includes the user input unit 82, which is installed separately
`from the touchscreen 50 to receive user input. The user input
`unit 82 may be embodied in various forms. For example, the
`user input unit 82 may be embodied in the form of a menu
`button or a dial. Otherwise, the user input unit 82 may be a
`speech recognition device that recognizes a user's speech,
`e.g., “lock” and “cancel'.
`0052. If the imaging device 20 captures an image of a
`Subject and generates an electrical signal and the image con
`version unit 41 converts the electrical signal into image data,
`then the image processing unit 46 processes the image data in
`order to generate a preview image to be displayed on the
`display unit 51 or in order to generate data to be stored in the
`memory 15.
`0053. The scene recognition unit 47 analyzes the image
`data to recognize the type of a scene to be photographed. The
`display unit 51 may display scene information regarding the
`recognized type of the scene.
`0054 The scene recognition unit 47 may recognize the
`type of the scene in various ways. For example, the scene
`recognition unit 47 may recognize the type of the scene by
`setting an object region in the image represented by the image
`data, wherein the object region corresponds to a target Subject
`to be photographed, and determining the type of the object
`
`region according to predetermined conditions. If the object
`region corresponds to a person, a portrait mode may be set as
`the shooting condition. If the object region corresponds to a
`flower, a close-up shooting mode may be set as the shooting
`condition. If the object region corresponds to a mountain, a
`field, or a river, then a landscape mode may be set as the
`shooting condition. If the object region corresponds to text, a
`text mode may be set as the shooting condition.
`0055. The scene recognition unit 47 may recognize the
`type of the scene based on object recognition that uses geo
`metrical characteristics, color information, or edge informa
`tion of objects stored in a database 60. In the object recogni
`tion, the type of a Subject that is to be photographed is
`recognized based on geometrical information, e.g., the loca
`tions and sizes of the minutiae of an object (a flower, a face, a
`mountain, etc.) and the distances between the minutiae, edge
`information; or color information of an object, such as, the
`sky or a sea, which has unique colors.
`0056. The user input unit 82 receives user input and gen
`erates a signal according to the user input. The controller 40
`may determine from the user input received via the user
`interface 81 whethera user intends to perform photographing
`while the shooting condition is locked in a mode matching the
`type of the scene recognized by the scene recognition unit 47.
`0057. If it is determined from the user input received via
`the user input unit 82 that the user intends to lock the shooting
`condition as the mode matching the currently recognized type
`of the scene, the condition setting unit 48 locks the shooting
`condition. Then, photographing is performed according to the
`locked shooting condition after the locking operation of the
`condition setting unit 48.
`0058. The mode recommendation unit 49 may recom
`mend to a user a shooting mode that matches a change in the
`type of a scene to be photographed even after the condition
`setting unit 48 has locked the shooting condition as a specific
`shooting mode. Even after the condition setting unit 48 has
`locked the shooting condition, the scene recognition unit 47
`may continuously analyze image data obtained by capturing
`images. If the type of a scene that does not match the currently
`locked shooting condition is recognized, the mode recom
`mendation unit 49 may control information regarding the
`recognized type of the scene to be displayed on the display
`unit 51 in order to recommend a new shooting mode to the
`USC.
`0059. The shooting controller 45 controls the imaging
`device 20 to capture images according to the shooting condi
`tion locked by the condition setting unit 48. When a user
`manipulates a shutter button (not shown), the shooting con
`troller 45 may generate a photographing signal for controlling
`the imaging device 20 to capture images. The shooting con
`troller 45 may also control the imaging device 20 to capture
`images even when the user does not manipulate the shutter
`button. For example, the shooting controller 45 may control
`the imaging device 20 to start capturing images when a pre
`view image received from the imaging device 20 matches a
`current shooting mode or may control the imaging device 20
`to automatically capture images according to the current
`shooting mode when a preview image that matches the cur
`rent shooting mode has been constantly maintained for sev
`eral seconds, e.g., for three or five seconds.
`0060. The image processing unit 46 may process image
`data generated by the image conversion unit 41 according to
`the shooting condition. Then the image data may be displayed
`on the display unit 51 or stored in the memory 15. The image
`
`IA1008
`
`Page 13 of 17
`
`
`
`US 2011/0007177 A1
`
`Jan. 13, 2011
`
`processing unit 46 may process image data of a preview
`image according to the shooting condition locked by the
`condition setting unit 48 so that the preview image may be
`displayed on the display unit 51.
`0061 Also, the image processing unit 46 may not only
`process a preview image according to the current shooting
`condition but may also process image data obtained by per
`forming a photographing operation so that the processing
`result may be displayed on the display unit 51 or stored in the
`processing result in the memory 15.
`0062 FIG. 2 is a block diagram schematically illustrating
`the condition setting unit 48 of FIG. 1 according to an
`embodiment. The condition setting unit 48 may include a
`portrait condition setting unit 91, a landscape condition set
`ting unit 92, a day/night condition setting unit 93, a vibration
`condition setting unit 94, a text condition setting unit 95, and
`a close-up condition setting unit 96 that respectively set con
`ditions for photographing based on the type of a scene rec
`ognized by the scene recognition unit 47 of FIG.1. In detail,
`the portrait condition setting unit 91 sets a condition for
`photographing a person. The landscape condition setting unit
`92 sets a condition for photographing a landscape. The day/
`night condition setting unit 93 sets a condition for photo
`graphing at day/night time. The vibration condition setting
`unit 94 sets a condition for compensating for vibration when
`a scene Such as sports activities is photographed. The text
`condition setting unit 95 sets a condition for photographing
`text. The close-up condition setting unit 96 sets a condition
`for close-up photographing of a flower or an insect. The
`condition setting unit 48 may further include other various
`functions Suitable for various scenes.
`0063. With the photographing apparatus as described
`above with reference to FIGS. 1 and 2, it is possible to per
`form photographing while locking the shooting condition that
`matches the type of a scene recognized by the scene recog
`nition unit 47. Thus, it is possible to lessen inconvenience
`caused by a slight change in the type of a scene to be photo
`graphed when photographing is performed using automatic
`scene recognition. Also, the mode recommendation unit 49
`may recommend to a user a shooting mode that matches a
`change in the type of the scene while photographing is per
`formed according to the locked shooting condition that is set.
`0064 FIG. 3 is a flowchart illustrating a photographing
`method according to an embodiment. The photographing
`method of FIG. 3 includes capturing a preview image by
`converting an electrical signal received from an imaging
`device into image data (operation S100), recognizing the type
`of a scene that is to be photographed from the image data
`(operation S110), displaying scene information regarding the
`recognized type of the scene on a display unit (operation
`S130), locking a shooting condition for photographing (op
`eration S150), and obtaining image data by operating the
`imaging device (operation S160).
`0065 Specifically, in operation S100, the imaging device
`captures a preview image of a subject, wherein the resolution
`of the image is fit for the display unit. The imaging device
`repeatedly captures images of the Subject at predetermined
`intervals of time so that the images may be continuously
`processed in order to display a real time preview image.
`Image data used to display the preview image is obtained in
`operation S100.
`0066. In operation S110, the image data captured in opera
`tion S110 is analyzed to recognize the type of the scene that is
`to be photographed.
`
`0067. In operation S110, various methods may be used to
`recognize the type of the scene. For example, the type of the
`scene may be recognized by setting an object region, wherein
`the object region corresponds to a target Subject to be photo
`graphed, in the image represented by the image data, and
`determining the type of the object region according to prede
`termined conditions. If the object region corresponds to a
`person, a portrait mode may be set as the shooting condition.
`If the object region corresponds to a flower, a close-up shoot
`ing mode may be set as the shooting condition. If the object
`region corresponds to a mountain, a field, or a river, then a
`landscape mode may be set as the shooting condition. If the
`object region corresponds to text, a text mode may be set as
`the shooting condition.
`0068 For example, the type of the scene may be recog
`nized using object recognition that uses geometrical charac
`teristics, color information, or edge information correspond
`ing to objects stored in a database. In the object recognition,
`the type of a subject that is to be photographed is recognized
`based on geometrical information, e.g., the locations and
`sizes the minutiae of an object (a flower, a face, a mountain,
`etc.) and the distances between the minutiae; edge informa
`tion; or color information of an object, Such as, the sky or a
`sea, which have unique colors.
`0069. When the type of the scene is recognized in opera
`tion S110, then the image data is processed according to the
`type of the scene recognized in order to obtain a preview
`image (operation S120). Then, the preview image and scene
`information regarding the type of the scene are displayed on
`the display unit (operation S130).
`0070 Next, a user may check the scene information di