throbber
USO08027523B2
`
`(12) United States Patent
`Sun et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 8,027,523 B2
`Sep. 27, 2011
`
`(54) IMAGE PROCESSINGAPPARATUS, IMAGE
`PROCESSING METHOD, AND PROGRAM
`
`(75) Inventors: Yun Sun, Tokyo (JP); Tamaki Kojima,
`Kanagawa (JP); Tomohiko Gotoh,
`Kanagawa (JP); Makoto Murata, Tokyo
`(JP); Masatomo Kurata, Tokyo (JP)
`(73) Assignee: Sony Corporation, Tokyo (JP)
`
`(*) Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 115 days.
`(21) Appl. No.: 12/484,643
`
`(22) Filed:
`(65)
`
`Jun. 15, 2009
`Prior Publication Data
`US 2009/0316962 A1
`Dec. 24, 2009
`
`Foreign Application Priority Data
`(30)
`Jun. 18, 2008 (JP) ............................... P2008-159782
`
`(51) Int. Cl.
`(2006.01)
`G06K 9/00
`(2006.01)
`HO)4N 9/07
`(52) U.S. Cl. ........................................ 382/118; 348/267
`(58) Field of Classification Search .................. 382/115,
`382/116, 117, 118; 348/154, 155, 169, 170,
`348/171, 172, 267: 379/93.03, 207.13
`See application file for complete search history.
`
`56
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`6,526,158 B1* 2/2003 Goldberg ...................... 382,115
`7,239,725 B2* 7/2007 Dobashi .......
`... 382.118
`7,734.072 B2 *
`6/2010 Yamaguchi ................... 382,118
`
`FOREIGN PATENT DOCUMENTS
`11-175730
`7, 1999
`JP
`2008-17042
`1, 2008
`JP
`2008-77536
`4/2008
`JP
`WO WO 2006/025 185 A1
`3, 2006
`OTHER PUBLICATIONS
`Japanese Office Action in corresponding Japanese Patent Application
`2008-159782 dated Apr. 5, 2010.
`Okamura, et al., "Clustering of Face Features for Listing Performers
`in TV-Programs”. FIT 2006, Public lecture memoirs. Third Separate
`Volume, pp. 29-30. (Aug. 21, 2006).
`F. Cootes, et al., “Active Appearance Models'. Proc. Fifth European
`Conf. Computer Vision, H. Burkhardt and B. Neumann, eds, vol. 2,
`pp. 484-498 (1998).
`* cited by examiner
`Primary Examiner — Abolfazl Tabatabai
`(74) Attorney, Agent, or Firm — Finnegan, Henderson,
`Farabow. Garrett & D
`LLP.
`arabow, Qarrell & Junner,
`57
`ABSTRACT
`(57)
`An image processing apparatus includes a face detector
`detecting face images from still-image frames Successively
`extracted from a moving-image stream in accordance with
`image information items regarding the still-image frames, a
`face-feature-value calculation unit calculating face feature
`values of the face images in accordance with image informa
`tion items regarding the face images, an identity determina
`tion unit determining whether a first face image in a current
`frame and a second face image in a previous frame represent
`an identical person in accordance with at least face feature
`values of the first and second face images, and a merging
`processor which stores one of the first and second face images
`when the first face image and the second face image represent
`9.
`ge rep
`an identical person, and which stores the first and second face
`images when the first face image and the second face image
`do not represent an identical person.
`
`20 Claims, 46 Drawing Sheets
`
`100
`
`to 1. g.org-to-ggest
`
`10
`
`O2
`
`O3
`
`
`
`MAGE FE
`
`104
`
`5
`
`'s
`
`NOISE-ACE
`REOWNG
`UN
`
`OEN CAFACES-ERGING
`PROCESSOR
`
`FACE
`CUSERNG
`UN
`
`CHARACER
`OAA
`
`
`
`USER
`SETING UNIT
`
`Page 1
`
`Amazon Ex. 1104
`Amazon.com v. CustomPlay
`IPR2018-01498
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 1 of 46
`
`US 8,027,523 B2
`
`
`
`| wiwq | ±10wwwHO
`
`104
`
`
`
`HÄN?A 50!!!!!!!S
`
`?N?GOO30
`
`1?N?I
`
`
`
`
`
`Page 2
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 2 of 46
`
`US 8,027,523 B2
`
`FIG. 2
`
`
`
`DF
`
`HEIGHT
`
`STILLIMAGE FRAME
`
`Page 3
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 3 of 46
`Sheet 3 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`ATONVTION
`
`
`
` TIONVMVA
`
`VeSls
`
`Page 4
`
`Page 4
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 4 of 46
`
`US 8,027,523 B2
`
`
`
`
`
`| SÐVNIHOVA GB10H1?d?vnGINGNIHOH SOI=no|Nn
`
`
`
`
`
`
`
`BT9Nv NOLLWLOH BOwº |
`
`NOIVNJOHN
`
`Page 5
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 5 of 46
`Sheet 5 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`G‘Sls
`
`WSLNOLLdldOSsd
`
`
`
`OLYslsnlo
`
`||HsLSnTONIGSQNTONSSOVWISJOV430SOIdO0V440LS
`
`SAOVIIFOVSAUVINASSddadJOdid0Vd|GlS0VdSAUVINASSddsaY
`
`
`
`
`YaLSATOWNidAdILNSO!dOdesislLNad!
`
`Sh]GidoVva
`
`Page 6
`
`Page 6
`
`

`

`U.S. Patent
`
`Sep. 27
`
`2011
`
`Sheet 6 of 46
`
`US 8,027,523 B2
`
`
`
`
`
`amwaaanly?a?ow? do Nouvinovo + NO.103.130 #0W+
`
`
`
`
`
`30°WHOWA G?mig?o waoWEH !
`
`
`
`
`
`
`
`
`
`--------------
`
`Page 7
`
`

`

`U.S. Patent
`
`
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 7 of 46
`Sheet 7 of 46
`
`U.S. Patent OSQIADad
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`Page 8
`
`Page 8
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 8 of 46
`
`US 8,027,523 B2
`
`S
`
`SART
`
`FIG. 8
`
`DECODE
`MOVING iMAGE STREAM
`
`
`
`DETECTFACE IN
`STILL-iMAGE FRAME
`S4
`ra
`NO
`GO TONEX
`FRAME
`
`FACE
`DEECEO
`
`
`
`YES
`CACULATE
`FACE FEATURE VALUES
`ST6
`
`
`
`
`
`NOSE FACE
`DEECTED
`
`
`
`
`
`
`
`
`
`
`
`REMOVE NOISE FACE
`
`
`
`
`
`
`
`
`
`STS
`
`
`
`DETERMINE
`DENTCAFACES
`S9
`
`OENOAACES
`DETECE)?
`
`DENT:CALFACES
`firgiMG PROCESSENG
`
`GOO
`CUSERNG
`CANAE
`SO
`
`iOVINGRESTREA
`
`s
`s
`REACHED
`
`
`
`GO TONEXT
`FRAME
`
`FACE-CSERNG
`PROCESSNG
`
`S2
`
`Page 9
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 9 of 46
`Sheet 9 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`6SIs
`
`LINASDVHOLSNISDV (Vv)
`
`
`
`YOLOSLAGSOv4AGNOILOSLAG(g)
`
`(w)
`
`
`
`i
`
`Page 10
`
`Page 10
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 10 of 46
`Sheet 10 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`
`
`Page 11
`
`Page 11
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 11 of 46
`Sheet 11 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`SCSYONNHWusAa$
`
`suivd4O
`
`
`
`{gjeeyxid‘{g)peyxid
`
`‘{g)
`
`‘(p)pepad)) AMVNOLLOIG30s(@)
`(Mere‘(pera‘(pzerxid‘(perxd
`
`
`
`(}=Byeucos (NIIEISSVIO
`
`
`_3AVONVHLHSHLONSSANN)0<eyeucos
`ivad3yAML
`
`
`OLLNAWLSNray4O4Gsasn
`
`ONONLSLSOOgvay)
`Teeeeeeeeeaa
`!aSWw
`INSWSYNSVAW3YOOSSvs(vy)
`(es9+efeu00s=Byu0DS
`
`‘(zerxid“{(y
`
`)erxid
`
`‘(agesuid*{
`Z)beyxc
`
`‘(eeeyuid‘{¢
`
`Leyxid
`
`‘(y)zeyxid
`
`LL“SIs
`
`
`
`
`
`SSWIL7
`
`(es~eyai00s=ey"91008
`
`(erg>Ozerxd~(jrerxid
`
`
`
`(LNSLX3SWOS
`
`30V4-NON
`
`Ov
`
`“SNYL
`
`aSWw
`
`Page 12
`
`Page 12
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 12 of 46
`
`US 8,027,523 B2
`
`FIG. 12
`
`-
`
`s
`
`s
`
`/is al?o pix fa1(2)
`
`
`
`Page 13
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 13 of 46
`
`US 8,027,523 B2
`
`FIG. 14
`
`ST21
`
`S22
`
`SAR
`
`S NO at
`
`SCALNG NACCORDANCE
`WITH S NO
`
`S23
`
`SE DETECTON FRAME ON
`PER LEF,
`
`S24
`
`
`
`DETECTION
`FRAME MOVED TO LAST
`- POSITION?
`
`Page 14
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 14 of 46
`Sheet 14 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`FIG.15
`
`Page 15
`
`Page 15
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 15 of 46
`
`US 8,027,523 B2
`
`FIG. 16
`
`
`
`IM-2
`
`SMILEDICTIONARY
`
`48X48 ! IM-3
`SCORE sm
`
`Page 16
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 16 of 46
`
`US 8,027,523 B2
`
`
`
`SQSYGNNHTWHSASS
`
`SuivdJO
`
`TIWS(a)
`
`Zusxid
`
`ZwSXIG AMYNOILOIG
`(g}us2‘(eiuis9‘(enewsxid‘(ey)pusxid
`eeeeeeeeeeeeO=WS600s (HALISSVIO
`
`
`_3aAVONVHLYSHLOYASWNN)0<Use109s
`sae(Quis4WS@1008=WS™E1008
`OLINSWLSAPAVNOsGasn
`
`ONOMLSLSOOSvay)
`Loe
`INSWSYNSVSNSYOOSSUNS(y)
`
`()wS72~WSa1008=UUse1008
`
`(Juis”@>(jzws™xd~(jpus”xid
`
`ZiOld
`
`
`
`
`
`
`
`(INSLX3SWOS
`
`ATINS-NON
`
`‘asWd
`
`JAWS
`
`ane
`
`‘and
`
`STW
`
`Page 17
`
`Page 17
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 17 of 46
`
`US 8,027,523 B2
`
`FIG. 18
`
`( SART
`
`ST7
`
`SCORE sm - 0
`
`--S72
`
`ST73
`
`ST74.
`
`ST75
`
`SELECTUMINANCEVALUES
`FOR pix sm () AND pix sm2(1)
`
`CALCLA Epix Smt () - pix Sin2(1)
`
`Y
`ES
`SCORE sm - SCORE sm + y sm(I)
`
`S78
`SCORE sm - SCORE sm - a sm)
`
`
`
`
`
`
`
`Page 18
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 18 of 46
`Sheet 18 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`see
`q.
`Sa
`oe
`
`eet,
`
`S
`2
`
`ar
`
`J m
`
`
`
`atin,
`™
`*
`al
`
`png,
`fae“mcrae
`
`G
`
`~~
`2
`
`oO
`
`23.
`.
`.
`
`2o
`
`Page 19
`
`s
`
`>o
`
`O
`
`C
`oO
`wn
`“—
`
`CD
`
`li
`
`“Wael
`
`2s< a
`s
`
`eo
`
`Page 19
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 19 of 46
`
`US 8,027,523 B2
`
`FIG. 20
`
`
`
`LUMINANCE WALUES OF PXES
`INBLOCK (2x2 PIXELS) AND
`ADD THEM TO ONE ANOTHER
`
`Page 20
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 20 of 46
`
`US 8,027,523 B2
`
`FIG 21
`
`S-MAGE
`FRAME
`
`-ST101
`
`-
`CONVERT NO GRAYSCAE
`
`ST102
`
`DETECT FACE IMAGE AND
`| DETECT FACE ROTATION ANGLE
`
`S103
`
`
`
`CACUATE SME SCORE
`
`
`
`
`
`CA CATE CONRAS SCORE
`
`CALCULATEFACEFEATUREVALUE
`(IDENTIFICATIONFEATURE VECTOR)
`
`
`
`LASTFACE IMAGE? de
`YES ST08
`
`END
`
`Page 21
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 21 of 46
`
`US 8,027,523 B2
`
`FIG. 22
`
`FACE DATA
`
`ST
`
`CHECK SEDEFACE MAGE
`
`S12
`
`
`
`YAW ANGLEs.
`TRESHOLD RANGE
`
`
`
`S3
`NO
`
`YES
`
`CHECKBURRNGFACE MAGE
`
`Sri
`
`CONRAST SCORE >
`HRESHOLD WAUE
`
`
`
`
`
`MAGE IS NOT NOSE FACE MAGE
`AND S NO REOWE
`
`MAGES NOSE PACE MAGE
`AND S REOWED
`
`S 4.
`
`ST115
`
`END
`
`Page 22
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 22 of 46
`
`US 8,027,523 B2
`
`FG. 23A
`
`FIG. 23B
`
`
`
`YAW ANGE: -5 DEGREES
`( MAGE IS NO )
`SOE-FACE MAGE
`
`YAW ANGE: 50 DEGREES
`MAGES
`
`(RE sag)
`
`AND REMOVED
`
`FIG. 24A
`
`FIG. 24B
`
`
`
`SCORE 350
`MAGE IS NOT
`(
`BLURRING FACE IMAGE
`
`SCORE: 20
`
`( MAGE IS BURRENG )
`FACE MAGE
`AND REMOVED
`
`Page 23
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 23 of 46
`
`US 8,027,523 B2
`
`FACE DATAOF
`CURRENT FRAME
`S12
`
`START
`
`
`
`CACA DEGREE OF
`Ci SAERSEE, BE
`ROCESSEDAND PREVOUS
`DENTCAFACE MAGE
`
`
`
`COMPARE DEGREE OF
`SMARYWT
`THRESHOLD VALUETh
`
`
`
`O
`DEGREE OF
`SMILARITY: Th?
`
`
`
`YES
`
`DETERMINED TO BE
`DEN CAFACE
`
`FIG. 25
`
`CoSEAEPEREOF
`THRESHOO VALUET2
`S200
`
`DEGREE OF
`SMARTY 22?
`
`YES
`
`
`
`
`
`Si20
`
`COMPARE ENORMATON
`ON FACE DEECON FRAE
`AND INFORMATION ON TIME
`BEWEEN FRAMES
`
`
`
`YES
`ST125
`
`S232
`
`FACE
`DEECON FRAME
`AND TEME BETWEEN FRAMES
`SASFY CERAiN
`CONTON
`
`
`
`
`
`
`
`
`
`
`
`DETERMINE REPRESENTATIVE is ST126
`FACE MAGE OF
`WO FACE MAGES
`
`
`
`
`
`SORE ACE AA OF
`REPRESENTAVE FACE MAGE
`AND OSCAR FACE DATA OF
`FACE MAGE WHCH IS NOT
`RERESENAVE FACE MACE
`
`ST127
`
`S128
`
`END
`
`S2O3
`---
`NO
`
`LAST
`PREVIOUS FACE
`iMAGE
`YES
`DEERMNE OBE
`OFFERENTFACE
`
`STORE FACE DAA
`
`S.204
`
`-ST1205
`
`Page 24
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 24 of 46
`
`US 8,027,523 B2
`
`FIG.26A
`
`FIG. 26B
`
`PREVIOUS FACE MAGE
`
`CURREN FACE WAGE
`
`DEGREE OFSMARTY: 88
`
`FIG.27A
`
`FIG.27B
`
`PREVIOUS FACE MAGE
`
`CURREN FACE MAGE
`
`
`
`
`
`s
`
`s
`
`s
`
`
`
`---
`
`----
`DEGREE OF SMARTY: 3
`
`Page 25
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 25 of 46
`
`US 8,027,523 B2
`
`FIG. 28A
`
`DF
`
`FRAME
`
`FIG. 28B
`DF2
`
`
`
`FRAME3
`
`Page 26
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 26 of 46
`
`US 8,027,523 B2
`
`FIG. 29
`
`SAR
`
`ST3
`
`ST132
`
`FACE
`5A
`
`CACAE
`f----- SMARTYPEGREEMARX
`AND GENERATEFACE PAERS
`S133
`
`CSERING FOR AYERNG
`
`S134
`
`DEERMINE CLUSTER
`
`S135
`
`DETERNE
`REPRESENAVE FACE
`
`...
`
`f CSER
`AA
`
`ST36
`
`E
`
`Page 27
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 27 of 46
`
`US 8,027,523 B2
`
`FIG. 30
`
`SL212
`2 2 3 2
`%
`
`fn
`
`FIG. 31
`
`
`
`DEGREE OF
`
`(f7, f12)
`(f11, f9)
`(f15, f9)
`
`865
`
`Page 28
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 28 of 46
`
`US 8,027,523 B2
`
`
`
`
`
`
`
`
`
`
`
`CALCULATEDEGREE OF SIMILARITY s'
`BETWEEN TWO FACE MAGES
`
`
`
`NEST PAREXISTS?
`No
`GENERATEFACE PAIRLIST BY SORTING is ST144
`NORDER OF DEGREE OF SMARTY
`USNG SMARTY DEGREE MATRIX
`
`
`
`ST45
`
`END
`
`-
`
`Page 29
`
`

`

`Sep. 27, 2011
`
`Sheet 29 of 46
`
`U.S. Patent
`
`eeSis
`
`NOMLdIdOSaC
`
`
`
`SSCONWNGIAIGNIYOsSATSNOINN
`
`
`
`YSEWNANSGON
`
`(SGONMSAV1LSOWYaddNSHLYOdTINN)
`
`AGONYSAVTdaddflOLYSLNIOd
`
`
`AGONYSAVdaddh
`NOLLVAYOSNI
`
`VIVd
`‘MAAVLSOWYSMO40(SACONWNINUSL)SSAV3TTVJOLSIaI30v4
`
`
`
`
`JOONWNINMSL4OSSVONIGSQNTONISIQE30V4FTONIS
`
`
`
`
`
`US 8,027,523 B2
`
`Page 30
`
`(SCGONY3AVTLSOWHSMO1NOTIONSuvHLOg)
`
`
`
`ACONSAV)HSMO1OLYALNIOd
`
`SOONYSAVTYSMO1
`
`NOLLVAYOANI
`
`isnsvi
`
`Page 30
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 30 of 46
`
`US 8,027,523 B2
`
`FIG.34
`
`(29)
`
`DODD@O@OADOA@O@@GDDG
`f5Fit£9f46f141310~~2f5
`
`F142
`
`{3
`
`1
`
`Page 31
`
`Page 31
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 31 of 46
`
`YOSad SeOld
`
`
`FCIONHISOLMaINIOd|NOLLVWHOSNISCONYSAV1HSMOT
`
`AQONH1L92OLHALNIOd|NOLLVAHOINISQONSSAVdaddl
`AQONHi8)OLHSLNIOdre,
`
`Gdbbs6}ishava
`
`
`
`US 8,027,523 B2
`
`Page 32
`
`NOlid
`
`YSSWNNSCON
`
`Page 32
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 32 of 46
`
`US 8,027,523 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`GENERATE EAVES N
`LOWERVOS LAYER
`CORRESPONDING TO
`THE NUMBER OF FACES
`
`S53
`
`FACE
`REFERTOFACEPARLIST | -- PARST
`
`St 154
`NODE N WHEC
`CERTAINPAR SINCLUDED iN
`EAF STEREOF
`EXISS
`
`NO
`
`GENERATE PARENT NODE
`FROM TWO NODES INCLUDING ---------->
`CERAN PAR
`:
`
`
`
`
`
`NFORMATION
`ONAYERED
`CONFIGURAON
`OF NODES
`
`
`
`S56
`
`AS PAR
`
`O
`
`YES
`
`END
`
`ST157
`
`Page 33
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 33 of 46
`Sheet 33 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`o |
`FOREACHNODEz2S
`
`
`
`g
`FIG.37
`
`2.
`yk
`we
`i=
`sz
`fie
`it
`e
`=u
`On
`
`f
`
`NN
`
`% ‘y
`
`oy
`
`.
`a{
`A -
`1
`3.
`ey,
`,
`Y.
`
`. . .
`
`w
`
`.
`
`.
`
`ar (O)
`o1 Y.
`v. ^
`Oey
`ow
`-
`—
`()
`. . .
`
`t
`
`oy
`.
`x
`‘’A
`is 2.
`[ee=.
`. a~
`v. a.
`-
`a
`
`. A
`
`r
`
`jh @)

`
`ave
`
`a
`
`=C
`
`T
`
`-
`
`s
`
`.
`+
`io* Nee
`
`ae- —_—
`Nee!
`
`Page 34
`
`* oe
`
`i*
`Oe
`
`Page 34
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 34 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`8ShiS
`
`eShls
`
`HOVWLSOL
`
`AGON-dOd
`
`
`
`
`
`
`
`
`
`
`
`
`G3YSAV]NONOLLVWYOINI98¢F“Oi4
`
`waismio§“~
`
`
`
`ISV1SVSQONINSYYNDSHOLS
`iQL1SON|
`
`SHCON4ONOLLYYNODINOD
`LSIdaiSadNIYSLSnTd
`
`
`
`AQONLSOWeaddlHSfid
`
`MOVISWOddSCONdOd
`
`SQONINSYYNO40HSANVHL
`
`Viva
`
`WOVLSOL
`
`
`
`YaMO7YaAVTNiSQONHSNd
`
`Page 35
`
`Page 35
`
`
`
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 35 of 46
`
`US 8,027,523 B2
`
`b8LLS
`
`a3uO30|
`
`ON
`
`G2i18
`
`SOVESAV
`ALSVIIWIS/~0---------
`ONIOUSW-USAOSVSNINUSLSC
`
`
`,SOONNi3au9aCALYVTIAISJOVESAYSLYINOTWOCNY
`
`STASUALVadSOVSOVESAYGNVSIVASunLV3sOVS
`
`
`
`
`
`
`WOudSNIVASUNLVadd0VdSOVYESAVSLVINOWO
`
`
`
`
`ONISNSSYOICALVISTWACIAIONISLVINOWO
`
`
`
`
`
`SSMSALVIIWISJOVESAY~nrrorrrnnnrerent
`ONIOUSN-USAO-NONSVSNIMWSLI0
`
`
`6cIOHSAMHL<33030ALYYTAISTWACIAIGNI
`
`GAQONNISOVI3OWIISYT
`
`
`AGONNiNOLEYWYOSNI4V37
`
`
`SGONNIOVIdowdDONdO
`
`bZbLS
`
`LYV1s
`
`suflvas
`
`
`
`VLVGSNTVA
`
`dV
`
`6€“Sls
`
`SIATgpg>|SATIVASUAVEOWSNIVLGO
`
`
`
`UICIOHSRUHL<a
`
`
`
`dalawed
`
`ONSLSNTO
`
`ONILLAS
`
`UaLSWVevd
`
`Page 36
`
`Page 36
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 36 of 46
`
`US 8,027,523 B2
`
`N/N/N/N/NYN/NZNZNSNZNZNSNSNYNZ
`
`Hl
`
` O07“Dis
`
`LYOLOSAANWASYNLVSsWOOT
`
`Page 37
`
`Page 37
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 37 of 46
`Sheet 37 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`aeOLSBLS1|DODODODOOHDO®DDOO®
`><j\+1?|Pas12.‘f‘‘..z4‘..We8hbe91.
`se;i4*.éf‘.‘%iomej‘‘Se3\Piwet‘!4a1*,“1“..:
`f4‘p1r%‘1;"[..,;‘4:
`
`Peemeneaneeneeennneoe7‘f,:t.4ii“i‘.;*,*
`‘,ii<xt.oyai¥,#twtibe¥;¢ce.t*vet*at‘
`
`G) GÐ
`
`*
`
`*... »
`
`ms‘*“taweaa.P4..4x7~,,*
`
`i"‘.eoBeet
`
`‘,os
`teay-a
`
`.iws,we
`
`’~»‘4.
`2f.._..oR‘meYween
`
`.;eonst1.t’i‘..7®.42¥wtx,;rais‘oweeet‘BeeemeeeenemeemmSemee\..o%%is‘%y.‘i+anes1‘t.Le,'‘
`.-...
`wea&x~~tome»ei:
`...MeahaaSeahHeetaToc
`
`G)
`
`§3
`
`Z4 044
`
`$ $4 @ @ @ @
`
`Page 38
`
`Page 38
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 38 of 46
`
`US 8,027,523 B2
`
`FIG. 42
`
`STAR
`
`S9.
`
`sort NASCENDINGoRDER is
`OF ANGLE SCORE
`
`FARESERENG
`CSTEING /.r FACE MAGES HAVING
`PARAMETER
`ANGLE SCORE SWAER HAN
`FRON-FACE DETERMNAON
`HRESOL VALUE
`
`S93
`
`SORT N DESCENONG ORDER
`OF SME SCORE
`
`94.
`
`CSERNG
`SETTING
`f --------e
`PARAWEER
`
`
`
`
`
`
`
`
`
`NARROW DOWN TO SECON)
`FACE MAGE GROtj-NC)iNG
`FACE MAGE HAVING
`SMILE SCORE ARGER AN
`SME TRESHOLD WAUE
`
`SORT N DESCENDNG ORDER
`OF CONTRAST SCORE
`
`S95
`
`S96
`
`DETERMNE FACE IMAGE HAVINGS 97
`ARGEST CONTRAST SCORE AS
`REPRESENTAVE FACE MAGE
`
`S98
`
`Page 39
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 39 of 46
`
`US 8,027,523 B2
`
`
`
`FG. 43A
`
`
`
`FG, 4.3B
`
`NON-FRONT-FACE
`
`FIG. 43C
`
`FRON AND UNEVENUMINANCE
`
`
`
`FG. 43D
`
`Page 40
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 40 of 46
`
`US 8,027,523 B2
`
`FIG. 44
`
`
`
`A, A, A, B, C, A, D, A, A
`
`OVER-MERGING
`
`OVER-DVDNG
`
`Page 41
`
`

`

`U.S. Patent
`U.S. Patent
`
`
`
`
`
`
`
`
`
`SY
`VI
`
`Sep. 27, 2011
`Sep. 27, 2011
`
`Sheet 41 of 46
`Sheet 41 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`
`
`LInsadONILIIGadd
`
`
`
`WALOV
`
`LIASSY
`
`Page 42
`
`Page 42
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 42 of 46
`
`US 8,027,523 B2
`
`
`
`| | | | | | | | | | | | |
`
`(Z'z) o
`
`-------------------------------------------------------------------------------(sºlo || 0 || 0 || Es?aismo
`
`Z &#31SñTO
`
`Page 43
`
`

`

`Sheet 43 of 46
`
`US 8,027,523 B2
`
`as
`
`U.S. Patent
`
`Sep. 27, 2011
`
`
`
`
`
`waelsmoÅdO?N3 8 NOSHEdº. O v NOSHEdº. O
`
`Page 44
`
`

`

`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 44 of 46
`
`US 8,027,523 B2
`
`
`
`
`
`v Hälsnio
`
`8 NOSHHd: O w Nos?ad: O
`
`
`
`
`
`„ÄgC8 i NE; NO??WHS
`
`ÅdÓ?N3
`
`Page 45
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 45 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`8ysalsnD
`
`aNOsuad:@yNOSYad:C)
`
`
`
`VeSLsnto
`
`‘(d-,)Bo(d-1)~dBojd—=(
`
`Page 46
`
`Page 46
`
`
`

`

`U.S. Patent
`U.S. Patent
`
`Sep. 27, 2011
`
`Sheet 46 of 46
`
`US 8,027,523 B2
`US 8,027,523 B2
`
`¥0S
`
`O
`O
`CD
`
`
`
`SOS
`
`FOVANSALNIO/T
`
`
`
`
`
`
`
`s
`
`60S805
`
`609809109909
`
`Page 47
`
`Page 47
`
`
`

`

`US 8,027,523 B2
`
`1.
`IMAGE PROCESSINGAPPARATUS, IMAGE
`PROCESSING METHOD, AND PROGRAM
`
`BACKGROUND OF THE INVENTION
`
`2
`mination unit configured to determine whether a first face
`image which is included in a current frame and which is
`detected by the face detector and a second face image which
`is included in a previous frame and which has been detected
`and stored by the face detector represent an identical person in
`accordance with at least face feature values of the first and
`second face images calculated by the face-feature-value cal
`culation unit, and a merging processor configured to store
`only one of the first and second face images when the identity
`determination unit determined that the first face image and the
`second face image represent an identical person, and to store
`both the first and second face images when the identity deter
`mination unit determined that the first face image and the
`second face image do not represent an identical person.
`In this embodiment, the face detector detects the face
`images included in the still-image frames Successively
`extracted from the moving-image stream by the face detector
`in accordance with the image information items regarding the
`still-image frames. Note that, although the face images are
`detected in this embodiment, images of a certain object may
`be generally detected.
`For example, the moving-image stream includes
`intraframes at predetermined intervals. The image informa
`tion items regarding the still-image frames are successively
`extracted from the moving-image stream by performing data
`decompression processing on image information items of the
`intraframes.
`The face-feature-value calculation unit calculates the face
`feature values of the face images detected by the face detector.
`The face-feature-value calculation unit detects face-feature
`positions, such as positions of both ends of an eyebrow, both
`ends of an eye, the center of the eyebrow, and the center of the
`eye, and calculates face feature values (local-feature-value
`vectors) in the face-feature positions using a convolution
`operation such as Gabor Filter.
`An identical person appears in the moving-image stream.
`Therefore, a plurality of face images representing an identical
`person are included in the face images detected in accordance
`with the image information items regarding the still-image
`frames successively extracted from the moving-image
`stream. When a character included in the moving-image
`stream is to be extracted, only a single face image is finally
`determined for the character.
`The identity determination unit determines whether the
`first face image detected in the current frame and the second
`face image detected in the previous frame which has been
`stored represent an identical person in accordance with at
`least the face feature values of the first and second face images
`calculated by the face-feature-value calculation unit. The
`identity determination unit may obtain a degree of similarity
`between the first and second face images in accordance with
`the face feature values of the first and second face images, and
`may compare the degree of similarity with a threshold value
`so as to determine whether the first and the second face
`images represent an identical person.
`The identity determination unit may determine whether the
`first and second face images represent an identical person in
`accordance with, in addition to the face feature values of the
`first and second face images, at least detection-frame infor
`mation items regarding the first and second face images or
`information on an interval between frames of the first and
`second face images.
`The identity determination unit may obtain a degree of
`similarity between the first and second face images in accor
`dance with the face feature values of the first and second face
`images, determine that the first and second face images rep
`resent an identical person when the degree of similarity is
`
`1. Field of the Invention
`The present invention relates to image processing appara
`tuses, image processing methods, and programs. More par
`ticularly, the present invention relates to an image processing
`apparatus which extracts characters through the following
`analysis performed within a short period of time. The image
`processing apparatus detects face images (images of a prede
`termined object) included in still-image frames Successively
`obtained from a moving-image stream, determines whether a
`person corresponding to a face image detected in a current
`frame is the same as a person corresponding to a face image
`which is detected in a previous frame and which has been
`stored, in accordance with face feature values of the two face
`images, and stores one of the two face images when the
`determination is affirmative.
`2. Description of the Related Art
`In recent years, opportunities of capturing moving images
`have been increased since camcorders and digital still cam
`eras which employ hard disks and memory cards as recording
`media have been widely used. Various methods, such as a
`method for detecting highlights using moving image analy
`sis, have been proposed in order to quickly retrieve and view
`desired moving-image files and scenes from many moving
`image files which have been recorded. An example of Such a
`method for improving ease of retrieval and ease of viewing of
`moving images includes a method for extracting characters in
`a moving-image file employing a face detection technique
`and a face identifying technique. Other similar methods have
`been proposed.
`Japanese Unexamined Patent Application Publication No.
`2008-77536, for example, discloses a method for performing
`face tracking on adjacent frames in a still-image sequence
`obtained by decoding a moving-image file so that face areas
`of identical persons are determined, and finally performing
`clustering in order to distinguish characters.
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`SUMMARY OF THE INVENTION
`
`45
`
`In the method disclosed in Japanese Unexamined Patent
`Application Publication No. 2008-77536, a full frame of the
`moving-image file or an almost full frame of the moving
`image file should be input so that the face tracking is accu
`rately performed. This method is suitable for a case where the
`face tracking is performed during shooting. However, when a
`moving-image file is to be processed after shooting, the mov
`50
`ing-image file should be fully decoded. When full decoding is
`performed on a moving-image file for a high-definition tele
`vision which has been used in recent years, considerably long
`analysis time is necessary. Therefore, the method disclosed in
`Japanese Unexamined Patent Application Publication No.
`2008-77536 is not practical.
`It is desirable to effectively extract characters within a short
`period of time for analysis.
`According to an embodiment of the present invention,
`there is provided an image processing apparatus including a
`face detector configured to detect face images from still
`image frames Successively extracted from a moving-image
`stream in accordance with image information items regarding
`the still-image frames, a face-feature-value calculation unit
`configured to calculate face feature values of the face images
`in accordance with image information items regarding the
`face images detected by the face detector, an identity deter
`
`60
`
`65
`
`55
`
`Page 48
`
`

`

`US 8,027,523 B2
`
`3
`equal to or larger than a first threshold value, and determine
`that the first and second face images represent an identical
`person when the detection-frame information items regarding
`the first and second face images and the information on an
`interval between frames of the first and second face images
`satisfy predetermined conditions and when the degree of
`similarity is smaller than the first threshold value and equal to
`or larger than a second threshold value.
`The predetermined condition for the detection-frame infor
`mation items may include a first condition in which a distance
`between a center of a detection frame of the first face image
`and a center of a detection frame of the second face image is
`Smaller than a threshold value, and a second condition in
`which an ratio of an area of the detection frame of the first face
`image to an area of the detection frame of the second face
`image is in a range from a first threshold value to a second
`threshold value. The predetermined condition for the infor
`mation on a frame interval may correspond to a condition in
`which an interval between frames of the first and second face
`images are smaller than a threshold value.
`When the identity determination unit determined that the
`first and second images represent an identical person, the
`merging processor stores one of the first and second face
`images. When the identity determination unit determined that
`the first and second images do not represent an identical
`person, the merging processor stores both the first and second
`face images.
`In this embodiment, the face images included in the still
`image frames Successively extracted from the moving-image
`stream are detected, and a determination as to whether the
`face image detected in the current frame and the face image
`detected in the previous frame represent an identical person is
`made in accordance with the face feature values of the face
`images. When the determination is affirmative, only one of
`the face images is stored.
`In this case, the still-image frames from which the face
`images are detected are extracted every one second, for
`example. Therefore, since the number of frames to be ana
`lyzed is Small, characters are extracted with a short analysis
`time. For example, a MPEG stream or an AVC stream is
`employed, merely intraframes included this stream in prede
`termined intervals are decoded to be used. That is, a so-called
`full decoding is not necessarily, and therefore, reduction of
`the analysis time is attained.
`As described above, since the identity determination unit
`determines whether the first face image and the second face
`image represent an identical person in accordance with at
`least the detection-frame information items regarding the first
`and second face images or the interval between the frames of
`the first and second face images, determination accuracy is
`enhanced.
`In a case where the degree of similarity between the first
`and second face images which is calculated in accordance
`with the face feature values of the first and second face images
`is low due to a lighting condition even though the first face
`image and the second face image represent an identical per
`son, it is determined that the first face image and the second
`face image represent an identical person taking whether the
`detection-frame information items regarding the first and sec
`ond face images and information on the interval between the
`frames of the first and second face images satisfy predeter
`mined conditions into consideration.
`The image processing apparatus may further includes a
`face-rotation-angle detector configured to detect face-rota
`tion angles representing angles of faces represented by the
`face images detected by the face detector, and a noise-face
`removing unit configured to remove, from among all the face
`
`40
`
`45
`
`4
`images detected by the face detector, face images having
`face-rotation angles in a predetermined direction relative to
`the front which are larger than a threshold value, in accor
`dance with information items regarding the face-rotation
`angles detected by the face-rotation-angle detector.
`As for images representing a face which faces considerably
`sideways, a face which faces considerably upward, and a face
`which faces considerably downward, it is possible that face
`feature values are not accurately obtained by the face-feature
`value calculation unit, and accordingly, the determination
`accuracy of the identity determination unit may be degraded.
`As described above, by removing the face images having
`face-rotation-angles in a predetermined direction relative to
`the from which are larger than the threshold value, images
`representing a face which faces considerably sideways, a face
`which faces considerably upward, and a face which faces
`considerably downward are removed in advance. Accord
`ingly, the determination accuracy of the identity determina
`tion unit is prevented from being degraded.
`The image processing apparatus may further include a
`contrast score calculation unit configured to calculate con
`trast scores representing contrast of the face images in accor
`dance with the image information items regarding the face
`images detected by the face detector, and a noise-face remov
`ing unit configured to remove face images having contrast
`scores, which have been calculated by the contrast score
`calculation unit, Smaller than a threshold value from among
`all the face images detected by the face detector.
`It is highly possible that face feature values of blurred face
`images having considerably low contrast scores are not accu
`rately calculated resulting in deterioration of the determina
`tion accuracy of the identity determination unit. As described
`above, by removing the face images having the contrast
`scores Smaller than a threshold value, the blurred face images
`having considerably low contrast scores are removed in
`advance. Accordingly, the determination accuracy of the
`identity determination unit is prevented from being degraded.
`The image processing apparatus may include a face clus
`tering unit configured to assign the face images stored by the
`merging processor to clusters at least in accordance with the
`face feature values calculated by the face-feature-value cal
`culation unit so that face images representing an identical
`person are assigned to a single cluster.
`When the end of the moving-image stream is reached, the
`merging processor stores a predetermined number of face
`images in accordance with image data items corresponding to
`the still-image frames successively extracted from the mov
`ing-image stream. The face clustering unit performs cluster
`ing processing in accordance with at least the feature values
`calculated by the feature value calculation unit so that, among
`the face images stored by the merging unit, face images
`representing an identical person are assigned to a single clus
`ter.
`As described above, when the merging processor deter
`mined that the face image of the current frame and the face
`image in the previous frame which has been stored represent
`an identical person, one of the face images is stored. In this
`way, when the end of the moving-image stream is reached, the
`number of face images ultimately stored in the merging pro
`cessor is reduced. Therefore, reduction of processing time of
`the face clustering unit is reduced.
`The face clustering unit may include a similarity degree
`calculation unit, a layering/clustering unit, and a cluster
`determination unit. The similarity degree calculation unit
`may calculate degrees of similarity of individual pairs of face
`images extracted from the face images stored by the merging
`processor in accordance with the face feature values of the
`
`5
`
`10
`
`15
`
`25
`
`30
`
`35
`
`50
`
`55
`
`60
`
`65
`
`Page 49
`
`

`

`US 8,027,523 B2
`
`10
`
`15
`
`25
`
`30
`
`40
`
`5
`corresponding pairs of face images. The layering-and-clus
`tering unit may assign the face images stored by the merging
`processor to individual clusters, and Successively merge clus
`ters including each of the pairs of face images in accordance
`with the degrees of similarity of the pairs of face images
`calculated by the similarity degree calculation unit in a
`descending order of the degrees of si

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket