`(12) Patent Application Publication (10) Pub. No.: US 2016/0187199 A1
`Brunk et al.
`(43) Pub. Date:
`Jun. 30, 2016
`
`US 2016O1871.99A1
`
`(54) SENSOR-SYNCHRONIZED
`SPECTRALLY-STRUCTURED-LIGHT
`MAGING
`
`(71) Applicant: Digimarc Corporation, Beaverton, OR
`(US)
`(72) Inventors: Hugh L. Brunk, Portland, OR (US);
`Geoffrey B. Rhoads, West Linn, OR
`(US); Cynthia Archer, Sherwood, OR
`(US); Arlie Conner, Portland, OR (US)
`(21) Appl. No.: 14/836,878
`
`(22) Filed:
`
`Aug. 26, 2015
`O
`O
`Related U.S. Application Data
`(60) Provisional application No. 62/042,127, filed on Aug.
`26, 2014, provisional application No. 62/054,294,
`filed on Sep. 23, 2014.
`Publication Classification
`
`(51) Int. Cl.
`GOI 3/28
`G06K 9/46
`H04N 9/04
`
`(2006.01)
`(2006.01)
`(2006.01)
`
`(2006.01)
`(2006.01)
`(2006.01)
`
`H04N 5/235
`G06K9/62
`H04N 5/225
`(52) U.S. Cl.
`CPC ............. G0IJ 3/2823 (2013.01); G06K9/6269
`(2013.01); H04N5/2256 (2013.01); H04N
`9/045 (2013.01); H04N 5/2354 (2013.01);
`G06K 9/4652 (2013.01); G06K9/466
`(2013.01); G06K 2009/4657 (2013.01); G0IJ
`2003/2826 (2013.01)
`
`ABSTRACT
`(57)
`An image capture device, such as a Smartphone or point of
`sale scanner, is adapted for use as an imaging spectrometer,
`by synchronized pulsing of different LED light sources as
`different image frames are captured by the image sensor. A
`particular implementation employs the CIE color matching
`functions, and/or their orthogonally transformed functions, to
`enable direct chromaticity capture. These and various other
`configurations of spectral capture devices are employed to
`capture spectral images comprised of spectral vectors having
`multi-dimensions per pixel. These spectral images are pro
`cessed for use in object identification, classification, and a
`variety of other applications. Particular applications include
`produce (e.g., fruit or vegetable) identification. A great vari
`ety of other features and arrangements are also detailed.
`
`
`
`Petitioner's Exhibit 1022,
` Page 1 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 1 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`S.
`
`Petitioner's Exhibit 1022,
` Page 2 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 2 of 73
`
`US 2016/O187199 A1
`
`rt 2
`
`Typicai spectari 88px}:se curwes, 88, af
`racier 8ayer-Eiserecipixels
`
`3.8
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 3 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 3 of 73
`
`US 2016/O187199 A1
`
`FIGURE 3
`
`
`
`
`
`120
`
`Petitioner's Exhibit 1022,
` Page 4 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 4 of 73
`
`US 2016/O187199 A1
`
`FIGURE 4
`
`
`
`*:::::::::::::::::::::
`
`Petitioner's Exhibit 1022,
` Page 5 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 5 of 73
`
`US 2016/O187199 A1
`
`FIGURE 5
`
`
`
`
`
`************************************************** ±
`-±
`
`Petitioner's Exhibit 1022,
` Page 6 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 6 of 73
`
`US 2016/O187199 A1
`
`FIGURE 6
`
`
`
`Petitioner's Exhibit 1022,
` Page 7 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 7 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 8 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 8 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 9 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 9 of 73
`
`US 2016/O187199 A1
`
`FigtjRE :
`
`
`
`:Spect:E::fix is:
`fissix 228, 38&ix&s
`^igits
`
`Petitioner's Exhibit 1022,
` Page 10 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 10 of 73
`
`US 2016/O187199 A1
`
`FRE:
`
`ifiefest’8textive sp3:8:a:
`s38sig (kisses, teke
`&:8:28 3x38xe &
`8st-scist
`E3, g3rd R.
`
`
`
`Petitioner's Exhibit 1022,
` Page 11 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 11 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 12 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 12 of 73
`
`US 2016/O187199 A1
`
`
`
`&& !!! ######
`
`
`
`Petitioner's Exhibit 1022,
` Page 13 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 13 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 14 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 14 of 73
`
`US 2016/O187199 A1
`
`FIGURE 14
`
`3rt
`8:
`{:
`
`
`
`88, 3.38 Sir
`x
`
`s
`
`$3.
`
`s
`8
`
`M
`
`f
`
`&
`
`Petitioner's Exhibit 1022,
` Page 15 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 15 of 73
`
`US 2016/O187199 A1
`
`FIGURE 15
`
`
`
`&
`
`:::::
`
`::::::
`
`::::::
`
`88:
`
`88:
`
`88:
`
`; ::::
`
`Petitioner's Exhibit 1022,
` Page 16 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 16 of 73
`
`US 2016/O187199 A1
`
`FIGURE 16
`
`
`
`
`
`
`
`
`
`if:
`
`3:
`
`::
`
`X:
`,8
`
`Y:
`3.
`
`&
`
`: s:
`
`te.
`
`Petitioner's Exhibit 1022,
` Page 17 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 17 of 73
`
`US 2016/O187199 A1
`
`FG. L7
`
`g = Hf
`
`X
`(88.488.8835):
`X
`w
`
`w
`8.
`x
`2 p.
`
`s
`
`23
`88
`33
`88
`s:
`22
`8:
`3:
`$3
`38
`2:3
`
`s
`
`
`
`goye.8s
`
`six six
`is
`8.8
`
`ass,8.
`go
`38
`
`Petitioner's Exhibit 1022,
` Page 18 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 18 of 73
`
`US 2016/O187199 A1
`
`
`
`FIGURE 18
`
`Flexibility on the Sensor-side
`
`Saeirassifiery
`
`3. s s
`
`ength N&rtoireters
`
`480: sample digital camera
`pixel-response spectral profiles
`
`Petitioner's Exhibit 1022,
` Page 19 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 19 of 73
`
`US 2016/O187199 A1
`
`RE is
`
`Ewers text 8 fiexiiility of the
`i.E. s.txtai-stage &isie
`
`
`
`kara's 3-axis
`Ex3rgies
`
`Petitioner's Exhibit 1022,
` Page 20 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 20 of 73
`
`US 2016/O187199 A1
`
`FURE
`
`Even cre flexibility of the
`EEspectra-shape side
`
`cost.
`
`typical 'white'. EEs at various effective
`color teaperatures
`
`x8xxx
`w
`
`38
`
`88:
`8:
`seeier8;
`8.
`
`
`
`Petitioner's Exhibit 1022,
` Page 21 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 21 of 73
`
`US 2016/0187199 A1
`
`Fis
`
`refixitxity is tiss
`Ewe
`E8 spectsai-shape side
`
`ex}{nt.
`
`Exa:gies of the ability to desigs spectra
`shapes airex at certairs agpicatigrks, this
`site for Fixsessence kicroscopy
`
`
`
`Petitioner's Exhibit 1022,
` Page 22 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 22 of 73
`
`US 2016/O187199 A1
`
`
`
`(uuu GOG) uÐ349 ºpuÐA
`
`
`
`
`
`
`
`(uu 8ZG) u 3349 and }.
`
`
`
`Petitioner's Exhibit 1022,
` Page 23 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 23 of 73
`
`US 2016/O187199 A1
`
`FIGURE 23
`
`
`
`Bayer CMOS Spectral Response
`
`: . . . RED (Meas)
`- - - GREEN (Meas)
`- - - BLUE (Meas)
`... RED (CMF)
`GREEN (CMF)
`BLUE (CMF)
`
`Petitioner's Exhibit 1022,
` Page 24 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 24 of 73
`
`US 2016/O187199 A1
`
`88
`
`is
`&
`
`
`
`w s:
`
`Petitioner's Exhibit 1022,
` Page 25 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 25 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 26 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 26 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 27 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 27 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 28 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 28 of 73
`
`US 2016/O187199 A1
`
`&
`
`:
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 29 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 29 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 30 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 30 of 73
`
`US 2016/O187199 A1
`
`
`
`i
`
`
`
`Petitioner's Exhibit 1022,
` Page 31 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 31 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 32 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 32 of 73
`
`US 2016/0187199 A1
`
`
`
`
`
`
`
`FASH PORTON
`
`
`
`
`
`DRVE
`CRCUT
`
`NTERFACE
`
`
`
`I/O CONNECTOR
`
`
`
`CAMERA SENSOR
`
`
`
`CAMERA CONTRO
`MODULE
`
`WREESS
`(BLUETOOTH, WiFi,
`RFID, E C.)
`
`PROCESSOR
`
`FIG. 31A
`
`SMARTPHONE
`
`Petitioner's Exhibit 1022,
` Page 33 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 33 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 34 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 34 of 73
`
`US 2016/O187199 A1
`
`3333333C;
`
`
`
`/~^ \ /
`
`Petitioner's Exhibit 1022,
` Page 35 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 35 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`(palenbs-isipºn + º)
`
`obs93
`
`'o' ang ,
`
`
`
`ole ov;
`
`Petitioner's Exhibit 1022,
` Page 36 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 36 of 73
`
`US 2016/O187199 A1
`
`'8',useres:ç
`
`aveea?i:
`
`§§
`
`$3
`
`
`
`
`
`
`
`
`
`
`
`98 HX109||-||
`
`¿? ¿quae
`
`?
`
`
`
`**$)) ***
`
`$3
`
`Petitioner's Exhibit 1022,
` Page 37 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 37 of 73
`
`US 2016/O187199 A1
`
`
`
`. ,,-,************-º-º-.......
`
`
`
`
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 38 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 38 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`'. ,
`
`-
`
`:
`
`-
`
`??),
`
`????
`ºp v
`
`Petitioner's Exhibit 1022,
` Page 39 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 39 of 73
`
`US 2016/O187199 A1
`
`Raw image
`
`Ambient Raw
`
`1OO
`
`Reverse
`Gamma
`
`Other channels
`
`
`
`
`
`
`
`
`
`
`
`Compute color
`temperature of
`ambient light (relative
`spectrum of ambient
`fight)
`
`Spectricity Vectors
`
`Fig. 38
`
`Petitioner's Exhibit 1022,
` Page 40 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 40 of 73
`
`US 2016/O187199 A1
`
`
`
`Sum over patch
`
`Normalize by
`3X
`
`132
`
`34
`
`136
`
`Coupling Factor For Channel
`Fig. 40
`
`Petitioner's Exhibit 1022,
` Page 41 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 41 of 73
`
`US 2016/O187199 A1
`
`
`
`u3349
`
`:
`
`
`
`
`
`
`
`
`
`
`
`
`
`s.s.............
`
`s.s.............
`
`
`
`000?
`
`000 £
`
`8 :
`-8-
`009
`
`sessssssssssssssss
`
`8 : 8
`
`:
`
`8
`8
`8
`8
`8
`8
`8
`8
`8
`8
`8
`8
`8
`--------------
`S.
`S
`3.
`
`r
`
`al
`r:
`
`: ×
`
`Petitioner's Exhibit 1022,
` Page 42 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 42 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 43 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 43 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 44 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 44 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 45 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 45 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 46 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 46 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 47 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 47 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 48 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 48 of 73
`
`US 2016/O187199 A1
`
`
`
`T. G.
`
`31+
`
`Petitioner's Exhibit 1022,
` Page 49 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 49 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 50 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 50 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 51 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 51 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 52 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 52 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 53 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 53 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 54 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 54 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 55 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 55 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 56 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 56 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 57 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 57 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`09 '$1H
`
`Petitioner's Exhibit 1022,
` Page 58 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 58 of 73
`
`US 2016/O187199 A1
`
`Powerful Cassifier
`
`AAAAA
`
`BBBBB
`
`AA
`
`AAA
`
`BB
`
`BBB
`
`Fig. 61
`
`Petitioner's Exhibit 1022,
` Page 59 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 59 of 73
`
`US 2016/O187199 A1
`
`Poor Classifier
`
`AAAAABBBBB
`
`AAABBB
`
`AABB
`
`Petitioner's Exhibit 1022,
` Page 60 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 60 of 73
`
`US 2016/O187199 A1
`
`
`
`taaaaaai
`
`Petitioner's Exhibit 1022,
` Page 61 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 61 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`i
`
`s
`
`i
`
`s
`
`Petitioner's Exhibit 1022,
` Page 62 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 62 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`GOO
`
`Petitioner's Exhibit 1022,
` Page 63 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 63 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`
`
`
`
`
`
`N
`
`Petitioner's Exhibit 1022,
` Page 64 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30,2016 Sheet 64 of 73
`
`US 2016/0187199 Al
`
`Ww
`
`¢2&© =
`
`7
`
`7
`
`4
`
`ra
`
`7
`
`7
`
`Z
`
`4s
`
`>
`o
`
`aee
`n
`c
`wo
`<s
`2
`=
`=
`
`7
`
`“
`
`“
`
`4
`
`4
`
`7
`
`of
`
`<
`
`4
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`/
`
`f
`
`i
`
`i
`
`f
`
`i
`
`f
`
`f
`
`/
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`\
`
`/
`
`/
`
`é
`
`}
`
`/
`
`f
`
`i
`
`t
`
`/
`\
`~
`Sy
`NOV ff
`
`\
`
`N
`
`~
`
`\
`
`‘
`
`7
`
`7
`
`,
`
`Z
`
`¢
`
`N
`
`‘“
`
`XN
`
`~
`
`aN
`
`‘
`
`N
`
`x
`
`\
`
`~N
`
`XN
`
`\
`
`N
`
`~
`
`XN
`
`\
`
`x
`
`‘
`
`~N
`
`\
`
`
`
`Fig.67
`
`Petitioner's Exhibit 1022,
`Page 65 of 115
`
`Petitioner's Exhibit 1022,
` Page 65 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 65 of 73
`
`US 2016/O187199 A1
`
`w
`
`a
`
`
`
`f
`
`a SS O
`
`U
`
`Petitioner's Exhibit 1022,
` Page 66 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 66 of 73
`
`US 2016/O187199 A1
`
`C
`
`t
`s
`
`
`
`
`
`HHHHHHHHHHHHHHHHHHHHHH
`
`f
`
`S 5
`(9 83
`
`
`
`
`
`Petitioner's Exhibit 1022,
` Page 67 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 67 of 73
`
`US 2016/O187199 A1
`
`a
`
`w
`
`
`
`
`
`i
`
`O
`
`Petitioner's Exhibit 1022,
` Page 68 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 68 of 73
`
`US 2016/O187199 A1
`
`
`
`
`
`C
`
`t
`s
`
`s
`
`s
`
`Petitioner's Exhibit 1022,
` Page 69 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 69 of 73
`
`US 2016/O187199 A1
`
`
`
`Calculate Spectral
`Vectors
`
`200
`
`Determine
`Distribution of
`Spectral Values
`from images
`captured of object
`
`r-202
`
`Extract Spectral ---204
`Distribution Values
`
`Calculate Texture
`Features
`
`Determine
`Distribution of -- 208
`Texture Features
`
`Extract Texture
`Feature
`Distribution Walues
`
`To Classifier (trainf
`classify)
`Fig. 72
`
`21 O
`
`212
`
`Petitioner's Exhibit 1022,
` Page 70 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 70 of 73
`
`US 2016/O187199 A1
`
`Goba
`Classifier
`
`Mapping
`Service
`
`300
`
`308
`
`
`
`Derivative
`Classifier
`
`306
`
`Fig. 73
`
`Derivative
`Classifier
`
`Derivative
`Classifier
`
`Petitioner's Exhibit 1022,
` Page 71 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 71 of 73
`
`US 2016/O187199 A1
`
`404
`
`402
`
`-
`LED
`HW Controller
`
`Controller
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`.
`
`. .Board
`
`
`
`
`
`
`
`control
`Signais
`
`multispectra
`images
`
`Classification Result
`
`Fig. 74
`
`Petitioner's Exhibit 1022,
` Page 72 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 72 of 73
`
`US 2016/O187199 A1
`
`508
`\
`
`506a
`
`504a
`502a
`
`506b
`
`504
`502
`
`500
`
`Fig. 75
`
`Petitioner's Exhibit 1022,
` Page 73 of 115
`
`
`
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 73 of 73
`
`US 2016/O187199 A1
`
`
`
`Petitioner's Exhibit 1022,
` Page 74 of 115
`
`
`
`US 2016/O 187199 A1
`
`Jun. 30, 2016
`
`SENSOR-SYNCHRONIZED
`SPECTRALLY-STRUCTURED-LIGHT
`MAGING
`
`RELATED APPLICATION DATA
`0001. This application claims benefit of provisional appli
`cation 62/042,127 filed Aug. 26, 2014 and provisional appli
`cation 62/054.294 filed Sep. 23, 2014.
`0002. This application is related to Ser. No. 14/201,852,
`filed Mar. 8, 2014, which is a continuation-in-part of Ser. No.
`13/840.451, filed Mar. 15, 2013, which is a non-provisional of
`co-pending provisional applications 61/688,722, filed May
`21, 2012, and 61/706,982, filed Sep. 28, 2012. This applica
`tion is also related to provisional applications 61/906,886,
`filed Nov. 20, 2013, provisional application 61/907,362, filed
`Nov. 21, 2013, all with same title, all which are incorporated
`by reference.
`
`REFERENCE TO COMPUTER PROGRAM
`LISTINGAPPENDIX
`0003. This application includes a computer program list
`ing appendix including the following Matlab computer pro
`gram files: SpectricityV 11 multiday set2-code appendix.
`txt, Spectralmg-code appendix.txt and spectrald-code
`appendix.txt,
`configParser-code appendix.txt,
`ClassifierTSVQ appendix.txt, basicClassify appendix.txt,
`VQ appendix.txt and DBCapture appendix, all incorporated
`into this specification.
`
`TECHNICAL FIELD
`0004 The present technology concerns, e.g., imaging
`spectrometry.
`
`BACKGROUND AND INTRODUCTION OF THE
`TECHNOLOGY
`0005. Both natural light (ambient) photography and
`flash-assisted (read broadly: human assisted light Supple
`mentation) photography have been around since the
`Daguerreotype. The technology of this disclosure concerns
`how primarily the latter form of lighting, call it flash for
`conciseness, can be so designed and implemented as to effec
`tively qualify it within the general art of imaging spectrom
`etry or hyper-spectral imaging.
`0006. In a nutshell, by illuminating a scene with several
`different brief (frame-synchronized) spectrally structured
`light sources, even a common Bayer pattern CMOS camera
`can effectively become an imaging spectrometer with N
`bands. N in very early days being practically on the order of
`5 to 10 bands, but with fine prospects of going higher, espe
`cially as design principles behind Bayer patterns (and
`RGBW, e.g., from Sony) are reconsidered in light of this
`technology.
`0007 An introduction of the technology must make note
`of multi-chip LEDs (see e.g. Edison's 2012-era Federal FM
`series, depicted in FIG. 7) as being at least a seed for creating
`spectrally structured light. A core approach, exploited in
`several embodiments, is to synchronize pulsing of different
`LED light sources with individual frames of a CMOS sensor,
`thereby creating the informational basis for N-band imaging.
`Light sources other than LEDs can certainly be considered
`but by 2012 standards, multi-chip and/or dual LEDs are
`leading candidates to realize this technology.
`
`0008. A particularly intriguing choice of bands is the 3
`very well-known 1931 CIE color matching functions and/or
`their orthogonally transformed functions. With such choices,
`the stage is set for taking color photography to its multiverse
`destiny: referred to as direct chromaticity capture' in this
`disclosure.
`0009. One part of this disclosure describes the design prin
`ciples and physical realizations of turning virtually any elec
`tronic imaging sensor into an imaging spectrometer via spe
`cific coordination with some supplemental light source. With
`the core how then elucidated, applications are presented and
`described, including A) the niche application of hyper-spec
`tral imaging, B) the medical imaging potential of this tech
`nology, C) radically improved color photography for both
`digital cameras and Smartphones (as 2012 still draws pretty
`sharp lines between the two), and D) uses of N-band imaging
`within the mature technology of digital watermarking and
`image fingerprinting.
`0010 Subsequent to the initial disclosure, this disclosure
`has been expanded significantly in several areas, including:
`0011
`methods and systems for classifying and recog
`nizing various types of objects;
`0012 Such systems employing various imaging con
`figurations, with various options on spectral light
`Sources, optical filters, polarimetric sensing, sensing of
`these spectral and polarimetric pixel samples at 3 spatial
`dimensions (including plenoptic sensing and structured
`light 3D sensing), Scanning techniques, and synchroniz
`ing controlled capture under various lighting and sens
`ing states;
`0013 training and applying classifiers for particular
`fields, including produce identification, produce ripen
`ing, etc.:
`0014 advances in illumination, sensing and post pro
`cessing to address various environmental effects, includ
`ing specular reflections, product package layers (e.g.,
`plastic packaging or bags that hamper object identifica
`tion); and
`0015 advances in sensing and post processing, prior to
`training and applying a classifier to obtain vectors per
`pixel, that combine spectral, polarimetric, and spatial
`relationships among pixel elements.
`0016 Many more system configurations, lighting and
`sensing devices, and pixel post processing techniques and
`device configurations are detailed further below. A myriad of
`inventive combinations of these and other aspects of the dis
`closure are contemplated and not limited to the particular
`example embodiments. We provide source code samples as
`examples. It is contemplated that the various signal process
`ing described may be implemented as Software instructions
`for execution on general purpose computing devices or spe
`cial purpose processors, including devices with DSPs, GPUs,
`etc. These software instructions may be ported into processor
`device specific firmware versions, ASICs, FPGAs, etc. in
`various combinations, as well as leverage cloud computing
`services for execution (particular for training, classifying and
`recognition services).
`0017. The foregoing and other features and advantages of
`the present technology will be more readily apparent from the
`following Detailed Description, which proceeds with refer
`ence to the accompanying drawings.
`0018 Classifiers for Produce
`0019. Several research groups have investigated methods
`using digital color (Red, Green, and Blue) cameras to classify
`
`Petitioner's Exhibit 1022,
` Page 75 of 115
`
`
`
`US 2016/O 187199 A1
`
`Jun. 30, 2016
`
`fruits or fruits and vegetables. One was made by IBM in the
`late 1990s. See, Bolle, Connell, Hass, Mohan, Taubin. “Veg
`gieVision: A Produce Recognition System”, “Proceedings of
`the Third IEEE Workshop on Applications of Computer
`Vision, pp. 224-251, 1996. For this effort, the researchers
`tried to classify 48 different produce items. They used a
`combination of color and texture features. Color features
`were three concatenated histograms of the produce item,
`computed in the Hue-Saturation-Intensity (HSI) space. For
`texture measure, they tried a couple different gradient mea
`Sures. The texture features were histograms of the gradient
`taken over the image. Both gradient measures performed
`similarly. They used a nearest neighbor classifier. The correct
`classification was one of the top four predicted classes 90% of
`the time for color only (with hue being most important), 63%
`of the time for texture only, and 97% of the time for color and
`texture. This result indicates that good category separation
`should be possible with a fast simple classifier operating on a
`single feature vector per image.
`0020 Several more recent publications by university
`researchers provide guidance on potential color and texture
`features for grouping produce into categories. A group in
`Brazil working with Cornell performed a study of a variety of
`features and classifier types using a set of 15 different produce
`items. See, Rocha, Hauagge, Wainer, Goldenstein. “Auto
`matic fruit and vegetable classification from images”. Com
`puters and Electronics in Agriculture, 70, 96-104, 2010. The
`images showed one or more examples of each item against a
`uniform white background. A digital RGB camera was used
`to capture the images. Their color and texture descriptors
`included:
`0021
`1. General Color Histogram. A color histogram is a
`3 dimensional matrix that measures the probability of each
`RGB vector, rather than building three separate histograms,
`one for each color. Typically, each color is quantized to 4
`levels to create a 4x4x4=64 element feature vector.
`0022. 2. Unser Features. Unser features are a texture mea
`Sure that operates on the intensity channel. It involves taking
`the sum and difference of pairs of pixels at a selected scale.
`Histograms are then formed for the sum and difference
`images.
`0023. 3. Color Coherence Vectors. Color coherence vec
`tors are frequently used in image searches of the type “find
`other pictures like this one'. They are comparable to the color
`histogram in terms of classification power.
`0024. 4. Border/Interior Color Histogram. This method
`uses two color histograms, one for pixels on the interior of
`regions and one for pixels on the edges of a region. This
`metric captures both color and texture information, and is the
`best of the features explored in this work.
`0025 5. Appearance descriptors. This feature matches
`Small regions of the intensity image to a set of appearance
`(edge/texture) descriptors that are similar to the Haarfeatures
`used for face detection. This feature set performed poorly and
`its evaluation was dropped early in the paper.
`0026. The researchers investigated a number of classifier
`methodologies, with one-versus-one Support Vector
`Machines (SVM) being the clear winner. Using the Border/
`Interior color histograms, the classification matched one of
`the top two 95.8% of the time and using a combination of
`features, they were able to bring top two correct classification
`up to 97%.
`0027. An Indian university group using the same data set
`performed a different set of experiments, but with less suc
`
`cess. See, Arivazhagan, Shebiah, Nidhyanandhan, Ganesan.
`“Fruit Recognition using Color and Texture Features”, Jour
`nal of Emerging Trends in Computing and Information Sci
`ences, 90-94, 2010. They used a co-occurrence histogram on
`low pass filtered intensity values to measure texture. Rather
`than use the histogram directly, they computed several statis
`tics, including contrast, energy, and local homogeneity, and
`used these statistics as features. Similarly they computed
`histograms on hue and Saturation for color measurement and
`derived statistics from those histograms. Their final feature
`vector had 13 statistical features. Color statistics performed
`particularly poorly, with only 45% correct classification. The
`texture feature was better with 70% average correct classifi
`cation. Combining the features worked best, giving 86% cor
`rect classification. This work indicates that while color his
`tograms are effective at capturing important produce
`characteristics, reducing the histograms to statistics is less
`effective.
`0028. Most recently, a group in China performed an inde
`pendent study similar to that performed by Rocha, et. al. on a
`set of 18 fruits (no vegetables). See, Zhang and Wu. “Classi
`fication of Fruits Using Computer Vision and a Multiclass
`Support Vector Machine', Sensors, pp. 12489-12505, 2012.
`They used several variants of SVMs and a combination of
`color, texture, and shape features. The color feature was a
`color histogram. They used the Unser feature vector, but
`reduced the pair of histograms to seven features using statis
`tical measures (mean, contrast, homogeneity, energy, Vari
`ance, correlation, and entropy). They also made eight shape
`measurements including area, perimeter, convex hull area,
`and minor and major axis of a fitted ellipse. Unfortunately,
`they performed no analysis of the relative value of each fea
`ture type (color, texture, shape). So it is difficult to ascertain
`the effectiveness of their different features. It would have
`been particularly useful to understand which, if any, of the
`shape features provided discriminability. They performed
`PCA on the feature set, reducing it from dimension 79 to
`dimension 14. The researchers performed tests using one
`Versus-All and one-versus-one classifiers, with the one-ver
`Sus-one approach the clear winner. Their classifiers had
`53.5% classification correctness using a linear SVM and
`88.2% correct using a radial basis function (RBF) SVM. The
`PCA operation may be partially responsible for the relatively
`poor performance of the linear classifier. The reduction of the
`Unser features to statistics may have also had a negative effect
`on classification accuracy.
`0029. A quick clarification on what constitutes the classi
`fication performance minimum: With two equal sized classes,
`you can get 50% correct by “flipping a coin' to select the
`class. However, when there are more than two classes, 50% is
`no longer your misclassification floor. For three classes the
`floor is 33%, for four classes 25%, for 20 classes 5%, and so
`O.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0030 The patent or application file contains at least one
`drawing executed in color. Copies of this patent or patent
`application publication with color drawing(s) will be pro
`vided by the Office upon request and payment of the neces
`sary fee.
`FIG. 1 illustrates how most modern cameras distin
`0031
`guish red apples from green apples.
`0032 FIG. 2 presents a plot of three spectral detection
`profiles of an illustrative Bayer-pattern CMOS sensor.
`
`Petitioner's Exhibit 1022,
` Page 76 of 115
`
`
`
`US 2016/O 187199 A1
`
`Jun. 30, 2016
`
`0033 FIG. 3 is similar to FIG. 1, but includes information
`about an idealized spectral reflectance profile of a green
`apple, and of a red apple.
`0034 FIG. 4 introduces an idealized ambient lighting
`Source spectral curve.
`0035 FIG. 5 presents a case involving slight green-ish,
`mainly blue-ish illumination.
`0036 FIG. 6 shows how an apple may be mis-colored
`when rendered on a screen, due to illumination.
`0037 FIGS. 7 and 8 introduce the notion of multi-colored
`flash.
`0038 FIG.9 is similar to FIG. 5, but incorporating insight
`from FIG. 8.
`0039 FIG. 10 shows another family of spectral curves.
`0040 FIG. 11 illustrates different spectral samplings of an
`apple.
`FIG. 12 illustrates how data gathered in FIG. 11 can
`0041
`be used to produce spectral information for the apple.
`0042 FIG. 13 shows a linear function estimation arrange
`ment that can be used with the spectral information of FIG.
`12.
`0043 FIGS. 14-17 show the evolution from a five-band
`rectangular Solution set to a linear algebra representation of
`the spectral data.
`0044 FIG. 18 introduces some of the considerations from
`a sensor side of the system.
`0045 FIGS. 19-22 delve into considerations concerning
`the illumination LEDs.
`0046 FIG. 23 illustrates a relationship between Bayer fil
`ters and orthogonal color matching functions.
`0047 FIG. 24 details use of a CIE matrix to generate
`chromaticity coordinates.
`0048 FIG. 25 shows how the present technology resolves
`an apple's color to particular coordinates on a chromaticity
`diagram.
`0049 FIG. 26 delves further into ambient illumination
`combined with the LED illumination.
`0050 FIG. 27 illustrates uses of the technology in medical
`applications.
`0051
`FIG. 28 introduces use of the technology in food
`safety, item inspection, and anti-counterfeiting applications.
`0052 FIG. 29 illustrates use of the technology in digital
`watermarking and related applications.
`0053 FIG. 30 details how conventional form-factor flash
`units can employ the present technology.
`0054 FIGS. 31 and 31A illustrate an implementation
`using a clip-on illumination accessory.
`0055 FIG. 32 addresses aspects of the technology con
`cerning motion.
`0056 FIGS. 33-36 further elaborate considerations
`involving ambient lighting.
`0057 FIG. 37 details how unknown ambient lighting
`spectral coefficients can be removed from aggregate math
`ematical equations.
`0058 FIG.38 is a diagram illustrating a process of gener
`ating spectral images in response to pulsing a target object
`with illumination in the presence of ambient light.
`0059 FIG. 39 depicts a matrix with the color channels of
`the sensor, R, G and B, on the vertical axis, and the LED light
`source co