throbber
Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 1 of 19
`Case 1:20-cv-00909—LMM Document 1-3 Filed 02/27/20 Page 1 of 19
`
`
`
`
`
`
`
`EXHIBIT B
`EXHIBIT B
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 2 of 19
`Casel=2°'cv'0°9°9'LMM D"°“"111|1|llllllllll'lllllllllflfllfilllfillllllllfllllalllllIllllllllllllllllll
`
`US010127688B2
`
`(12) United States Patent
`US 10,127,688 B2
`(10) Patent No.:
`Atsmon et al.
`(45) Date of Patent:
`*Nov. 13, 2018
`
`(54)
`
`(71)
`
`(72)
`
`SYSTEM AND PROCESS FOR
`AUTOMATICALLY FINDING OBJECTS OF A
`SPECIFIC COLOR
`
`(56)
`
`References Cited
`U. S. PATENT DOCUMENTS
`
`Applicants:Alon Atsmon, Ganey-Tikva (IL); Dan
`Atsmon, Rehovot (IL)
`
`Inventors: Alon Atsmon, Ganey-Tikva (IL); Dan
`Atsmon, Rehovot (IL)
`
`4/2009 Bhatti et a1.
`7,522,768 B2
`7,844,140 B2* 11/2010 Fujita ................ G06F17/30265
`348/211 .3
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`(*)
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`W0
`W0
`
`WO 2004/027695
`WO 2009/137830
`
`4/2004
`11/2009
`
`OTHER PUBLICATIONS
`
`(21)
`
`Appl. No.: 15/650,993
`
`(22)
`
`Filed:
`
`Jul. 17, 2017
`
`Applicant-Initiated Interview Summary dated Feb. 19, 2016 From
`the US Patent and Trademark Office Re. US. Appl. No. 14/833,099.
`
`(Continued)
`
`Prior Publication Data
`
`Primary Examiner 7 Ali Bayat
`
`US 2017/0316584 A1
`
`NOV. 2, 2017
`
`(57)
`
`ABSTRACT
`
`(65)
`
`(63)
`
`Related US. Application Data
`
`Continuation of application No. 15/230,433, filed on
`Aug. 7, 2016, now Pat. No. 9,710,928, which is a
`(Continued)
`
`Int. Cl.
`
`(51)
`
`G06K 9/00
`G06T 7/90
`
`US. Cl.
`
`(52)
`
`(2006.01)
`(2017.01)
`(Continued)
`
`(58)
`
`CPC ............ G06T 7/90 (2017.01); G06F 17/3025
`(2013.01); G06K 9/4652 (2013.01);
`(Continued)
`Field of Classification Search
`CPC ................ G06K 9/6201; G06K 9/4652; G06F
`17/3025; H04N 1/6058; H04N 1/60;
`(Continued)
`
`A computer implemented method, system and computer
`program product for identifying the Main Colors and the
`matching colors of a Visual object, and then viewing on a
`mobile device select items comprising the matching colors,
`such as from a merchant’s catalog. A visual object
`is
`analyzed for color content, and the results are stored on a
`system database located on the device or on a remote server.
`The color analysis of the objects comprise advanced image
`processing techniques, such as Main Color extraction using
`color space transformation comprising HSV, RGB and
`CYMK to map between pixels in the image. The user can
`subsequently view a display on their mobile identifying the
`visual object’s Main Colors and at
`least one Harmonic
`Color; and then select and view all items (i.e. products in a
`database) comprising one Harmonic Color, and/or all items
`of a specific type and Harmonic Color.
`
`20 Claims, 9 Drawing Sheets
`
`1213
`
`124
`
` /
`
`1 06
`“x/\f~
`Internet
`
`
`
`
`
`
`103
`
`122 $
`Kw:
`77
`
`Found Red and Orange
`
`/
`
`1 141
`‘
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 3 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 3 of 19
`
`US 10,127,688 B2
`
`Page 2
`
`Related US. Application Data
`
`(56)
`
`continuation of application No. 14/833,099, filed on
`Aug. 23, 2015, now Pat. No. 9,412,182, which is a
`continuation of application No. 14/292,914, filed on
`Jun. 1, 2014, now Pat. No. 9,117,143, which is a
`continuation of application No. 13/356,815, filed on
`Jan. 24, 2012, now Pat. No. 8,744,180.
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`8,311,369 B2 *
`
`11/2012 Sambongi
`
`........... G06F 17/3025
`358/403
`
`8,369,616 B2
`8,538,149 B2 *
`
`8,633,986 B1 *
`
`2/2013 Marchesotti et a1.
`9/2013 Iwasaki ................... G06T 11/60
`358/19
`1/2014 Hughes ............ H04N 21/25825
`348/135
`
`(60)
`
`Provisional application No. 61/438,993, filed on Feb.
`3, 2011, provisional application No. 61/435,358, filed
`on Jan. 24, 2011.
`
`Int. Cl.
`
`(51)
`
`G06Q 10/00
`G06F 1 7/30
`G06K 9/46
`G06K 9/62
`G06T 7/40
`H04N 5/232
`US. Cl.
`
`(2012.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2017.01)
`(2006.01)
`
`CPC ........... G06K 9/6201 (2013.01); G06Q 10/00
`(2013.01); G06T 7/408 (2013.01); H04N
`5/23229 (2013.01); H05K 999/99 (2013.01);
`G06T 2207/10024 (2013.01)
`Field of Classification Search
`CPC ................. H04N 1/64; H04N 1/32106; H04N
`2201/325; G06T 11/60; G06T
`2207/20021; G06T 2207/20144; G06T
`2207/30108; G06T 7/0081
`See application file for complete search history.
`
`(52)
`
`(58)
`
`6/2014 Atsmon et a1.
`8,744,180 B2
`8/2016 Atsmon et a1.
`9,412,182 B2
`11/2003 Sridhar et a1.
`2003/0208754 A1
`7/2005 Jones
`2005/0156942 A1
`2/2008 Lieb .................... G06F 17/3025
`2008/0046410 A1*
`1/2009 Obrador ............ G06F 17/30256
`2009/0024580 A1*
`8/2009 Sambongi
`2009/0202147 A1
`10/2009 Rao
`2009/0252371 A1
`2009/0290762 A1* 11/2009 Guan ................. G06K 9/00362
`382/107
`
`2013/0022264 A1
`2014/0270509 A1
`2015/0363945 A1
`2016/0343149 A1
`
`1/2013 Atsmon et a1.
`9/2014 Atsmon et a1.
`12/2015 Atsmon et a1.
`11/2016 Atsmon et a1.
`
`OTHER PUBLICATIONS
`
`Oflicial Action dated NOV. 4, 2016 From the US Patent and
`Trademark Office Re. US. Appl. No. 15/230,433.
`Oflicial Action dated Dec. 22, 2015 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/833,099.
`Oflicial Action dated Sep. 22, 2014 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/292,914.
`Oflicial Action dated Jul. 31, 2013 From the US Patent and
`Trademark Office Re. U.S.App1. No. 13/356,815.
`
`* cited by examiner
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 4 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 4 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 1 of 9
`
`US 10,127,688 B2
`
`120
`
`106
`
`’l 00
`
`
`
`
`141 {
`
`
`
`Found Red and Orange
`
`J
`
`/ /
`142 k
`
`Complementary
`
`________________________________________________________________________J
`
`"‘\
`/
`
`/’
`
`[
`
`\M
`
`Analogous
`
`‘\\
`
`,x
`
`144
`
`146
`
`,K’"""""""""""""""""""""""""""""""""""""""""""\4
`Trladlc
`
`\
`
`, """""""""""""""""""""""""""""""""""""""""""""\\
`[
`Split— Comp
`,J \
`
`/
`
`148
`
`124
`
`
`
`/ 152
`square
`[f
`Tetradic
`150 k
`‘ /, / ‘ "
`l\
`
`
`
`
`
`
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 5 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 5 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 2 of 9
`
`US 10,127,688 B2
`
`______________________________________________________________
`
`Image DB
`/
`
`
`/
`
`201
`
`/’
`./
`
`Capture
`Image
`
`/
`/
`
`202
`
`Analyze
`captured image
`
`Send to remote
`
`‘
`
`server
`
`Server Analysis
`
`
`
`204
`
`206
`
`207
`
`"""""""""""""""""""""""""""""7
`
`/ Harmonics
`
`//
`
`208
`
`
`
`
`
`
`
`Search DB
`
`210
`
`212
`
`l
`_/_/__v_._._\
`// Overlay on
`
`(v \\ background
`
`\_\
`
`.\_\
`
`\‘E
`
`j
`
`/
`/
`
`200
`
`Figure 2
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 6 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 6 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 3 of 9
`
`US 10,127,688 B2
`
`BOG
`
`3’94
`f
`x
`(If R.ed I"
`> _\
`l\\ 3O6
`)L2..“ *‘ “~~J\g\rangg
`Grange \
`\
`
`34 G
`Warm
`
`Xx \
`,»/"“ Ax
`
`\
`
`
`
`\
`
`\\
`
`‘~ ~.
`324 /
`
`Rfid
`
`/ f 1"“;1 7,,""""""“" '\~*-»._/,,,_\ \~ ‘
`\
`
`
`
`
`
`320
`
`‘
`K
`\ Vzoéet
`// ”,7 //‘/ I \‘\\
`
`\ Béue
`\a
`
`\\
`
`,\
`
`‘\M
`
`Yeiiowu E 308
`3
`z
`{LQa’ange
`\
`J
`
`/ ‘
`
`~.
`
`‘
`
`/’/
`
`l/l
`Yeiiow /
`\x_
`/ 3’3 {3
`
`/
`
`
`
`330
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 7 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 7 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 4 of 9
`
`US 10,127,688 B2
`
`, /// Red "‘
`
`\\
`
`.~:~:
`
`\\
`
`\ L\=
`/ Red ...
`/
`LQ range/
`\ 306
`x
`/
`\W/ Grange \,
`\
`\
`\,
`\
`,\
`\
`
`\
`
`\
`
`\\
`
`,
`
`\
`
`\
`
`\\
`‘\
`Biue X
`‘\x
`//"
`L
`/
`\N
`3 1 8 \\\\ ,,// B E U6-
`
`\
`
`Yeiiown W gag
`
`/ \\ \
`
`\
`
`/
`
`/
`
`//
`
`310
`
`/
`///
`/
`/«:\ Yeiiow /,/
`\__
`,
`/, ,/’
`‘.\
`/
`/
`\\ Y9 a aOW I\V//
`\
`\ Green
`
`/'
`
`_\¥
`
`
`
`.....
`.....
`
`/
`/
`
`312
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 8 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 8 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 5 of 9
`
`US 10,127,688 B2
`
`
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 9 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 9 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 6 of 9
`
`US 10,127,688 B2
`
`300>
`
`3392
`
`324//
`
`a
`J Gr n
`a ge/
`/ \\\“\\\\\ \\
`[/l
`,/
`
`2 7
`/'/
`/
`4 Yeiiew /
`*--.\
`/
`/ 310
`
`__
`
`/
`
`\Green / Green \ G 99/
`\ L
`/
`/
`
`\\
`
`r
`
`n /
`
`31 2
`
`316
`
`‘ """"" ‘
`
`
`
`
`
`
`{ Biuen
`320 Vioiet
`\\
`r/f/ // (x‘
`\
`
`{'
`
`\K/
`\\
`
`\
`
`\x
`
`Biue
`
`\\
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 10 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 10 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 7 of 9
`
`US 10,127,688 B2
`
`/
`/
`/\\
`
`\
`
`r'
`/
`
`/
`
`g"
`‘1
`320 S Ewe“
`\
`Vioiet
`l,//’ // (x‘
`
`\
`
`\r////
`\\
`
`Biue
`
`\
`
`__
`
`/
`
`\\
`
`\‘n
`
`/
`
`'
`
`\,
`
`\\
`a
`////// I \\
`
`\
`
`308
`
`*1 YeiEOW-x
`Jaran e;
`g /
`\\“\\\\\ \\
`[/l
`
`/
`//
`
`
`
`~ 7
`//l
`H/
`4 Yeiiew /
`
`
`
`l
`/
`/
`/
`
`//
`’
`
`/
`
`__
`
`\_\ /
`
`310
`
`4
`YeEE0w1‘\-\
`Green
`//’
`
`//
`
`\
`
`‘
`\
`\
`
`\
`\
`
`,4/
`
`312
`
`\///
`
`/
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 11 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 11 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 8 of 9
`
`US 10,127,688 B2
`
`302
`
`324
`/ / 4 Red—
`
`’2 ””””
`Red
`
`304
`
`322
`
`329 ‘i We“
`Vioiet
`‘\\/"’///
`\
`Biue
`
`308
`
`\1 YeiEOW-x
`;
`;
`Larange/
`
`\ \
`ll,
`/,/
`/
`
`4 Yeiiew /
`-.._\\
`,/ 31 O
`/.
`Yeiioww\
`G reen
`/
`312
`
`\\\
`1318
`
`\ 4i
`
`BM?
`Greer:
`\
`
`3‘16
`
`/
`
`/ Green
`e
`,,,,,,,,,, -/
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 12 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 12 of 19
`
`U.S. Patent
`
`Nov. 13, 2018
`
`Sheet 9 of 9
`
`US 10,127,688 B2
`
`
`
`
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 13 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 13 of 19
`
`US 10,127,688 B2
`
`1
`SYSTEM AND PROCESS FOR
`AUTOMATICALLY FINDING OBJECTS OF A
`SPECIFIC COLOR
`
`RELATED APPLICATIONS
`
`This application is a continuation of US. patent applica-
`tion Ser. No. 15/230,433 filed on Aug. 7, 2016, which is a
`continuation of US. patent application Ser. No. 14/833,099
`filed on Aug. 23, 2015, now US. Pat. No. 9,412,182, which
`is a continuation of US. patent application Ser. No. 14/292,
`914 filed on Jun. 1, 2014, now US. Pat. No. 9,117,143,
`which is a continuation of US. patent application Ser. No.
`13/356,815 filed on Jan. 24, 2012, now US. Pat. No.
`8,744,180, which claims the benefit of priority under 35
`USC § 119(e) of US. Provisional Patent Application Nos.
`61/438,993 filed on Feb. 3, 2011 and 61/435,358 filed on
`Jan. 24, 2011. The contents of the above applications are all
`incorporated by reference as if fully set forth herein in their
`entirety.
`
`FIELD AND BACKGROUND OF THE
`INVENTION
`
`1. Technical Field
`
`The present invention relates to systems and processes for
`automatically analyzing and matching object’s colors using
`a digital image captured on an electronic communications
`device.
`
`2. Discussion of the Related Art
`
`The prior art discloses a number of color matching
`applications for use on mobile devices, such as to match
`colors of wall paints, furniture, clothing, etc. In most cases
`the user captures and stores an image of the item with their
`mobile device camera or laptop webcam. They then capture
`another image at a store, and run a mobile application to
`compare the two images to determine if their colors match.
`The mobile application may run the comparison in a
`variety of manners. For example, the device may show the
`two images side-by-side so that the user can subjectively
`make the decision. Or the mobile device can conduct an
`
`image analysis to determine to what degrees they match. For
`example, US. Patent Application 20090252371 entitled
`“Mobile device with color detection capabilities” will break-
`down the component colors into percentages (i.e. “Red:
`10%, Blue 47%, Green 43% and Purchase Item Image: Red:
`12%, Blue 47%, Green 41%”). It will then display a close-
`ness of color match based upon a preset overall percentage
`margins such as “Overall Result: Compares within 10%,
`Close Enough” or “Overall Result: Compares within 10%,
`Almost the Same” or “Overall Result: Compares within
`40%, Does not Match”.
`The prior art also discloses the use of a remote server to
`analyze a color match. For example,
`in 2008 Hewlett-
`Packard Laboratories launched a service using a mobile
`device photograph to enable a woman to select her hue of
`foundation makeup (See US. Pat. No. 7,522,768). The
`consumer takes a photograph of herself using a phone
`camera while holding a specially designed color chart. The
`image is then sent by the consumer via multimedia messag-
`ing service (MMS) to an advisory service host at a backend
`server. The system uses color science to correct the image
`color, image processing algorithms to locate and extract the
`face from the image, and statistical classifiers to determine
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`
`the user’s foundation makeup color with accuracy close to
`that of a makeup expert. The consumer then receives a SMS
`(Short Message Service) text message containing the foun-
`dation shade recommendation that best matches her com-
`
`plexion.
`The prior art does not, though, disclose color analysis of
`images captured on a mobile device using various image
`processing algorithms wherein the user can select what type
`of colors hues they will receive from the system, such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors.
`
`SUMMARY OF THE INVENTION
`
`The present invention comprises a computer implemented
`method, system and computer program product for identi-
`fying matching colors of a visual object captured in a digital
`image on a mobile device, such as with a mobile phone
`camera or a laptop webcam. The visual object is compared
`to a reference object that the mobile device user or another
`entity has previously captured, analyzed for color content,
`and stored on a system database. The user can then be
`provided a display on their mobile identifying the primary
`colors in the visual object, and other colors that would
`coordinate with the object for a “color match”, such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors.
`In a preferred embodiment of the present invention, the
`user captures an image on their mobile device of a reference
`object (i.e. fumiture, clothing, wall paint, etc. .
`.
`. ) that they
`wish to color coordinate with a similar object (i.e. pillows
`for furniture, shoes for clothing, wall paper for wall paint,
`etc.
`.
`.
`. ). The system and software will conduct a color
`analysis, which will identify its Main Colors (i.e., Base,
`Primary, Secondary and Tertiary Colors), of the reference
`object and optionally create a color harmonics of it. The
`system will query image database, then return and display
`matching color combinations and/or harmonics (such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors) based on the query, on the
`user’s mobile device.
`
`The computer implemented method as conducted by the
`software of the present invention comprise the steps of: 1)
`capturing an image on a terminal device, wherein the image
`are associated with a visual object; 2) conducting a color
`analysis, i.e. determining the Main Colors, on the reference
`image and optionally constructing a color wheel based on
`the analysis; 3) Querying an Image Database against the
`color analysis using one or more harmonics, wherein the
`user may manually select a particular type of harmonic; and,
`4) electronically transmitting and displaying results to the
`terminal based on the image color analysis, e.g. Main
`Colors; wherein the results comprise all harmonics if the
`user did not select the particular type of harmonic in step (3).
`The present invention uses various image enhancing and
`processing algorithms and techniques to detect and analyze
`the different color hues in a digital image, such as, HSV
`(Hue, Saturation, Value) color histograms, RGB color his-
`tograms, CYMK color histograms, and multi-space color
`clustering. The color analysis may also comprise, separating
`the object from its background, compensating for distortions
`such as shading and/or flash light, classifying each pixel to
`a predefined color set and finding the elements of the color
`set with the highest number of pixels.
`Other aspects of the invention may include a system
`arranged to execute the aforementioned methods and a
`computer readable program to include a mobile application
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 14 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 14 of 19
`
`US 10,127,688 B2
`
`3
`configured to execute the aforementioned methods. These,
`additional, and/or other aspects and/or advantages of the
`embodiments of the present invention are set forth in the
`detailed description which follows; possibly inferable from
`the detailed description; and/or learnable by practice of the
`embodiments of the present invention.
`
`BRIEF DESCRIPTION OF THE SEVERAL
`VIEWS OF THE DRAWINGS
`
`invention will now be described in the
`The present
`following detailed description of exemplary embodiments of
`the invention and with reference to the attached drawings, in
`which dimensions of components and features shown are
`chosen for convenience and clarity of presentation and are
`not necessarily shown to scale. Generally, only structures,
`elements or parts that are germane to the discussion are
`shown in the figure.
`FIG. 1 is a scheme describing the system and process in
`accordance with an exemplary embodiment of the invention.
`FIG. 2 is a flowchart of acts performed in capturing and
`matching a visual object, in accordance with an exemplary
`embodiment of the invention.
`
`FIG. 3 is a scheme describing a color wheel in accordance
`with an exemplary embodiment of the invention.
`FIG. 4 is a scheme describing a selecting a complemen-
`tary color in accordance with an exemplary embodiment of
`the invention.
`
`10
`
`15
`
`20
`
`25
`
`FIG. 5 is a scheme describing the selection of analogous
`colors in accordance with an exemplary embodiment of the
`invention.
`
`30
`
`FIG. 6 is a scheme describing the selection of triadic
`colors in accordance with an exemplary embodiment of the
`invention.
`
`FIG. 7 is a scheme describing the selection of split
`complementary colors in accordance with an exemplary
`embodiment of the invention.
`
`35
`
`FIG. 8 is a scheme describing the selection of tetriadic
`colors in accordance with an exemplary embodiment of the
`invention.
`
`40
`
`FIG. 9 is a scheme describing the selection of square
`colors in accordance with an exemplary embodiment of the
`invention.
`
`DESCRIPTION OF SPECIFIC EMBODIMENTS
`OF THE INVENTION
`
`Provided herein is a detailed description of this invention.
`It is to be understood, however, that this invention may be
`embodied in various forms, and that
`the suggested (or
`proposed) embodiments are only possible implementations
`(or examples for a feasible embodiments, or materializa-
`tions) of this invention. Therefore, specific details disclosed
`herein are not to be interpreted as limiting, but rather as a
`basis and/or principle for the claims, and/or as a represen-
`tative basis for teaching one skilled in the art to employ this
`invention in virtually any appropriately detailed system,
`structure or manner.
`
`Glossary of Terms
`
`the
`invention,
`To facilitate understanding the present
`following glossary of terms is provided. It is to be noted that
`terms used in the specification but not included in this
`glossary are considered as defined according the normal
`usage of the computer science art, or alternatively according
`to normal dictionary usage.
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`The term “image” as used herein in this application is
`defined as visual representation that can be presented on two
`dimensional or three dimensional surfaces. Images can be
`taken in any part of the electromagnetic spectrum such as
`visible light, infrared, ultraviolet, X-rays, Terahertz, Micro-
`waves, and Radio frequency waves. The reference images
`(i.e. color wheel) are stored in an “image database (DB)” on
`the system server or on the mobile device.
`The term “photo” as used herein in this application is
`defined as image in the visible light.
`is
`The term “GPS” as used herein in this application,
`defined as a system based on satellites that allows a user with
`a receiver to determine precise coordinates for their location
`on the earth’s surface.
`
`is
`The term “GPU” as used herein in this application,
`defined as an apparatus adapted to reduce the time it takes
`to produce images on the computer screen by incorporating
`its own processor and memory, having more than 16 CPU
`cores, such as GeForce 8800.
`The term “Keypoint” as used herein in this application, is
`defined as interest points in an object. For example, in the
`SIFT framework,
`the image is convolved with Gaussian
`filters at different scales, and then the difference of succes-
`sive Gaussian-blurred images are taken. Keypoints are then
`taken as maxima/minima of the Difference of Gaussians.
`
`Such keypoints can be calculated for the original image or
`for a transformation of the original image, such as an affine
`transform of the original images.
`The term “Keypoint descriptor” as used herein in this
`application,
`is defined as a descriptor of a keypoint. For
`example, in the SIFT framework the feature descriptor is
`computed as a set of orientation histograms on neighbor-
`hoods. The orientation histograms are relative to the key-
`point orientation and the orientation data comes from the
`Gaussian image closest in scale to the keypoint’s scale. Just
`like before, the contribution of each pixel is weighted by the
`gradient magnitude, and by a Gaussian with a 1.5 times the
`scale of the keypoint. Histograms contain 8 bins each, and
`each descriptor contains an array of 4 histograms around the
`keypoint. This leads to a SIFT feature vector with (4x4><
`8:128 elements).
`The term “Visual content item” as used herein in this
`
`application, is defined as an object with visual characteris-
`tics such as an image file like BMP, JPG, JPEG, GIF, TIFF,
`PNG files; a screenshot; Avideo file like AVI, MPG, MPEG,
`MOV, WMV, FLV files or a one or more frame of a video.
`The term “visual object” as used herein in this application,
`is defined as a content that includes visual information such
`
`as visual content item, images, photos, videos, IR image,
`magnified image, an image sequence or TV broadcast.
`The term “camera” as used herein in this application is
`defined as means of capturing a visual object.
`The term “terminal” as used herein in this application is
`defined as an apparatus adapted to show visual content such
`as a computer, a laptop computer, mobile phone or a TV.
`The term “visual similarity” as used herein in this appli-
`cation, is defined as the measure of resemblances between
`two visual objects that can be comprised of:
`The fit between their color distributions such as the
`
`correlation between their HSV color histograms
`The fit between their texture
`
`The fit between their shapes
`The correlation between their edge histograms
`Face similarity
`Methods that
`invariant
`
`include local descriptors such as Scale-
`feature
`transform (SIFT), Affine-SIFT
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 15 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 15 of 19
`
`US 10,127,688 B2
`
`5
`(ASIFT), Speeded Up Robust Feature (SURF), and
`Multi-Scale Retinex (MSR)
`The term “Visual analysis” as used herein in this appli-
`cation,
`is defined as the analysis of the characteristics of
`Visual objects such, as Visual similarity, coherence, hierar-
`chical organization, concept load or density, feature extrac-
`tion and noise removal.
`
`5
`
`The term “Capturing data analysis” as used herein in this
`application, is defined as the analysis of capturing data such
`as:
`
`10
`
`X-Y—Z coordinates
`
`3 angles
`Manufacturer
`Model
`
`Orientation (rotation) topileft
`Software
`Date and Time
`
`YCbCr Positioning centered
`Compression
`x-Resolution
`
`y-Resolution
`Resolution Unit
`
`Exposure Time
`FNumber
`
`ExposureProgram
`Exif Version
`
`Date and Time (original)
`Date and Time (digitized)
`ComponentsConfiguration Y Cb Cr
`Compressed Bits per Pixel
`Exposure Bias
`MaxApertureValue
`Metering Mode Pattern
`Flash fired or not
`
`Focal Length
`MakerNote
`FlashPixVersion
`
`Color Space
`PixelXDimension
`PixelYDimension
`File Source
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`lnteroperabilitylndex
`lnteroperabilityVersion
`Derivates of the above such as acceleration in the X-axis
`
`45
`
`The term “Service location” as used herein in this appli-
`cation, is defined as a physical place where objects can be
`serviced and/or fixed such as a mobile carrier service center.
`
`The term “Location based analysis” as used herein in this
`application, is defined as analysis of local data such as GPS
`location, triangulation data, RFID data, and street address.
`Location data can for example identify the service location
`or even the specific part of the service location in which the
`visual object was captured.
`The term “Color analysis” as used herein in this applica-
`tion, is defined as the combination of visual analysis, cap-
`turing data analysis, location based analysis and/or analysis
`of other data and analysis history to extract a color from a
`visual object. Color analysis can include the steps of sepa-
`rating the main object from its background, compensating
`for distortions such as shading and/or flash light, classifying
`each pixel to a predefined color set and finding the elements
`of the color set with the highest number of pixels.
`The term “marketplace” as used herein in this application,
`is defined as a physical place where objects can be bought
`such as a bank, a change point, a supermarket, a convenience
`store and a grocery store.
`
`50
`
`55
`
`60
`
`65
`
`6
`The term “color wheel” as used herein in this application,
`and is further described in FIG. 3, is defined as an abstract
`illustrative organization of color hues around a circle that
`shows relationships between primary colors, secondary col-
`ors, complementary colors. In the RYB (or subtractive) color
`model, the primary colors are red, yellow and blue. The three
`secondary colors (green, orange and purple) are created by
`mixing two primary colors. Another six tertiary colors are
`created by mixing primary and secondary colors.
`The term “color harmonies” as used herein in this appli-
`cation, is defined as color combinations that are considered
`especially pleasing. They consist of two or more colors with
`a fixed relation in the color wheel.
`
`The term “color impact” as used herein in this application,
`is defined as the dynamic creation of a color wheel to match
`a visual object’s Base, Primary, Secondary and Tertiary
`Colors.
`
`The term “warm colors” as used herein in this application,
`is defined as vivid and energetic colors.
`The term “cool colors” as used herein in this application,
`is defined as colors that give an impression of calm, and
`create a soothing impression. White, black and gray are
`considered to be neutral.
`
`The term “complementary colors” as used herein in this
`application, and is further shown on FIG. 4, is defined as
`colors that are opposite each other on the color wheel.
`The term “analogous colors” as used herein in this appli-
`cation, and is further shown on FIG. 5, is defined as colors
`that are next to each other on the color wheel.
`
`The term “triadic colors” as used herein in this applica-
`tion, and is further shown on FIG. 6, is defined as colors that
`are colors that are evenly spaced around the color wheel.
`The term “split-complementary colors” as used herein in
`this application, and is further shown on FIG. 7, is defined
`as set of base color on the color wheel and two colors
`
`adjacent to its complementary color.
`The term “tetradic colors” as used herein in this applica-
`tion, and is further shown on FIG. 8, is defined as four colors
`arranged into two complementary pairs on the color wheel.
`The term “square colors” as used herein in this applica-
`tion, and is further shown on FIG. 9, is defined as four colors
`arranged into two complementary pairs on the color wheel,
`with all four colors spaced evenly around the color wheel.
`System for Analyzing Color Images
`FIG. 1 is a scheme describing the system and process in
`accordance with an exemplary embodiment of the invention.
`System 100 performs the process described hereinafter:
`Terminal 101, such as a mobile phone with camera 102 or
`a computer webcam, captures a visual object 120 represent-
`ing physical objects such as man with a shirt 124 or an
`apartment wall 122.
`The Capturing can be performed in several ways:
`Taking a photograph
`Recording a video
`Continuously capturing an image while local or remote
`processing provides real time feedback such “color not
`decided” or “a problem was found”. The continuous
`capturing can be done while moving the camera such as
`moving in the directions shown in 103.
`Said visual object can be captured from a static camera
`placed in the marketplace or from a camera held by person
`112. Person 112 can be a crowd of people that were
`incentivized to capture the object.
`Said visual object can be processed locally using terminal
`101 or it can be sent to a remote server 108, as further
`described in step 206 in FIG. 2, over a network 106, such as
`the internet.
`
`

`

`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 16 of 19
`Case 1:20-cv-00909-LMM Document 1-3 Filed 02/27/20 Page 16 of 19
`
`US 10,127,688 B2
`
`7
`Server 108 or device 101 calculates a color feedback 140
`
`that is sent over the internet or created locally. Feedback 140
`shows:
`
`Main colors found 141 on the Visual object using steps
`204 to 207 as further described in FIG. 2.
`
`An option to select further images of the image DB, as
`further described in step 201 in FIG. 2, according to
`different color harmonies using one or more of the main
`colors found:
`
`Complementary colors 142 for color harmony further
`described in FIG. 4;
`Analogous colors 144 for color harmony further
`described in FIG. 5;
`Triadic colors 146 for color harmony further described
`in FIG. 6;
`Split-complementary colors 148 for color harmony
`further described in FIG. 7;
`Tetradic colors 150 for color harmony further described
`in FIG. 8; and,
`Square colors 152 for color harmony further described
`in FIG. 9.
`
`An example would be that a person takes a photo of green
`wall using a mobile device camera 102 or computer web-
`cam, in which green is found as the main color. The user
`selects complementary colors 142 and wall art having red as
`its main color is presented for the user to choose. One or
`more of the matching wall art can later be presented on the
`terminal display 101 to demonstrate the results to the user
`comprising red art on the green wall.
`Method of Capturing and Matching a Visual Obj ect’s Colors
`FIG. 2 is a flowchart of acts performed in capturing and
`matching a Visual object, in accordance with an exemplary
`embodiment of the invention.
`
`The flowchart describes a process and system 200 to
`capture and match visual objects.
`An image database (DB) is loaded 201, including photos
`of a plurality of visual objects from one or more sides. For
`example, a database of shirts and their main colors are
`extracted using color analysis techniques of the present
`invention. Visual object representative 120 is then captured
`202 using a mobile device camera 102 or computer webcam.
`The captured object is optionally analyzed 204 locally, as
`further described in step 207, to get a match using color or
`color harmonics analysis or to reduce the size of the data to
`be sent to remote or local servers in step 206.
`Optionally the image itself or a processed part of it is sent
`206 to a remote server 108 or locally processed on a server
`at device 101. The server performs color analysis 207 to
`generate color feedback 140. Such analysis uses the visual
`object and optionally other data, such as GPS data,
`the
`history of the sender, history of similar types of visual
`objects, and predefined categories. Main colors found in the
`visual object are displayed on color feedback 140.
`In case the user manually selects harmonics 208, then the
`main colors found in steps 204-207 are used to find images
`in the Image DB 210 that fit the relevant harmonics (i.e.
`complementary, analogous,
`triadic,
`split-complementary,
`tetradic, or square color matches). In case user does not
`manually select harmonics, a sample from each harmonic is
`displayed to the user, and then he/she chooses the harmonic
`he/she prefers. Feedback disclosing results of the harmonic
`colors that match the visual object of interest is then dis-
`played in step 212 using device such as 101. The feedback
`report 140 may comprise various forms. For example, the
`harmonic color(s) may be displayed on top of the original
`
`8
`image captured in step 202. Optionally, further commercial
`ads are displayed in step 212 on device 101 in addition to the
`color match.
`
`Optionally a straight forward search of an object of the
`same color and/or tone is performe

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket