`Case 1:20-cv-00909—LMM Document 1-2 Filed 02/27/20 Page 1 of 19
`
`EXHIBIT A
`EXHIBIT A
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 2 of 19
`Casel=2°'cv'0°9°9'LMM D"°“"lll|l|lllllll|||'IIIIIBIIlll’lllfilllfillllIlllllfllalllllIllllllllllllllllll
`
`US009710928B2
`
`(12) United States Patent
`US 9,710,928 B2
`(10) Patent No.:
`Atsmon et al.
`(45) Date of Patent:
`*Jul. 18, 2017
`
`(54)
`
`(71)
`
`(72)
`
`SYSTEM AND PROCESS FOR
`AUTOMATICALLY FINDING OBJECTS OF A
`SPECIFIC COLOR
`
`(56)
`
`References Cited
`U. S. PATENT DOCUMENTS
`
`Applicants:Alon Atsmon, Ganey-Tikva (IL); Dan
`Atsmon, Rehovot (IL)
`
`Inventors: Alon Atsmon, Ganey-Tikva (IL); Dan
`Atsmon, Rehovot (IL)
`
`7,522,768 B2
`8,369,616 B2*
`
`4/2009 Bhatti et a1.
`2/2013 Marchesotti
`
`......... G06K 9/4652
`3 82/ 1 62
`
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`(*)
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`This patent is subject to a terminal dis-
`claimer.
`
`W0
`W0
`
`WO 2004/027695
`WO 2009/137830
`
`4/2004
`11/2009
`
`OTHER PUBLICATIONS
`
`(21)
`
`Appl. No.: 15/230,433
`
`(22)
`
`Filed:
`
`Aug. 7, 2016
`
`Applicant-Initiated Interview Summary Dated Feb. 19, 2016 From
`the US Patent and Trademark Office Re. US. Appl. No. 14/833,099.
`
`(Continued)
`
`Prior Publication Data
`
`Primary Examiner 7 Ali Bayat
`
`US 2016/0343149 A1
`
`NOV. 24, 2016
`
`(57)
`
`ABSTRACT
`
`(65)
`
`(63)
`
`Related US. Application Data
`
`Continuation of application No. 14/833,099, filed on
`Aug. 23, 2015, now Pat. No. 9,412,182, which is a
`(Continued)
`
`Int. Cl.
`
`(51)
`
`G06K 9/00
`G06T 7/40
`
`US. Cl.
`
`(52)
`
`(2006.01)
`(2017.01)
`(Continued)
`
`(58)
`
`CPC .......... G06T 7/408 (2013.01); G06F 17/3025
`(2013.01); G06K 9/4652 (2013.01);
`(Continued)
`Field of Classification Search
`CPC G06K 9/6201; G06K 9/4652; G06F 17/3025;
`H04N 1/6058; H04N 1/60;
`(Continued)
`
`A computer implemented method, system and computer
`program product for identifying the Main Colors and the
`matching colors of a Visual object, and then viewing on a
`mobile device select items comprising the matching colors,
`such as from a merchant’s catalogue. A visual object is
`analyzed for color content, and the results are stored on a
`system database located on the device or on a remote server.
`The color analysis of the objects comprise advanced image
`processing techniques, such as Main Color extraction using
`color space transformation comprising HSV, RGB and
`CYMK to map between pixels in the image. The user can
`subsequently view a display on their mobile identifying the
`visual object’s Main Colors and at
`least one Harmonic
`Color; and then select and view all items (i.e. products in a
`database) comprising one Harmonic Color, and/or all items
`of a specific type and Harmonic Color.
`
`20 Claims, 9 Drawing Sheets
`
`120
`
`124
`
`
`
`106
`
`Internet
`
`
`
`
`,, , 119,, , ,1,
`,
`,
`Found Red and Orange
`\
`
`, ,, , ,,
`
`141 ’
`
`142 f
`146 ‘
`
`Analogous
`
`Complementary
`Triadic
`Split- Comp.
`Tetradic
`150 '
`Square
`
`
`\
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 3 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 3 of 19
`
`US 9,710,928 B2
`
`Page 2
`
`Related US. Application Data
`
`(56)
`
`References Cited
`U. S. PATENT DOCUMENTS
`
`continuation of application No. 14/292,914, filed on
`Jun. 1, 2014, now Pat. No. 9,117,143, which is a
`continuation of application No. 13/356,815, filed on
`Jan. 24, 2012, now Pat. No. 8,744,180.
`
`(60)
`
`Provisional application No. 61/438,993, filed on Feb.
`3, 2011, provisional application No. 61/435,358, filed
`on Jan. 24, 2011.
`
`Int. Cl.
`
`(51)
`
`G06Q 10/00
`G06F 1 7/30
`G06K 9/46
`G06K 9/62
`H04N 5/232
`G06T 7/90
`US. Cl.
`
`(2012.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2017.01)
`
`CPC ........... G06K 9/6201 (2013.01); G06Q 10/00
`(2013.01); G06T 7/90 (2017.01); H04N
`5/23229 (2013.01); G06T 2207/10024
`(2013.01)
`
`Field of Classification Search
`CPC ................. H04N 1/64; H04N 1/32106; H04N
`2201/325; G06T 11/60; G06T
`2207/20021; G06T 2207/20144; G06T
`2207/30108; G06T 7/0081
`See application file for complete search history.
`
`(52)
`
`(58)
`
`8,744,180 B2
`2003/0208754 A1 *
`
`6/2014 Atsmon et a1.
`11/2003 Sridhar .................. G06Q 30/02
`725/34
`
`7/2005 Jones
`2005/0156942 A1
`2/2008 Lieb .................... G06F 17/3025
`2008/0046410 A1*
`1/2009 Obrador ............ G06F 17/30256
`2009/0024580 A1*
`8/2009 Sambongi
`2009/0202147 A1
`10/2009 Rao
`2009/0252371 A1
`2009/0290762 A1* 11/2009 Guan ................. G06K 9/00362
`382/107
`
`2013/0022264 A1
`2014/0270509 A1
`2015/0363945 A1
`
`1/2013 Atsmon et a1.
`9/2014 Atsmon et a1.
`12/2015 Atsmon et a1.
`
`OTHER PUBLICATIONS
`
`Notice of Allowance Dated Apr. 1, 2016 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/833,099.
`Notice of Allowance Dated Apr. 20, 2015 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/292,914.
`Notice of Allowance Dated Jan. 31, 2014 From the US Patent and
`Trademark Office Re. U.S.App1. No. 13/356,815.
`Oflicial Action Dated Dec. 22, 2015 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/833,099.
`Oflicial Action Dated Sep. 22, 2014 From the US Patent and
`Trademark Office Re. US. Appl. No. 14/292,914.
`Oflicial Action Dated Jul. 31, 2013 From the US Patent and
`Trademark Office Re. U.S.App1. No. 13/356,815.
`
`* cited by examiner
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 4 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 4 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 1 of 9
`
`US 9,710,928 B2
`
`
`
`
`
`M1
`
`142
`
`146
`
`Found Red and Orange
`
`1
`
`Cemgiementary
`
`Triadac
`
`1
`
`’
`
`Analogeus
`
`144 1
`
`Spiit— Camp.
`
`"148
`
`1
`
`152
`Square
`Tetradic
`150
`
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 5 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 5 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 2 of 9
`
`US 9,710,928 B2
`
`image DB
`
`
`2m
`
`202
`
`204
`
`206
`
`20'?
`
`208
`
`2’? Q
`
`212
`
`/,,,,,,,,,,,,,,,,,,,,,,,
`
`,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, ,
`
`//
`
`Captu re
`image
`
`/
`
`/
`
`Anaiyze
`3captured image3
`
`3’7777777777777777777777777Y7m"77777777777777"???
`3send to remete3
`server
`
`
`
`777773*777777777777777
`
`/ Qverlay 0n
`
`( \__ background
`
`\3
`
`x;
`
`260
`
`Figure 2
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 6 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 6 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 3 of 9
`
`US 9,710,928 B2
`
`300
`
`324
`
`\\
`
`\\
`
`y/
`
`Rafi
`
`394
`\
`_
`RECE— /
`/
`\ {7A7\_9’l_\§}rangefl
`Xxx/Orange
`
`Warm
`
`306
`\
`
`
`
`
`
`Yeiiow— * 308
`j Orange /
`
`///
`
`Yeiiow
`
`..
`a
`“OW—x___
`
`[f
`
`310
`
`330
`
`
`
`.
`
`'
`
`\
`
`Figure :3
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 7 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 7 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 4 of 9
`
`US 9,710,928 B2
`
` 324 ////“\
`
`X:
`" \“
`gaiiow ‘ 308
`range
`
`306
`
`/
`
`3 1 O
`
`Red“ V“
`Vioiet
`
`\
`
`/
`BEE/”vamet
`
`// 1 \
`J
`1 8mg”
`320 \ Vioiei
`
`"
`
`1:
`
`\\_
`
`\ Biue
`l
`/,////
`\xk
`318 \/ Biue—
`‘ \ \Green
`
`315
`
`4‘
`
`/
`
`Figure 4
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 8 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 8 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 5 of 9
`
`US 9,710,928 B2
`
` ,/
`
`/// \M__\ VE Q Eat
`
`322 /'
`Vioiet
`
`\7/
`/
`
`Grange
`\,
`
`\
`
`/
`
`\
`
`x
`
`\
`
`\ Béue
`\\
`3’3 8
`
`/
`
`\
`
`I, /'/
`
`\x
`
`_ fl
`
`Bi U9"
`__
`_
`$5139“ / Green
`
`f
`//
`
`,/
`7,
`
`2
`
`/
`
`//
`
`r'//
`/\ Yeiiow /
`\
`/ 3 1 0
`/
`Yeiiowk
`\\ G
`\7’,
`\\
`ream ’ /"H
`/’312
`
`Figure 5
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 9 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 9 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 6 of 9
`
`US 9,710,928 B2
`
`
`
`
`Z Blue“
`320 Vioiet
`
`\ Biue
`
`e
`
`;'3\f3ra“9e/i
`
`Yeiiew
`
`.
`
`310
`
`\Green
`
`Green
`
`\
`
`312
`
`31a
`
`'
`
`******* /
`
`314
`
`Figure 6
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 10 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 10 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 7 of 9
`
`US 9,710,928 B2
`
`
`
`
`/
`
`/,
`//
`
`\\
`
`\\
`
`\ 305
`Orange/
`{Orange \
`
`
`
`\ /
`K
`
`\\
`
`\K
`
`\
`
`\ Yeiiow-x
`j Orange f
`/L\'
`/
`(J
`
`x\\\
`
`//
`
`308
`
`/
`
`/
`// Yeiiow //
`1\\
`y/ 31 O
`
`Ewe
`
`\
`
`\
`
`_/
`
`//”
`
`
`31 8 V/ Biue~
`\ 23:;
`\\\gFEE-3n
`/
`{3Feen
`\\
`////;§ 1 2
`3‘} 6;
`\"\4C\_\ »
`,7777777777 “/7 729/”
`
`3 1 4
`
`Figure ‘3’
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 11 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 11 of 19
`
`U.S. Patent
`
`Jul. 18, 2017
`
`Sheet 8 of 9
`
`US 9,710,928 B2
`
`322 /\,\Vieflet
`L“\\
`/ _
`/Vroiet /
`
`,/’/
`
`/ Orange/r/ \
`
`30g
`
`fi\
`
`Grange \
`
`
`‘91 YeiEOW-x 2 308
`If Grange ;
`
`/ \\ _
`/
`\\ \ /
`~ 4
`
`
`/
`/
`
`/
`
`,4 Yeiiow
`.._\\
`/ ,
`YeEEowJ-4
`Green
`////// 31 2
`
`,// 31 O
`
`Biue
`
`
`
`/’
`
`{3r38{3
`
`\\
`
`Figure 8
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 12 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 12 of 19
`
`US Patent
`
`Jul. 18, 2017
`
`Sheet 9 of 9
`
`US 9,710,928 132
`
`30%?
`
`302
`
`77777777 ‘
`Re‘j
`
`/ K
`324
`A, // Red"
`‘*~-\_\_\Y:0Eet/
`Vieiet
`\
`
`322
`
`3434
`\
`
`K
`
`Red-
`\
`Qrange
`\\ Grange
`
`306
`
`3 0 5 gm“
`Vioie
`
`BM?
`
`\\ \
`,"//
`\‘\
`318\\ 35m?“
`\___<,,:
`Green
`
`Yeiiow— 308
`Grange}:
`
`Yeiiow
`
`/
`
`31 O
`
`
`
`LL\\\
`//
`,
`/ \ YSEEOW#\ J/
`\\ Green />;-/
`/ 312
`
`,7
`/
`/ Green
`\
`7,,,,,,,,, 2/ /
`
`3‘36
`
`314
`
`Figure 9
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 13 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 13 of 19
`
`US 9,710,928 B2
`
`1
`SYSTEM AND PROCESS FOR
`AUTOMATICALLY FINDING OBJECTS OF A
`SPECIFIC COLOR
`
`RELATED APPLICATIONS
`
`This application is a continuation of US. patent applica-
`tion Ser. No. 14/833,099 filed on Aug. 23, 2015, which is a
`continuation of US. patent application Ser. No. 14/292,914
`filed on Jun. 1, 2014, now US. Pat. No. 9,117,143, which is
`a continuation of US. patent application Ser. No. 13/356,
`815 filed on Jan. 24, 2012, now US. Pat. No. 8,744,180,
`which claims the benefit of priority under 35 USC §119(e)
`of US. Provisional Patent Application Nos. 61/438,993 filed
`on Feb. 3, 2011 and 61/435,358 filed on Jan. 24, 2011. The
`contents of the above applications are all incorporated by
`reference as if fully set forth herein in their entirety.
`
`FIELD AND BACKGROUND OF THE
`INVENTION
`
`1. Technical Field
`
`The present invention relates to systems and processes for
`automatically analyzing and matching object’s colors using
`a digital image captured on an electronic communications
`device.
`2. Discussion of the Related Art
`
`The prior art discloses a number of color matching
`applications for use on mobile devices, such as to match
`colors of wall paints, furniture, clothing, etc. In most cases
`the user captures and stores an image of the item with their
`mobile device camera or laptop webcam. They then capture
`another image at a store, and run a mobile application to
`compare the two images to determine if their colors match.
`The mobile application may run the comparison in a
`variety of manners. For example, the device may show the
`two images side-by-side so that the user can subjectively
`make the decision. Or the mobile device can conduct an
`
`image analysis to determine to what degrees they match. For
`example, United States Patent Application 20090252371
`entitled “Mobile device with color detection capabilities”
`will breakdown the component colors into percentages (i.e.
`“Red: 10%, Blue 47%, Green 43% and Purchase Item
`Image: Red: 12%, Blue 47%, Green 41%”). It will then
`display a closeness of color match based upon a preset
`overall percentage margins such as “Overall Result: Com-
`pares within 10%, Close Enough” or “Overall Result: Com-
`pares within 10%, Almost the Same” or “Overall Result:
`Compares within 40%, Does not Match”.
`The prior art also discloses the use of a remote server to
`analyze a color match. For example,
`in 2008 Hewlett-
`Packard Laboratories launched a service using a mobile
`device photograph to enable a woman to select her hue of
`foundation makeup (See US. Pat. No. 7,522,768). The
`consumer takes a photograph of herself using a phone
`camera while holding a specially designed color chart. The
`image is then sent by the consumer via multimedia messag-
`ing service (MMS) to an advisory service host at a backend
`server. The system uses color science to correct the image
`color, image processing algorithms to locate and extract the
`face from the image, and statistical classifiers to determine
`the user’s foundation makeup color with accuracy close to
`that of a makeup expert. The consumer then receives a SMS
`(Short Message Service) text message containing the foun-
`dation shade recommendation that best matches her com-
`
`plexion.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`
`The prior art does not, though, disclose color analysis of
`images captured on a mobile device using various image
`processing algorithms wherein the user can select what type
`of colors hues they will receive from the system, such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors.
`
`SUMMARY OF THE INVENTION
`
`The present invention comprises a computer implemented
`method, system and computer program product for identi-
`fying matching colors of a visual object captured in a digital
`image on a mobile device, such as with a mobile phone
`camera or a laptop webcam. The visual object is compared
`to a reference object that the mobile device user or another
`entity has previously captured, analyzed for color content,
`and stored on a system database. The user can then be
`provided a display on their mobile identifying the primary
`colors in the visual object, and other colors that would
`coordinate with the object for a “color match”, such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors.
`In a preferred embodiment of the present invention, the
`user captures an image on their mobile device of a reference
`object (i.e. fumiture, clothing, wall paint, etc. .
`.
`. ) that they
`wish to color coordinate with a similar object (i.e. pillows
`for furniture, shoes for clothing, wall paper for wall paint,
`etc.
`.
`.
`. ). The system and software will conduct a color
`analysis, which will identify its Main Colors (i.e., Base,
`Primary, Secondary and Tertiary Colors), of the reference
`object and optionally create a color harmonics of it. The
`system will query image database, then return and display
`matching color combinations and/or harmonics (such as
`analogous,
`triadic,
`tetradic, square, complementary, and
`split-complementary colors) based on the query, on the
`user’s mobile device.
`
`The computer implemented method as conducted by the
`software of the present invention comprise the steps of: 1)
`capturing an image on a terminal device, wherein the image
`are associated with a visual object; 2) conducting a color
`analysis, i.e. determining the Main Colors, on the reference
`image and optionally constructing a color wheel based on
`the analysis; 3) Querying an Image Database against the
`color analysis using one or more harmonics, wherein the
`user may manually select a particular type of harmonic; and,
`4) electronically transmitting and displaying results to the
`terminal based on the image color analysis, e.g. Main
`Colors; wherein the results comprise all harmonics if the
`user did not select the particular type of harmonic in step (3).
`The present invention uses various image enhancing and
`processing algorithms and techniques to detect and analyze
`the different color hues in a digital image, such as, HSV
`(Hue, Saturation, Value) color histograms, RGB color his-
`tograms, CYMK color histograms, and multi-space color
`clustering. The color analysis may also comprise, separating
`the object from its background, compensating for distortions
`such as shading and/or flash light, classifying each pixel to
`a predefined color set and finding the elements of the color
`set with the highest number of pixels.
`Other aspects of the invention may include a system
`arranged to execute the aforementioned methods and a
`computer readable program to include a mobile application
`configured to execute the aforementioned methods. These,
`additional, and/or other aspects and/or advantages of the
`embodiments of the present invention are set forth in the
`detailed description which follows; possibly inferable from
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 14 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 14 of 19
`
`US 9,710,928 B2
`
`3
`the detailed description; and/or learnable by practice of the
`embodiments of the present invention.
`
`BRIEF DESCRIPTION OF THE SEVERAL
`
`VIEWS OF THE DRAWING(S)
`
`invention will now be described in the
`The present
`following detailed description of exemplary embodiments of
`the invention and with reference to the attached drawings, in
`which dimensions of components and features shown are
`chosen for convenience and clarity of presentation and are
`not necessarily shown to scale. Generally, only structures,
`elements or parts that are germane to the discussion are
`shown in the figure.
`FIG. 1 is a scheme describing the system and process in
`accordance with an exemplary embodiment of the invention.
`FIG. 2 is a flowchart of acts performed in capturing and
`matching a visual object, in accordance with an exemplary
`embodiment of the invention.
`
`FIG. 3 is a scheme describing a color wheel in accordance
`with an exemplary embodiment of the invention.
`FIG. 4 is a scheme describing a selecting a complemen-
`tary color in accordance with an exemplary embodiment of
`the invention.
`
`FIG. 5 is a scheme describing the selection of analogous
`colors in accordance with an exemplary embodiment of the
`invention.
`
`10
`
`15
`
`20
`
`25
`
`FIG. 6 is a scheme describing the selection of triadic
`colors in accordance with an exemplary embodiment of the
`invention.
`
`30
`
`FIG. 7 is a scheme describing the selection of split
`complementary colors in accordance with an exemplary
`embodiment of the invention.
`
`FIG. 8 is a scheme describing the selection of tetriadic
`colors in accordance with an exemplary embodiment of the
`invention.
`
`35
`
`FIG. 9 is a scheme describing the selection of square
`colors in accordance with an exemplary embodiment of the
`invention.
`
`DESCRIPTION OF SPECIFIC EMBODIMENTS
`OF THE INVENTION
`
`Provided herein is a detailed description of this invention.
`It is to be understood, however, that this invention may be
`embodied in various forms, and that
`the suggested (or
`proposed) embodiments are only possible implementations
`(or examples for a feasible embodiments, or materializa-
`tions) of this invention. Therefore, specific details disclosed
`herein are not to be interpreted as limiting, but rather as a
`basis and/or principle for the claims, and/or as a represen-
`tative basis for teaching one skilled in the art to employ this
`invention in virtually any appropriately detailed system,
`structure or manner.
`
`Glossary of Terms
`the
`invention,
`To facilitate understanding the present
`following glossary of terms is provided. It is to be noted that
`terms used in the specification but not included in this
`glossary are considered as defined according the normal
`usage of the computer science art, or alternatively according
`to normal dictionary usage.
`The term “image” as used herein in this application is
`defined as visual representation that can be presented on two
`dimensional or three dimensional surfaces. Images can be
`taken in any part of the electromagnetic spectrum such as
`visible light, infrared, ultraviolet, X-rays, Terahertz, Micro-
`waves, and Radio frequency waves. The reference images
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`(i.e. color wheel) are stored in an “image database (DB)” on
`the system server or on the mobile device.
`The term “photo” as used herein in this application is
`defined as image in the visible light.
`is
`The term “GPS” as used herein in this application,
`defined as a system based on satellites that allows a user with
`a receiver to determine precise coordinates for their location
`on the earth’s surface.
`
`is
`The term “GPU” as used herein in this application,
`defined as an apparatus adapted to reduce the time it takes
`to produce images on the computer screen by incorporating
`its own processor and memory, having more than 16 CPU
`cores, such as GeForce 8800.
`The term “Keypoint” as used herein in this application, is
`defined as interest points in an object. For example, in the
`SIFT framework,
`the image is convolved with Gaussian
`filters at different scales, and then the difference of succes-
`sive Gaussian-blurred images are taken. Keypoints are then
`taken as maxima/minima of the Difference of Gaussians.
`
`Such keypoints can be calculated for the original image or
`for a transformation of the original image, such as an affine
`transform of the original images.
`The term “Keypoint descriptor” as used herein in this
`application,
`is defined as a descriptor of a keypoint. For
`example, in the SIFT framework the feature descriptor is
`computed as a set of orientation histograms on neighbor-
`hoods. The orientation histograms are relative to the key-
`point orientation and the orientation data comes from the
`Gaussian image closest in scale to the keypoint’s scale. Just
`like before, the contribution of each pixel is weighted by the
`gradient magnitude, and by a Gaussian with 0 1.5 times the
`scale of the keypoint. Histograms contain 8 bins each, and
`each descriptor contains an array of 4 histograms around the
`keypoint. This leads to a SIFT feature vector with (4x4><
`8:128 elements).
`The term “Visual content item” as used herein in this
`
`application, is defined as an object with visual characteris-
`tics such as an image file like BMP, JPG, JPEG, GIF, TIFF,
`PNG files; a screenshot; Avideo file like AVI, MPG, MPEG,
`MOV, WMV, FLV files or a one or more frame of a video.
`The term “visual object” as used herein in this application,
`is defined as a content that includes visual information such
`
`as visual content item, images, photos, videos, IR image,
`magnified image, an image sequence or TV broadcast.
`The term “camera” as used herein in this application is
`defined as means of capturing a visual object.
`The term “terminal” as used herein in this application is
`defined as an apparatus adapted to show visual content such
`as a computer, a laptop computer, mobile phone or a TV.
`The term “visual similarity” as used herein in this appli-
`cation, is defined as the measure of resemblances between
`two visual objects that can be comprised of:
`The fit between their color distributions such as the
`
`correlation between their HSV color histograms
`The fit between their texture
`
`The fit between their shapes
`The correlation between their edge histograms
`Face similarity
`include local descriptors such as Scale-
`Methods that
`feature
`transform (SIFT), Affine-SIFT
`invariant
`(ASIFT), Speeded Up Robust Feature (SURF), and
`Multi-Scale Retinex (MSR)
`The term “Visual analysis” as used herein in this appli-
`cation,
`is defined as the analysis of the characteristics of
`visual objects such, as visual similarity, coherence, hierar-
`chical organization, concept load or density, feature extrac-
`tion and noise removal.
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 15 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 15 of 19
`
`US 9,710,928 B2
`
`5
`The term “Capturing data analysis” as used herein in this
`application, is defined as the analysis of capturing data such
`as:
`
`X-Y—Z coordinates
`
`3 angles
`Manufacturer
`Model
`
`Orientation (rotation) top-left
`Software
`Date and Time
`
`YCbCr Positioning centered
`Compression
`x-Resolution
`
`y-Resolution
`Resolution Unit
`
`Exposure Time
`FNumber
`
`ExposureProgram
`Exif Version
`
`Date and Time (original)
`Date and Time (digitized)
`ComponentsConfiguration Y Cb Cr
`Compressed Bits per Pixel
`Exposure Bias
`MaxApertureValue
`Metering Mode Pattern
`Flash fired or not
`
`Focal Length
`MakerNote
`FlashPixVersion
`
`Color Space
`PixelXDimension
`PixelYDimension
`File Source
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`InteroperabilityIndex
`InteroperabilityVersion
`Derivates of the above such as acceleration in the X-axis
`
`The term “Service location” as used herein in this appli-
`cation, is defined as a physical place where objects can be
`serviced and/or fixed such as a mobile carrier service center.
`
`40
`
`6
`mixing two primary colors. Another six tertiary colors are
`created by mixing primary and secondary colors.
`The term “color harmonies” as used herein in this appli-
`cation, is defined as color combinations that are considered
`especially pleasing. They consist of two or more colors with
`a fixed relation in the color wheel.
`
`The term “color impact” as used herein in this application,
`is defined as the dynamic creation of a color wheel to match
`a visual object’s Base, Primary, Secondary and Tertiary
`Colors.
`
`The term “warm colors” as used herein in this application,
`is defined as vivid and energetic colors.
`The term “cool colors” as used herein in this application,
`is defined as colors that give an impression of calm, and
`create a soothing impression. White, black and gray are
`considered to be neutral.
`
`The term “complementary colors” as used herein in this
`application, and is further shown on FIG. 4, is defined as
`colors that are opposite each other on the color wheel.
`The term “analogous colors” as used herein in this appli-
`cation, and is further shown on FIG. 5, is defined as colors
`that are next to each other on the color wheel.
`
`The term “triadic colors” as used herein in this applica-
`tion, and is further shown on FIG. 6, is defined as colors that
`are colors that are evenly spaced around the color wheel.
`The term “split-complementary colors” as used herein in
`this application, and is further shown on FIG. 7, is defined
`as set of base color on the color wheel and two colors
`
`adjacent to its complementary color.
`The term “tetradic colors” as used herein in this applica-
`tion, and is further shown on FIG. 8, is defined as four colors
`arranged into two complementary pairs on the color wheel.
`The term “square colors” as used herein in this applica-
`tion, and is further shown on FIG. 9, is defined as four colors
`arranged into two complementary pairs on the color wheel,
`with all four colors spaced evenly around the color wheel.
`System for Analyzing Color Images
`FIG. 1 is a scheme describing the system and process in
`accordance with an exemplary embodiment of the invention.
`System 100 performs the process described hereinafter:
`Terminal 101, such as a mobile phone with camera 102 or
`a computer webcam, captures a visual object 120 represent-
`ing physical objects such as man with a shirt 124 or an
`apartment wall 122.
`The Capturing can be performed in several ways:
`Taking a photograph
`Recording a video
`Continuously capturing an image while local or remote
`processing provides real time feedback such “color not
`decided” or “a problem was found”. The continuous
`capturing can be done while moving the camera such as
`moving in the directions shown in 103.
`Said visual object can be captured from a static camera
`placed in the marketplace or from a camera held by person
`112. Person 112 can be a crowd of people that were
`incentivized to capture the object.
`Said visual object can be processed locally using terminal
`101 or it can be sent to a remote server 108, as further
`described in step 206 in FIG. 2, over a network 106, such as
`the internet.
`Server 108 or device 101 calculates a color feedback 140
`
`The term “Location based analysis” as used herein in this
`application, is defined as analysis of local data such as GPS
`location, triangulation data, RFID data, and street address.
`Location data can for example identify the service location
`or even the specific part of the service location in which the
`visual object was captured.
`The term “Color analysis” as used herein in this applica-
`tion, is defined as the combination of visual analysis, cap-
`turing data analysis, location based analysis and/or analysis
`of other data and analysis history to extract a color from a
`visual object. Color analysis can include the steps of sepa-
`rating the main object from its background, compensating
`for distortions such as shading and/or flash light, classifying
`each pixel to a predefined color set and finding the elements
`of the color set with the highest number of pixels.
`The term “marketplace” as used herein in this application,
`is defined as a physical place where objects can be bought
`such as a bank, a change point, a supermarket, a convenience
`store and a grocery store.
`The term “color wheel” as used herein in this application,
`and is further described in FIG. 3, is defined as an abstract
`illustrative organization of color hues around a circle that
`shows relationships between primary colors, secondary col-
`ors, complementary colors. In the RYB (or subtractive) color
`model, the primary colors are red, yellow and blue. The three
`secondary colors (green, orange and purple) are created by
`
`45
`
`50
`
`55
`
`60
`
`65
`
`that is sent over the internet or created locally. Feedback 140
`shows:
`
`Main colors found 141 on the visual object using steps
`204 to 207 as further described in FIG. 2.
`
`
`
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 16 of 19
`Case 1:20-cv-00909-LMM Document 1-2 Filed 02/27/20 Page 16 of 19
`
`US 9,710,928 B2
`
`7
`An option to select further images of the image DB, as
`further described in step 201 in FIG. 2, according to
`different color harmonies using one or more of the main
`colors found:
`
`Complementary colors 142 for color harmony further
`described in FIG. 4;
`Analogous colors 144 for color harmony further
`described in FIG. 5;
`Triadic colors 146 for color harmony further described
`in FIG. 6;
`Split-complementary colors 148 for color harmony
`further described in FIG. 7;
`Tetradic colors 150 for color harmony further described
`in FIG. 8; and,
`Square colors 152 for color harmony further described
`in FIG. 9.
`
`An example would be that a person takes a photo of green
`wall using a mobile device camera 102 or computer web-
`cam, in which green is found as the main color. The user
`selects complementary colors 142 and wall art having red as
`its main color is presented for the user to choose. One or
`more of the matching wall art can later be presented on the
`terminal display 101 to demonstrate the results to the user
`comprising red art on the green wall.
`Method of Capturing and Matching a Visual Obj ect’s Colors
`FIG. 2 is a flowchart of acts performed in capturing and
`matching a Visual object, in accordance with an exemplary
`embodiment of the invention.
`
`The flowchart describes a process and system 200 to
`capture and match Visual objects.
`An image database (DB) is loaded 201, including photos
`of a plurality of Visual objects from one or more sides. For
`example, a database of shirts and their main colors are
`extracted using color analysis techniques of the present
`invention. Visual object representative 120 is then captured
`202 using a mobile device camera 102 or computer webcam.
`The captured object is optionally analyzed 204 locally, as
`further described in step 207, to get a match using color or
`color harmonics analysis or to reduce the size of the data to
`be sent to remote or local servers in step 206.
`Optionally the image itself or a processed part of it is sent
`206 to a remote server 108 or locally processed on a server
`at device 101. The server performs color analysis 207 to
`generate color feedback 140. Such analysis uses the visual
`object and optionally other data, such as GPS data,
`the
`history of the sender, history of similar types of visual
`objects, and predefined categories. Main colors found in the
`visual object are displayed on color feedback 140.
`In case the user manually selects harmonics 208, then the
`main colors found in steps 204-207 are used to find images
`in the Image DB 210 that fit the relevant harmonics (i.e.
`complementary, analogous,
`triadic,
`split-complementary,
`tetradic, or square color matches). In case user does not
`manually select harmonics, a sample from each harmonic is
`displayed to the user, and then he/she chooses the harmonic
`he/she prefers. Feedback disclosing results of the harmonic
`colors that match the visual object of interest is then dis-
`played in step 212 using device such as 101. The feedback
`report 140 may comprise various forms. For example, the
`harmonic color(s) may be displayed on top of the original
`image captured in step 202. Optionally, further commercial
`ads are displayed in step 212 on device 101 in addition to the
`color match.
`
`Optionally a straight forward search of an object of the
`same color and/or tone is performed and displayed as well.
`FIG. 3 is a scheme describing a color wheel in accordance
`with an exemplary embodiment of the invention.
`
`8
`Color wheel 300 is comprised of 12 colors 302-324. The
`color wheel can further include more colors to create a
`
`continuum of hues between every pair of hues. The wheel is
`further divided by imaginary line 330 into warm colors 340
`and cool colors 350.
`
`5
`
`FIG. 4 is a scheme describing selecting a