throbber
Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 1 of 37 PageID #: 84
`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 1 of 37 PageID #: 84
`
`EXHIBIT C
`
`EXHIBIT C
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 2 of 37 PageID #: 85
`
`
`
`
`I 1111111111111111 11111 1111111111 1111111111 11111 11111 111111111111111 IIII IIII
`US008982109B2
`
`c12) United States Patent
`Vilcovsky et al.
`
`(IO) Patent No.:
`(45) Date of Patent:
`
`US 8,982,109 B2
`Mar.17,2015
`
`(72)
`
`(73)
`
`(54) DEVICES, SYSTEMS AND METHODS OF
`CAPTURING AND DISPLAYING
`APPEARANCES
`(71) Applicants:Nissi Vilcovsky, Tokyo (JP); Ofer
`Saban, Vienna, VA (US)
`Inventors: Nissi Vilcovsky, Tokyo (JP); Ofer
`Saban, Vienna, VA (US)
`Assignee: Eyesmatch Ltd, Road Town, Tortola
`(VG)
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 169 days.
`Appl. No.: 13/843,001
`Mar. 15, 2013
`Filed:
`Prior Publication Data
`
`( *)
`
`Notice:
`
`(21)
`(22)
`(65)
`
`............................ 345/204; 345/212; 345/214
`USPC
`(58) Field of Classification Search
`CPC ............ G06F 3/005; G06F 3/01; G06F 3/011
`USPC
`............... 345/204-215; 348/333.01; 434/395
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,230,039 A
`5,551,021 A
`
`7/1993 Grossman et al.
`8/ 1996 Harada et al.
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`DE
`DE
`
`3/2001
`19943355 Al
`1/2002
`10031965 Al
`(Continued)
`
`OTHER PUBLICATIONS
`
`Search Report for European Patent Application No. 06711263.1
`dated Aug. 18, 2011.
`
`(Continued)
`
`Primary Examiner - Vijay Shankar
`(74) Attorney, Agent, or Firm -Nixon
`Joseph Bach, Esq.
`
`Peabody LLP;
`
`(57)
`
`ABSTRACT
`
`Systems, devices and methods enabling appearance compari(cid:173)
`son. The system includes at least one interactive imaging and
`display station. The station includes a mirror-display device
`capable of selectably operating in either or both a mirror
`mode or a display mode; an imaging device to capture one or
`more appearances appearing in a field of view in front of the
`mirror-display device; and/or an image control unit to select
`the mode of operation of the mirror-display device according
`to a user command.
`
`18 Claims, 18 Drawing Sheets
`
`Sep. 5, 2013
`US 2013/0229482 Al
`Related U.S. Application Data
`
`(63)
`
`(60)
`
`Continuation-in-part of application No. 13/088,369,
`filed on Apr. 17, 2011, now Pat. No. 8,624,883, which
`is a continuation of application No. 11/817,411, filed
`as application No. PCT/IL2006/000281 on Mar. 1,
`2006, now Pat. No. 7,948,481.
`
`Provisional application No. 60/656,884, filed on Mar.
`1, 2005, provisional application No. 60/656,885, filed
`on Mar. 1, 2005, provisional
`application No.
`61/738,957, filed on Dec. 18, 2012.
`
`(51)
`
`Int. Cl.
`G06F 3/038
`H04N7/15
`
`(2013.01)
`(2006.01)
`(Continued)
`
`(52)
`
`U.S. Cl.
`CPC .. H04N 7115 (2013.01); G02B 5108 (2013.01);
`G09F 19116 (2013.01); G09F 27100 (2013.01);
`H04N 71144 (2013.01); G06F 31011 (2013.01);
`H04N 11622 (2013.01); G06T 3/00 (2013.01);
`H04N 5/2624 (20l3.01);A47F 2007/195
`(2013.01); H04N 2005/2726 (2013.01); H04N
`5/2628 (2013.01)
`
`120
`
`10
`
`20
`
`30
`
`4-0
`
`50
`
`Controller
`121 12h
`Network Interface
`
`StoraeDevice
`123 124
`Input Device
`125
`Storaelnterface
`
`,.-120
`
`123 124
`
`10
`
`30
`
`40
`
`50
`
`llO
`
`m
`
`' ~ '
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 3 of 37 PageID #: 86
`
`US 8,982,109 B2
`Page 2
`
`(51)
`
`Int. Cl.
`G02B 5108
`G09F 19116
`G09F 27100
`H04N7/14
`G06F 3101
`H04N 1162
`G06T 3/00
`H04N 5/262
`A47F 7/19
`H04N 5/272
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,923,776 A
`5,937,081 A
`6,195,467 Bl
`6,366,694 Bl
`6,417,850 Bl
`7,500,755 B2 *
`7,874,481 B2
`7,874,681 B2 *
`7,948,481 B2
`8,000,727 Bl
`8,624,883 B2
`2002/0049546 Al
`2002/0196333 Al
`2003/0085866 Al
`2003/0110099 Al
`2003/0146901 Al
`2005/0018140 Al
`2005/0047629 Al
`2005/0259158 Al
`2006/0007303 Al
`2006/0158534 Al
`2006/0178902 Al
`2007 /0040033 Al
`2007/0120954 Al
`2007/0132863 Al
`2008/0056564 Al
`2008/0151092 Al
`2009/0051779 Al
`2009/0091710 Al
`2010/0097442 Al
`2010/0169411 Al
`2010/0190510 Al
`2010/0191578 Al
`2011/0199294 Al
`2011/0210970 Al
`2012/0120184 Al
`2012/0154872 Al
`2012/0169850 Al
`
`7/1999 Kamgar-Parsi
`8/1999 O'Brill et al.
`2/2001 Asimopoulos et al.
`4/2002 Acharya
`7/2002 Kang
`3/2009
`Ishizaki et al. .................. 353/28
`1/2011 Silverbrook et al.
`1/2011 Huebner
`......................... 353/28
`5/2011 Vilcovsky
`8/2011 Bushman et al.
`1/2014 Vilcovsky
`4/2002 Shimomura
`12/2002 Gorischek
`5/2003 Bimber et al.
`6/2003 Trajkovic et al.
`8/2003 Ryan
`1/2005 Ishizaki et al.
`3/2005 Farrell et al.
`11/2005 Jacob et al.
`1/2006 Milton
`7/2006 Gotohda
`8/2006 Vicars et al.
`2/2007 Rosenberg
`5/2007 Allen et al.
`6/2007 Deguchi
`3/2008 Lindbloom
`6/2008 Vilcovsky
`2/2009 Rolston
`4/2009 Huebner
`4/2010 Lablans
`7/2010 Colton et al.
`7/2010 Maranhas et al.
`7/2010 Tranetal.
`8/2011 Vilcovsky
`9/2011 Segawa
`5/2012 Fornell et al.
`6/2012 Reddy
`7/2012 Kim et al.
`
`2012/0177284 Al
`2012/0229637 Al
`2012/0233089 Al
`2013/0083015 Al
`2013/0088562 Al
`2014/0225977 Al
`2014/0225978 Al
`2014/0226000 Al
`2014/0226900 Al
`
`7/2012 Wang
`9/2012 Mooradian et al.
`9/2012 Calman et al.
`4/2013 Hernandez Esteban
`4/2013 Hong et al.
`8/2014 Vilcovsky et al.
`8/2014 Saban et al.
`8/2014 Vilcovsky et al.
`8/2014 Saban et al.
`
`FOREIGN PATENT DOCUMENTS
`
`EP
`EP
`EP
`WO
`WO
`WO
`
`1372092 Al
`1376207 Al
`1859432 A2
`00/22955 Al
`2006/092793 A2
`2014/100250 A2
`
`12/2003
`1/2004
`11/2007
`4/2000
`9/2006
`6/2014
`
`OTHER PUBLICATIONS
`
`for European Patent Application No.
`Examination Report
`06711263.1 dated Jun. 21, 2012.
`2nd Examination Report for European Patent Application No.
`06711263.1 dated Jun. 25, 2013.
`International Search Report and Written Opinion for PCT/IL2006/
`000281 mailed on Jun. 12, 2007.
`International Preliminary Report on Patentability for PCT /IL2006/
`000281 mailed on Sep. 20, 2007.
`Office Action for U.S. Appl. No. 13/088,369 mailed on Nov. 2, 2012.
`Office Action for U.S. Appl. No. 13/088,369 mailed on Apr. 9, 2013.
`Notice of Allowance for U.S. Appl. No. 13/088,369 mailed on Sep. 3,
`2013.
`Office Action for U.S. Appl. No. 11/817,411 mailed on Jul. 21, 2010.
`Notice of Allowance for U.S. Appl. No. 11/817,411 mailed on Feb. 3,
`2011.
`Decision to Refuse for European Patent Application No. 06711263 .1
`dated Nov. 4, 2014.
`International Search Report and Written Opinion for PCT/US2013/
`076253 dated May 6, 2014.
`Notice of Allowance for U.S. Appl. No. 14/253,800 dated Oct. 24,
`2014.
`Invitation to Pay Additional Fees and, Where Applicable, Protest Fee
`for PCT/US2014/034333 dated Aug. 8, 2014.
`International Search Report and Written Opinion for PCT/US2014/
`034333 dated Dec. 17, 2014.
`Notice of Allowance for U.S. Appl. No. 14/253,827 dated Dec. 15,
`2014.
`Notice of Allowance for U.S. Appl. No. 14/253,831 dated Dec. 3,
`2014.
`
`* cited by examiner
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 4 of 37 PageID #: 87
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 1 of 18
`
`US 8,982,109 B2
`
`,-100
`
`110
`
`130
`
`120
`
`131
`
`Storage Device
`123 124
`Input Device
`125
`.....___ __ Storage Interface --.+--+----..i
`
`Portable
`Storage
`Device
`180
`
`190
`Control
`Center
`
`FIG. 1
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 5 of 37 PageID #: 88
`
`110
`
`140
`)
`
`130
`
`12l
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`~
`~ :-: ....
`0 ....
`
`"'-....J
`N
`
`Ul
`
`+ + +
`+ +
`+ +
`+
`+ +
`+ + +
`'+ +
`+
`
`+
`
`('D
`
`rJJ =(cid:173)
`('D .....
`N
`0 ....
`....
`
`QO
`
`+
`+
`+ +
`+ +
`+ +
`+ +
`+
`
`+
`
`r·
`
`I
`
`I
`I
`I
`I
`
`120
`
`131~
`
`10
`
`Controller
`121 122\
`Network Interface
`
`I
`
`I
`141
`
`Storage Device
`123 124
`Input Device
`125\
`Storage Interface
`
`I
`
`40
`
`50 I
`
`FIG. 2A
`
`d r.,;_
`00
`\0
`00
`N
`"'
`
`"'""' = \0 = N
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 6 of 37 PageID #: 89
`
`130
`
`110
`
`I
`
`140
`
`192
`
`120
`
`131
`
`141
`
`10
`
`4o
`
`1
`
`50 I
`
`Controller
`121 122\
`Network Interface I
`Storage Device I
`123 124--:\
`I
`Input Device
`125\
`Storage Interface
`
`+ + ~
`+
`+ +
`
`+
`
`+
`
`FIG. 2B
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`~
`~ :-: ....
`0 ....
`
`"'-....J
`N
`
`Ul
`
`rJJ =-('D
`('D .....
`0 ....
`....
`
`~
`
`QO
`
`d r.,;_
`00
`\0
`00
`N
`"'
`
`"'""' = \0 = N
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 7 of 37 PageID #: 90
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 4 of 18
`
`US 8,982,109 B2
`
`110
`
`130
`
`131
`
`141
`120
`
`10 ............... --------,
`Controller
`121 122
`Network Interface
`
`Storage Device
`123 124
`Input Device
`125
`Storage Interface
`
`FIG. 3A
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 8 of 37 PageID #: 91
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 5 of 18
`
`US 8,982,109 B2
`
`110
`
`131
`
`141
`120
`
`10 ............... ----------,
`Controller
`121 122
`Network Interface
`
`Storage Device
`123 124
`Input Device
`125
`Storage Interface
`
`FIG. 3B
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 9 of 37 PageID #: 92
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 6 of 18
`
`US 8,982,109 B2
`
`110
`
`131
`
`10
`
`20
`
`30
`
`40
`
`50
`
`141
`120
`
`Controller
`121 122
`Network Interface
`
`Storage Device
`123 124
`Input Device
`125
`Storage Interface
`
`FIG. 3C
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 10 of 37 PageID #: 93
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 7 of 18
`
`US 8,982,109 B2
`
`Set
`Mirror-Display Device ~ 410
`to Mirror Mode
`
`w
`
`Posing in front of
`Mirror-Display Device
`
`~
`
`420
`
`w
`
`image of
`Capture
`user appearance ~ 430
`of first trial
`
`Posing in front of
`Mirror-Display Device ~ 440
`in different appearance
`
`w
`
`Switching between
`operating modes of ~ 450
`Mirror-Display
`
`FIG. 4
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 11 of 37 PageID #: 94
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 8 of 18
`
`US 8,982,109 B2
`
`Image
`
`D1
`
`Mirror
`
`User
`
`FIG. 5A
`
`Image
`
`02 Mirror 02
`
`User
`
`FIG. 5B
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 12 of 37 PageID #: 95
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 9 of 18
`
`US 8,982,109 B2
`
`Screen
`
`----
`
`--------
`
`8
`
`C
`
`FIG. 6
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 13 of 37 PageID #: 96
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 10 of 18
`
`US 8,982,109 B2
`
`Screen
`
`A
`
`0
`
`FIG. 7
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 14 of 37 PageID #: 97
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 11 of 18
`
`US 8,982,109 B2
`
`A
`
`B
`
`C
`
`1
`
`FIG. 8
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 15 of 37 PageID #: 98
`
`r
`Video/still/lR
`Cam 2D/3D
`1:n
`
`'
`
`)
`
`\..
`
`\_930
`
`~
`
`Image grabbing module:
`
`Enhancement filters, format
`conversion, video frame separation, .;
`Image cropping or resize
`
`Image stitching if needed etc.
`\_932
`
`934\
`Eyes-match transformation:
`Apply on the image the right mapping
`to match the camera point of view with
`+--
`theoretical mirror point of view
`(user eyes reflection) and fill the blind
`pixels if there are after the mapping.
`
`/
`
`,,
`
`/
`
`936\
`
`-
`
`Screen 1:m ... -
`
`-
`
`/
`
`\_940
`
`-;
`
`Trigger event module:
`User in front of the mirror
`Face recognition
`User gesture commands
`Item recognition
`Distance measurement
`User body measurements/
`
`assessment (height, age,
`weight, ethnic group, Sex etc.)
`Calculate User theoretical point of
`view in theoretical mirror.
`
`!
`
`962\
`
`\_960
`Control element
`Control and management:
`
`Set the camera for optimize quality
`Set other HW elements,
`-
`Interface between algorithm modules and
`
`higher code/application/user interfaces.
`Push calibrated data from factory
`into the algorithm elements.
`
`964\
`
`!
`
`Factory calibration
`
`Define the mapping transformation
`between camera and user point of
`view in front of the screen.
`
`Calibration based distance, special location
`and user height or any combination.
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`~
`~ :-: ....
`0 ....
`
`"'-....J
`N
`
`Ul
`
`('D
`
`rJJ =(cid:173)
`('D .....
`....
`N
`0 .....
`....
`
`QO
`
`d r.,;_
`00
`\0
`00
`N
`"'
`
`""'"' = \0 = N
`
`---+ Cloud
`
`950_)
`
`Web/
`store
`
`952_)
`
`User
`---+ smart
`phone
`954_}
`
`Virtual dressing/
`
`augmented reality module
`
`f-+
`
`......
`-
`
`.;;:
`
`Video/still recording:
`
`Record single image or short
`take based on SW control.
`\_93g
`
`+--
`
`FIG. 9
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 16 of 37 PageID #: 99
`
`,,
`,,
`_,,,,--------
`,,
`
`14
`
`/
`
`/
`,-'
`
`------------
`
`- - - - - - - ------
`
`.........
`.........
`.........
`........
`............
`',
`
`',,
`
`10
`
`12
`
`..........
`...................
`---------------
`
`~
`00
`•
`~
`~
`~
`
`~ = ~
`
`~
`~ :-: ....
`0 ....
`
`"'-....J
`N
`
`Ul
`
`FIG. lOA
`
`FIG. lOB
`
`1130 ~-c!!,2s
`1120
`I'
`, 1xe ,
`.
`Display H Fill :
`Transform
`
`I
`I
`
`Ip·
`
`I
`I
`L ______
`
`I
`I
`
`I
`I
`..J
`
`I
`
`I
`
`I
`I
`l _________
`
`I
`I
`...1
`
`FIG. 11
`
`~-C!!_l_~,
`,,,.---1110
`. H H Live H Factory
`i
`: Scaling : Measure
`
`,,,.---1105
`
`,,,.---1100
`
`Imaging
`
`Calibration
`
`('D
`
`~
`
`rJJ =(cid:173)
`('D .....
`....
`0 ....
`....
`
`QO
`
`d r.,;_
`00
`\0
`00
`N
`
`"' """' = \0 = N
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 17 of 37 PageID #: 100
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 14 of 18
`
`US 8,982,109 B2
`
`Set Pointers (e.g. sticker) on target
`
`face and body that can be
`detected by the cameras
`
`1261
`
`Record a set of base and offset images
`
`Input
`Images
`R=1:n
`H=1:m
`
`Reference Images
`
`(Base)
`R=1:n
`H=1:m
`
`\.
`
`1262
`
`Registration Point generator:
`Specify control points in input and base images
`
`
`using cpselect tool or equivalent tool
`1263
`or automated algorithm.
`
`You can add more points around the eyes
`
`Obtain corresponded set of
`(input (Xi, Yi), & Base (Xi, Yi))
`
`1260
`
`1230
`
`Live Video/
`still/lR Cam
`2D/3D
`1:n
`
`1260
`Trigger event
`module distance
`measurement
`and/or actual
`point of view
`as described etc.
`
`Camera
`optimization
`setting
`
`1232
`
`Camera setting
`E.g. Zone, FOV
`to optimize
`the quality
`
`the eyes-optional
`Fine tune control points around
`___._ ___
`1264
`..-- __
`Determine parameters of spatial
`
`transformation that best match the
`,__ ____ ......,... ____ ___.
`control points in both input and base.
`1265
`
`E.g. using cp2tform (or equivalent module).
`1266
`
`Structure the mapping -
`
`
`Generate
`
`the transformation -
`tform (1 :n x 1:m)
`___._ ___
`..-- __
`1234
`based
`
`Interpolate the required transformation
`
`on estimated user distance or actual point of
`view, and camera setting (optional)
`
`transformation mapping blind pixel __ ___.
`
`Image
`filling / under sampling when needed
`1237
`
`1240
`
`
`
`EyesMatch Image
`
`FIG. 12
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 18 of 37 PageID #: 101
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 15 of 18
`
`US 8,982,109 B2
`
`t(cid:173)
`oo
`.......
`
`0
`
`>:\\\\
`
`'
`' '
`' '
`'
`
`'
`
`~
`
`r-------------.,,,.
`
`I
`I
`I
`I
`I
`I
`
`\
`\
`
`I
`I
`
`\
`
`' \
`
`\
`
`', -----,
`
`\
`
`\
`
`\
`\
`\
`
`1i.;,...1-----'-----+-'F-'
`
`'
`'
`
`'
`'
`' ' ' '
`
`/
`
`/
`
`/
`
`/
`
`00 .......
`
`• c.,
`~
`~
`
`I
`
`I 1D0
`
`/
`
`/
`
`/
`
`/
`
`/
`
`("f')
`
`00 .......
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 19 of 37 PageID #: 102
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 16 of 18
`
`US 8,982,109 B2
`
`1430
`
`Live Video/
`still/lR Cam
`2D/3D
`1 :n
`
`1450
`Trigger event module
`Progressive
`
`Background learning
`and subtraction
`Detect that user
`located in front
`of the camera
`
`Camera
`optimization ____
`setting
`1432
`
`_.
`
`Separate user from background
`and transfer to binary Image
`1462
`
`Calculate central mass (J, k)
`
`1454
`
`Calculate the user location on the floor
`(Shoe position)
`(Min J around k +/-1/2 body width)
`Calculate the distance interpolation
`based on factory calibration
`measurements or direct calculation
`based resolution and Known geometry
`
`1466
`
`Calculate the user Height
`(Max J around k +/_ 1/2 body width)
`
`Estimate/find the eyes location
`
`1468
`
`1420
`
`put
`ce, Height,
`of view of
`
`FIG. 14
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 20 of 37 PageID #: 103
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 17 of 18
`
`US 8,982,109 B2
`
`( 1530
`Live Video 1/
`still/lR Cam
`2D/3D
`1:n
`'
`
`( 1560
`Trigger event module
`distance measurement
`.,.._
`and/or actual point
`of view as described etc.
`
`( 1532
`Live Video N/
`still/lR Cam
`2D/3D
`1:n
`
`---+
`
`J~
`
`~
`
`~
`
`Camera
`optimization
`setting
`\__1533
`(1560
`Eyes Match
`Algorithm (1}
`
`-
`
`~
`
`(1564
`"
`Reference
`point
`0,k}
`
`:::
`
`1570
`(
`
`Overlay
`decision
`
`''
`
`Camera
`optimization
`setting
`\__1536
`1562\
`Eyes Match
`Algorithm (N}
`
`-
`
`~
`
`~
`
`1566\
`Reference
`point
`(JN,kN}
`
`::::
`
`~
`
`Stitching based on the overlay decision
`redundancy and best performance
`\__1572
`
`(1574
`Boundary smoothing optional (average filter or equivalent}
`
`'
`
`1576
`
`Stitched Image output
`
`FIG. 15
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 21 of 37 PageID #: 104
`
`U.S. Patent
`
`Mar.17,2015
`
`Sheet 18 of 18
`
`US 8,982,109 B2
`
`1630
`
`Live Video/
`still/lRCam
`2D/3D
`1:n
`
`1660
`Trigger event
`module distance
`measurement
`and/or actual
`point of view
`as described etc.
`
`Set Pointers (e.g. sticker) on target
`
`face and body that can be
`detected by the cameras
`
`,__ ____ ....,.. ____ ___.
`
`1661
`
`Record a set of base and offset images
`
`Input
`Images
`R=1:n
`H=1:m
`
`Reference Images
`
`(Base)
`R=1:n
`H=1:m
`
`1662
`
`Registration Point generator:
`
`Specify control points in input and base images
`
`
`using cpselect tool or equivalent tool
`1663
`or automated algorithm.
`
`You can add more points around the eyes
`
`Obtain corresponded set of
`(input (Xi, Yi), & Base (Xi, Yi))
`
`1660
`
`Camera
`optimization
`setting
`
`1632
`
`Camera setting
`E.g. Zone, FOV
`to optimize
`the quality
`
`Fine tune control points around the eyes-optional
`
`___
`___._ ____
`1664
`
`Determine parameters of spatial
`
`
`transformation that best match the
`,__ ____ ....,.. ____ ___.
`
`control points in both input and base.
`1665
`
`E.g. using cp2tform (or equivalent module).
`1666
`
`Structure the mapping -
`
`
`
`
`Generate the transformation -
`!form (1 :n x 1:m)
`1634
`___._ ____
`___
`Interpolate the required transformation based
`
`
`
`on estimated user distance or actual point of
`view, and camera setting (optional)
`
`1637
`
`Image transformation mapping, blind pixel
`filling / under sampling when needed
`
`1680
`Estimate eyes pose after geometric mapping.
`
`
`Reconstruct eyes for forward look
`
`1640
`
`
`
`EyesMatch Image
`
`FIG. 16
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 22 of 37 PageID #: 105
`
`US 8,982,109 B2
`
`1
`DEVICES, SYSTEMS AND METHODS OF
`CAPTURING AND DISPLAYING
`APPEARANCES
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application is Continuation-in-Part of U.S. applica(cid:173)
`tion Ser. No. 13/088,369, filed Apr. 17, 2011, which is a
`continuation of U.S. application Ser. No. 11/817,411, now
`U.S. Pat. No. 7,948,481, issued May 24, 2011, and which was
`a National Phase Application of PCT International Applica(cid:173)
`tion No. PCT /IL06/000281, International Filing Date Mar. 1,
`2006, which claims the benefit of U.S. Provisional Applica(cid:173)
`tion No. 60/656,884, filed Mar. 1, 2005, and U.S. Provisional
`Application No. 60/656,885, filed Mar. 1, 2005. This appli(cid:173)
`cation further claims the priority benefit of U.S. Provisional
`Application No. 61/738,957, filed Dec. 18, 2012. The entire
`disclosures of all of the above listed applications are incor(cid:173)
`porated herein by reference.
`
`BACKGROUND
`
`1. Field
`The invention relates generally to imaging and display
`systems and, more particularly, to monitors, and interactive
`displays, e.g., in retail and/or service environments, medical
`or home situations, video conferencing, gaming, etc. Specific
`implementations relate to making a flat panel display appear
`as a mirror. Another specific implementation relates to mak(cid:173)
`ing a flat panel display provide a video of a person looking at
`his eyes to create an eye to eye video conference.
`2. Related Art
`Customers may shop for consumer articles, for example,
`apparel such as clothes, e.g., shirts, pants, coats and other
`garments, as well as shoes, glasses, and/or any other items or
`products, such as cosmetics, furniture and the like. Shopping
`normally takes place at a shopping facility, for example, retail
`stores. Prior to making a decision which article to buy, a
`customer may try on various articles ( e.g., apparel, cosmetics)
`and/or pose with other background articles (e.g., furniture),
`and may view for each trial a user-appearance in front of a
`mirror, which may be located, for example, at a trial area of
`the retail store. For example, the customer may try on a first
`article, e.g., a suit, and view for that first trial his/her user(cid:173)
`appearance in front of the mirror. Then, the customer may try
`on a second article, e.g., another suit. The customer may then
`need to memorize his/her user-appearance from the first trial
`in order to perform a mental comparison between the first 50
`article and the second article, thereby to evaluate which of the
`two articles might be a better fit for the customer.
`Unfortunately, since the customer may try on numerous
`articles and/or since the second trial may take place a consid(cid:173)
`erable amount of time after the first trial or even at a different 55
`store, the customer may not be able to recall his/her appear(cid:173)
`ance for each trial and may therefore be required to repeatedly
`retry articles, e.g., items of apparels, previously tried on. This
`may result in a frustrating and inefficient shopping experi(cid:173)
`ence.
`The conventional mirror (i.e., reflective surface) is the
`common and most reliable tool for an individual to explore
`actual self-appearance, in real time. A few alternatives have
`been proposed by prior art around the combination of a cam(cid:173)
`era and a screen to replace the conventional mirror. However,
`these techniques are not convincing and are not yet accepted
`as a reliable image of the individual as if he was looking at
`
`2
`himself in a conventional mirror. This is mainly because the
`image generated by a camera is very different from an image
`generated by a mirror.
`When a user looks at himself in the mirror, what he actually
`5 sees is the reflection of himself as if he was standing at a
`distance that is double the distance from him to the mirror.
`This is illustrated in FIG. SA, wherein the user standing at
`distance Dl sees himself at a distance equal to twice Dl.
`Similarly, as shown in FIG. SB, a user standing at distance D2
`10 will see himself at a distance 2xD2. In addition, the angle of
`the user's Field of View (FOY) changes when the user
`changes the distance, e.g., gets closer, to the mirror. The FOY
`is limited by the specular reflecting angle (~) from the mirror
`to the user's eye and to the edge of the visible image on all
`15 sides of the mirror (four sides for a rectangular or square
`mirror). In FIG. SB the bottom of the vertical FOY is illus(cid:173)
`trated as double the angle(~) formed by the lines connecting
`the user's eyes to the bottom of the mirror and reflecting to the
`user's shoes. Consequently, as illustrated in FIG. SB, when
`20 the user approaches the mirror, the FOY angle increases,
`which is why he continues to see the same size reflection
`(FOVl <FOV2), so that the user actually sees himself roughly
`at the same size, but closer. This is a noticeable difference
`from a camera, wherein as the user gets closer to the camera,
`25 he appears larger in the image. This is mainly because the
`FOY of a camera is fixed and is determined mainly by the size
`of the camera lens, or focal length.
`There are other phenomena to note regarding reflection of
`a mirror. For example, when the user approaches the mirror,
`30 the reflection of his eyes will always stay on the same virtual
`line into the mirror. Conversely, depending on a camera's
`height, as the user gets closer to the camera, the user's eyes
`may appear at different levels. Another difference from a
`camera is that when one looks at a mirror, one's image
`35 appears to be reversed ( e.g., if one raises one's right hand, his
`left hand will appear to go up in the mirror). However, a
`mirror does not "swap" left and right any more than it swaps
`top and bottom. A mirror reverses the forward/backward axis
`(i.e., what's in front of the mirror appears to be behind the
`40 mirror), and we define left and right relative to front and back.
`Also, because the image in the mirror is virtual, the mirror can
`be smaller than the full body and the user will still see the
`reflection of his full body. The reason is that the specular
`reflection (in FIG. SA the angle of incidence ~ equals to the
`45 reflection angle ~) can increase the effective field of view
`while the user approaches the mirror. Moreover, although the
`mirror is a two dimensional object, the user sees his appear(cid:173)
`ance in three dimensions.
`For at least some of the reasons noted above, so far no
`system has been provided for imitating a mirror convincingly.
`Imitating a mirror can have many applications in retail and
`other fields, and opens the possibility of incorporating real
`life experiences with virtual life experiences, such as sharing
`on social networks and other mobile technologies.
`
`SUMMARY
`
`Some demonstrative embodiments of the invention include
`devices, systems and/or methods enabling appearance com-
`60 parison.
`According to some demonstrative embodiments of the
`invention, a system enabling appearance comparison may
`include at least one interactive imaging and display station.
`The station may include, for example, a mirror display device
`65 capable of operating in either or both a mirror or a display
`mode; an imaging device to capture one or more appearances
`from a field of view in front of the mirror-display device;
`
`

`

`Case 1:21-cv-00111-UNA Document 1-3 Filed 01/28/21 Page 23 of 37 PageID #: 106
`
`US 8,982,109 B2
`
`4
`advantages thereof, may best be understood by reference to
`the following detailed description when read with the accom(cid:173)
`panied drawings in which:
`FIG. 1 is a schematic illustration of an interactive system
`5 enabling appearance comparison in accordance with some
`demonstrative embodiments of the invention;
`FIGS. 2A and 2B are schematic illustrations of two,
`sequential, stages of appearances comparison using an inter(cid:173)
`active system
`in accordance with some demonstrative
`10 embodiments of the invention; and
`FIGS. 3A, 3B and 3C are schematic illustrations of three,
`sequential, stages of appearances comparison using an inter(cid:173)
`active system
`in accordance with some demonstrative
`embodiments of the invention; and
`FIG. 4 is a schematic flow chart of a method enabling
`comparison of one or more user-appearances in accordance
`with some demonstrative embodiments of the invention.
`FIGS. SA and 5B schematically illustrate mirror reflection.
`FIG. 6 illustrates an embodiment having one camera or
`20 multiple cameras in a vertical array at the user's eye level, so
`as to obtain low image distortion.
`FIG. 7 is a schematic illustrating what happens when the
`user approaches the mirror or moves away from it when using
`a camera mounted above the screen and pointed horizontally.
`FIG. 8 is a schematic illustrating a system wherein the
`camera is positioned above a monitor screen and is tilted
`downwards.
`FIG. 9 is a block diagram of an embodiment of the inven(cid:173)
`tion which performs image transformation to generate an
`30 image that mimics a mirror.
`FIGS. lOA and 10B are schematics illustrating calibration
`processes according to embodiments of the invention.
`FIG. 11 is a block diagram illustrating a process according
`to an embodiment of the invention.
`FIG. 12 is a block diagram illustrating modules and pro(cid:173)
`cesses to perform the calibration and image transformation
`according to an embodiment of the invention.
`FIG. 13 illustrates another embodiment, wherein calibra(cid:173)
`tion and transformation mapping is performed in the field
`40 after installation of the system.
`FIG. 14 illustrates an embodiment for extracting data from
`the image of camera.
`FIG. 15 illustrates an embodiment wherein stitching of
`images from n cameras is performed.
`FIG. 16 illustrates an embodiment for presenting the eyes
`to their fullest, mimicking the user looking directly at himself
`in the mirror.
`It will be appreciated that for simplicity and clarity of
`illustration, elements shown in the figures have not necessar-
`50 ily been drawn accurately orto scale. For example, the dimen(cid:173)
`sions of some of the elements may be exaggerated relative to
`other elements for clarity or several physical components
`included in one element. Further, where considered appropri(cid:173)
`ate, reference numerals may be repeated among the figures to
`55 indicate corresponding or analogous elements. It will be
`appreciated that these figures present examples of embodi(cid:173)
`ments of the present invention and are not intended to limit the
`scope of the invention.
`
`3
`and/or an image control unit to select the mode of operation
`the mirror display device according to a user command. The
`mirror display device may be in the form of a flat panel TV,
`wherein during mirror mode the TV presents a transposed live
`video feed from the camera, while during display mode it
`presents a transposed video taken at an earlier time and
`fetched from a memory.
`According to some demonstrative embodiments of the
`invention the image control unit may include an input device
`to receive the user command.
`According to some demonstrative embodiments of the
`invention, the image control unit may include a storage device
`to store data of one or more images which may correspond to
`one or more appearances.
`According to some demonstrative embodiments of the 15
`invention, the mirror-display device may be capable of being
`partitioned into at least first and second simultaneously-dis(cid:173)
`playable frames. The first frame may be selectably operable,
`for example, both in a mirror mode and a display mode. The
`second frame may be operable, for example, in a mirror mode.
`According to some demonstrative embodiments of the
`invention, the imaging device may be capable of capturing
`three-dimensional images of appearances.
`According to some demonstrative embodiments of the
`invention, the mirror-display device may be capable of dis- 25
`playing images of appearances at predefined sequences.
`According to some demonstrative embodiments of the
`invention, the image control unit may be able to selectively
`enable a user access to images of appearances authorized to
`the user, e.g., based on user-identifying data received from the
`user.
`According to some demonstrative embodiments of the
`invention, the at least one interactive imaging and display
`system may include two or more interactive imaging and
`display stations able to communicate over a network. For 35
`example, the two or more stations may be able to communi(cid:173)
`cate between each other data representing images of appear(cid:173)
`ances.
`According to some demonstrative embodiments of the
`invention, the image control unit may control the mirror(cid:173)
`display device to display, e.g., during the display mode, one or
`more images corresponding to the appearances. The one or
`more images may include, for example, one or more mirrored
`appearances. The mirrored appearances are obtained by
`transposing the images or video feed obtained from a camera 45
`to generate images and video that, when presented on a moni(cid:173)
`tor, resembles an appearance in a mirror.
`According to some demonstrative embodiments of the
`invention, a method enabling appearance comparison may
`comprise using a mirror mode of operation of a mirror-dis(cid:173)
`play device capable of being selectably operated in either a
`mirror or a display mode; capturing an image corresponding
`to an appearance of a first trial in front of the mirror-display
`device; storing the image of the first trial; selecting the display
`mode of operation of the mirror-display device; and/or
`retrieving the image of the first trial and displaying the image
`on the mirror-display device.
`According to further embodiments, methods and apparatus
`are provided utilizing a camera and a flat screen display to
`create a convincing mirror appearance.
`
`60
`
`DETAILED DESCRIPTION
`
`BRIEF DESCRIPTION

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket