throbber
Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 1 of 31
`
`EXHIBIT NO. 1
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 1
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 2 of 31
`
`IN THE UNITED STATES DISTRICT COURT
`FOR THE WESTERN DISTRICT OF TEXAS
`WACO DIVISION
`
`SLYCE ACQUISITION INC.,
`
` Plaintiff,
`
` V.
`
`SYTE – VISUAL CONCEPTION LTD.
`AND KOHL’S CORPORATION,
`
` Defendants.
`
` Case No. 6:19-cv-00257-ADA
`
`DECLARATION OF MICHAEL I. SHAMOS, PH.D., J.D. IN SUPPORT OF
`SLYCE ACQUISITION INC.’S REPLY CLAIM CONSTRUCTION BRIEF
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 2
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 3 of 31
`
`II.
`III.
`IV.
`V.
`
`Contents
`I.
`INTRODUCTION AND QUALIFICATIONS ....................................................................... 1
`A.
`Engagement .................................................................................................................. 1
`B.
`Background and Qualifications .................................................................................... 2
`MATERIALS CONSIDERED ............................................................................................ 4
`SUMMARY OF OPINIONS ............................................................................................... 5
`LEVEL OF ORDINARY SKILL IN THE ART ................................................................. 5
`THE BOVIK DECLARATION ........................................................................................... 6
`“measure of distinction” ............................................................................................. 12
`A.
`“alignment of categories” ........................................................................................... 18
`B.
`THE ROSS DECLARATION ........................................................................................... 22
`VI.
`VII. CONCLUDING STATEMENT ........................................................................................ 26
`
`DECLARATION OF MICHAEL SHAMOS
`
`ii
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 3
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 4 of 31
`
`I, Michael Shamos, a resident of the Commonwealth of Pennsylvania,
`
`hereby declare under penalty of perjury under the laws of the United States of
`
`America as follows:
`
`I.
`
`INTRODUCTION AND QUALIFICATIONS
`
`A.
`
`Engagement
`
`1.
`
`I have been retained by Plaintiff Slyce Acquisition Inc. (“Slyce”) to provide
`
`opinions regarding certain terms appearing in the claims of U.S. Patent No.
`
`9,152,624 (“the ’624 Patent” or the “Patent”) in this proceeding. In particular, I have
`
`been asked to respond to the “Declaration of Alan Bovik in Support of Defendants’
`
`Responsive Claim Construction Brief,” Dkt. 43-1 (“Bovik Dec.”), and the
`
`“Declaration of Kenneth Ross in Support of Defendants’ Responsive Claim
`
`Construction Brief,” Dkt. 43-2, dated March 17, 2020 (“Ross Dec.”)
`
`2.
`
`I am being compensated at my standard consulting rate of $600/hour. My
`
`compensation is in no way dependent on the outcome of this proceeding and my
`
`compensation in no way affects the substance of my statements and opinions in this
`
`Declaration.
`
`3.
`
`4.
`
`I have no financial interest in any of the parties or in the ’624 Patent.
`
`I make this declaration based on my personal knowledge and experience, and
`
`I am competent to testify about the matters set forth herein.
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 4
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 5 of 31
`
`B.
`
`Background and Qualifications
`
`5.
`
`A detailed description of my professional qualifications, including a listing of
`
`my specialties, expertise and professional activities, and a list of cases in which I
`
`have testified in the last ten years, is contained in my curriculum vitae, a copy of
`
`which is attached as Appendix B. Below is a short summary of my professional
`
`qualifications.
`
`6.
`
`I hold the title of Distinguished Career Professor in the School of Computer
`
`Science at Carnegie Mellon University in Pittsburgh, Pennsylvania. I am a member
`
`of two departments in that School, the Institute for Software Research and the
`
`Language Technologies Institute. I was a founder and Co-Director of the Institute
`
`for eCommerce at Carnegie Mellon from 1998-2004 and from 2004-2018 have been
`
`Director of the eBusiness Technology graduate program in the Carnegie Mellon
`
`University School of Computer Science. Since 2018, I have been Director of the
`
`M.S. in Artificial Intelligence and Innovation degree program at Carnegie Mellon.
`
`7.
`
`I received an A.B. (1968) from Princeton University in Physics; an M.A.
`
`(1970) from Vassar College in Physics; an M.S. (1972) from American University
`
`in Technology of Management, a field that covers quantitative tools used in
`
`managing organizations, such as statistics, operations research and cost-benefit
`
`analysis; an M.S. (1973), and M.Phil. (1974) and a Ph.D. (1978) from Yale
`
`University in Computer Science; and a J.D. (1981) from Duquesne University.
`
`DECLARATION OF MICHAEL SHAMOS
`
`2
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 5
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 6 of 31
`
`8.
`
`I was a founder of the subfield of computer science known as “computational
`
`geometry,” which provides a theoretical foundation for image processing.
`
`9.
`
`I have taught graduate courses at Carnegie Mellon in Electronic Commerce,
`
`including eCommerce Technology, Electronic Payment Systems, Electronic Voting,
`
`Internet of Things, Electronic Payment Systems and eCommerce Law and
`
`Regulation, as well as Analysis of Algorithms. Since 2007 I have taught an annual
`
`course in Law of Computer Technology. I currently also teach Artificial Intelligence
`
`and Future Markets.
`
`10.
`
`Since 2001 I have been a Visiting Professor at the University of Hong Kong,
`
`where I teach an annual course on Electronic Payment Systems. It is one of only a
`
`few university courses in the world on this subject.
`
`11.
`
`I am the author and lecturer in a 24-hour video course on Internet protocols
`
`and have taught computer networking, wireless communication and Internet
`
`architecture since 1999.
`
`12.
`
`From 1979-1987 I was the founder and president of two computer software
`
`development companies in Pittsburgh, Pennsylvania, Unilogic, Ltd. and Lexeme
`
`Corporation.
`
`13.
`
`I am a named co-inventor on the following five issued patents relating to
`
`electronic commerce: U.S. Patent Nos. 7,330,839, 7,421,278, 7,747,465, 8,195,197
`
`and 8,280,773.
`
`DECLARATION OF MICHAEL SHAMOS
`
`3
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 6
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 7 of 31
`
`14.
`
`I am an attorney admitted to practice in Pennsylvania and have been admitted
`
`to the Bar of the U.S. Patent and Trademark Office since 1981. I have not been
`
`asked to offer any opinions on patent law in this proceeding.
`
`15.
`
`I have served as an expert in over 285 cases concerning computer technology.
`
`In particular, I have been involved in multiple cases involving distribution of
`
`software over networks and multiples cases involving access to secure systems. A
`
`current copy of my curriculum vitae setting forth details of my background and
`
`relevant experience, including a full list of my relevant publications and a listing of
`
`cases for which I have provided expert testimony is attached hereto as Exhibit B.
`
`16.
`
`Throughout this Declaration, all emphasis and annotations are added unless
`
`noted.
`
`II. MATERIALS CONSIDERED
`
`17.
`
`In performing my analysis I have reviewed the materials listed in Appendix
`
`A, which includes “Defendants’ Responsive Claim Construction Brief” (Dkt. 43)
`
`(“Defendants’ Response”).
`
`18.
`
`I have also relied on my education, skill, training, and experience in the
`
`relevant fields of technology in forming my opinions. I have further considered the
`
`viewpoint of a person of ordinary skill in the art as of the time of invention of the
`
`’624 Patent (“POSITA”).
`
`DECLARATION OF MICHAEL SHAMOS
`
`4
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 7
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 8 of 31
`
`19.
`
`I reserve the right to supplement my opinions as expressed in this Declaration
`
`to address any new information obtained in the course of this proceeding, or based
`
`on any new positions taken by Patent Owner.
`
`III.
`
`SUMMARY OF OPINIONS
`
`20.
`
`The Bovik Dec. and the Ross Dec. do not support the conclusion that any term
`
`or claim of the ‘’624 Patent is indefinite.
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`
`21.
`
`The ’624 Patent describes its field of invention at 1:15-21:
`
`The present invention relates to systems and methods for providing
`
`visual presentation and navigation of content using data-based
`
`image analysis. More particularly, the present invention relates to
`
`systems and methods for allowing users of a browsing software
`
`application to search, browse, and navigate collections of data or
`
`other content of interest using data-based image analysis.
`
`22.
`
`The specification of the ’624 Patent discusses image processing, determining
`
`similarity of databases, proximity searching, and presenting images on websites.
`
`23.
`
`The claims of the ’624 Patent are drawn to methods, systems and storage
`
`media for navigating image-based content using a graphical user interface (GUI),
`
`comparing attributes of images, allowing a user to generate and image, and providing
`
`a user with an opportunity to purchase an item associated with an image.
`
`DECLARATION OF MICHAEL SHAMOS
`
`5
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 8
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 9 of 31
`
`24.
`
`It is my opinion that a person of ordinary skill in the art at the time the ’624
`
`Patent was filed, in order to read and understand the specification and to make and
`
`use the claimed inventions without undue experimentation, would have had at least
`
`a Bachelor’s degree in computer engineering or computer science or an equivalent
`
`field, or equivalent work experience and, in addition, at least two years of experience
`
`with image processing, including experience with graphical user interfaces,
`
`electronic commerce and content databases.
`
`25.
`
`I note that, remarkably, Dr. Bovik and Dr. Ross both give identical
`
`characterizations of the level of skill of a POSITA as “a person with a Master’s
`
`Degree in Electrical or Computer Engineering, or Computer Science, or a Bachelor’s
`
`Degree and equivalent industry experience.” Bovik Declaration ¶ 9; Ross
`
`Declaration ¶ 10. That characterization is clearly incorrect because it does not
`
`require any familiarity at all with image processing, which is a central technology of
`
`the Patent. The characterizations also omit familiarity with GUIs and databases.
`
`Further, neither expert defines in which “industry” the POSITA is supposed to have
`
`experience.
`
`V.
`
`THE BOVIK DECLARATION
`
`26.
`
`I note that the Bovik declaration consists of 2.5 pages of supposed opinion
`
`and 98 pages of CV. I also note that Dr. Bovik has not identified the prior cases in
`
`DECLARATION OF MICHAEL SHAMOS
`
`6
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 9
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 10 of 31
`
`which he has given declarations or testimony, which I find unusual for someone with
`
`significant expert witness experience.
`
`27.
`
`I understand that a supposed expert opinion unsupported by any facts or
`
`evidence, is commonly referred to as “ipse dixit.” The substantive portion of the
`
`Bovik declaration (¶¶ 9-14) consists entirely of ipse dixit. Dr. Bovik provides no
`
`support whatsoever for his opinions.
`
`28.
`
`In ¶ 10, Dr. Bovik states that “[i]n 2003, analyzing an image to detect
`
`attributes and comparing those attributes to attributes in another image, was not a
`
`function that an off-the-shelf computer or processor could perform without
`
`specialized programming.” He provides no support for that opinion. If the point is
`
`that image processing was not a vanilla function of common processors, I agree.
`
`There did exist, however, specialized hardware to perform image processing
`
`functions, however, at least as early as the early 1990s. See, e.g., Robert et al.,
`
`“Design of an image processing integrated circuit for real time edge detection,”
`
`Proceedings Euro ASIC ’92.
`
`29.
`
`In ¶ 11, Dr. Bovik states: “I am familiar with histograms of color usage, edge
`
`detection techniques and outline processing, as described in the patent specification.
`
`These techniques do not refer to specific algorithms. Rather they are entire classes
`
`of different possible algorithms that can be used for image analysis.” He provides
`
`no support for that opinion. While it is true that multiple algorithms are included in
`
`DECLARATION OF MICHAEL SHAMOS
`
`7
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 10
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 11 of 31
`
`the terms “edge detection techniques” and “outline processing,” Defendants appear
`
`to rely improperly on that fact on p. 7 of Defendants’ Response. They cite
`
`Biomedino for the proposition that a “bare statement that known techniques or
`
`methods can be used does not disclose structure. To conclude otherwise would
`
`vitiate the language of the statute requiring ‘corresponding structure, material, or
`
`acts described in the specification’.” Even assuming the statement to be correct,
`
`applicant here did not offend Biomedino because there was no assertion that merely
`
`“known techniques or methods” could be used, but listed specific techniques or
`
`methods.
`
`30.
`
`I understand that, with regard to structure corresponding to the function of an
`
`MPF element, disclosure of a class of algorithms “that places no limitations on how
`
`values are calculated, combined, or weighted is insufficient to make the bounds of
`
`the claims understandable.” However, the claims at issue here do place limitations
`
`on how the values are calculated. For example, not just any “measure of distinction”
`
`will suffice to correspond to the “means for calculating a measure of distinction,”
`
`MPF element for two reasons: (1) claim 12 contains additional express limitations,
`
`as the full element reads “means for calculating a measure of distinction between the
`
`first item and a second item based on the plurality of categories, wherein the
`
`measure of distinction represents an alignment of categories between the first
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`8
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 11
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 12 of 31
`
`and second items limits” (emphasis added); and (2) the specification discloses
`
`certain algorithms for performing the function. For example,
`
`A proximity searching technique may be used by browsing
`
`application 104 to search for and retrieve items that are most
`
`similar, but not identical, to the selected item. For example,
`
`browsing application 104 may calculate some measure of
`
`distinction between items in terms of descriptive item
`
`characteristics (e.g., color, pattern, material, size, price,
`
`manufacturer or brand, etc.). The measure of distinction may be
`
`used to locate similar items. For example, browsing application
`
`104 may calculate the number of common attributes between the
`
`selected item(s) and every other item stored in database 106.
`
`Browsing application 104 may also limit the search by excluding
`
`obvious non-matching items according to the product category of
`
`the selected item (e.g., if the user selects a men's shoe, there is no
`
`need to determine the similarity between the selected men's shoe
`
`and a coat). The item(s) with the least level of distinction from the
`
`selected item(s) may be retrieved from the search and images
`
`associated with the retrieved items may be displayed. ’624 Patent,
`
`9:43-60 (emphasis added)
`
`31.
`
`In ¶ 12, Dr. Bovik states: “Using color histograms in conjunction with edge-
`
`detection techniques does not result in a measure of distinction, alignment of
`
`categories, or determining similarity of items; additional programming would be
`
`needed, which is not disclosed in the ‘624 patent.” He provides absolutely no
`
`DECLARATION OF MICHAEL SHAMOS
`
`9
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 12
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 13 of 31
`
`support for that statement. Even if it were true, however, it is always the case that
`
`“additional programming” is required to realize a functional system. The fact that
`
`“additional programming” might be needed is therefore not material except as to the
`
`requirement for disclosure of an algorithm to support any MPF element. The
`
`specification provides such support. It states:
`
`Fully automated image detection and analysis, which is important
`
`for implementation in a large item database environment, may be
`
`achieved using known systems and methods for detecting and
`
`storing descriptive information of an item automatically determined
`
`from a sample (e.g., image or physical specimen) of an item. For
`
`example, histograms of color usage may be used in conjunction
`
`with edge detection techniques to determine the color, number, and
`
`orientation of lines and curves in an item’s pattern. ’624 Patent,
`
`7:58-66.
`
`32. The algorithm is explained in the above paragraph. One is to process an image
`
`using edge detection to determine the boundary lines and curves in the image. That
`
`is one set of attributes. One also creates color histograms, which are essentially bar
`
`charts showing the distribution of colors in the image. That information is stored as
`
`“descriptive information” about the image. The “descriptive information” can then
`
`be compared with analogous descriptive information for another image to determine
`
`how distinct the two images are (“measure of distinction”). That much is clear
`
`simply from the specification.
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`10
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 13
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 14 of 31
`
`33.
`
`In ¶ 13, Dr. Bovik offers the unsupported opinion that “none of the features
`
`of image analysis disclosed in the patent specification, such as, color histograms,
`
`‘edge detection techniques,’ and ‘[o]utline processing techniques,’ explain how to
`
`use this information and transform it into recognizable parameters (e.g., that describe
`
`a pair of shoes), or that choose relevant categories ‘automatically,’ and/or
`
`disaggregate this information and convert it into data relevant to these categories.”
`
`Even assuming the statement to be true, it is of no significance. No claim includes
`
`the word “automatically,” and no claim requires a computer to “choose relevant
`
`categories.”
`
` Further, no claim requires transformation of anything into
`
`“recognizable parameters.” No claim even includes the word “parameter.”
`
`34.
`
`In ¶ 14, Dr. Bovik offers the unsupported statement that “the terms:
`
`‘calculating a measure of distinction’ and ‘alignment of categories’ were not used
`
`routinely in 2003 or since.” The significance of that statement, even if it were true,
`
`is elusive. A term can be perfectly clear even if it is not used “routinely.” He goes
`
`on to offer the further unsupported opinion that “neither term would inform a
`
`POSITA, with reasonable certainty, about the scope of the patent claims.”
`
`35.
`
`In ¶ 14, Dr. Bovik states that “[n]either has a precise meaning.” Even if that
`
`were true, it would not render the terms indefinite. Precision is not required – what
`
`is needed is for a POSITA to understand the scope of a claim with “reasonable
`
`certainty.” Few terms in English have a “precise meaning.” Furthermore, it does
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`11
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 14
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 15 of 31
`
`not appear that Dr. Bovik undertook any investigation at all to determine how these
`
`terms were used in the art. Instead, he offered a personal, off-the-cuff opinion that
`
`the terms were indefinite.
`
`36. By contrast, I have undertaken such an investigation.
`
`A.
`
`“measure of distinction”
`
`37. The term “measure of distinction” has a plain meaning, which is “an amount
`
`by which items differ.” It is the opposite of “measure of similarity” in that items
`
`that have a high measure of distinction would have a low measure of similarity.
`
`38. As part of my undergraduate minor in mathematics during the 1960s, I
`
`personally developed a formula for the measure of distinction between two closed
`
`plane curves.
`
`39. Further, the phrase “measure of distinction” was in common use in the patent
`
`and technical literature, especially in the field of image processing.
`
`40.
`
`In 1999, an entire paper was devoted to measures of distinction: Zlotnikov et
`
`al. “New-distinction measure for pattern recognition in fuzzy features space,” Proc.
`
`SPIE 3837, Intelligent Robots and Computer Vision XVIII: Algorithms, Techniques,
`
`and Active Vision, (26 August 1999). I take “distinction measure” to be synonymous
`
`with “measure of distinction.” The paper, beginning at p. 407, contains a section
`
`entitled “The Definition of Distinction Measures.” The paper provides several
`
`measures of distinction, including this one at p. 412:
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`12
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 15
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 16 of 31
`
`Value of a fuzzy distinction measure is calculated to estimate a
`
`membership degree of the recognized object to each of classes,
`
`defined in item 1.
`
`This measure of distinction is similar to one of the measures in the ’624 Patent,
`
`namely that of claim 12: “wherein the measure of distinction represents an alignment
`
`of categories between the first and second items.”
`
`41. There are numerous uses of “measure of distinction” in the patent literature.
`
`42. Benitez et al. U.S. Patent 7,460,985, entitled “Three-Dimensional
`
`Simultaneous Multiple-Surface Method and Free-Form Illumination-Optics
`
`Designed Therefrom,” claims priority to a provisional application filed July 28,
`
`2003. It states at 26:17-21:
`
`A measure of their distinction, in some embodiments, is determined
`
`by taking the mean distance between the two defined surfaces, as
`
`defined by the averaged value of the minimum distance between
`
`each point of surface Si to the surface Si'.
`
`43. Marshall et al. U.S. Patent 7,675,655, entitled “Moving Object Scanning
`
`Apparatus and Method,” claims priority to a Great Britain application filed March
`
`7, 2003. It states at 28:13-19:
`
`The motion estimation processor 166 processes image data within
`
`the image buffer 162 and gradient processor 164 in accordance
`
`with the measure of distinction provided by the image analyser
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`13
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 16
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 17 of 31
`
`168. This generates an estimate of object motion between the
`
`current and immediately preceding image frames It and It-1.
`
`44. Ranguelova U.S. Patent 8,655,080, entitled “Method and Apparatus for
`
`Identifying Combinations of Matching Regions in Images,” claims priority to a
`
`European application dated December 22, 2008. It states at 5:32-57:
`
`The salience of a region may be measured for example as a
`
`measure of distinction between the image contents of the region
`
`and another region that entirely surrounds it, or of versions of the
`
`region that are shrunken or grown by a predetermined amount. A
`
`region may be detected as salient for example if it has at least a
`
`predetermined size and its pixels have values (e.g. grey values or
`
`binary values obtained by thresholding) from one range of values
`
`and the pixels of a region that entirely surrounds it have values
`
`from a different, non overlapping range of values, or if shrinking or
`
`growing the region by a predetermined amount results in such a
`
`situation.
`
`…
`
`Other measures of the distinction between a region and another
`
`region that entirely surrounds it may be used to measure salience.
`
`45. D’Amico et al. U.S. Patent 9,473,708, entitled “Devices and Methods for an
`
`Imaging System with a Dual Camera Architecture,” was filed August 7, 2013. It
`
`uses color histograms to compute a measure of distinction at 6:17-23:
`
`For example, spatial resolution can be a measure of how closely
`
`lines (e.g., edges) can be resolved in the image (e.g., perceived by
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`14
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 17
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 18 of 31
`
`person looking at the image). In another example, spectral
`
`resolution can be a measure of distinction between light including
`
`more than one spectrum (e.g., multiple wavelengths, different
`
`colors, etc.).
`
`46. Cigla European Patent Specification EP 2 466 903 B1, entitled “A method
`
`and device for disparity range detection” states at [0036]:
`
`Apart from vertical edge pixels, the salient pixels can also be
`
`exploited in the following way. The salient pixels are determined
`
`based on the centre-surround differences which correspond to a
`
`measure of distinction of pixels from their surrounds which means
`
`that the salient pixels are compared to neighboring pixels. In that
`
`manner, the intensities of pixels are subtracted from the mean value
`
`of their surrounds. The surround mean value of each pixel is
`
`calculated by taking the average intensity among certain windows.
`
`47.
`
`Particularly relevant to the ’624 Patent is Hinloopen et al., “Integration of
`
`ordinal and cardinal
`
`information
`
`in multi-criteria ranking with
`
`imperfect
`
`compensation,” European Journal of Operational Research 158 (2004) 317–338,
`
`which states at 327:
`
`As the measure of distinction between alternatives the probability
`
`that the overall value difference function is greater than 0 is used:
`
`Prob(Dii' > 0) = Prob((Wii - Wi'i) > 0).
`
`
`
` (3.23)
`
`Since the distributions of all stochastical variables of Wii and Wi'i
`
`are known, by means of a Monte Carlo procedure, Prob(Dii' > 0)
`
`can be calculated.
`
`DECLARATION OF MICHAEL SHAMOS
`
`15
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 18
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 19 of 31
`
`…
`
`As stated in Section 3.3, we use the probability that alternative i
`
`wins a paired comparison from alternative i' as the measure of
`
`distinction between alternatives i and i'. In other words: the
`
`probability that Dii' > 0 is used as the measure of distinction
`
`between alternatives i and i'. Let
`
`Pii' = Prob (Dii' > 0)
`
`
`
`
`
`
`
`
`
` (4.1)
`
`Then Pii', the probability that alternative i dominates alternative i',
`
`is the measure of distinction between alternatives i and i'. Based on
`
`the Pii', the final ranking of alternatives is established.
`
`48. Keserci et al., “Computerized detection of pulmonary nodules in chest
`
`radiographs based on morphological features and wavelet snake model,” Medical
`
`Image Analysis 6 (2002) 431–447, deals with recognizing tumors in chest x-rays. It
`
`states at 440:
`
`In general, the weighted overlap for nodules appears to be higher
`
`than those of false positives because the wavelet snake is designed
`
`to capture the boundary of a nodule-like object. On the other hand,
`
`if the edges consist of a large number of irregular curves due to
`
`normal structures, they may not be fitted well by the wavelet snake,
`
`and yield a divergent snake with a low degree of weighted overlap.
`
`Therefore, the weighted overlap was used as a measure for
`
`distinction between true nodules and false positives.
`
`49. Mukherjee et al, “Corroborating the subjective classification of ultrasound
`
`images of normal and fatty human livers by the radiologist through texture analysis
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`16
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 19
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 20 of 31
`
`and SOM, 15th International Conference on Advanced Computing and
`
`Communications (2007) deals with recognizing features of ultrasound images. It
`
`states at 202:
`
`We have estimated the degree of distinction between the normal
`
`and fatty clusters with the help of quality factor and identified the
`
`optimal combination of pixel pair distance and orientation.
`
`I take “degree of distinction” to be synonymous with “measure of distinction.”
`
`50. Reyer et al., “Comparison of Face Profiles Based on Homeomorphism,”
`
`Pattern Recognition and Image Analysis, 2006, Vol. 16, No. 1, pp. 43–45, addresses
`
`measuring the distinction between two human faces. It states at 44:
`
`Therefore, to find a measure of distinction or distance between two
`
`polygonal lines, it is necessary to find a transformation with a
`
`minimal work among all possible transformations of P into Q.
`
`51. Sánchez-Yáñez et al., “One-class texture classifier in the CCR feature space,”
`
`Pattern Recognition Letters 24 (2003) 1503–1511, defines a measure of distinction
`
`between visual textures. It states at 1507:
`
`The L1 distance between points in the space of CCR distribution
`
`functions is used as the measure of distinction between images
`
`52. Shilane et al, “Distinctive Regions of 3D Surfaces,” ACM Trans. Graph. 26,
`
`2, Article 7 (June 2007), discusses analyzing images to separate classes of objects
`
`and defines a measure of distinction. It states at 5:
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`
`17
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 20
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 21 of 31
`
`In our system, we compute and store this measure of distinction for
`
`every shape descriptor of every object during an offline processing
`
`phase.
`
`53. And at 12:
`
`However, it should be clearly stated that our measure of distinction
`
`is based on 3D shape matching not 2D image matching, and thus it
`
`is not guaranteed that the regions determined to be most distinctive
`
`by our method will match the ones most visually recognizable by a
`
`human. Nonetheless, we find that our simple method based on mesh
`
`distinction produces good icons in most cases.
`
`54.
`
`It is clear from these examples, that “measure of distinction” was a well-
`
`understood term in the art of image analysis.
`
`B.
`
`“alignment of categories”
`
`55.
`
`The phrase “alignment of categories” has a plain meaning, which is “closeness
`
`of matching of categories.” It is also used in the technical literature. While not as
`
`common as “measure of distinction,” the term is used in both the Patent and the
`
`literature in a readily understandable way.
`
`56.
`
`For example, Arbib et al., “Vision and Action in the Language-Ready Brain:
`
`From Mirror Neurons to SemRep,” Advances in Brain, Vision, and Artificial
`
`Intelligence 2007, Lecture Notes in Computer Science, 4729, pp. 104–123 (2007).
`
`It deals with robot-to-robot communication and considers “categories” of words,
`
`which can be understood as classes of semantically related words. It states at p. 119:
`
`DECLARATION OF MICHAEL SHAMOS
`
`
`
`18
`
`Syte - Visual Conception Ltd. Ex. 1003 p. 21
`
`

`

`Case 6:19-cv-00257-ADA Document 48-1 Filed 03/25/20 Page 22 of 31
`
`Letter strings can be randomly generated to provide “words”, and
`
`weighted, many-to-many links between words and categories can be
`
`stored in a bidirectional associative memory [28]. However, from
`
`this random initial state, interactions between 2 or more robots
`
`allow them to end up with a set of categories, and a set of words
`
`associated with those categories, that allow any 2 robots to
`
`communicate effectively about a scene, adopting either their own
`
`perspective or that of the other robot.
`
`…
`
`This adjustment acts as a reinforcement learning mechanism and
`
`also as priming mechanism so that agents gradually align their
`
`lexicons in consecutive games. Similar mechanisms apply to the
`
`updating – and eventual alignment – of categories in each robot on
`
`the basis of success or failure in each exchange.
`
`57. Miller et al., “Considerations in the Development of Procedures for
`
`Prioritizing Transportation Improvement Projects in Virginia,” Report VTRC 02-R6
`
`of the Virginia Transportation Research Council (March 2002), uses the term
`
`“alignment of categories” in a readily understandable way. It presents a table on p.
`
`4 aligning Virginia’s categories with those of the Transportation Equity Act (TEA-
`
`21):
`
`DECLARATION OF

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket