`
`_________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`_________________
`
`GOOGLE LLC,
`Petitioner,
`
`v.
`
`BUFFALO PATENTS, LLC,
`Patent Owner.
`
`_________________
`
`Case No. IPR2023-01387
`U.S. Patent No. 8,204,737
`_________________
`
`Declaration of Shauna L. Wiest Regarding Manke
`
`Page 1 of 48
`
`GOOGLE EXHIBIT 1024
`
`
`
`Declaration of Shauna L. Wiest
`
`I, Shauna L. Wiest, state and declare as follows:
`
`I.
`
`Introduction
`
`1.
`
`I have prepared this Declaration in connection with Google
`
`LLC’s (“Petitioner”) Petition for Inter Partes Review of U.S. Patent No.
`
`8,204,737, which I understand will be filed concurrently with this Declaration.
`
`2.
`
`I am currently a senior research analyst with the Research &
`
`Information Services team at Finnegan, Henderson, Farabow, Garrett &
`
`Dunner, LLP, located at 3300 Hillview Avenue, Palo Alto, CA 94304
`
`(“Finnegan”).
`
`3.
`
`I am over eighteen years of age, and I am competent to make this
`
`Declaration. I make this Declaration based on my own personal knowledge,
`
`and my professional knowledge of library science practices.
`
`4.
`
`I earned a Master of Science in Library Science degree from the
`
`University of North Carolina at Chapel Hill in 1999, and a Bachelor of Arts in
`
`Political Science degree from the University of California at San Diego in
`
`1989. I have worked as a law librarian for eighteen years. I have been
`
`employed in the Research & Information Services Department at Finnegan
`
`since 2021. Before that, from 2000-2015, I was employed as a Law Librarian
`
`at Stoel Rives LLP, and from 2015-2016, I was employed as a Competitive
`
`Intelligence Specialist for Nossaman LLP.
`
`2
`
`Page 2 of 48
`
`
`
`
`
`II.
`
`Declaration of Shauna L. Wiest
`
`
`Standard Library Practice for Receiving, Cataloging, Shelving, and
`Making Materials Publicly Available
`
`5.
`
`I have knowledge of and experience with standard library practices
`
`regarding receiving, cataloging, shelving, and making materials, including
`
`conference publications, available to the public. I am fully familiar with and have
`
`knowledge of and experience with the Machine-Readable Cataloging (MARC)
`
`system, an industry-wide standard that libraries use to catalog materials.
`
`6.
`
`The MARC system was developed during the 1960s to standardize
`
`bibliographic catalog records so they could be read by computers and shared
`
`among libraries. By the mid-1970s, MARC had become the international standard
`
`for cataloging bibliographic materials and is still used today. Many libraries
`
`provide public access to their MARC records via the Internet and/or their
`
`electronic cataloging systems at the library. As discussed further in this
`
`Declaration, each field in a MARC record provides specific information about the
`
`cataloged item, including when materials are received and made available to the
`
`public.
`
`III. MARC Records
`
`7.
`
`The MARC record system uses a specific three-digit numeric code
`
`(“field tag”) (from 001-999) to identify each field in a catalog record. For example,
`
`MARC field tag 008 provides the six-digit date the item was received and
`
`catalogued (Date entered on file). The first six characters of field tag 008 are
`
`
`
`3
`
`Page 3 of 48
`
`
`
`
`always in the “YYMMDD” format. Descriptions and definitions of all of the
`
`Declaration of Shauna L. Wiest
`
`
`character positions of field tag 008 are outlined here:
`
`https://www.loc.gov/marc/bibliographic/bd008a.html.
`
`8.
`
`As is relevant to this Declaration, MARC field tag 050 provides the
`
`Library of Congress Call Number. MARC field tag 245 identifies the title and
`
`statement of responsibility for the work. MARC field tag 260 identifies the place of
`
`publication, name of publisher, and publication date of the resource. MARC field
`
`tag 500 provides general notes including language, resource type, and local
`
`holdings information for the desired resource.
`
`9.
`
`Based on standard library practice, when a library receives an item, it
`
`generally stamps (and/or labels) the item with the library name, barcode, often with
`
`a date that is within a few days or weeks of receipt. Next, the library will catalog
`
`the item within a matter of a few days or weeks of receiving it. As a general
`
`practice, cataloguing is centralized and performed by a cataloguing department
`
`within a library or university setting. In certain circumstances the catalogued item
`
`may be subsequently sent to a library location within the library or university
`
`setting where it may be stamped and/or labeled after it has been catalogued.
`
`10.
`
`Generally, after an item is cataloged, the public may access the item
`
`by searching a library catalog, browsing the library shelves, and either requesting
`
`or electronically accessing the item from the library. Standard library practice is to
`
`make the item available to the public within a few days or weeks of cataloging it.
`4
`
`
`
`Page 4 of 48
`
`
`
`Declaration of Shauna L. Wiest
`
`IV. Public Availability of Manke
`
`11.
`
`This Declaration relates to the dates of receipt and public
`
`availability of the following reference: S. Manke, M. Finke, Alex Waibel, “The
`
`Use of Dynamic Writing Information in a Connectionist On-Line Cursive
`
`Handwriting Recognition System” (“Manke”) in Advances in Neural
`
`Information Processing Systems 7, Proceedings of the 1994 Conference, G.
`
`Tesauro, D. Touretzky, and T. Leen (eds.), The MIT Press, 1995 (“MIT
`
`Publication”). I understand that Manke has been submitted as Exhibit 1015 in
`
`this proceeding. That same reference is appended to my Declaration as
`
`Appendix A.
`
`12. As detailed below, I have reviewed the print reference, public
`
`holdings information, and Library of Congress MARC record for the MIT
`
`Publication containing Manke to determine the date of public availability of
`
`this reference.
`
`13. Appendix A to this Declaration is a true and accurate copy of the
`
`print book cover, title pages, table of contents, date stamp, and handwritten call
`
`number for the copy of the MIT Publication containing Manke initially held by the
`
`University of Wisconsin at Madison, Kurt F. Wendt Library. The handwritten call
`
`number is QA76.87 A38 and the ISSN is 1049-5258. The date stamp on the copy
`
`of the MIT Publication containing Manke indicates that Manke was received by the
`
`University of Wisconsin at Madison on August 17, 1995.
`
`5
`
`Page 5 of 48
`
`
`
`
`
`14.
`
`The University of Wisconsin at Madison indicated that “[i]n 2018,
`
`Declaration of Shauna L. Wiest
`
`
`the Wendt Library for Engineering relocated all its collections and staff to
`
`Steenbock Library” at the following webpage:
`
`https://www.library.wisc.edu/steenbock/about-steenbock/library-history/.
`
`15. Appendix B to this Declaration is a true and accurate copy of the
`
`University of Wisconsin at Madison Library public catalog record for its copy
`
`of the MIT Publication containing Manke, which was downloaded from
`
`https://search.library.wisc.edu/catalog/999723890002121 on September 6,
`
`2023. The University of Wisconsin at Madison Library public catalog record sets
`
`forth the public holdings and onsite location information for members of the public
`
`seeking a physical copy of the MIT Publication containing Manke. The public
`
`catalog record indicates that the copy of the MIT Publication containing Manke is
`
`located in the Steenbock Library stacks at Call Number QA76.87 A38.
`
`Additionally, the public catalog record states that the Steenbock Library holds v.1
`
`(1988) through v.25 (2012) of the MIT Publication, which includes volume 7
`
`containing Manke. The record also identifies the ISSN as 1049-5258. Based on my
`
`experience as a librarian, the public catalog record (Appendix B) references the
`
`MIT Publication containing Manke (Appendix A).
`
`16. Appendix C to this Declaration is a true and accurate copy of the
`
`University of Wisconsin at Madison Library MARC record for its holdings of
`
`the MIT Publication containing Manke, which was downloaded from the
`6
`
`
`
`Page 6 of 48
`
`
`
`
`“Library Staff Details – Staff View” link at
`
`Declaration of Shauna L. Wiest
`
`
`https://search.library.wisc.edu/catalog/999723890002121 on September 5,
`
`2023. This record also indicates that the copy of the MIT Publication containing
`
`Manke is located in the Steenbock Library stacks at Call Number QA76.87 A38.
`
`The University of Wisconsin MARC record field tag 050 notes the Call Number
`
`as QA76.87 A38. MARC field tag 245 identifies the full title statement for the
`
`work as: $a Advances in neural information processing systems. MARC field
`
`tag 260 sets forth the places of publication, publishers, and publication dates
`
`for “Advances in Neural Information Processing Systems” as:
`
` 260
`
`
`
`$31989-1994 ;$aSan Mateo, CA :$bMorgan
`
`Kaufmann Publishers,$cc1989-
`
` 260 2_
`
`$31995-2006 ;$aCambridge, Mass :$bMIT
`
`Press.
`
` 260 3_
`
`$32007- ;$a[Place of publication not identified]
`
`:$bNeural Information Processing Systems Foundation.
`
`Finally, MARC field tag 500 sets forth the local holdings and volume
`
`information for the collected papers comprising volumes 1-7 of the MIT
`
`Publication held by the University of Wisconsin at Madison Library as “Vols.
`
`1-2 contain the collected papers of the 1988-1989 IEEE Conference on
`
`Neural Information Processing Systems--Natural and Synthetic; v. 3-6, has
`
`
`
`7
`
`Page 7 of 48
`
`
`
`
`
`the papers of the 1990-1993 NIPS Conference; v. 7- has the papers of the
`
`1994- Conference on Neural Information Processing Systems.”
`
`Declaration of Shauna L. Wiest
`
`
`17. Appendix C confirms the fixed data elements of MARC field tag
`
`008 as 890619c19899999mauar1a0engc. As discussed above, the first six
`
`characters “890619” are in typical “YYMMDD” format and indicate that the
`
`resource “Advances in Neural Information Processing Systems,” was first
`
`received and catalogued by the University of Wisconsin at Madison Library on
`
`June 19, 1989, and is a continuing resource currently published. Based on my
`
`experience as a librarian, the MARC record (Appendix C) references the MIT
`
`Publication containing Manke (Appendix A).
`
`18.
`
`Based on the information in Appendices A, B, and C, the MIT
`
`Publication containing Manke was received by the University of Wisconsin at
`
`Madison Library on August 17, 1995. Based on standard library practices, Manke
`
`would have been processed and catalogued by the University of Wisconsin at
`
`Madison Library within a matter of a few days or weeks of August 17, 1995.
`
`19. Accordingly, Manke would have been made available to the public
`
`within a few days or weeks of being checked-in and catalogued. The public could
`
`have accessed Manke after a few weeks of August 17, 1995 by (i) searching the
`
`University of Wisconsin at Madison Library catalog online by, for example,
`
`searching by title, author, and/or subject keywords, or (ii) by asking a library staff
`
`
`
`8
`
`Page 8 of 48
`
`
`
`
`member and being directed to the Kurt F. Wendt Library or the Steenbock Library
`
`Declaration of Shauna L. Wiest
`
`
`stacks at Call Number QA76.87 A38.
`
`V. Conclusion
`
`20.
`
`In signing this Declaration, I understand it will be filed as
`
`evidence in a contested case before the Patent Trial and Appeal Board of the
`
`United States Patent and Trademark Office. I understand I may be subject to
`
`cross-examination in this case and that cross-examination will take place
`
`within the United States. If cross-examination is required of me, I will appear
`
`for cross-examination within the United States during the time allotted for
`
`cross-examination.
`
`21.
`
`I declare that all statements made herein of my knowledge are true,
`
`that all statements made on information and belief are believed to be true, and that
`
`these statements were made with the knowledge that willful false statements and
`
`the like so made are punishable by fine or imprisonment, or both, under Section
`
`1001 of Title 18 of the United States Code.
`
`
`
`
`
`
`
`Executed on September 12, 2023.
`
`
`
`
`
`Shauna L. Wiest
`
`9
`
`Page 9 of 48
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`APPENDIX A
`APPENDIX A
`
`
`
`Page 10 of 48
`
`Page 10 of 48
`
`
`
`amo==2ODxfa
`
`®vt
`
`Page 11 of 48
`
`
`
`Agveidiba dks
`
`« ¥¥
`7
`J
` AZ
`COLLEGEOFENGINEERING
`
`Aus 1 / 1995
`
`i af-MADISON, WE 5370"
`
`Page 12 of 48
`
`Page 12 of 48
`
`
`
`
`
`ADVANCES IN NEURAL INFORMATION
`PROCESSING SYSTEMS7
`
`Edited by
`Gerald Tesauro, David Touretzky, Todd Leen
`
`The MIT Press
`
`Cambridge, Massachusetts
`London, England
`
`Page 13 of 48
`
`Page 13 of 48
`
`
`
`Page 14 of 48
`
`Page 14 of 48
`
`
`
`Contents
`
`Preface
`Contributors
`
`PART I
`COGNITIVE SCIENCE
`
`DIRECTION SELECTIVITY IN PRIMARY VISUAL CORTEX USING
`MASSIVE INTRACORTICAL CONNECTIONS
`Humbert Suarez, Christof Koch, Rodney Douglas
`
`ON THE COMPUTATIONAL UTILITY OF CONSCIOUSNESS
`Donald Mathis, Michael C. Mozer
`
`CATASTROPHIC INTERFERENCE IN HUMAN MOTOR LEARNING
`Tom Brashers-Krug, Reza Shadmehr, Emanuel Todorov
`
`GRAMMARLEARNINGBYA SELF-ORGANIZING NETWORK
`Michiro Negishi
`
`PATTERNS OF DAMAGEIN NEURAL NETWORKS:THE EFFECTS OF
`LESION AREA, SHAPE AND NUMBER
`Eytan Ruppin, James A, Reggia
`
`FORWARD DYNAMIC MODELS IN HUMAN MOTOR CONTROL:
`PSYCHOPHYSICAL EVIDENCE
`Daniel M. Wolpert, Zoubin Ghahramani, MichaelI. Jordan
`
`A SOLVABLE CONNECTIONIST MODEL OF IMMEDIATE RECALL OF
`ORDERED LISTS
`Neil Burgess
`
`XVE
`
`i
`
`19
`
`35
`
`B
`
`3i
`
`Page 15 of 48
`
`
`
`Page 15 of 48
`
`
`
`vi
`
`PART II
`NEUROSCIENCE
`
`A MODEL FOR CHEMOSENSORY RECEPTION
`
`Rainer Malaka, Thomas Ragg, Martin Hammer
`
`THE ELECTRONIC TRANSFORMATION: A TOOL FOR
`RELATING NEURONAL FORM TO FUNCTION
`Nicholas Carnevale, Kenneth Y. Tsai, Brenda J. Claiborne, Thomas H. Brown
`
`A MODEL OF THE HIPPOCAMPUS COMBINING SELF-ORGANIZATION
`AND ASSOCIATIVE MEMORY FUNCTION
`Michael E. Hasselmo, Eric Schnell, Joshua Berke, Edi Barkai
`
`MODEL OF BIOLOGICAL NEURON AS A TEMPORAL NEURAL NETWORK
`
`Sean D. Murphy, Edward W. Kairiss
`
`A CRITICAL COMPARISON OF MODELS FOR ORIENTATION AND
`OCULAR DOMINANCE COLUMNSIN THE STRIATE CORTEX
`
`E. Erwin, K. Obermayer, K. Schulten
`
`A NOVEL REINFORCEMENT MODEL OF BIRDSONG VOCALIZATION
`LEARNING
`
`Kenji Doya, Terrence J. Sejnowski
`
`OCULAR DOMINANCE AND PATTERNED LATERAL CONNECTIONSIN A
`SELF-ORGANIZING MODEL OF THE PRIMARY VISUAL CORTEX
`
`Joseph Sirosh, Risto Mitkkulainen
`
`ANATOMICAL ORIGIN AND COMPUTATIONAL ROLE OF DIVERSITY IN
`THE RESPONSE PROPERTIES OF CORTICAL NEURONS
`
`Kalanit Grill Spector, Shimon Edelman, Rafael Malach
`
`REINFORCEMENT LEARNING PREDICTS THESITE OF PLASTICITY FOR
`AUDITORY REMAPPINGIN THE BARN OWL
`
`Alexandre Pouget, Cedric Deffayet, Terrence J. Sejnowski
`
`MORPHOGENESIS OF THE LATERAL GENICULATE NUCLEUS:
`HOW SINGULARITIES AFFECT GLOBAL STRUCTURE
`
`Svilen Tzonev, Klaus Schulten, Joseph G. Malpeli
`
`A COMPUTATIONAL MODEL OF PREFRONTAL CORTEX FUNCTION
`ToddS. Braver, Jonathan D. Cohen, David Servan-Schreiber
`
`Contents
`
`6l
`
`69
`
`77
`
`85
`
`93
`
`101
`
`109
`
`117
`
`125
`
`133
`
`14]
`
`Page 16 of 48
`
`Page 16 of 48
`
`
`
`Contents
`
`ANEURAL MODEL OF DELUSIONS AND HALLUCINATIONSIN
`SCHIZOPHRENIA
`Eytan Ruppin, James A. Reggia, David Horn
`
`SPATIAL REPRESENTATIONSIN THE PARIETAL CORTEX MAY
`USEBASIS FUNCTIONS
`Alexandre Pouget, Terrence J. Sejnowski
`
`GROUPING COMPONENTS OF THREE-DIMENSIONAL MOVING OBJECTS
`IN AREA MSTOF VISUAL CORTEX
`Richard S. Zemel, Terrence J. Sejnowski
`
`A MODEL OF THE NEURAL BASIS OF THE RATS SENSE OF DIRECTION
`William Skaggs, James J. Knierim, Hemant S. Kudrimoti, Bruce L. McNaughton
`
`PART III
`LEARNING THEORY AND DYNAMICS
`
`ON THE COMPUTATIONAL COMPLEXITY OF NETWORKSOF SPIKING
`NEURONS
`Wolfgang Maass
`
`Heo OPTIMAL TRAINING ALGORITHMS AND THEIR RELATION
`TO BACKPROPAGATION
`Babak Hassibi, Thomas Kailath
`
`SYNCHRONY AND DESYNCHRONY IN NEURAL OSCILLATOR NETWORKS
`DeLiang Wang, David Terman
`
`LEARNINGIN LARGE LINEAR PERCEPTRONS AND WHY THE
`THERMODYNAMICLIMIT IS RELEVANT TO THE REAL WORLD
`Peter Sollich
`
`GENERALISATION IN FEEDFORWARD NETWORKS
`Adam Kowalczyk, Herman Ferra
`
`FROM DATADISTRIBUTIONS TO REGULARIZATIONIN INVARIANT
`LEARNING
`Todd Leen
`
`NEURAL NETWORK ENSEMBLES, CROSS VALIDATION, AND ACTIVE
`LEARNING
`Anders Krogh, Jesper Vedelsby
`
`LIMITS ON LEARNING MACHINE ACCURACYIMPOSED BY DATA QUALITY
`Corinna Cortes, L. D. Jackel, Wan-Ping Chiang
`
`vil
`
`149
`
`157
`
`165
`
`173
`
`183
`
`19]
`
`199
`
`207
`
`215
`
`223
`
`231
`
`239
`
`Page 17 of 48
`
`Page 17 of 48
`
`
`
`vidl
`
`Contents
`
`HIGHER ORDER STATISTICAL DECORRELATION WITHOUT
`INFORMATION LOSS
`Gustavo Deco, Wilfried Brauer
`
`HYPERPARAMETERS, EVIDENCE AND GENERALISATION FOR AN
`UNREALISABLE RULE
`Glenn Marion, David Saad
`
`TEMPORAL DYNAMICS OF GENERALIZATION IN NEURAL NETWORKS
`Changfeng Wang, Santosh S. Venkatesh
`
`STOCHASTIC DYNAMICS OF THREE-STATE NEURAL NETWORKS
`Toru Ohira, Jack D. Cowan
`
`LEARNING STOCHASTIC PERCEPTRONS UNDER K-BLOCKING
`DISTRIBUTIONS
`Mario Marchand, Saeed Hadjifaradji
`
`LEARNING FROM QUERIES FOR MAXIMUM INFORMATION GAIN
`IN IMPERFECTLY LEARNABLE PROBLEMS
`Peter Sollich, David Saad
`
`BIAS, VARIANCE AND THE COMBINATION OF LEAST SQUARES
`ESTIMATORS
`Ronny Meir
`
`ON-LINE LEARNING OF DICHOTOMIES
`N. Barkai, H. S. Seung, H. Sompolinsky
`
`DYNAMIC MODELLING OF CHAOTIC TIME SERIES WITH NEURAL NETWORKS
`Jose Principe, Jyh-Ming Kuo
`
`ARIGOROUS ANALYSIS OF LINSKER-TYPE HEBBIAN LEARNING
`Jianfeng Feng, H. Pan, V. P. Roychowdhury
`
`SAMPLE SIZE REQUIREMENTS FOR FEEDFORWARD NEURAL NETWORKS
`Michael Turmon, Terrence L. Fine
`
`ASYMPTOTICS OF GRADIENT-BASED NEURAL NETWORK TRAINING
`ALGORITHMS
`Sayandev Mukherjee, Terrence L. Fine
`
`247
`
`255
`
`263
`
`271
`
`279
`
`287
`
`295
`
`303
`
`31]
`
`319
`
`327
`
`335
`
`Page18 of 48
`
`Page 18 of 48
`
`
`
`REINFORCEMENT LEARNING ALGORITHM FOR PARTIALLY OBSERVABLE
`MARKOVDECISION PROBLEMS
`Tommi Jaakkola, Satinder P. Singh, Michael I. Jordan
`
`ADVANTAGE UPDATING APPLIED TO A DIFFERENTIAL GAME
`Mance E. Harmon, Leemon C. Baird Ill, A. Harry Klopf
`
`REINFORCEMENT LEARNING WITH SOFT STATE AGGREGATION
`Satinder Singh, Tommi Jaakkola, Michael I. Jordan
`
`GENERALIZATION IN REINFORCEMENT LEARNING: SAFELY
`APPROXIMATING THE VALUE FUNCTION
`
`Justin Boyan, Andrew W. Moore
`
`INSTANCE-BASED STATE IDENTIFICATION FOR REINFORCEMENT
`LEARNING
`R. Andrew McCailum
`
`FINDING STRUCTURE IN REINFORCEMENT LEARNING
`Sebastian Thrun, Anton Schwartz
`
`REINFORCEMENT LEARNING METHODS FOR CONTINUOUS-TIME
`MARKOV DECISION PROBLEMS
`
`Steven Bradtke, Michael O. Duff
`
`AN ACTOR/CRITIC ALGORITHM THATIS EQUIVALENT TO Q-LEARNING
`Robert Crites, Andrew G. Barto
`
`PART V
`ALGORITHMS AND ARCHITECTURES
`
`FINANCIAL APPLICATIONS OF LEARNING FROM HINTS
`Yaser S. Abu-Mostafa (Invited Paper)
`
`COMBINING ESTIMATORS USING NON-CONSTANT WEIGHTING FUNCTIONS
`Volker Tresp, Michiaki Taniguchi
`
`AN INPUT OUTPUT HMM ARCHITECTURE
`
`Yoshua Bengio, Paolo Frasconi
`
`BOLTZMANN CHAINS AND HIDDEN MARKOV MODELS
`Lawrence K. Saul, Michael I. Jordan
`
`345
`
`353
`
`361
`
`369
`
`377
`
`385
`
`393
`
`401
`
`41]
`
`419
`
`427
`
`435
`
`Page 19 of 48
`
`Page 19 of 48
`
`
`
`x
`
`Contents
`
`BAYESIAN QUERY CONSTRUCTION FOR NEURAL NETWORK MODELS
`Gerhard Paass, Jorg Kindermann
`
`USING A SALIENCY MAP FOR ACTIVE SPATIAL SELECTIVE ATTENTION:
`IMPLEMENTATION & INITIAL RESULTS
`Shumeet Baluja, Dean A. Pomerleau
`
`MULTIDIMENSIONAL SCALING AND DATA CLUSTERING
`
`Thomas Hofmann, Joachim Buhmann
`
`A NON-LINEAR INFORMATION MAXIMISATION ALGORITHM THAT
`PERFORMSBLIND SEPARATION
`Anthony J. Bell, Terrence J. Sejnowski
`
`PLASTICITY-MEDIATED COMPETITIVE LEARNING
`Nicol Schraudolph, Terrence J. Sejnowski
`
`PHASE-SPACE LEARNING
`
`Fu-Sheng Tsung, Garrison W. Cottrell
`
`LEARNING LOCAL ERROR BARS FOR NONLINEAR REGRESSION
`
`David A. Nix, Andreas S. Weigend
`
`DYNAMIC CELL STRUCTURES
`Jorg Bruske, Gerald Sommer
`
`EXTRACTING RULES FROM ARTIFICIAL NEURAL NETWORKSWITH
`DISTRIBUTED REPRESENTATIONS
`Sebastian Thrun
`
`CAPACITY AND INFORMATIONEFFICIENCY OF A BRAIN-LIKE
`ASSOCIATIVE NET
`Bruce Graham, David Willshaw
`
`BOOSTING THE PERFORMANCE OF RBF NETWORKSWITH DYNAMIC
`DECAY ADJUSTMENT
`
`Michael R. Berthold, Jay Diamond
`
`SIMPLIFYING NEURAL NETS BY DISCOVERING FLAT MINIMA
`Sepp Hochreiter, Jiirgen Schmidhuber
`
`LEARNING WITH PRODUCT UNITS
`Laurens Leerink, C. Lee Giles, Bill G. Horne, Marwan A. Jabri
`
`DETERMINISTIC ANNEALING VARIANT OF THE EM ALGORITHM
`Naonori Ueda, Ryohei Nakano
`
`443
`
`45]
`
`459
`
`467
`
`475
`
`481
`
`489
`
`497
`
`505
`
`S13
`
`521
`
`529
`
`537
`
`545
`
`Page 20 of 48
`
`Page 20 of 48
`
`
`
`Contents
`
`DIFFUSION OF CREDIT IN MARKOVIAN MODELS
`Yoshua Bengio, Paolo Frasconi
`
`FACTORIAL LEARNING BY CLUSTERING FEATURES
`Joshua B. Tenenbaum, Emmanuel V. Todorov
`
`INTERIOR POINT IMPLEMENTATIONS OF ALTERNATING MINIMIZATION
`TRAINING
`Michael Lemmon, Peter T. Szymanski
`
`SARDNET: A SELF-ORGANIZING FEATURE MAP FOR SEQUENCES
`Daniel L. James, Risto Miikkulainen
`
`CONVERGENCE PROPERTIES OF THE K-MEANS ALGORITHMS
`Léon Bottou, Yoshua Bengio
`
`ACTIVE LEARNING FOR FUNCTION APPROXIMATION
`Kah Kay Sung, Partha Niyogi
`
`ANALYSIS OF UNSTANDARDIZED CONTRIBUTIONSIN CROSS
`CONNECTED NETWORKS
`Thomas R. Shultz, Yuriko Oshima-Takane, Yoshio Takane
`
`TEMPLATE-BASED ALGORITHMS FOR CONNECTIONIST RULE EXTRACTION
`JayA. Alexander, Michael C. Mozer
`
`FACTORIAL LEARNING AND THE EM ALGORITHM
`Zoubin Ghahramani
`
`A GROWING NEURAL GAS NETWORK LEARNS TOPOLOGIES
`Bernd Fritzke
`
`AN ALTERNATIVE MODEL FOR MIXTURES OF EXPERTS
`Lei Xu, Michael I. Jordan, Geoffrey E. Hinton
`
`ESTIMATING CONDITIONAL PROBABILITY DENSITIES FOR PERIODIC
`VARIABLES
`Chris M. Bishop, Claire Legleye
`
`EFFECTS OF NOISE ON CONVERGENCE AND GENERALIZATION
`IN RECURRENT NETWORKS
`Kam Jim, Bill G. Horne, C. Lee Giles
`
`LEARNING MANY RELATED TASKS AT THE SAME TIME
`WITH BACKPROPAGATION
`Rich Caruana
`
`XE
`
`553
`
`561
`
`569
`
`SAF
`
`585
`
`593
`
`60!
`
`609
`
`617
`
`625
`
`633
`
`641
`
`649
`
`657
`
`Page 21 of 48
`
`Page 21 of 48
`
`
`
`xii
`
`Contents
`
`A RAPID GRAPH-BASED METHOD FOR ARBITRARY
`TRANSFORMATION-INVARIANT PATTERN CLASSIFICATION
`Alessandro Sperduti, David G. Stork
`
`RECURRENT NETWORKS: SECOND ORDER PROPERTIES AND PRUNING
`Morten With Pedersen, Lars Kai Hansen
`
`CLASSIFYING WITH GAUSSIAN MIXTURES AND CLUSTERS
`Nanda Kambhatla, Todd K. Leen
`
`EFFICIENT METHODS FOR DEALING WITH MISSING DATA
`IN SUPERVISED LEARNING
`Volker Tresp, Ralph Neuneier, Subutai Ahmad
`AN EXPERIMENTAL COMPARISON OF RECURRENT NEURAL NETWORKS
`Bill G. Horne, C. Lee Giles
`
`ACTIVE LEARNINGWITH STATISTICAL MODELS
`David Cohn, Zoubin Ghahramani, Michael I. Jordon
`
`LEARNING WITH PREKNOWLEDGE: CLUSTERING WITH POINT AND
`GRAPH MATCHING DISTANCE MEASURES
`Steven Gold, Anand Rangarajan, Eric Mjolsness
`
`DIRECT MULTI-STEP TIME SERIES PREDICTION USING TDA)
`Peter Kazlas, Andreas S. Weigend
`
`PART VI
`IMPLEMENTATIONS
`
`ICEG MORPHOLOGYCLASSIFICATION USING AN ANALOGUE VLSI
`NEURAL NETWORK
`Richard Coggins, Marwan Jabri, Barry Flower, Stephen Pickard
`
`A SILICON AXON
`Bradley A. Minch, Paul Hasler, Chris Diorio, Carver Mead
`
`THE NI1000: HIGH SPEED PARALLEL VLSI FOR IMPLEMENTING
`MULTILAYER PERCEPTRONS
`Michael P. Perrone, Leon N. Cooper
`
`AREAL TIME CLUSTERING CMOS NEURAL ENGINE
`T. Serrano-Gotarredona, B. Linares-Barranco, J. L. Huertas
`
`665
`
`673
`
`681
`
`689
`
`697
`
`705
`
`713
`
`f2i
`
`731
`
`739
`
`747
`
`755
`
`Page 22 of 48
`
`Page 22 of 48
`
`
`
`Contents
`
`PULSESTREAM SYNAPSES WITH NON-VOLATILE ANALOGUE
`AMORPHOUS-SILICON MEMORIES
`A. J. Holmes, A. F. Murray, S. Churcher, J. Hajto, M. J. Rose
`
`A LAGRANGIAN FORMULATION FOR OPTICAL BACKPROPAGATION
`TRAINING IN KERR-TYPE OPTICAL NETWORKS
`James E. Steck, Steven R. Skinner, Alvaro A. Cruz-Cabrara,
`Elizabeth C. Behrman
`
`A CHARGE-BASED CMOS PARALLEL ANALOG VECTOR QUANTIZER
`Gert Cauwenberghs, Volnei Pedroni
`
`AN AUDITORY LOCALIZATION AND COORDINATE TRANSFORM CHIP
`Timothy Horiuchi
`
`AN ANALOG NEURAL NETWORKINSPIRED BY FRACTAL BLOCK CODING
`Fernando Pineda, Andreas G. Andreou
`
`A STUDY OF PARALLEL PERTURBATIVE GRADIENT DESCENT
`D. Lippe, J. Alspector
`
`IMPLEMENTATION OF NEURAL HARDWAREWITH THE NEURAL VLSI
`OF URANIN APPLICATIONS WITH REDUCED REPRESENTATIONS
`Il-Song Han, Hwang-Soo Lee, Ki-Chul Kim
`
`SINGLE TRANSISTOR LEARNING SYNAPSES
`Paul Hasler, Chris Diorio, Bradley A. Minch, Carver Mead
`
`PART VII
`SPEECH AND SIGNAL PROCESSING
`
`PATTERN PLAYBACKIN THE '90S
`Malcolm Sianey (Invited Paper)
`
`NON-LINEAR PREDICTION OF ACOUSTIC VECTORS USING
`HIERARCHICAL MIXTURES OF EXPERTS
`S. R. Waterhouse, A. J. Robinson
`
`GLOVE-TALKII: MAPPING HAND GESTURES TO SPEECH USING
`NEURAL NETWORKS
`S. Sidney Fels, Geoffrey Hinton
`
`VISUAL SPEECH RECOGNITION WITH STOCHASTIC NETWORKS
`Javier Movellan
`
`xiil
`
`763
`
`771
`
`779
`
`787
`
`795
`
`803
`
`81]
`
`817
`
`827
`
`835
`
`843
`
`851
`
`Page 23 of 48
`
`Page 23 of 48
`
`
`
`xiv
`
`Contents
`
`HIERARCHICAL MIXTURES OF EXPERTS METHODOLOGY
`APPLIED TO CONTINUOUS SPEECH RECOGNITION
`Ying Zhao, Richard Schwartz, Jason Sroka, John Makhoul
`
`CONNECTIONIST SPEAKER NORMALIZATION WITH GENERALIZED
`RESOURCE ALLOCATING NETWORKS
`Cesare Furlanello, Diego Giuliani, Edmondo Trentin
`
`USING VOICE TRANSFORMATIONSTO CREATE ADDITIONAL
`TRAINING TALKERS FOR WORD SPOTTING
`Eric I. Chang, Richard P. Lippmann
`
`A COMPARISON OF DISCRETE-TIME OPERATOR MODELS FOR NONLINEAR
`SYSTEM IDENTIFICATION
`Andrew D. Back, Ah Chung Tsoi
`
`PART VIII
`VISUAL PROCESSING
`
`LEARNING SACCADIC EYE MOVEMENTSUSING MULTISCALE
`SPATIAL FILTERS
`Rajesh P. N. Rao, Dana H.Ballard
`
`A CONVOLUTIONAL NEURAL NETWORK HAND TRACKER
`Steven J. Nowlan, John C. Platt
`
`CORRELATION AND INTERPOLATION NETWORKS FOR REAL-TIME
`EXPRESSION ANALYSIS/SYNTHESIS
`Trevor Darrell, Irfan Essa, Alex Pentland
`
`LEARNING DIRECTION IN GLOBAL MOTION: TWO CLASSES OF
`PSYCHOPHYSICALLY-MOTIVATED MODELS
`V. Sundareswaran, Lucia M. Vaina
`
`ASSOCIATIVE DECORRELATION DYNAMICS: A THEORY OF
`SELF-ORGANIZATION AND OPTIMIZATION IN FEEDBACK NETWORKS
`Dawei W. Dong
`
`JPMAX: LEARNING TO RECOGNIZE MOVING OBJECTS AS A MODEL-
`FITTING PROBLEM
`Suzanna Becker
`
`PCA-PYRAMIDS FOR IMAGE COMPRESSION
`Horst Bischof, Kurt Hornik
`
`859
`
`867
`
`875
`
`883
`
`893
`
`901
`
`909
`
`917
`
`925
`
`933
`
`941
`
`Page 24 of 48
`
`Page 24 of 48
`
`
`
`Contents
`
`UNSUPERVISED CLASSIFICATION OF 3D OBJECTS FROM 2D VIEWS
`Satoshi Suzuki, Hiroshi Ando
`
`NEW ALGORITHMSFOR 2D AND 3D POINT MATCHING:
`POSE ESTIMATION AND CORRESPONDENCE
`Steven Gold, Chien Ping Lu, Anand Rangarajan,
`Suguna Pappu, Eric Mjolsness
`
`USING A NEURAL NET TO INSTANTIATE A DEFORMABLE MODEL
`Christopher K. I. Williams, Michael D. Revow, Geoffrey E. Hinton
`
`NONLINEAR IMAGE INTERPOLATION USING MANIFOLD LEARNING
`Christoph Bregler, Stephen M. Omohundro
`
`COARSE-TO-FINE IMAGE SEARCH USING NEURAL NETWORKS
`Clay D. Spence, John C. Pearson, Jim Bergen
`
`PART IX
`APPLICATIONS
`
`TRANSFORMATION INVARIANT AUTOASSOCIATION WITH
`APPLICATION TO HANDWRITTEN CHARACTER RECOGNITION
`Holger Schwenk, Maurice Milgram
`
`LEARNING PROTOTYPE MODELS FOR TANGENT DISTANCE
`Trevor Hastie, Patrice Simard, Eduard Sackinger
`
`REAL-TIME CONTROL OFTOKAMAK PLASMA USING NEURAL NETWORKS
`Chris M. Bishop, Paul S. Haynes, Mike E. U. Smith, Tom N. Todd,
`David L. Trotman, Colin G. Windsor
`
`RECOGNIZING HANDWRITTEN DIGITS USING MIXTURES OF LINEAR MODELS
`Geoffrey E. Hinton, Michael Revow, Peter Dayan
`
`OPTIMAL MOVEMENT PRIMITIVES
`Terence Sanger
`
`AN INTEGRATED ARCHITECTURE OF ADAPTIVE NEURAL
`NETWORK CONTROL FOR DYNAMIC SYSTEMS
`Liu Ke, Robert L. Tokar, Brian D. McVey
`
`ACONNECTIONIST TECHNIQUE FOR ACCELERATED TEXTUAL INPUT:
`LETTING A NETWORK DO THE TYPING
`Dean Pomerleau
`
`949
`
`957
`
`965
`
`973
`
`981
`
`991
`
`999
`
`1007
`
`1015
`
`1023
`
`1031
`
`1039
`
`Page 25 of 48
`
`Page 25 of 48
`
`
`
`Raul
`
`Contents
`
`PREDICTIVE CODING WITH NEURAL NETS: APPLICATION TO
`TEXT COMPRESSION
`
`Jiirgen Schmidhuber, Stefan Heil
`
`PREDICTING THE RISK OF COMPLICATIONS IN CORONARY
`ARTERY BYPASS OPERATIONS USING NEURAL NETWORKS
`
`Richard P. Lippmann, Linda Kukolich, David Shahian
`
`COMPARINGTHE PREDICTION ACCURACYOF ARTIFICIAL NEURAL
`NETWORKS AND OTHERSTATISTICAL MODELS FOR BREAST CANCER
`SURVIVAL
`
`Harry B. Burke, David B. Rosen, Philip H. Goodman
`
`LEARNING TO PLAY THE GAME OF CHESS
`Sebastian Thrun
`
`A MIXTURE MODEL SYSTEM FOR MEDICAL AND MACHINE DIAGNOSIS
`
`Magnus Stensmo, Terrence J. Sejnowski
`
`INFERRING GROUND TRUTH FROM SUBJECTIVE LABELLING OF VENUS
`IMAGES
`
`Padhraic Smyth, Usama Fayyad, Michael Burl, Pietro Perona, Pierre Baldi
`
`THE USE OF DYNAMIC WRITING INFORMATIONIN A CONNECTIONIST
`ON-LINE CURSIVE HANDWRITING RECOGNITION SYSTEM
`
`Stefan Manke, Michael Finke, Alex Waibel
`
`ADAPTIVE ELASTIC INPUT FIELD FOR RECOGNITION IMPROVEMENT
`
`Minoru Asogawa
`
`PAIRWISE NEURAL NETWORKCLASSIFIERS WITH PROBABILISTIC OUTPUTS
`David Price, Stefan Knerr, Léon Personnaz, Gérard Dreyfus
`
`INTERFERENCE IN LEARNING INTERNAL MODELS OF INVERSE
`DYNAMICS IN HUMANS
`
`Reza Shadmehr, Tom Brashers-Krug, Ferdinando Mussa-lvaldi
`
`COMPUTATIONAL STRUCTURE OF COORDINATE TRANSFORMATIONS:
`A GENERALIZATION STUDY
`
`Zoubin Ghahramani, Daniel M. Wolpert, Michael I. Jordan
`
`Author Index
`
`Keyword Index
`
`1047
`
`1055
`
`1063
`
`1069
`
`1077
`
`1085
`
`1093
`
`1101
`
`1109
`
`1117
`
`1125
`
`1133
`1137
`
`Page 26 of 48
`
`Page 26 of 48
`
`
`
`
`This material may be protected by Copyright law (Title 17 U.S. Code)
`
`NN_.____|
`The Use of Dynamic Writing Information
`in a Connectionist On-Line Cursive
`Handwriting Recognition System
`oO
`
`Stefan Manke
`University of Karlsruhe
`Computer Science Department
`D-76128 Karlsruhe, Germany
`manke@ira.uka.de, finkem@ira.uka.de
`
`Michael Finke
`Alex Waibel
`Carnegie Mellon University
`School of Computer Science
`Pittsburgh, PA 15213-3890, U.S.A.
`waibel@cs.cmu.edu
`
`Abstract
`
`In this paper we present NPent+, a connectionist system for
`writer independent,
`large vocabulary on-line cursive handwriting
`recognition. This system combines a robust input representation,
`which preserves the dynamic writing information, with a neural
`network architecture, a so called Multi-State Time Delay Neural
`Network (MS-TDNN), which integrates recognition and segmen-
`tation in a single framework. Our preprocessing transforms the
`original coordinate sequence intoa (still temporal) sequence of fea-
`ture vectors, which combine strictly local features, like curvature
`or writing direction, with a bitmap-like representation of the co-
`ordinate’s proximity. The MS-TDNN architecture is well suited
`for handling temporal sequences as provided by this input. rep-
`resentation. Our system is tested both on writer dependent and
`writer independent tasks with vocabulary sizes ranging from 400
`up to 20,000 words. For example, on a 20,000 word vocabulary we
`achieve word recognition rates up to 88.9% (writer dependent) and
`84.1% (writer independent) without using any language models.
`
`Page 27 of 48
`
`
`
`Several preprocessing and recognition approaches for on-line handwriting recog-
`nition have been developed during the past years. The main advantage of on-line
`handwriting recognition in comparison to optical character recognition (OCR) is the
`temporal information of handwriting, which can be recorded and used for recogni-
`tion. In general this dynamic writing information (i.e. the time-ordered sequence of
`coordinates) is not available in OCR, where input consists of scanned text. In this
`paper we present the NPent+ system, which is designed to preserve the dynamic
`writing information as long as possible in the preprocessing and recognition process.
`
`During preprocessing a temporal sequence of N-dimensional feature vectors is com-
`puted from the original coordinate sequence, which is recorded on the digitizer.
`These feature vectors combinestrictly local features, like curvature and writing di-
`rection [4], with so-called context bitmaps, which are bitmap-like representationsof
`a coordinate’s proximity.
`The recognition component of NPentt is well suited for handling temporal se-
`quences of patterns, as provided by this kind of input representation. The rec-
`ognizer, a so-called Multi-State Time Delay Neural Network (MS-TDNN), inte-
`grates recognition and segmentation of words into a single network architecture.
`The MS-TDNN, which was originally proposed for continuous speech recognition
`tasks [6, 7], combines shift-invariant, high accuracy pattern recognition capabilities
`of a TDNN [8, 4] with a non-linear alignment procedure for aligning strokes into
`character sequences.
`
`Our system is applied both to different writer dependent and writer independent,
`large vocabulary handwriting recognition tasks with vocabulary sizes up to 20,000
`words. Writer independent word recognition rates range from 92.9% with a 400
`word vocabulary to 84.1% with a 20,000 word vocabulary. For the writer dependent
`system, word recognition rates for the same tasks range from 98.6% to 88.9% [I].
`In the following section we give a description of our preprocessing performed on the
`raw coordinate sequence, provided by the digitizer. In section 3 the architecture and
`training of the recognizer is presented. A description of the experiments to evaluate
`the system and the results we have achieved on different
`tasks can be found in
`section 4. Conclusions and future work is described in section 5.
`
`2 PREPROCESSING
`
`is
`the temporal order of the data points,
`i.e.
`The dynamic writing information,
`preserved throughout all preprocessing steps. The original coordinate sequence
`{(2(t), 9(t)) }eeqo...744 recorded on the digitizer is transformed into a new temporal
`sequence af = @...a7, where each frame x; consists of an N-dimensional real-
`valued feature vector (fi(é),...,