rent-now

Rent More, Save More! Use code: ECRENTAL

5% off 1 book, 7% off 2 books, 10% off 3+ books

9780805815900

Proceedings of the 1993 Connectionist Models Summer School

by ;
  • ISBN13:

    9780805815900

  • ISBN10:

    0805815902

  • Edition: 1st
  • Format: Paperback
  • Copyright: 1993-11-01
  • Publisher: Psychology Pres

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $125.00 Save up to $78.80
  • Rent Book $84.38
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    USUALLY SHIPS IN 3-5 BUSINESS DAYS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

How To: Textbook Rental

Looking to rent a book? Rent Proceedings of the 1993 Connectionist Models Summer School [ISBN: 9780805815900] for the semester, quarter, and short term or search our site for other textbooks by Mozer; Michael C.. Renting a textbook can save you up to 90% from the cost of buying.

Summary

The result of the 1993 Connectionist Models Summer School, the papers in this volume exemplify the tremendous breadth and depth of research underway in the field of neural networks. Although the slant of the summer school has always leaned toward cognitive science and artificial intelligence, the diverse scientific backgrounds and research interests of accepted students and invited faculty reflect the broad spectrum of areas contributing to neural networks, including artificial intelligence, cognitive science, computer science, engineering, mathematics, neuroscience, and physics. Providing an accurate picture of the state of the art in this fast-moving field, the proceedings of this intense two-week program of lectures, workshops, and informal discussions contains timely and high-quality work by the best and the brightest in the neural networks field.

Table of Contents

Forewordp. vii
Participants in the 1993 Connectionist Models Summer Schoolp. ix
Neurosciencep. 1
Sigma Vs Pi Properties of Spiking Neuronsp. 3
Sigma Vs Pi Properties of Spiking Neuronsp. 3
Towards a Computational Theory of Rat Navigationp. 11
Conclusionp. 18
Referencesp. 18
Evaluating Connectionist Models in Psychology and Neurosciencep. 20
Acknowledgmentsp. 25
Referencesp. 26
Visionp. 29
Self-Organizing Feature Maps with Lateral Connections: Modeling Ocular Dominancep. 31
Referencesp. 37
Referencesp. 37
Joint Solution of Low, Intermediate and High-Level Vision Tasks by Global Optimization: Application to Computer Vision at Low Snrp. 39
Joint Solution of Low, Intermediate and High-Level Vision Tasks by Global Optimization: Application to Computer Vision at Low Snrp. 39
Learning Global Spatial Structures from Local Associationsp. 48
Learning Global Spatial Structures from Local Associationsp. 48
Referencesp. 54
Acknowledgmentsp. 54
Cognitive Modelingp. 55
Connectionist Model of Auditory Morse Code Perceptionp. 57
Connectionist Model of Auditory Morse Code Perceptionp. 57
Acknowledgmentsp. 64
Referencesp. 64
Competitive Neural Network Model for the Process of Recurrent Choicep. 65
Competitive Neural Network Model for the Process of Recurrent Choicep. 65
Acknowledgementsp. 72
Acknowledgementsp. 72
A Neural Network Simulation of Numerical Verbal-To-Arabic Transcodingp. 73
Conclusionsp. 78
Acknowledgmentsp. 79
Referencesp. 79
Combining Models of Single-Digit Arithmetic and Magnitude Comparisonp. 81
Acknowledgementsp. 86
Referencesp. 86
Neural Network Models as Tools for Understanding High-Level Cognition: Developing Paradigms for Cognitive Interpretation of Neural Network Modelsp. 87
Neural Network Models as Tools for Understanding High-Level Cognition: Developing Paradigms for Cognitive Interpretation of Neural Network Modelsp. 87
Conclusionsp. 92
Acknowledgmentsp. 92
Referencesp. 93
Languagep. 95
Modeling Language as Sensorimotor Coordinationp. 97
Structure and Content in Word Production: Why It's Hard to Say Dlormp. 105
Acknowledgmentsp. 111
Acknowledgmentsp. 111
Investigating Phonological Representations: a Modeling Agendap. 113
Acknowledgmentsp. 120
Part-Of-Speech Tagging Using a Variable Context Markov Modelp. 122
Part-Of-Speech Tagging Using a Variable Context Markov Modelp. 122
Referencesp. 129
Acknowledgmentp. 129
Quantitative Predictions from a Constraint-Based Theory of Syntactic Ambiguity Resolutionp. 130
Optimality Semanticsp. 138
Optimality Semanticsp. 138
Symbolic Computation and Rulesp. 147
What's in a Rule? the Past Tense by Some Other Name Might Be Called a Connectionist Netp. 149
What's in a Rule? the Past Tense by Some Other Name Might Be Called a Connectionist Netp. 149
Referencesp. 156
Acknowledgementsp. 156
On the Proper Treatment of Symbolism a Lesson from Linguisticsp. 157
Conclusionp. 160
Acknowledgmentsp. 160
Referencesp. 161
Structure Sensitivity in Connectionist Modelsp. 162
Conclusionp. 168
Acknowledgmentsp. 168
Referencesp. 169
Looking for Structured Representations in Recurrent Networksp. 170
Looking for Structured Representations in Recurrent Networksp. 170
Conclusionp. 176
Acknowledgementsp. 176
Referencesp. 177
Back Propagation with Understandable Resultsp. 178
Back Propagation with Understandable Resultsp. 178
Introductionp. 178
Conclusionp. 183
Referencesp. 183
Understanding Neural Networks Via Rule Extraction and Pruningp. 184
Understanding Neural Networks Via Rule Extraction and Pruningp. 184
Acknowledgementsp. 190
Conclusionsp. 190
Referencesp. 191
Rule Learning and Extraction with Self-Organizing Neural Networksp. 192
Concluding Remarksp. 199
Acknowledgementsp. 199
Acknowledgementsp. 199
Recurrent Networks and Temporal Pattern Processingp. 201
Recurrent Networks: State Machines or Iterated Function Systems?p. 203
Recurrent Networks: State Machines or Iterated Function Systems?p. 203
Acknowledgmentsp. 210
Referencesp. 210
On the Treatment of Time in Recurrent Neural Networksp. 211
Acknowledgementsp. 217
Referencesp. 218
Finding Metrical Structure in Timep. 219
Conclusionsp. 225
Acknowledgmentsp. 226
Representations of Tonal Music: a Case Study in the Development of Temporal Relationshipsp. 228
Representations of Tonal Music: a Case Study in the Development of Temporal Relationshipsp. 228
Conclusionsp. 233
Acknowledgmentsp. 234
Applications of Radial Basis Function Fitting to the Analysis of Dynamical Systemsp. 236
Acknowledgmentsp. 243
Referencesp. 243
Event Prediction: Faster Learning in a Layered Hebbian Network with Memoryp. 245
Event Prediction: Faster Learning in a Layered Hebbian Network with Memoryp. 245
Acknowledgmentsp. 250
Referencesp. 250
Controlp. 253
Issues in Using Function Approximation for Reinforcement Learningp. 255
Approximating Q-Values with Basis Function Representationsp. 264
Approximating Q-Values with Basis Function Representationsp. 264
Referencesp. 271
Referencesp. 271
Efficient Learning of Multiple Degree-Of-Freedom Control Problems with Quasi-Independent Q-Agentsp. 272
Efficient Learning of Multiple Degree-Of-Freedom Control Problems with Quasi-Independent Q-Agentsp. 272
Referencesp. 279
Neural Adaptive Control of Systems with Drifting Parametersp. 280
Learning Algorithms and Architecturesp. 289
Temporally Local Unsupervised Learning: The Maxin Algorithm for Maximizing Input Informationp. 291
Minimizing Disagreement for Self-Supervised Classificationp. 300
Acknowledgementsp. 307
Referencesp. 307
Comparison of Two Unsupervised Neural Network Models for Redundancy Reductionp. 308
Conclusionp. 314
Referencesp. 315
Acknowledgmentsp. 315
Solving Inverse Problems Using an Em Approach to Density Estimationp. 316
Estimating A-Posteriori Probabilities Using Stochastic Network Modelsp. 324
Estimating A-Posteriori Probabilities Using Stochastic Network Modelsp. 324
Referencesp. 331
Referencesp. 331
Learning Theoryp. 333
On Overfitting and the Effective Number of Hidden Unitsp. 335
Increase of Apparent Complexity is Due to Decrease of Training Set Errorp. 343
Acknowledgmentsp. 348
Referencesp. 348
Momentum and Optimal Stochastic Searchp. 351
Momentum and Optimal Stochastic Searchp. 351
Referencesp. 356
Acknowledgmentsp. 356
Scheme to Improve the Generalization Errorp. 358
Scheme to Improve the Generalization Errorp. 358
Acknowledgementsp. 363
Referencesp. 363
General Averaging Results for Convex Optimizationp. 364
General Averaging Results for Convex Optimizationp. 364
Acknowledgementsp. 369
Summaryp. 369
Referencesp. 370
Multitask Connectionist Learningp. 372
Acknowledgementsp. 378
Summaryp. 378
Referencesp. 379
Estimating Learning Performance Using Hintsp. 380
Conclusionp. 385
Referencesp. 386
Acknowledgementp. 386
Simulation Toolsp. 387
A Simulator for Asynchronous Hopfield Modelsp. 389
An Object-Oriented Dataflow Approach for Better Designs of Neural Net Architecturesp. 397
Acknowledgmentsp. 404
Referencesp. 404
Indexp. 405
Table of Contents provided by Publisher. All Rights Reserved.

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program