Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
Purchase Benefits
Looking to rent a book? Rent Proceedings of the 1993 Connectionist Models Summer School [ISBN: 9780805815900] for the semester, quarter, and short term or search our site for other textbooks by Mozer; Michael C.. Renting a textbook can save you up to 90% from the cost of buying.
Foreword | p. vii |
Participants in the 1993 Connectionist Models Summer School | p. ix |
Neuroscience | p. 1 |
Sigma Vs Pi Properties of Spiking Neurons | p. 3 |
Sigma Vs Pi Properties of Spiking Neurons | p. 3 |
Towards a Computational Theory of Rat Navigation | p. 11 |
Conclusion | p. 18 |
References | p. 18 |
Evaluating Connectionist Models in Psychology and Neuroscience | p. 20 |
Acknowledgments | p. 25 |
References | p. 26 |
Vision | p. 29 |
Self-Organizing Feature Maps with Lateral Connections: Modeling Ocular Dominance | p. 31 |
References | p. 37 |
References | p. 37 |
Joint Solution of Low, Intermediate and High-Level Vision Tasks by Global Optimization: Application to Computer Vision at Low Snr | p. 39 |
Joint Solution of Low, Intermediate and High-Level Vision Tasks by Global Optimization: Application to Computer Vision at Low Snr | p. 39 |
Learning Global Spatial Structures from Local Associations | p. 48 |
Learning Global Spatial Structures from Local Associations | p. 48 |
References | p. 54 |
Acknowledgments | p. 54 |
Cognitive Modeling | p. 55 |
Connectionist Model of Auditory Morse Code Perception | p. 57 |
Connectionist Model of Auditory Morse Code Perception | p. 57 |
Acknowledgments | p. 64 |
References | p. 64 |
Competitive Neural Network Model for the Process of Recurrent Choice | p. 65 |
Competitive Neural Network Model for the Process of Recurrent Choice | p. 65 |
Acknowledgements | p. 72 |
Acknowledgements | p. 72 |
A Neural Network Simulation of Numerical Verbal-To-Arabic Transcoding | p. 73 |
Conclusions | p. 78 |
Acknowledgments | p. 79 |
References | p. 79 |
Combining Models of Single-Digit Arithmetic and Magnitude Comparison | p. 81 |
Acknowledgements | p. 86 |
References | p. 86 |
Neural Network Models as Tools for Understanding High-Level Cognition: Developing Paradigms for Cognitive Interpretation of Neural Network Models | p. 87 |
Neural Network Models as Tools for Understanding High-Level Cognition: Developing Paradigms for Cognitive Interpretation of Neural Network Models | p. 87 |
Conclusions | p. 92 |
Acknowledgments | p. 92 |
References | p. 93 |
Language | p. 95 |
Modeling Language as Sensorimotor Coordination | p. 97 |
Structure and Content in Word Production: Why It's Hard to Say Dlorm | p. 105 |
Acknowledgments | p. 111 |
Acknowledgments | p. 111 |
Investigating Phonological Representations: a Modeling Agenda | p. 113 |
Acknowledgments | p. 120 |
Part-Of-Speech Tagging Using a Variable Context Markov Model | p. 122 |
Part-Of-Speech Tagging Using a Variable Context Markov Model | p. 122 |
References | p. 129 |
Acknowledgment | p. 129 |
Quantitative Predictions from a Constraint-Based Theory of Syntactic Ambiguity Resolution | p. 130 |
Optimality Semantics | p. 138 |
Optimality Semantics | p. 138 |
Symbolic Computation and Rules | p. 147 |
What's in a Rule? the Past Tense by Some Other Name Might Be Called a Connectionist Net | p. 149 |
What's in a Rule? the Past Tense by Some Other Name Might Be Called a Connectionist Net | p. 149 |
References | p. 156 |
Acknowledgements | p. 156 |
On the Proper Treatment of Symbolism a Lesson from Linguistics | p. 157 |
Conclusion | p. 160 |
Acknowledgments | p. 160 |
References | p. 161 |
Structure Sensitivity in Connectionist Models | p. 162 |
Conclusion | p. 168 |
Acknowledgments | p. 168 |
References | p. 169 |
Looking for Structured Representations in Recurrent Networks | p. 170 |
Looking for Structured Representations in Recurrent Networks | p. 170 |
Conclusion | p. 176 |
Acknowledgements | p. 176 |
References | p. 177 |
Back Propagation with Understandable Results | p. 178 |
Back Propagation with Understandable Results | p. 178 |
Introduction | p. 178 |
Conclusion | p. 183 |
References | p. 183 |
Understanding Neural Networks Via Rule Extraction and Pruning | p. 184 |
Understanding Neural Networks Via Rule Extraction and Pruning | p. 184 |
Acknowledgements | p. 190 |
Conclusions | p. 190 |
References | p. 191 |
Rule Learning and Extraction with Self-Organizing Neural Networks | p. 192 |
Concluding Remarks | p. 199 |
Acknowledgements | p. 199 |
Acknowledgements | p. 199 |
Recurrent Networks and Temporal Pattern Processing | p. 201 |
Recurrent Networks: State Machines or Iterated Function Systems? | p. 203 |
Recurrent Networks: State Machines or Iterated Function Systems? | p. 203 |
Acknowledgments | p. 210 |
References | p. 210 |
On the Treatment of Time in Recurrent Neural Networks | p. 211 |
Acknowledgements | p. 217 |
References | p. 218 |
Finding Metrical Structure in Time | p. 219 |
Conclusions | p. 225 |
Acknowledgments | p. 226 |
Representations of Tonal Music: a Case Study in the Development of Temporal Relationships | p. 228 |
Representations of Tonal Music: a Case Study in the Development of Temporal Relationships | p. 228 |
Conclusions | p. 233 |
Acknowledgments | p. 234 |
Applications of Radial Basis Function Fitting to the Analysis of Dynamical Systems | p. 236 |
Acknowledgments | p. 243 |
References | p. 243 |
Event Prediction: Faster Learning in a Layered Hebbian Network with Memory | p. 245 |
Event Prediction: Faster Learning in a Layered Hebbian Network with Memory | p. 245 |
Acknowledgments | p. 250 |
References | p. 250 |
Control | p. 253 |
Issues in Using Function Approximation for Reinforcement Learning | p. 255 |
Approximating Q-Values with Basis Function Representations | p. 264 |
Approximating Q-Values with Basis Function Representations | p. 264 |
References | p. 271 |
References | p. 271 |
Efficient Learning of Multiple Degree-Of-Freedom Control Problems with Quasi-Independent Q-Agents | p. 272 |
Efficient Learning of Multiple Degree-Of-Freedom Control Problems with Quasi-Independent Q-Agents | p. 272 |
References | p. 279 |
Neural Adaptive Control of Systems with Drifting Parameters | p. 280 |
Learning Algorithms and Architectures | p. 289 |
Temporally Local Unsupervised Learning: The Maxin Algorithm for Maximizing Input Information | p. 291 |
Minimizing Disagreement for Self-Supervised Classification | p. 300 |
Acknowledgements | p. 307 |
References | p. 307 |
Comparison of Two Unsupervised Neural Network Models for Redundancy Reduction | p. 308 |
Conclusion | p. 314 |
References | p. 315 |
Acknowledgments | p. 315 |
Solving Inverse Problems Using an Em Approach to Density Estimation | p. 316 |
Estimating A-Posteriori Probabilities Using Stochastic Network Models | p. 324 |
Estimating A-Posteriori Probabilities Using Stochastic Network Models | p. 324 |
References | p. 331 |
References | p. 331 |
Learning Theory | p. 333 |
On Overfitting and the Effective Number of Hidden Units | p. 335 |
Increase of Apparent Complexity is Due to Decrease of Training Set Error | p. 343 |
Acknowledgments | p. 348 |
References | p. 348 |
Momentum and Optimal Stochastic Search | p. 351 |
Momentum and Optimal Stochastic Search | p. 351 |
References | p. 356 |
Acknowledgments | p. 356 |
Scheme to Improve the Generalization Error | p. 358 |
Scheme to Improve the Generalization Error | p. 358 |
Acknowledgements | p. 363 |
References | p. 363 |
General Averaging Results for Convex Optimization | p. 364 |
General Averaging Results for Convex Optimization | p. 364 |
Acknowledgements | p. 369 |
Summary | p. 369 |
References | p. 370 |
Multitask Connectionist Learning | p. 372 |
Acknowledgements | p. 378 |
Summary | p. 378 |
References | p. 379 |
Estimating Learning Performance Using Hints | p. 380 |
Conclusion | p. 385 |
References | p. 386 |
Acknowledgement | p. 386 |
Simulation Tools | p. 387 |
A Simulator for Asynchronous Hopfield Models | p. 389 |
An Object-Oriented Dataflow Approach for Better Designs of Neural Net Architectures | p. 397 |
Acknowledgments | p. 404 |
References | p. 404 |
Index | p. 405 |
Table of Contents provided by Publisher. All Rights Reserved. |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.