did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780521631976

Information Theory and the Brain

by
  • ISBN13:

    9780521631976

  • ISBN10:

    0521631971

  • Edition: 1st
  • Format: Hardcover
  • Copyright: 2000-05-15
  • Publisher: Cambridge University Press

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $140.00 Save up to $42.00
  • Rent Book $98.00
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    SPECIAL ORDER: 1-2 WEEKS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

Information Theory and the Brain deals with an expanding area of neuroscience which provides a framework for understanding neuronal processing. It is derived from a conference held in Newquay, UK, where a select group of scientists from around the world met to discuss the topic. This book begins with an introduction to the basic concepts of information theory and then illustrates these concepts with examples from research over 40 years. Throughout the book, the contributors highlight current research from four different areas: 1) biological networks, 2) information theory and artificial networks, 3) information theory and psychology, 4) formal analysis. Each section includes an introduction and glossary covering basic concepts. This book will appeal to graduate students and researchers in neuroscience as well as computer scientists and cognitive scientists. Neuroscientists interested in any aspect of neural networks or information processing will find this a very useful addition to the current literature in this rapidly growing field.

Table of Contents

List of Contributors
xi
Preface xiii
Introductory Information Theory and the Brain
1(20)
Roland Baddeley
Introduction
1(1)
What Is Information Theory?
1(3)
Why Is This Interesting?
4(1)
Practical Use of Information Theory
5(8)
Maximising Information Transmission
13(4)
Potential Problems and Pitfalls
17(2)
Conclusion
19(2)
Part One: Biological Networks 21(58)
Problems and Solutions in Early Visual Processing
25(16)
Brian G. Burton
Introduction
25(1)
Adaptations of the Insect Retina
26(4)
The Nature of the Retinal Image
30(1)
Theories for the RFs of Retinal Cells
31(5)
The Importance of Phase and the Argument for Sparse, Distributed Coding
36(2)
Discussion
38(3)
Coding Efficiency and the Metabolic Cost of Sensory and Neural Information
41(21)
Simon B. Laughlin
John C. Anderson
David O'Carroll
Rob De Ruyter Van Steveninck
Introduction
41(1)
Why Code Efficiently?
42(3)
Estimating the Metabolic Cost of Transmitting Information
45(3)
Transmission Rates and Bit Costs in Different Neural Components of the Blowfly Retina
48(1)
The Energetic Cost of Neural Information is Substantial
49(1)
The Costs of Synaptic Transfer
50(2)
Bit Costs Scale with Channel Capacity -- Single Synapses Are Cheaper
52(1)
Graded Potentials Versus Action Potentials
53(1)
Costs, Molecular Mechanisms, Cellular Systems and Neural Codes
54(3)
Investment in Coding Scales with Utility
57(1)
Phototransduction and the Cost of Seeing
58(1)
Investment in Vision
59(1)
Energetics -- a Unifying Principle?
60(2)
Coding Third-Order Image Structure
62(17)
Mitchell Thompson
Introduction
62(2)
Higher-Order Statistics
64(1)
Data Acquisition
65(1)
Computing the SCF and Power Spectrum
66(2)
Computing the TCF and Bispectrum
68(2)
Spectral Measures and Moments
70(2)
Channels and Correlations
72(5)
Conclusions
77(2)
Part Two: Information Theory and Artificial Networks 79(122)
Experiments with Low-Entropy Neural Networks
84(17)
George Harpur
Richard Prager
Introduction
84(1)
Entropy in an Information-Processing System
84(2)
An Unsupervised Neural Network Architecture
86(2)
Constraints
88(5)
Linear ICA
93(2)
Image Coding
95(2)
Speech Coding
97(3)
Conclusions
100(1)
The Emergence of Dominance Stripes and Orientation Maps in a Network of Firing Neurons
101(21)
Stephen P. Luttrell
Introduction
101(1)
Theory
102(2)
Dominance Stripes and Orientation Maps
104(5)
Simulations
109(9)
Conclusions
118(4)
Appendix
119(3)
Dynamic Changes in Receptive Fields Induced by Cortical Reorganization
122(17)
German Mato
Nestor Parga
Introduction
122(2)
The Model
124(3)
Discussion of the Model
127(3)
Results
130(7)
Conclusions
137(2)
Time to Learn About Objects
139(25)
Guy Wallis
Introduction
139(3)
Neurophysiology
142(7)
A Neural Network Model
149(4)
Simulating Fractal Image Learning
153(3)
Psychophysical Experiments
156(6)
Discussion
162(2)
Principles of Cortical Processing Applied to and Motivated by Artificial Object Recognition
164(16)
Norbert Kruger
Michael Potzsch
Gabriele Peters
Introduction
164(2)
Object Recognition with Banana Wavelets
166(5)
Analogies to Visual Processing and Their Functional Meaning
171(7)
Conclusion and Outlook
178(2)
Performance Measurement Based on Usable Information
180(21)
Martin Elliffe
Introduction
181(5)
Information Theory: Simplistic Application
186(1)
Information Theory: Binning Strategies
187(4)
Usable Information: Refinement
191(3)
Result Comparison
194(4)
Conclusion
198(3)
Part Three: Information Theory and Psychology 201(54)
Modelling Clarity Change in Spontaneous Speech
204(17)
Matthew Aylett
Introduction
204(2)
Modelling Clarity Variation
206(1)
The Model in Detail
207(6)
Using the Model to Calculate Clarity
213(2)
Evaluating the Model
215(3)
Summary of Results
218(2)
Discussion
220(1)
Free Gifts from Connectionist Modelling
221(20)
John A. Bullinaria
Introduction
221(1)
Learning and Developmental Bursts
222(1)
Regularity, Frequency and Consistency Effects
223(4)
Modelling Reaction Times
227(4)
Speed--Accuracy Trade-offs
231(1)
Reaction Time Priming
232(2)
Cohort and Left--Right Seriality Effects
234(1)
Lesion Studies
235(4)
Discussion and Conclusions
239(2)
Information and Resource Allocation
241(14)
Janne Sinkkonen
Introduction
241(1)
Law for Temporal Resource Allocation
242(4)
Statistical Information and Its Relationships to Resource Allocation
246(2)
Utility and Resource Sharing
248(1)
Biological Validity of the Resource Concept
248(1)
An MMR Study
249(2)
Discussion
251(4)
Part Four: Formal Analysis 255(63)
Quantitative Analysis of a Schaffer Collateral Model
257(16)
Simon Schultz
Stefano Panzeri
Edmund Rolls
Alessandro Treves
Introduction
257(2)
A Model of the Schaffer Collaterals
259(3)
Technical Comments
262(2)
How Graded is Information Representation on the Schaffer Collaterals?
264(3)
Non-uniform Convergence
267(1)
Discussion and Summary
268(5)
Expression from the Replica Evaluation
270(2)
Parameter Values
272(1)
A Quantitative Model of Information Processing in CA1
273(17)
Carlo Fulvi Mari
Stefano Panzeri
Edmund Rolls
Alessandro Treves
Introduction
273(1)
Hippocampal Circuitry
274(2)
The Model
276(4)
Statistical--Informational Analysis
280(1)
Results
281(2)
Discussion
283(7)
Results of the Analytical Evaluation
283(7)
Stochastic Resonance and Bursting in a Binary-Threshold Neuron with Intrinsic Noise
290(15)
Paul C. Bressloff
Peter Roper
Introduction
290(3)
The One-Vesicle Model
293(1)
Neuronal Dynamics
294(6)
Periodic Modulation and Response
300(1)
Conclusions
301(4)
The Continuous-Time CK Equation
303(1)
Derivation of the Critical Temperature
303(2)
Information and Density and Cortical Magnification Factors
305(13)
M.D. Plumbley
Introduction
305(1)
Artificial Neural Feature Maps
306(2)
Information Theory and Information Density
308(1)
Properties of Information Density and Information Distribution
309(2)
Symmetrical Conditional Entropy
311(1)
Example: Two Components
312(1)
Alternative Measures
312(2)
Continuous Domain
314(1)
Continuous Example: Gaussian Random Function
314(2)
Discussion
316(1)
Conclusions
316(2)
Bibliography 318(23)
Index 341

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program