did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780262541855

Theoretical Neuroscience Computational and Mathematical Modeling of Neural Systems

by ;
  • ISBN13:

    9780262541855

  • ISBN10:

    0262541858

  • Format: Paperback
  • Copyright: 2005-08-12
  • Publisher: The MIT Press

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

List Price: $58.67 Save up to $17.60
  • Rent Book $41.07
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    USUALLY SHIPS IN 3-5 BUSINESS DAYS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

Theoretical neuroscience provides a quantitative basis for describing what nervous systems do, determining how they function, and uncovering the general principles by which they operate. This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory. The book is divided into three parts. Part I discusses the relationship between sensory stimuli and neural responses, focusing on the representation of information by the spiking activity of neurons. Part II discusses the modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics. Part III analyzes the role of plasticity in development and learning. An appendix covers the mathematical methods used, and exercises are available on the book's Web site.

Author Biography

Peter Dayan is Professor of Computational Neuroscience at the Gatsby Computational Neuroscience Unit at University College London

Table of Contents

Preface xiii
I Neural Encoding and Decoding
1(150)
Neural Encoding I: Firing Rates and Spike Statistics
3(42)
Introduction
3(5)
Spike Trains and Firing Rates
8(9)
What Makes a Neuron Fire?
17(7)
Spike-Train Statistics
24(10)
The Neural Code
34(5)
Chapter Summary
39(1)
Appendices
40(3)
Annotated Bibliography
43(2)
Neural Encoding II: Reverse Correlation and Visual Receptive Fields
45(42)
Introduction
45(1)
Estimating Firing Rates
45(6)
Introduction to the Early Visual System
51(9)
Reverse-Correlation Methods: Simple Cells
60(14)
Static Nonlinearities: Complex Cells
74(3)
Receptive Fields in the Retina and LGN
77(2)
Constructing V1 Receptive Fields
79(2)
Chapter Summary
81(1)
Appendices
81(3)
Annotated Bibliography
84(3)
Neural Decoding
87(36)
Encoding and Decoding
87(2)
Discrimination
89(8)
Population Decoding
97(16)
Spike-Train Decoding
113(5)
Chapter Summary
118(1)
Appendices
119(3)
Annotated Bibliography
122(1)
Information Theory
123(28)
Entropy and Mutual Information
123(7)
Information and Entropy Maximization
130(15)
Entropy and Information for Spike Trains
145(4)
Chapter Summary
149(1)
Appendix
150(1)
Annotated Bibliography
150(1)
II Neurons and Neural Circuits
151(128)
Model Neurons I: Neuroelectronics
153(42)
Introduction
153(1)
Electrical Properties of Neurons
153(8)
Single-Compartment Models
161(1)
Integrate-and-Fire Models
162(4)
Voltage-Dependent Conductances
166(7)
The Hodgkin-Huxley Model
173(2)
Modeling Channels
175(3)
Synaptic Conductances
178(10)
Synapses on Integrate-and-Fire Neurons
188(3)
Chapter Summary
191(1)
Appendices
191(2)
Annotated Bibliography
193(2)
Model Neurons II: Conductances and Morphology
195(34)
Levels of Neuron Modeling
195(1)
Conductance-Based Models
195(8)
The Cable Equation
203(14)
Multi-compartment Models
217(7)
Chapter Summary
224(1)
Appendices
224(4)
Annotated Bibliography
228(1)
Network Models
229(50)
Introduction
229(2)
Firing-Rate Models
231(10)
Feedforward Networks
241(3)
Recurrent Networks
244(21)
Excitatory-Inhibitory Networks
265(8)
Stochastic Networks
273(3)
Chapter Summary
276(1)
Appendix
276(1)
Annotated Bibliography
277(2)
III Adaptation and Learning
279(120)
Plasticity and Learning
281(50)
Introduction
281(3)
Synaptic Plasticity Rules
284(9)
Unsupervised Learning
293(20)
Supervised Learning
313(13)
Chapter Summary
326(1)
Appendix
327(1)
Annotated Bibliography
328(3)
Classical Conditioning and Reinforcement Learning
331(28)
Introduction
331(1)
Classical Conditioning
332(8)
Static Action Choice
340(6)
Sequential Action Choice
346(8)
Chapter Summary
354(1)
Appendix
355(2)
Annotated Bibliography
357(2)
Representational Learning
359(40)
Introduction
359(9)
Density Estimation
368(5)
Causal Models for Density Estimation
373(16)
Discussion
389(5)
Chapter Summary
394(1)
Appendix
395(1)
Annotated Bibliography
396(3)
Mathematical Appendix
399(20)
Linear Algebra
399(9)
Finding Extrema and Lagrange Multipliers
408(2)
Differential Equations
410(3)
Electrical Circuits
413(2)
Probability Theory
415(3)
Annotated Bibliography
418(1)
References 419(20)
Index 439

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program