rent-now

Rent More, Save More! Use code: ECRENTAL

5% off 1 book, 7% off 2 books, 10% off 3+ books

9780444517920

Methods and Models in Neurophysics

by ; ; ; ;
  • ISBN13:

    9780444517920

  • ISBN10:

    0444517928

  • Edition: 1st
  • Format: Hardcover
  • Copyright: 2005-02-09
  • Publisher: Elsevier Science
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
  • Complimentary 7-Day eTextbook Access - Read more
    When you rent or buy this book, you will receive complimentary 7-day online access to the eTextbook version from your PC, Mac, tablet, or smartphone. Feature not included on Marketplace Items.
List Price: $121.00 Save up to $0.12
  • Buy New
    $120.88
    Add to Cart Free Shipping Icon Free Shipping

    PRINT ON DEMAND: 2-4 WEEKS. THIS ITEM CANNOT BE CANCELLED OR RETURNED.

    7-Day eTextbook Access 7-Day eTextbook Access

Summary

Neuroscience is an interdisciplinary field that strives to understand the functioning of neural systems at levels ranging from biomolecules and cells to behaviour and higher brain functions (perception, memory, cognition). Neurophysics has flourished over the past three decades, becoming an indelible part of neuroscience, and has arguably entered its maturity. It encompasses a vast array of approaches stemming from theoretical physics, computer science, and applied mathematics. This book provides a detailed review of this field from basic concepts to its most recent development.

Table of Contents

Experimenting with theory
1(16)
E. Marder
Overcoming communication barriers
6(2)
Modeling with biological neurons-the dynamic clamp
8(1)
The traps inherent in building conductance-based models
9(3)
Theory can drive new experiments
12(1)
Conclusions
13(4)
References
14(3)
Understanding neuronal dynamics by geometrical dissection of minimal models
17(56)
A. Borisyuk
J. Rinzel
Introduction
21(4)
Nonlinear behaviors, time scales, our approach
21(1)
Electrical activity of cells
22(3)
Revisiting the Hodgkin-Huxley equations
25(13)
Background and formulation
25(3)
Hodgkin-Huxley gating equations as idealized kinetic models
28(1)
Dissection of the action potential
29(1)
Current-voltage relations
29(1)
Qualitative view of fast-slow dissection
30(2)
Stability of the fast subsystem's steady states
32(1)
Repetitive firing
33(1)
Stability of the four-variable model's steady state
33(1)
Stability of periodic solutions
34(1)
Bistability
35(3)
Morris-Lecar model
38(13)
Excitable regime
41(1)
Post-inhibitory rebound
42(1)
Single steady state. Onset of repetitive firing, type II
43(3)
Three steady states
46(1)
Large φ. Bistability of steady states
46(1)
Small φ. Onset of repetitive firing, Type I
47(3)
Intermediate φ. Bistability of rest state and a depolarized oscillation
50(1)
Similar phenomena in the Hodgkin-Huxley model
50(1)
Summary: onset of repetitive firing, Types I and II
51(1)
Bursting, cellular level
51(7)
Geometrical analysis and fast-slow dissection of bursting dynamics
52(1)
Examples of bursting behavior
53(1)
Square wave bursting
53(2)
Parabolic bursting
55(1)
Elliptic bursting
56(1)
Other types of bursting
57(1)
Bursting, network generated. Episodic rhythms in the developing spinal cord
58(7)
Experimental background
58(1)
Firing rate model
59(1)
Basic recurrent network
60(1)
Full model
61(2)
Predictions of the model
63(2)
Chapter summary
65(8)
Appendix A. Mathematical formulation of fast-slow dissection
65(2)
Appendix B. Stability of periodic solutions
67(2)
References
69(4)
Geometric singular perturbation analysis of neuronal dynamics
73(50)
D. Terman
Introduction
77(1)
Introduction to dynamical systems
78(12)
First order equations
78(1)
Bifurcations
79(3)
Bistability and hysteresis
82(1)
The phase plane
83(1)
Oscillations
84(1)
Local bifurcations
85(2)
Global bifurcations
87(1)
Geometric singular perturbation theory
88(2)
Properties of a single neuron
90(7)
Reduced models
91(1)
Response to injected current
92(2)
Traveling wave solutions
94(3)
Two mutually coupled cells
97(11)
Introduction
97(1)
Synaptic coupling
98(1)
Geometric approach
99(2)
Synchrony with excitatory synapses
101(4)
Desynchrony with inhibitory synapses
105(3)
Excitatory-inhibitory networks
108(7)
Introduction
108(3)
Synchronous solution
111(3)
Clustered solutions
114(1)
Activity patterns in the basal ganglia
115(8)
The basal ganglia
115(2)
The model
117(2)
Activity patterns
119(1)
Concluding remarks
119(2)
References
121(2)
Theory of neural synchrony
123(56)
G. Mato
Introduction
127(1)
Weakly coupled oscillators
128(25)
Stability of cluster states
131(1)
One-cluster state
131(1)
Two-cluster states
132(1)
N-cluster state
132(2)
Stochastic dynamics
134(1)
Evaluation of the phase interaction function
135(1)
Synaptic interactions
136(1)
Conductance based neurons
136(8)
Integrate-and-fire neurons
144(2)
Gap junctions
146(1)
Numerical simulations of conductance based models
146(3)
Phase interaction for electrotonic interactions
149(4)
Strongly coupled oscillators: mechanisms of synchrony
153(16)
The two-population QIF model
153(3)
Stability of the asynchronous state
156(3)
Mechanisms of synchrony
159(1)
The symmetric case
159(5)
The general case
164(3)
Application: persistent states
167(2)
Conclusion
169(10)
Appendix A. Hodgkin-Huxley and Wang-Buszaki models
172(2)
Appendix B. Measure of synchrony and variability in numerical simulations
174(1)
Appendix C. Reduction of a conductance-based model to the QIF model
175(2)
References
177(2)
Some useful numerical techniques for simulating integrate-and-fire networks
179(18)
M. Shelley
Introduction
183(1)
The conductance-based I&F model
184(1)
Modified time-stepping schemes
185(5)
Synaptic interactions
190(2)
Simulating a V1 model
192(5)
References
195(2)
Propagation of pulses in cortical networks: the single-spike approximation
197(48)
D. Golomb
Introduction
202(1)
Propagating pulses in networks of excitatory neurons
203(14)
The model
203(2)
Continuous and lurching pulses
205(1)
Existence, stability and velocity of continuous pulses
206(6)
Velocity of lurching pulses
212(3)
The nature of lurching pulses
215(1)
Finite axonal conduction velocity
216(1)
Propagating pulses in networks of excitatory and inhibitory neurons
217(20)
The model
217(2)
Fast and slow pulses
219(1)
Analysis of traveling pulse solutions
220(1)
Volterra representation
220(1)
Existence of traveling pulses
220(2)
Stability of traveling pulses
222(1)
Voltage profile
223(1)
Theory of propagation of fast and slow pulses
223(1)
Effects of I-to-E inhibition and slow E-to-E excitation
224(3)
Response to shock initial conditions
227(2)
Effects of strength of fast E-to-E excitation
229(2)
I-to-I conductance and irregular pulses
231(2)
Lurching pulses
233(1)
Finite axonal conduction velocity
234(1)
Traveling pulse solution with finite axonal conduction velocity
234(2)
Pulse velocity may decrease with c
236(1)
Discussion
237(8)
Types of propagating pulses
237(1)
Effects of approximations
238(1)
Application to biological pulses
239(1)
Network without inhibition
239(1)
Network with inhibition
240(1)
Appendix A. Stability of the lower branch
241(2)
References
243(2)
Activity-dependent transmission in neocortical synapses
245(22)
M. Tsodyks
Introduction
249(1)
Phenomenological model of synaptic depression and facilitation
250(3)
Synaptic depression
250(1)
Modeling synaptic facilitation
251(2)
Dynamic synaptic transmission on the population level
253(3)
Recurrent networks with synaptic depression
256(7)
Conclusion
263(4)
References
264(3)
Theory of large recurrent networks: from spikes to behavior
267(74)
H. Sompolinsky
O.L. White
Introduction
271(1)
From spikes to rates I: rates in asynchronous states
272(12)
Introduction: rate models
272(1)
Synchronous and asynchronous states
273(3)
Synchronous and asynchronous states in highly connected networks
276(3)
Rate equations for asynchronous networks
279(5)
From spikes to rates II: dynamics and conductances
284(11)
Introduction
284(1)
Dynamics of firing rates---linear response theory
285(4)
Synaptic inputs as conductances
289(3)
Rate dynamics for slow synapses
292(1)
Summary
293(2)
Persistent activity and neural integration in the brain
295(17)
Introduction
295(2)
Dynamics of linear recurrent networks
297(7)
Recurrent network model of persistence and integration
304(3)
Single neuron model for integration and persistence
307(5)
Feature selectivity in recurrent networks---the ring model
312(12)
Introduction
312(1)
The ring network
313(2)
Stationary solution of the ring model
315(1)
Symmetry breaking in the ring model
316(8)
Models of associative memory
324(14)
Introduction
324(1)
Binary networks
324(3)
The Hopfield model
327(5)
Networks with sparse memories
332(6)
Concluding remarks
338(3)
References
339(2)
Irregular activity in large networks of neurons
341(66)
C. van Vreeswijk
H. Sompolinsky
Introduction
345(2)
A simple binary model
347(19)
The model
348(1)
Population averaged activity
349(4)
Heterogeneity of the firing rate.
353(4)
Temporal correlations
357(3)
Dependence on initial conditions
360(2)
Non-linearity of response of individual cells
362(4)
A memory model
366(6)
A modified Willshaw model
367(1)
Mean field equations
368(2)
Multiple solutions
370(2)
A model of visual cortex hypercolumn
372(12)
The model
373(1)
Population averaged response
374(3)
Quenched disorder
377(3)
Contrast invariance
380(4)
Adding realism: integrate-and-fire network
384(16)
Current based synapses
384(4)
Conductance based synapses
388(2)
Networks with two populations
390(2)
A model of a hypercolumn
392(6)
Beyond the Poisson assumption
398(2)
Discussion
400(7)
References
402(5)
Network models of memory
407(70)
N. Brunel
Introduction
411(1)
Persistent neuronal activity during delayed response experiments
412(9)
Discrete working memory: persistent activity in the delayed match to sample experiments
412(2)
Plasticity of persistent activity patterns induced by learning
414(3)
Spatial short-term memory: persistent activity in the oculomotor delayed response task
417(1)
Parametric short-term memory
418(2)
Persistent activity in slices
420(1)
Scenarios for multistability in neural systems
421(2)
Single cell
421(1)
Local network
422(1)
Networks involving several areas
422(1)
Networks of binary neurons with discrete attractors
423(16)
The Hopfield model
423(2)
Signal-to-noise ratio (SNR) analysis
425(1)
Statistical mechanics of the Hopfield model
426(3)
Critiques of the Hopfield model
429(1)
Robustness to perturbations of the synaptic matrix
430(2)
Models with 0,1 neurons and low coding levels
432(1)
Summary---models with binary neurons and analog synapses
433(1)
A model with binary synapses: the Willshaw model
434(2)
Optimal capacity of networks of binary neurons
436(3)
Other generalizations of the Hopfield model
439(1)
Learning
439(6)
Supervised learning: the perceptron learning algorithm
440(2)
Unsupervised learning---palimpsest models
442(1)
Palimpsest models with analog synapses
442(1)
Unsupervised learning in networks with discrete synapses
443(2)
Summary
445(1)
Networks of spiking neurons with discrete attractors
445(14)
A cortical network model
447(1)
Dynamics of networks of spiking neurons---single population analysis
448(3)
Two population networks
451(1)
Storing binary non-overlapping patterns in the cortical network model
452(4)
Spontaneous and persistent activity in the large C, sparse coding limit
456(2)
Capacity
458(1)
Stability of persistent states vs synchronized oscillations
458(1)
Plasticity of persistent activity
459(6)
Learning correlations between patterns separated by delay periods
459(2)
Network subjected to DMS task with a fixed sequence of samples
461(2)
Network subjected to pair-associate tasks
463(1)
Transitions between states in the absence of external inputs in networks of spiking neurons
463(2)
Models with continuous attractors
465(3)
Conclusions
468(9)
References
470(7)
Pattern formation in visual cortex
477(98)
P.C. Bressloff
Introduction
481(4)
The functional architecture of V1
485(8)
Retinotopic map
485(3)
Feature maps
488(3)
Long-range horizontal and feedback connections
491(2)
Large-scale models of V1
493(18)
Planar model of V1
493(2)
Receptive fields and the feedforward input h
495(6)
Orientation pinwheels as organizing centers for feature map F(r)
501(2)
Long-range connections and the weight distribution w
503(3)
Coupled hypercolumn model of V1
506(5)
Pattern formation in a single hypercolumn
511(17)
Orientation tuning in a one-population ring model
511(2)
Derivation of amplitude equation using the Fredholm alternative
513(4)
Bifurcation theory and O(2) symmetry
517(2)
Two-population model
519(2)
Amplitude equation for oscillatory patterns
521(3)
Mean-field theory
524(4)
Pattern formation in a coupled hypercolumn model of V1
528(19)
Linear stability analysis
529(3)
Marginal stability
532(3)
Doubly-periodic planforms
535(3)
Amplitude equation
538(2)
Bifurcation theory and shift-twist symmetry
540(2)
Selection and stability of patterns
542(3)
From cortical patterns to geometric visual hallucinations
545(2)
Pattern formation in a planar model of V1
547(14)
Homogeneous weights
547(3)
Pinning of cortical patterns by spatially periodic horizontal connections
550(6)
Ginzburg--Landau equation
556(2)
Commensurate-incommensurate transitions in cortex
558(3)
Pattern formation in a model of cortical development
561(7)
Future directions
568(7)
References
570(5)
Symmetry breaking and pattern selection in visual cortical development
575(66)
F. Wolf
Introduction
579(3)
The pattern of orientation preference columns
582(1)
Symmetries in the development of orientation columns
583(4)
From learning to dynamics
587(1)
Generation and motion of pinwheels
588(9)
Random orientation maps
588(5)
Gaussian fields from a dynamic instability
593(3)
Predicting pinwheel annihilation
596(1)
The problem of pinwheel stability
597(1)
Weakly nonlinear analysis of pattern selection
598(16)
Amplitude equations
599(2)
Fixed points of the amplitude equations and their stability
601(5)
Linking amplitude equations and field dynamics
606(6)
Extrinsic stability
612(1)
Swift-Hohenberg models
613(1)
A Swift-Hohenberg model with stable pinwheel patterns
614(21)
Construction of the model
615(5)
Essentially complex planforms
620(5)
Stability ranges and phase diagram
625(8)
The spectrum of pinwheel densities
633(2)
Discussion
635(6)
References
638(3)
Of the evolution of the brain
641(50)
A. Treves
Y. Roudi
Introduction and summary
645(1)
The phase transition that made us mammals
645(5)
An information-theoretical advantage in the hippocampus
646(1)
An information-theoretical hypothesis about layers and maps
647(3)
Maps and patterns of threshold-linear units
650(9)
A model with geometry in its connections
651(1)
Retrieval states
652(3)
The network without structure
655(2)
Appearance of bumps of activity
657(2)
The main points
659(1)
Validation of the lamination hypothesis
659(4)
Differentiation among isocortical layers
661(2)
What do we need DG and CA1 for?
663(5)
Distinguishing storage from retrieval
664(2)
CA1 in search of a role
666(2)
Infinite recursion and the origin of cognition
668(6)
Infinite recursion and its ambiguities
669(1)
Memory---statics and dynamics
670(2)
Memory latching as a model of recursive dynamics
672(2)
Reducing local networks to Potts units
674(17)
A discrete-valued model
675(1)
Storage capacity
676(1)
Sparse coding
677(1)
A Potts model with graded response
678(1)
Correlated patterns
679(2)
Scheme of the simulations
681(3)
Conclusions
684(1)
References
685(6)
Theory of point processes for neural systems
691(38)
E.N. Brown
Neural spike trains as point processes
694(1)
Integrate and fire models and interspike interval distributions
695(6)
Non-leaky integrator with excitatory Poisson inputs
695(2)
Non-leaky integrator with excitatory and inhibitory Poisson inputs
697(1)
Non-leaky integrator with random walk inputs
698(2)
Remarks
700(1)
The conditional intensity function and interevent time probability density
701(3)
Joint probability density of a point process
704(4)
Derivation of the joint probability density
704(3)
An alternative derivation of the joint probability density of a spike train
707(1)
Special point process models
708(7)
Poisson processes
709(1)
Axioms for a Poisson process
709(1)
Interevent and waiting time probability densities for a Poisson process
709(2)
Inhomogeneous Poisson process
711(2)
Renewal processes
713(1)
Counting process associated with a renewal process
713(1)
Asymptotic distribution of N(t) for large t
714(1)
Stationary process
714(1)
Covariance function
714(1)
Spectral density function
715(1)
The time-rescaling theorem
715(5)
An elementary proof of the time-rescaling theorem
716(2)
The time-rescaling theorem: assessing model goodness-of-fit
718(1)
Kolmogorov-Smirnov test
719(1)
Quantile-quantile plot
719(1)
Normalized point process residuals
720(1)
Simulation of point processes
720(4)
Thinning algorithm 1
720(1)
Simulating a multivariate point process model
721(2)
Simulating a univariate point process by time-rescaling
723(1)
Poisson limit theorems
724(1)
Poisson approximation to the binomial probability mass function
724(1)
Superposition of point processes
725(1)
Problems
725(4)
References
726(3)
Technique(s) for spike-sorting
729(58)
C. Pouzat
Introduction
733(1)
The problem to solve
734(1)
Two features of single neuron data we would like to include in the spike-sorting procedure
735(4)
Spike waveforms from a single neuron are usually not stationary on a short time-scale
735(1)
An experimental illustration with cerebellar Purkinje cells
735(1)
A phenomenological description by an exponential relaxation
735(1)
Neurons inter-spike interval probability density functions carry a lot of information we would like to exploit
736(1)
An experimental illustration from projection neurons in the locust antennal lobe
736(1)
A phenomenological description by a log-normal pdf
737(1)
Noise properties
738(1)
Noise whitening
739(2)
Probabilistic data generation model
741(1)
Model assumptions
741(8)
Likelihood computation for single neuron data
741(4)
Complications with multi-neuron data
745(1)
Notations for the multi-neuron model parameters
745(1)
Configuration and data augmentation
745(2)
Posterior density
747(1)
Remarks on the use of the posterior
747(1)
The normalizing constant problem and its solution
748(1)
The problem
748(1)
Its solution
748(1)
Markov chains
749(8)
Some notations and definitions
750(2)
The fundamental theorem and the ergodic theorem
752(5)
The Metropolis--Hastings algorithm and its relatives
757(5)
The Metropolis--Hastings algorithm
757(1)
Second fundamental theorem
757(1)
Metropolis--Hastings and Gibbs algorithms for multi-dimensional spaces
758(1)
An example: the Gibbs sampler for the parameters of the ISI density
759(2)
Generation of the amplitude parameters of our model
761(1)
Generation of the configuration
762(1)
Our complete MC step
762(1)
Priors choice
762(2)
The good use of the ergodic theorem. A warning
764(1)
Autocorrelation functions and confidence intervals
764(1)
Initialization bias
765(1)
Slow relaxation and the replica exchange method
765(3)
An Example from a simulated data set
768(10)
Data properties
769(1)
Algorithm dynamics and parameters estimates without the REM
769(1)
Initialization
769(1)
Energy evolution
770(1)
Model parameters evolution
771(1)
Model parameters estimates
771(1)
Algorithm dynamics and parameters estimates with the REM
772(1)
Making sure the REM ``works''
773(1)
Posterior estimates with the REM
774(1)
Configuration estimate
775(2)
A more detailed illustration of the REM dynamics
777(1)
Conclusions
778(2)
Exercises solutions
780(7)
Cholesky decomposition (or factorization)
780(1)
Bayesian posterior densities for the parameters of a log-normal pdf
781(1)
Stochastic property of a Markov matrix
782(1)
Detailed balance
783(1)
References
783(4)
The emergence of relevant data representations: an information theoretic approach
787
N. Tishby
Part I: the fundamental dilemma
792
Fitting models to data---what is the goal?
792
Accuracy versus complexity
793
Quantifying complexity and accuracy
794
Part II: Shannon's information theory---a new perspective
795
Formalizing the problem: distortion versus cost
797
Solving the cost-distortion tradeoff: the emergence of mutual information
798
The rate-distortion function
801
Example: the Gaussian source and channel
802
Perfect matching of source and channel
803
Part III: relevant data representation
804
The ``inverse Shannon problem''
805
Looking into the black box: the information bottleneck method
806
IB as perfectly matched source-channel
809
Alternating projections and the IB algorithm
810
Part IV: applications and extensions
812
IB clustering
812
IB dimensionality reduction: the Gaussian IB
813
Towards multivariate network IB
817
Bayesian networks and multi-information
817
Network information bottleneck principle
820
Characterization of the IB fix-points
823
Deterministic annealing algorithm
825
References
827

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program