rent-now

Rent More, Save More! Use code: ECRENTAL

5% off 1 book, 7% off 2 books, 10% off 3+ books

9780198538646

NEURAL NETWORKS FOR PATTERN RECOGNITION

by
  • ISBN13:

    9780198538646

  • ISBN10:

    0198538642

  • Format: Paperback
  • Copyright: 1996-01-18
  • Publisher: Clarendon Press

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $117.33 Save up to $53.97
  • Rent Book $63.36
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    USUALLY SHIPS IN 24-48 HOURS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

How To: Textbook Rental

Looking to rent a book? Rent NEURAL NETWORKS FOR PATTERN RECOGNITION [ISBN: 9780198538646] for the semester, quarter, and short term or search our site for other textbooks by Bishop, Christopher M.. Renting a textbook can save you up to 90% from the cost of buying.

Summary

This book provides the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts of pattern recognition, the book describes techniques for modelling probability density functions, and discussesthe properties and relative merits of the multi-layer perceptron and radial basis function network models. It also motivates the use of various forms of error functions, and reviews the principal algorithms for error function minimization. As well as providing a detailed discussion of learningand generalization in neural networks, the book also covers the important topics of data processing, feature extraction, and prior knowledge. The book concludes with an extensive treatment of Bayesian techniques and their applications to neural networks.

Table of Contents

Statistical Pattern Recognition
1(33)
An example -- character recognition
1(4)
Classification and regression
5(1)
Pre-processing and feature extraction
6(1)
The curse of dimensionality
7(2)
Polynomial curve fitting
9(5)
Model complexity
14(1)
Multivariate non-linear functions
15(2)
Bayes' theorem
17(6)
Decision boundaries
23(4)
Minimizing risk
27(6)
Exercises
28(5)
Probability Density Estimation
33(44)
Parametric methods
34(5)
Maximum likelihood
39(3)
Bayesian inference
42(4)
Sequential parameter estimation
46(3)
Non-parametric methods
49(10)
Mixture models
59(18)
Exercises
73(4)
Single-Layer Networks
77(39)
Linear discriminant functions
77(8)
Linear separability
85(3)
Generalized linear discriminants
88(1)
Least-squares techniques
89(9)
The perceptron
98(7)
Fisher's linear discriminant
105(11)
Exercises
112(4)
The Multi-layer Perceptron
116(48)
Feed-forward network mappings
116(5)
Threshold units
121(5)
Sigmoidal units
126(7)
Weight-space symmetries
133(1)
Higher-order networks
133(2)
Projection pursuit regression
135(2)
Kolmogorov's theorem
137(3)
Error back-propagation
140(8)
The Jacobian matrix
148(2)
The Hessian matrix
150(14)
Exercises
161(3)
Radial Basis Functions
164(30)
Exact interpolation
164(3)
Radial basis function networks
167(3)
Network training
170(1)
Regularization theory
171(5)
Noisy interpolation theory
176(1)
Relation to kernel regression
177(2)
Radial basis function networks for classification
179(3)
Comparison with the multi-layer perceptron
182(1)
Basis function optimization
183(7)
Supervised training
190(4)
Exercises
191(3)
Error Functions
194(59)
Sum-of-squares error
195(13)
Minkowski error
208(3)
Input-dependent variance
211(1)
Modelling conditional distributions
212(10)
Estimating posterior probabilities
222(3)
Sum-of-squares for classification
225(5)
Cross-entropy for two classes
230(6)
Multiple independent attributes
236(1)
Cross-entropy for multiple classes
237(3)
Entropy
240(5)
General conditions for outputs to be probabilities
245(8)
Exercises
248(5)
Parameter Optimization Algorithms
253(42)
Error surfaces
254(3)
Local quadratic approximation
257(2)
Linear output units
259(1)
Optimization in practice
260(3)
Gradient descent
263(9)
Line search
272(2)
Conjugate gradients
274(8)
Scaled conjugate gradients
282(3)
Newton's method
285(2)
Quasi-Newton methods
287(3)
The Levenberg--Marquardt algorithm
290(5)
Exercises
292(3)
Pre-processing and Feature Extraction
295(37)
Pre-processing and post-processing
296(2)
Input normalization and encoding
298(3)
Missing data
301(1)
Time series prediction
302(2)
Feature selection
304(6)
Principal component analysis
310(9)
Invariances and prior knowledge
319(13)
Exercises
329(3)
Learning and Generalization
332(53)
Bias and variance
333(5)
Regularization
338(8)
Training with noise
346(3)
Soft weight sharing
349(4)
Growing and pruning algorithms
353(11)
Committees of networks
364(5)
Mixtures of experts
369(2)
Model order selection
371(6)
Vapnik--Chervonenkis dimension
377(8)
Exercises
380(5)
Bayesian Techniques
385(55)
Bayesian learning of network weights
387(11)
Distribution of network outputs
398(5)
Application to classification problems
403(3)
The evidence framework for α and β
406(9)
Integration over hyperparameters
415(3)
Bayesian model comparison
418(4)
Committees of networks
422(2)
Practical implementation of Bayesian techniques
424(1)
Monte Carlo methods
425(4)
Minimum description length
429(11)
Exercises
433(7)
A Symmetric Matrices 440(4)
B Gaussian Integrals 444(4)
C Lagrange Multipliers 448(3)
D Calculus of Variations 451(3)
E Principal Components 454(3)
References 457(20)
Index 477

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program