did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780521573535

Neural Network Learning: Theoretical Foundations

by
  • ISBN13:

    9780521573535

  • ISBN10:

    052157353X

  • Format: Hardcover
  • Copyright: 1999-11-13
  • Publisher: Cambridge University Press

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $142.00 Save up to $73.39
  • Rent Book $89.46
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    SPECIAL ORDER: 1-2 WEEKS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

This book describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Research on pattern classification with binary-output networks is surveyed, including a discussion of the relevance of the Vapnik-Chervonenkis dimension, and calculating estimates of the dimension for several neural network models. A model of classification by real-output networks is developed, and the usefulness of classification with a 'large margin' is demonstrated. The authors explain the role of scale-sensitive versions of the Vapnik-Chervonenkis dimension in large margin classification, and in real prediction. They also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient constructive learning algorithms. The book is self-contained and is intended to be accessible to researchers and graduate students in computer science, engineering, and mathematics.

Table of Contents

Preface xiii
Introduction
1(10)
Supervised learning
1(1)
Artificial neural networks
2(5)
Outline of the book
7(2)
Bibliographical notes
9(2)
Part one: Pattern Classification with Binary-Output Neural Networks 11(120)
The Pattern Classification Problem
13(16)
The learning problem
13(6)
Learning finite function classes
19(3)
Applications to perceptrons
22(1)
Restricted model
23(2)
Remarks
25(2)
Bibliographical notes
27(2)
The Growth Function and VC-Dimension
29(13)
Introduction
29(1)
The growth function
29(6)
The Vapnik-Chervonenkis dimension
35(6)
Bibliographical notes
41(1)
General Upper Bounds on Sample Complexity
42(17)
Learning by minimizing sample error
42(1)
Uniform convergence and learnability
43(2)
Proof of uniform convergence result
45(5)
Application to the perceptron
50(2)
The restricted model
52(1)
Remarks
53(5)
Bibliographical notes
58(1)
General Lower Bounds on Sample Complexity
59(15)
Introduction
59(1)
A lower bound for learning
59(6)
The restricted model
65(4)
VC-dimension quantifies sample complexity
69(2)
Remarks
71(1)
Bibliographical notes
72(2)
The VC-Dimension of Linear Threshold Networks
74(12)
Feed-forward neural networks
74(3)
Upper bound
77(3)
Lower bounds
80(3)
Sigmoid networks
83(2)
Bibliographical notes
85(1)
Bounding the VC-Dimension using Geometric Techniques
86(22)
Introduction
86(1)
The need for conditions on the activation functions
86(3)
A bound on the growth function
89(3)
Proof of the growth function bound
92(10)
More on solution set components bounds
102(4)
Bibliographical notes
106(2)
Vapnik-Chervonenkis Dimension Bounds for Neural Networks
108(23)
Introduction
108(1)
Function classes that are polynomial in their parameters
108(4)
Piecewise-polynomial networks
112(10)
Standard sigmoid networks
122(6)
Remarks
128(1)
Bibliographical notes
129(2)
Part two: Pattern Classification with Real-Output Networks 131(98)
Classification with Real-Valued Functions
133(7)
Introduction
133(2)
Large margin classifiers
135(3)
Remarks
138(1)
Bibliographical notes
138(2)
Covering Numbers and Uniform Convergence
140(11)
Introduction
140(1)
Covering numbers
140(3)
A uniform convergence result
143(4)
Covering numbers in general
147(2)
Remarks
149(1)
Bibliographical notes
150(1)
The Pseudo-Dimension and Fat-Shattering Dimension
151(14)
Introduction
151(1)
The pseudo-dimension
151(8)
The fat-shattering dimension
159(4)
Bibliographical notes
163(2)
Bounding Covering Numbers with Dimensions
165(19)
Introduction
165(1)
Packing numbers
165(2)
Bounding with the pseudo-dimension
167(7)
Bounding with the fat-shattering dimension
174(7)
Comparing the two approaches
181(1)
Remarks
182(1)
Bibliographical notes
183(1)
The Sample Complexity of Classification Learning
184(9)
Large margin SEM algorithms
184(1)
Large margin SEM algorithms as learning algorithms
185(3)
Lower bounds for certain function classes
188(3)
Using the pseudo-dimension
191(1)
Remarks
191(1)
Bibliographical notes
192(1)
The Dimensions of Neural Networks
193(25)
Introduction
193(1)
Pseudo-dimension of neural networks
194(2)
Fat-shattering dimension bounds: number of parameters
196(7)
Fat-shattering dimension bounds: size of parameters
203(10)
Remarks
213(3)
Bibliographical notes
216(2)
Model Selection
218(11)
Introduction
218(2)
Model selection results
220(3)
Proofs of the results
223(2)
Remarks
225(2)
Bibliographical notes
227(2)
Part three: Learning Real-Valued Functions 229(68)
Learning Classes of Real Functions
231(10)
Introduction
231(1)
The learning framework for real estimation
232(2)
Learning finite classes of real functions
234(2)
A substitute for finiteness
236(3)
Remarks
239(1)
Bibliographical notes
240(1)
Uniform Convergence Results for Real Function Classes
241(6)
Uniform convergence for real functions
241(4)
Remarks
245(1)
Bibliographical notes
246(1)
Bounding Covering Numbers
247(11)
Introduction
247(1)
Bounding with the fat-shattering dimension
247(3)
Bounding with the pseudo-dimension
250(4)
Comparing the different approaches
254(1)
Remarks
255(1)
Bibliographical notes
256(2)
Sample Complexity of Learning Real Function Classes
258(11)
Introduction
258(1)
Classes with finite fat-shattering dimension
258(2)
Classes with finite pseudo-dimension
260(1)
Results for neural networks
261(1)
Lower bounds
262(3)
Remarks
265(2)
Bibliographical notes
267(2)
Convex Classes
269(15)
Introduction
269(1)
Lower bounds for non-convex classes
270(7)
Upper bounds for convex classes
277(3)
Remarks
280(2)
Bibliographical notes
282(2)
Other Learning Problems
284(13)
Loss functions in general
284(1)
Convergence for general loss functions
285(1)
Learning in multiple-output networks
286(3)
Interpolation models
289(6)
Remarks
295(1)
Bibliographical notes
296(1)
Part four: Algorithmics 297(60)
Efficient Learning
299(8)
Introduction
299(1)
Graded function classes
299(2)
Efficient learning
301(1)
General classes of efficient learning algorithms
302(3)
Efficient learning in the restricted model
305(1)
Bibliographical notes
306(1)
Learning as Optimization
307(9)
Introduction
307(1)
Randomized algorithms
307(4)
Learning as randomized optimization
311(1)
A characterization of efficient learning
312(1)
The hardness of learning
312(2)
Remarks
314(1)
Bibliographical notes
315(1)
The Boolean Perceptron
316(15)
Introduction
316(1)
Learning is hard for the simple perceptron
316(3)
Learning is easy for fixed fan-in perceptrons
319(3)
Perceptron learning in the restricted model
322(6)
Remarks
328(1)
Bibliographical notes
329(2)
Hardness Results for Feed-Forward Networks
331(11)
Introduction
331(1)
Linear threshold networks with binary inputs
331(4)
Linear threshold networks with real inputs
335(2)
Sigmoid networks
337(1)
Remarks
338(1)
Bibliographical notes
339(3)
Constructive Learning Algorithms for Two-Layer Networks
342(15)
Introduction
342(1)
Real estimation with convex combinations
342(9)
Classification learning using boosting
351(4)
Bibliographical notes
355(2)
Appendix 1 Useful Results 357(8)
Bibliography 365(14)
Author index 379(3)
Subject index 382

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program