did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9781402006425

Ten Lectures on Statistical and Structural Pattern Recognition

by ;
  • ISBN13:

    9781402006425

  • ISBN10:

    140200642X

  • Format: Hardcover
  • Copyright: 2002-07-01
  • Publisher: Kluwer Academic Pub
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $99.99 Save up to $81.43
  • Digital
    $40.22
    Add to Cart

    DURATION
    PRICE

Supplemental Materials

What is included with this book?

Summary

This monograph explores the close relationship of various well-known pattern recognition problems that have so far been considered independent. These relationships became apparent with the discovery of formal procedures for addressing known problems and their generalisations. The generalised problem formulations were analysed mathematically and unified algorithms were found. The main scientific contribution of this book is the unification of two main streams in pattern recognition - the statistical one and the structural one. The material is presented in the form of ten lectures, each of which concludes with a discussion with a student. It provides new views and numerous original results in their field. Written in an easily accessible style, it introduces the basic building blocks of pattern recognition, demonstrates the beauty and the pitfalls of scientific research, and encourages good habits in reading mathematical text.

Table of Contents

Preface xi
Preface to the English edition xi
A letter from the doctoral student Jiri Pecha prior to publication of the lectures xii
A letter from the authors to the doctoral student Jiri Pecha xiii
Basic concepts and notations xv
Acknowledgements xix
Bayesian statistical decision making
1(24)
Introduction to the analysis of the Bayesian task
1(1)
Formulation of the Bayesian task
1(2)
Two properties of Bayesian strategies
3(4)
Two particular cases of the Bayesian task
7(4)
Probability of the wrong estimate of the state
7(2)
Bayesian strategy with possible rejection
9(2)
Discussion
11(11)
Bibliographical notes
22(3)
Non-Bayesian statistical decision making
25(48)
Severe restrictions of the Bayesian approach
25(3)
Penalty function
25(1)
A priori probability of situations
26(1)
Conditional probabilities of observations
27(1)
Formulation of the known and new non-Bayesian tasks
28(7)
Neyman--Pearson task
28(3)
Generalised task with two dangerous states
31(1)
Minimax task
31(1)
Wald task
32(1)
Statistical decision tasks with non-random interventions
33(2)
The pair of dual linear programming tasks, properties and solutions
35(5)
The solution of non-Bayesian tasks using duality theorems
40(13)
Solution of the Neyman--Pearson task
41(3)
Solution of generalised Neyman--Pearson task with two dangerous states
44(2)
Solution of the minimax task
46(2)
Solution of Wald task for the two states case
48(2)
Solution of Wald task in the case of more states
50(2)
Testing of complex random hypotheses
52(1)
Testing of complex non-random hypotheses
53(1)
Comments on non-Bayesian tasks
53(1)
Discussion
54(17)
Bibliographical notes
71(2)
Two statistical models of the recognised object
73(28)
Conditional independence of features
73(2)
Gaussian probability distribution
75(2)
Discussion
77(22)
Bibliographical notes
99(2)
Learning in pattern recognition
101(36)
Myths about learning in pattern recognition
101(1)
Three formulations of learning tasks in pattern recognition
102(6)
Learning according to the maximal likelihood
104(1)
Learning according to a non-random training set
105(1)
Learning by minimisation of empirical risk
106(2)
Basic concepts and questions of the statistical theory of learning
108(12)
Informal description of learning in pattern recognition
108(5)
Foundations of the statistical learning theory according to Chervonenkis and Vapnik
113(7)
Critical view of the statistical learning theory
120(2)
Outlines of deterministic learning
122(5)
Discussion
127(9)
Bibliographical notes
136(1)
Linear discriminant function
137(78)
Introductory notes on linear decomposition
137(1)
Guide through the topic of the lecture
138(3)
Anderson tasks
141(18)
Equivalent formulation of generalised Anderson task
141(1)
Informal analysis of generalised Anderson task
142(3)
Definition of auxiliary concepts for Anderson tasks
145(2)
Solution of Anderson original task
147(3)
Formal analysis of generalised Anderson task
150(7)
Outline of a procedure for solving generalised Anderson task
157(2)
Linear separation of finite sets of points
159(16)
Formulation of tasks and their analysis
159(4)
Algorithms for linear separation of finite sets of points
163(4)
Algorithm for ε-optimal separation of finite sets of points by means of the hyperplane
167(2)
Construction of Fisher classifiers by modifying Kozinec and perceptron algorithms
169(2)
Further modification of Kozinec algorithms
171(4)
Solution of the generalised Anderson task
175(7)
ε-solution of Anderson task
175(4)
Linear separation of infinite sets of points
179(3)
Discussion
182(31)
Link to a toolbox
213(1)
Bibliographical notes
213(2)
Unsupervised learning
215(60)
Introductory comments on the specific structure of the lecture
215(2)
Preliminary and informal definition of unsupervised learning
217(2)
Unsupervised learning in a perceptron
219(7)
Empirical Bayesian approach after H. Robbins
226(6)
Quadratic clustering and formulation of a general clustering task
232(6)
Unsupervised learning algorithms and their analysis
238(15)
Formulation of a recognition task
238(1)
Formulation of a learning task
238(2)
Formulation of an unsupervised learning task
240(1)
Unsupervised learning algorithm
241(1)
Analysis of the unsupervised learning algorithm
242(9)
Algorithm solving Robbins task and its analysis
251(2)
Discussion
253(20)
Link to a toolbox
273(1)
Bibliographical notes
274(1)
Mutual relationship of statistical and structural recognition
275(32)
Statistical recognition and its application areas
275(2)
Why is structural recognition necessary for image recognition?
277(7)
Set of observations
277(3)
Set of hidden parameter values for an image
280(1)
The role of learning and unsupervised learning in image recognition
281(3)
Main concepts necessary for structural analysis
284(4)
Discussion
288(17)
Bibliographical notes
305(2)
Recognition of Markovian sequences
307(90)
Introductory notes on sequences
307(1)
Markovian statistical model of a recognised object
308(4)
Recognition of the stochastic automaton
312(9)
Recognition of the stochastic automaton; problem formulation
312(1)
Algorithm for a stochastic automaton recognition
313(1)
Matrix representation of the calculation procedure
314(2)
Statistical interpretation of matrix multiplication
316(2)
Recognition of the Markovian object from incomplete data
318(3)
The most probable sequence of hidden parameters
321(12)
Difference between recognition of an object as a whole and recognition of parts that form the object
321(1)
Formulation of a task seeking the most probable sequence of states
321(1)
Representation of a task as seeking the shortest path in a graph
321(2)
Seeking the shortest path in a graph describing the task
323(3)
On the necessity of formal task analysis
326(1)
Generalised matrix multiplications
327(3)
Seeking the most probable subsequence of states
330(3)
Seeking sequences composed of the most probable hidden parameters
333(5)
Markovian objects with acyclic structure
338(6)
Statistical model of an object
338(1)
Calculating the probability of an observation
339(4)
The most probable ensemble of hidden parameters
343(1)
Formulation of supervised and unsupervised learning tasks
344(3)
The maximum likelihood estimation of a model during learning
345(1)
Minimax estimate of the model
345(1)
Tuning of the recognition algorithm
346(1)
Task of unsupervised learning
347(1)
Maximum likelihood estimate of the model
347(6)
Minimax estimate of a statistical model
353(13)
Formulation of an algorithm and its properties
353(3)
Analysis of a minimax estimate
356(10)
Proof of the minimax estimate algorithm of a Markovian mode
366(1)
Tuning the algorithm that recognises sequences
366(2)
The maximum likelihood estimate of statistical model
368(4)
Discussion
372(23)
Link to a toolbox
395(1)
Bibliographical notes
395(2)
Regular languages and corresponding pattern recognition tasks
397(82)
Regular languages
397(2)
Other ways to express regular languages
399(5)
Regular languages and automata
399(1)
Regular languages and grammars
400(1)
Regular languages and regular expressions
401(1)
Example of a regular language expressed in different ways
402(2)
Regular languages respecting faults; best and exact matching
404(5)
Fuzzy automata and languages
405(1)
Penalised automata and corresponding languages
406(1)
Simple best matching problem
407(2)
Partial conclusion after one part of the lecture
409(1)
Levenstein approximation of a sentence
410(33)
Preliminary formulation of the task
410(1)
Levenstein dissimilarity
411(1)
Known algorithm calculating Levenstein dissimilarity
412(2)
Modified definition of Levenstein dissimilarity and its properties
414(3)
Formulation of the problem and comments to it
417(1)
Formulation of main results and comments to them
418(2)
Generalised convolutions and their properties
420(7)
Formulation of a task and main results in convolution form
427(2)
Proof of the main result of this lecture
429(11)
Nonconvolution interpretation of the main result
440(3)
Discussion
443(34)
Link to a toolbox
477(1)
Bibliographical notes
477(2)
Context-free languages, their 2-D generalisation, related tasks
479(28)
Introductory notes
479(1)
Informal explanation of two-dimensional grammars and languages
480(4)
Two-dimensional context-free grammars and languages
484(2)
Exact matching problem. Generalised algorithm of C-Y-K
486(3)
General structural construction
489(9)
Structural construction defining observed sets
490(3)
Basic problem in structural recognition of images
493(1)
Computational procedure for solving the basic problem
494(4)
Discussion
498(7)
Bibliographical notes
505(2)
Bibliography 507(7)
Index 514

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program