9781848210448

Optimisation in Signal and Image Processing

by
  • ISBN13:

    9781848210448

  • ISBN10:

    1848210442

  • Format: Hardcover
  • Copyright: 2009-10-12
  • Publisher: Iste/Hermes Science Pub

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $206.95 Save up to $20.69
  • Rent Book $186.26
    Add to Cart Free Shipping

    TERM
    PRICE
    DUE

Supplemental Materials

What is included with this book?

  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
  • The Rental copy of this book is not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Summary

This book describes the optimization methods most commonly encountered in signal and image processing: artificial evolution and Parisian approach; wavelets and fractals; information criteria; training and quadratic programming; Bayesian formalism; probabilistic modeling; Markovian approach; hidden Markov models; and metaheuristics (genetic algorithms, ant colony algorithms, cross-entropy, particle swarm optimization, estimation of distribution algorithms, and artificial immune systems).

Author Biography

Patrick Siarry is Professor of Automatics and Informatics at the University of Paris 12, France.

Table of Contents

Introductionp. xiii
Modeling and Optimization in Image Analysisp. 1
Modeling at the source of image analysis and synthesisp. 1
From image synthesis to analysisp. 2
Scene geometric modeling and image synthesisp. 3
Direct model inversion and the Hough transformp. 4
The deterministic Hough transformp. 4
Stochastic exploration of parameters: evolutionary Houghp. 5
Examples of generalizationp. 7
Optimization and physical modelingp. 9
Photometric modelingp. 9
Motion modelingp. 10
Conclusionp. 12
Acknowledgementsp. 13
Bibliographyp. 13
Artificial Evolution and the Parisian Approach. Applications in the Processing of Signals and Imagesp. 15
Introductionp. 15
The Parisian approach for evolutionary algorithmsp. 15
Applying the Parisian approach to inverse IFS problemsp. 17
Choosing individuals for the evaluation processp. 18
Retribution of individualsp. 18
Results obtained on the inverse problems of IFSp. 20
Conclusion on the usage of the Parisian approach for inverse IFS problemsp. 22
Collective representation: the Parisian approach and the Fly algorithmp. 23
The principlesp. 23
Results on real imagesp. 27
Application to robotics: fly-based robot planningp. 30
Sensor fusionp. 34
Artificial evolution and real timep. 37
Conclusion about the fly algorithmp. 39
Conclusionp. 40
Acknowledgementsp. 41
Bibliographyp. 41
Wavelets and Fractals for Signal and Image Analysisp. 45
Introductionp. 45
Some general points on fractalsp. 46
Fractals and paradoxp. 46
Fractal sets and self-similarityp. 47
Fractal dimensionp. 49
Multifractal analysis of signalsp. 54
Regularityp. 54
Multifractal spectrump. 58
Distribution of singularities based on waveletsp. 60
Qualitative approachp. 60
A rough guide to the world of waveletp. 60
Wavelet Transform Modulus Maxima (WTMM) methodp. 63
Spectrum of singularities and waveletsp. 66
WTMM and some didactic signalsp. 68
Experimentsp. 70
Fractal analysis of structures in images: applications in microbiologyp. 70
Using WTMM for the classification of textures-application in the field of medical imageryp. 72
Conclusionp. 76
Bibliographyp. 76
Information Criteria: Examples of Applications in Signal and Image Processingp. 79
Introduction and contextp. 79
Overview of the different criteriap. 81
The case of auto-regressive (AR) modelsp. 83
Origin, written form and performance of different criteria on simulated examplesp. 84
AR and the segmentation of images: a first approachp. 87
Extension to 2D AR and application to the modeling of texturesp. 89
AR and the segmentation of images: second approach using 2D ARp. 92
Applying the process to unsupervised clusteringp. 95
Law approximation with the help of histogramsp. 98
Theoretical aspectsp. 98
Two applications used for encoding imagesp. 99
Other applicationsp. 103
Estimation of the order of Markov modelsp. 103
Data fusionp. 104
Conclusionp. 106
Appendixp. 106
Kullback (-Leibler) informationp. 106
Nishii's convergence criteriap. 107
Bibliographyp. 107
Quadratic Programming and Machine Learning-Large Scale Problems and Sparsityp. 111
Introductionp. 111
Learning processes and optimizationp. 112
General frameworkp. 112
Functional frameworkp. 114
Cost and regularizationp. 115
The aims of realistic learning processesp. 116
From learning methods to quadratic programmingp. 117
Primal and dual formsp. 117
Methods and resolutionp. 119
Properties to be used: sparsityp. 120
Tools to be usedp. 120
Structures of resolution algorithmsp. 121
Decomposition methodsp. 121
Solving quadratic problemsp. 123
Online and non-optimized methodsp. 126
Comparisonsp. 127
Experimentsp. 128
Comparison of empirical complexityp. 128
Very large databasesp. 130
Conclusionp. 132
Bibliographyp. 133
Probabilistic Modeling of Policies and Application to Optimal Sensor Managementp. 137
Continuum, a path toward oblivionp. 137
The cross-entropy (CE) methodp. 138
Probability of rare eventsp. 139
CE applied to optimizationp. 143
Examples of implementation of CE for surveillancep. 146
Introducing the problemp. 147
Optimizing the distribution of resourcesp. 149
Allocating sensors to zonesp. 150
Implementationp. 151
Example of implementation of CE for explorationp. 153
Definition of the problemp. 153
Applying the CEp. 156
Analyzing a simple examplep. 157
Optimal control under partial observationp. 158
Decision-making in partially observed environmentsp. 159
Implementing CEp. 162
Examplep. 163
Conclusionp. 166
Bibliographyp. 166
Optimizing Emissions for Tracking and Pursuit of Mobile Targetsp. 169
Introductionp. 169
Elementary modeling of the problem (deterministic case)p. 170
Estimability measurement of the problemp. 170
Framework for computing exterior productsp. 173
Application to the optimization of emissions (deterministic case)p. 175
The case of a maneuvering targetp. 180
The case of a target with a Markov trajectoryp. 181
Conclusionp. 189
Appendix: monotonous functional matricesp. 189
Bibliographyp. 192
Bayesian Inference and Markov Modelsp. 195
Introduction and application frameworkp. 195
Detection, segmentation and classificationp. 196
General modelingp. 199
Markov modelingp. 199
Bayesian inferencep. 200
Segmentation using the causal-in-scale Markov modelp. 201
Segmentation into three classesp. 203
The classification of objectsp. 206
The classification of seabedsp. 212
Conclusion and perspectivesp. 214
Bibliographyp. 215
The Use of Hidden Markov Models for Image Recognition: Learning with Artificial Ants, Genetic Algorithms and Particle Swarm Optimizationp. 219
Introductionp. 219
Hidden Markov models (HMMs)p. 220
Definitionp. 220
The criteria used in programming hidden Markov modelsp. 221
Using mataheuristics to learn HMMsp. 223
The different types of solution spaces used for the training of HMMsp. 223
The metaheuristics used for the training of the HMMsp. 225
Description, parameter setting and evaluation of the six approaches that are used to train HMMsp. 226
Genetic algorithmsp. 226
The API algorithmp. 228
Particle swarm optimizationp. 230
A behavioral comparison of the metaheuristicsp. 233
Parameter setting of the algorithmsp. 234
Comparing the algorithms' performancesp. 237
Conclusionp. 240
Bibliographyp. 240
Biological Metaheuristics for Road Sign Detectionp. 245
Introductionp. 245
Relationship to existing worksp. 246
Template and deformationsp. 248
Estimation problemp. 248
A priori energyp. 248
Image energyp. 249
Three biological metaheuristicsp. 252
Evolution strategiesp. 252
Clonal selection (CS)p. 255
Particle swarm optimizationp. 259
Experimental resultsp. 259
Preliminariesp. 259
Evaluation on the CD3 sequencep. 261
Conclusionp. 265
Bibliographyp. 266
Metaheuristics for Continuous Variables. The Registration of Retinal Angiogram Imagesp. 269
Introductionp. 269
Metaheuristics for difficult optimization problemsp. 270
Difficult optimizationp. 270
Optimization algorithmsp. 272
Image registration of retinal angiogramsp. 275
Existing methodsp. 275
A possible optimization method for image registrationp. 277
Optimizing the image registration processp. 279
The objective functionp. 280
The Nelder-Mead algorithmp. 281
The hybrid continuous interacting ant colony (HCIAC)p. 283
The continuous hybrid estimation of distribution algorithmp. 285
Algorithm settingsp. 288
Resultsp. 288
Preliminary testsp. 288
Accuracyp. 291
Typical casesp. 291
Additional problemsp. 293
Analysis of the resultsp. 295
Conclusionp. 296
Acknowledgementsp. 296
Bibliographyp. 296
Joint Estimation of the Dynamics and Shape of Physiological Signals through Genetic Algorithmsp. 301
Introductionp. 301
Brainstem Auditory Evoked Potentials (BAEPs)p. 302
BAEP generation and their acquisitionp. 303
Processing BAEPsp. 303
Genetic algorithmsp. 305
BAEP dynamicsp. 307
Validation of the simulated signal approachp. 313
Validating the approach on real signalsp. 320
Acceleration of the GA's convergence timep. 321
The non-stationarity of the shape of the BAEPsp. 324
Conclusionp. 327
Bibliographyp. 327
Using Interactive Evolutionary Algorithms to Help Fit Cochlear Implantsp. 329
Introductionp. 329
Finding good parameters for the processorp. 330
Interacting with the patientp. 331
Choosing an optimization algorithmp. 333
Adapting an evolutionary algorithm to the interactive fitting of cochlear implantsp. 335
Population size and the number of children per generationp. 336
Initializationp. 336
Parent selectionp. 336
Crossoverp. 337
Mutationp. 337
Replacementp. 337
Evaluationp. 338
Experimentsp. 339
The first experiment with patient Ap. 339
Analyzing the resultsp. 343
Second set of experiments: verifying the hypothesesp. 345
Third set of experiments with other patientsp. 349
Medical issues which were raised during the experimentsp. 350
Algorithmic conclusions for patient Ap. 352
Conclusionp. 354
Bibliographyp. 354
List of Authorsp. 357
Indexp. 359
Table of Contents provided by Ingram. All Rights Reserved.

Rewards Program

Write a Review