Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
Purchase Benefits
What is included with this book?
Introduction | p. xiii |
Modeling and Optimization in Image Analysis | p. 1 |
Modeling at the source of image analysis and synthesis | p. 1 |
From image synthesis to analysis | p. 2 |
Scene geometric modeling and image synthesis | p. 3 |
Direct model inversion and the Hough transform | p. 4 |
The deterministic Hough transform | p. 4 |
Stochastic exploration of parameters: evolutionary Hough | p. 5 |
Examples of generalization | p. 7 |
Optimization and physical modeling | p. 9 |
Photometric modeling | p. 9 |
Motion modeling | p. 10 |
Conclusion | p. 12 |
Acknowledgements | p. 13 |
Bibliography | p. 13 |
Artificial Evolution and the Parisian Approach. Applications in the Processing of Signals and Images | p. 15 |
Introduction | p. 15 |
The Parisian approach for evolutionary algorithms | p. 15 |
Applying the Parisian approach to inverse IFS problems | p. 17 |
Choosing individuals for the evaluation process | p. 18 |
Retribution of individuals | p. 18 |
Results obtained on the inverse problems of IFS | p. 20 |
Conclusion on the usage of the Parisian approach for inverse IFS problems | p. 22 |
Collective representation: the Parisian approach and the Fly algorithm | p. 23 |
The principles | p. 23 |
Results on real images | p. 27 |
Application to robotics: fly-based robot planning | p. 30 |
Sensor fusion | p. 34 |
Artificial evolution and real time | p. 37 |
Conclusion about the fly algorithm | p. 39 |
Conclusion | p. 40 |
Acknowledgements | p. 41 |
Bibliography | p. 41 |
Wavelets and Fractals for Signal and Image Analysis | p. 45 |
Introduction | p. 45 |
Some general points on fractals | p. 46 |
Fractals and paradox | p. 46 |
Fractal sets and self-similarity | p. 47 |
Fractal dimension | p. 49 |
Multifractal analysis of signals | p. 54 |
Regularity | p. 54 |
Multifractal spectrum | p. 58 |
Distribution of singularities based on wavelets | p. 60 |
Qualitative approach | p. 60 |
A rough guide to the world of wavelet | p. 60 |
Wavelet Transform Modulus Maxima (WTMM) method | p. 63 |
Spectrum of singularities and wavelets | p. 66 |
WTMM and some didactic signals | p. 68 |
Experiments | p. 70 |
Fractal analysis of structures in images: applications in microbiology | p. 70 |
Using WTMM for the classification of textures-application in the field of medical imagery | p. 72 |
Conclusion | p. 76 |
Bibliography | p. 76 |
Information Criteria: Examples of Applications in Signal and Image Processing | p. 79 |
Introduction and context | p. 79 |
Overview of the different criteria | p. 81 |
The case of auto-regressive (AR) models | p. 83 |
Origin, written form and performance of different criteria on simulated examples | p. 84 |
AR and the segmentation of images: a first approach | p. 87 |
Extension to 2D AR and application to the modeling of textures | p. 89 |
AR and the segmentation of images: second approach using 2D AR | p. 92 |
Applying the process to unsupervised clustering | p. 95 |
Law approximation with the help of histograms | p. 98 |
Theoretical aspects | p. 98 |
Two applications used for encoding images | p. 99 |
Other applications | p. 103 |
Estimation of the order of Markov models | p. 103 |
Data fusion | p. 104 |
Conclusion | p. 106 |
Appendix | p. 106 |
Kullback (-Leibler) information | p. 106 |
Nishii's convergence criteria | p. 107 |
Bibliography | p. 107 |
Quadratic Programming and Machine Learning-Large Scale Problems and Sparsity | p. 111 |
Introduction | p. 111 |
Learning processes and optimization | p. 112 |
General framework | p. 112 |
Functional framework | p. 114 |
Cost and regularization | p. 115 |
The aims of realistic learning processes | p. 116 |
From learning methods to quadratic programming | p. 117 |
Primal and dual forms | p. 117 |
Methods and resolution | p. 119 |
Properties to be used: sparsity | p. 120 |
Tools to be used | p. 120 |
Structures of resolution algorithms | p. 121 |
Decomposition methods | p. 121 |
Solving quadratic problems | p. 123 |
Online and non-optimized methods | p. 126 |
Comparisons | p. 127 |
Experiments | p. 128 |
Comparison of empirical complexity | p. 128 |
Very large databases | p. 130 |
Conclusion | p. 132 |
Bibliography | p. 133 |
Probabilistic Modeling of Policies and Application to Optimal Sensor Management | p. 137 |
Continuum, a path toward oblivion | p. 137 |
The cross-entropy (CE) method | p. 138 |
Probability of rare events | p. 139 |
CE applied to optimization | p. 143 |
Examples of implementation of CE for surveillance | p. 146 |
Introducing the problem | p. 147 |
Optimizing the distribution of resources | p. 149 |
Allocating sensors to zones | p. 150 |
Implementation | p. 151 |
Example of implementation of CE for exploration | p. 153 |
Definition of the problem | p. 153 |
Applying the CE | p. 156 |
Analyzing a simple example | p. 157 |
Optimal control under partial observation | p. 158 |
Decision-making in partially observed environments | p. 159 |
Implementing CE | p. 162 |
Example | p. 163 |
Conclusion | p. 166 |
Bibliography | p. 166 |
Optimizing Emissions for Tracking and Pursuit of Mobile Targets | p. 169 |
Introduction | p. 169 |
Elementary modeling of the problem (deterministic case) | p. 170 |
Estimability measurement of the problem | p. 170 |
Framework for computing exterior products | p. 173 |
Application to the optimization of emissions (deterministic case) | p. 175 |
The case of a maneuvering target | p. 180 |
The case of a target with a Markov trajectory | p. 181 |
Conclusion | p. 189 |
Appendix: monotonous functional matrices | p. 189 |
Bibliography | p. 192 |
Bayesian Inference and Markov Models | p. 195 |
Introduction and application framework | p. 195 |
Detection, segmentation and classification | p. 196 |
General modeling | p. 199 |
Markov modeling | p. 199 |
Bayesian inference | p. 200 |
Segmentation using the causal-in-scale Markov model | p. 201 |
Segmentation into three classes | p. 203 |
The classification of objects | p. 206 |
The classification of seabeds | p. 212 |
Conclusion and perspectives | p. 214 |
Bibliography | p. 215 |
The Use of Hidden Markov Models for Image Recognition: Learning with Artificial Ants, Genetic Algorithms and Particle Swarm Optimization | p. 219 |
Introduction | p. 219 |
Hidden Markov models (HMMs) | p. 220 |
Definition | p. 220 |
The criteria used in programming hidden Markov models | p. 221 |
Using mataheuristics to learn HMMs | p. 223 |
The different types of solution spaces used for the training of HMMs | p. 223 |
The metaheuristics used for the training of the HMMs | p. 225 |
Description, parameter setting and evaluation of the six approaches that are used to train HMMs | p. 226 |
Genetic algorithms | p. 226 |
The API algorithm | p. 228 |
Particle swarm optimization | p. 230 |
A behavioral comparison of the metaheuristics | p. 233 |
Parameter setting of the algorithms | p. 234 |
Comparing the algorithms' performances | p. 237 |
Conclusion | p. 240 |
Bibliography | p. 240 |
Biological Metaheuristics for Road Sign Detection | p. 245 |
Introduction | p. 245 |
Relationship to existing works | p. 246 |
Template and deformations | p. 248 |
Estimation problem | p. 248 |
A priori energy | p. 248 |
Image energy | p. 249 |
Three biological metaheuristics | p. 252 |
Evolution strategies | p. 252 |
Clonal selection (CS) | p. 255 |
Particle swarm optimization | p. 259 |
Experimental results | p. 259 |
Preliminaries | p. 259 |
Evaluation on the CD3 sequence | p. 261 |
Conclusion | p. 265 |
Bibliography | p. 266 |
Metaheuristics for Continuous Variables. The Registration of Retinal Angiogram Images | p. 269 |
Introduction | p. 269 |
Metaheuristics for difficult optimization problems | p. 270 |
Difficult optimization | p. 270 |
Optimization algorithms | p. 272 |
Image registration of retinal angiograms | p. 275 |
Existing methods | p. 275 |
A possible optimization method for image registration | p. 277 |
Optimizing the image registration process | p. 279 |
The objective function | p. 280 |
The Nelder-Mead algorithm | p. 281 |
The hybrid continuous interacting ant colony (HCIAC) | p. 283 |
The continuous hybrid estimation of distribution algorithm | p. 285 |
Algorithm settings | p. 288 |
Results | p. 288 |
Preliminary tests | p. 288 |
Accuracy | p. 291 |
Typical cases | p. 291 |
Additional problems | p. 293 |
Analysis of the results | p. 295 |
Conclusion | p. 296 |
Acknowledgements | p. 296 |
Bibliography | p. 296 |
Joint Estimation of the Dynamics and Shape of Physiological Signals through Genetic Algorithms | p. 301 |
Introduction | p. 301 |
Brainstem Auditory Evoked Potentials (BAEPs) | p. 302 |
BAEP generation and their acquisition | p. 303 |
Processing BAEPs | p. 303 |
Genetic algorithms | p. 305 |
BAEP dynamics | p. 307 |
Validation of the simulated signal approach | p. 313 |
Validating the approach on real signals | p. 320 |
Acceleration of the GA's convergence time | p. 321 |
The non-stationarity of the shape of the BAEPs | p. 324 |
Conclusion | p. 327 |
Bibliography | p. 327 |
Using Interactive Evolutionary Algorithms to Help Fit Cochlear Implants | p. 329 |
Introduction | p. 329 |
Finding good parameters for the processor | p. 330 |
Interacting with the patient | p. 331 |
Choosing an optimization algorithm | p. 333 |
Adapting an evolutionary algorithm to the interactive fitting of cochlear implants | p. 335 |
Population size and the number of children per generation | p. 336 |
Initialization | p. 336 |
Parent selection | p. 336 |
Crossover | p. 337 |
Mutation | p. 337 |
Replacement | p. 337 |
Evaluation | p. 338 |
Experiments | p. 339 |
The first experiment with patient A | p. 339 |
Analyzing the results | p. 343 |
Second set of experiments: verifying the hypotheses | p. 345 |
Third set of experiments with other patients | p. 349 |
Medical issues which were raised during the experiments | p. 350 |
Algorithmic conclusions for patient A | p. 352 |
Conclusion | p. 354 |
Bibliography | p. 354 |
List of Authors | p. 357 |
Index | p. 359 |
Table of Contents provided by Ingram. All Rights Reserved. |