did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9781119618553

Image Processing Dealing with Texture

by ;
  • ISBN13:

    9781119618553

  • ISBN10:

    111961855X

  • Edition: 2nd
  • Format: Hardcover
  • Copyright: 2021-03-22
  • Publisher: Wiley

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $152.48 Save up to $56.41
  • Rent Book $96.07
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    USUALLY SHIPS IN 3-4 BUSINESS DAYS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

The classic text that covers practical image processing methods and theory for image texture analysis, updated second edition

The revised second edition of Image Processing: Dealing with Textures updates the classic work on texture analysis theory and methods without abandoning the foundational essentials of this landmark work. Like the first, the new edition offers an analysis of texture in digital images that are essential to a diverse range of applications such as: robotics, defense, medicine and the geo-sciences.

Designed to easily locate information on specific problems, the text is structured around a series of helpful questions and answers. Updated to include the most recent developments in the field, many chapters have been completely revised including: Fractals and Multifractals, Image Statistics, Texture Repair, Local Phase Features, Dual Tree Complex Wavelet Transform, Ridgelets and Curvelets and Deep Texture Features. The book takes a two-level mathematical approach: light math is covered in the main level of the book, with harder math identified in separate boxes. This important text:

Contains an update of the classic advanced text that reviews practical image processing methods and theory for image texture analysis

Puts the focus exclusively on an in-depth exploration of texture

Contains a companion website with exercises and algorithms

Includes examples that are fully worked to enhance the learning experience

Written for students and researchers of image processing, the second edition of Image Processing has been revised and updated to incorporate the foundational information on the topic and information on the latest advances.

Author Biography

The late Maria Petrou was a Greek-born British scientist who specialised in the fields of artificial intelligence and machine vision.

Sei-ichiro Kamata, is Professor, Graduate School of Information, Production and Systems, Waseda University, Japan.

Table of Contents

Preface to the Second Edition

Preface to the First Edition

Acknowledgements

 

1 Introduction. 1

What is texture? 1

Why are we interested in texture? 1

How do we cope with texture when texture is a nuisance? 3

How does texture give us information about the material of the imaged object? 4

Are there non-optical images? 4

What is the meaning of texture in non-optical images? 4

What is the albedo of a surface? 4

Can a surface with variable albedo appear non-textured? 5

Can a rough surface of uniform albedo appear non-textured? 5

What are the problems of texture which image processing is trying to solve? 5

What are the challenges in trying to solve the above problems? 5

How may the limitations of image processing be overcome for recognising textures? 6

What is this book about? 7

Box 1.1. An algorithm for the isolation of textured regions. 7

 

2 Binary textures. 13

Why are we interested in binary textures? 13

What is this chapter about? 13

Are there any generic tools appropriate for all types of texture? 14

Can we at least distinguish classes of texture? 14

Which are the texture classes? 14

Which tools are appropriate for each type of texture? 14

 

2.1 Shape grammars. 15

What is a shape grammar? 15

Box 2.1. Shape grammars. 15

What happens if the placement of the primitive pattern is not regular? 23

What happens if the primitive pattern itself is not always the same? 24

What happens if the primitive patterns vary in a continuous way? 24

 

2.2 Boolean models. 25

What is a 2D Boolean model? 25

Box 2.2. How can we draw random numbers according to a given probability density function? 25

Box 2.3. What is a Poisson process? 31

How can we use the 2D Boolean model to describe a binary texture? 34

How can we estimate some aggregate parameters of the 2D Boolean model? 34

What is the area fraction? 34

What is the specific boundary length? 34

What types of image connectivity may we have? 34

How does image connectivity influence the estimation of the specific boundary length? 35

How do we deal with border effects? 36

Is there a way to estimate the specific boundary length L, bypassing the problem of 4- or 8-connectivity? 36

What is the specific convexity N+? 36

How can we estimate some individual parameters of the 2D Boolean model? 41

Box 2.4. How can we relate the individual parameters to the aggregate parameters of the 2D Boolean model? 42

What is the simplest possible primitive pattern we may have in a Boolean model? 48

What is a 1D Boolean model? 48

How may the 1D Boolean model be used to describe textures? 49

How can we create 1D strings from a 2D image? 49

Box 2.5. How to construct Hilbert curves  50

How can we estimate the parameters of the 1D Boolean model? 55

Box 2.6. Parameter estimation for the discrete 1D Boolean model. 58

What happens if the primitive patterns are very irregular? 60

 

2.3 Mathematical morphology. 61

What is mathematical morphology? 61

What is dilation? 61

What is erosion? 63

Is there any way to lose details smaller than a certain size but leave the size of larger details unaffected? 63

What is opening? 63

What is closing? 64

How do we do morphological operations if the structuring element is not symmetric about its centre? 64

Since the structuring element looks like a small image, can we exchange the roles of object and structuring element? 67

Is closing a commutative operation? 68

Can we use different structuring elements for the erosion and the dilation parts of the opening and closing operators? 68

Can we apply morphological operators to the white pixels of an image instead of applying them to the black pixels? 71

Can we apply more than one morphological operator to the same image? 71

Is erosion an associative operation as well? 72

How can we use morphological operations to characterise a texture? 74

Box 2.7. Formal definitions in mathematical morphology. 76

Box 2.8. Hit-or-miss algorithm 88

What is the “take home” message of this chapter? 88

 

3 Stationary grey texture images 91

What is a stationary texture image? 91

What is this chapter about? 91

Are any of the methods appropriate for classifying binary textures useful for the analysis of grey textures? 91

 

3.1 Image binarisation. 94

How may a grey image be analysed into a set of binary images by thresholding? 94

How can we deal with the problem of unequal number of pixels in each range of grey values? 94

How may a grey image be analysed into a set of binary images by bit-slicing? 94

Is there any relationship between the binary planes produced by thresholding and the bit planes? 98

 

3.2 Grey scale mathematical morphology. 102

How is mathematical morphology generalised for grey images? 102

How is the complement of an image defined for grey images? 104

What is a non-flat structuring element? 104

What is the relationship between the morphological operations applied to an image and those applied to its complement? 108

What is the purpose of using a non-flat structuring element? 110

How can we perform granulometry with a grey image? 111

Can we extract in one go the details of a signal, peaks or valleys, smaller than a certain size? 115

What is the blanket method? 115

How can we deal with border effects without loosing part of the image? 117

How do we apply the blanket method in practice? 117

How can we use the pattern spectrum to classify textures? 120

 

3.3 Fractals and Multifractals 122

What is a fractal? 122

What is the fractal dimension? 122

Box 3.1. The box-counting method for binary images 137

Box 3.2. The box-counting algorithm for grey images 140

What is the generalised fractal dimension of a binary shape? 143

What is the role of q in the definition of the generalised fractal dimension? 144

Box 3.3. Least squares error fitting of data with a straight line 149

Box 3.4. Robust line fitting using the RanSac method 150

How do we compute the local connected fractal dimension of a binary image? 150

Which statistical properties remain the same at all scales in non-deterministic fractals? 156

Box 3.5. What is self-affine scaling? 157

Box 3.6. What is the relationship between the fractal dimension and exponent H? 158

Box 3.7. What is the range of values of H? 159

What is a fractional Brownian motion? 161

Box 3.8. Prove that the range of values of H for a fractional Brownian motion is (0,1) 169

Box 3.9. What is the correlation between two increments of a fractional Brownian motion? 171

Box 3.10. Synthesising a 1D fractal 178

Box 3.11. The Shur algorithm for Cholesky decomposition of a symmetric Toeplitz matrix 179

Box 3.12. What is the power spectrum of a fractal? 182

Box 3.13. What is the autocorrelation function of a fractal? 184

Box 3.14. Construction of a 2D fractal 185

Is fractal dimension a good texture descriptor? 190

What is lacunarity? 190

How can we compute the multifractal spectrum of a grey image? 194

What is a multifractal? 197

What is the Holder regularity? 197

What is the difference between a fractal and a multifractal? 197

How can we characterise an image using multifractals? 201

How can we calculate the Holder exponent of each pixel? 201

Box 3.15. The 1D binomial multifractal 205

Box 3.16. Extension to 2D of the 1D binomial multifractal 208

Are multifractals good image models? 210

 

3.4 Image Statistics 211

What is an image statistic? 211

What is a one-point statistic? 211

What is a two-point statistic? 211

What is a many-point statistic? 211

How do we construct features from one-point, two-point or many-point image statistics? 211

What is the histogram of the image? 212

Can we have rolling bins? 212

Box 3.17. Constructing the 3D orientation histogram of a 3D image 218

What is the rank-frequency plot? 220

What is the run-length matrix? 222

How do we compute the run-length matrix in practice? 223

How do we compute features from the run-length matrix? 223

What is the variogram of an image? 225

How do we compute the variogram of an image in practice? 229

How do we compute the variogram of irregularly sampled data in practice? 229

How can we use the variogram to characterise textures? 232

How may we compute automatically the Nugget, Sill and Range of the variogram? 232

Can we use the variogram for anisotropic textures? 234

How can we model an anisotropic variogram? 234

Can we use the autocorrelation function itself to characterise a texture? 238

How can we use the autocorrelation function directly for texture characterisation? 243

How can we infer the periodicity of a texture from the autocorrelation function? 245

How can we extract parametric features from image statistics? 246

Box 3.18. Least square error fitting of a second order surface in 2D 246

Box 3.19. Least square error fitting of a second order curve in 1D 248

Box 3.20. Least square error fitting of an exponential function to a set of data 248

What is the generalised Zipf distribution? 249

Box 3.21. What is the relationship of the Zipf distribution and fractals? 250

What is the Weibull distribution 250

How can we estimate the parameters of the Weibull distribution from a set of data? 253

Which parametric models are commonly used to fit the variogram over its Range? 255

How can we fit automatically a model to a variogram? 257

Can we use non-parametric descriptions of texture? 258

What is a histogram moment? 259

What is the co-occurrence matrix? 259

What are the generalised co-occurrence matrices? 259

What is a high order co-occurrence matrix? 259

How is a co-occurrence matrix defined? 259

How do we compute the co-occurrence matrix in practice? 265

How can we recognise textures with the help of the co-occurrence matrix? 266

How can we choose the parameters of the co-occurrence matrix? 268

 

3.5 Texture features from the Fourier transform 272

How can we use the Fourier transform to construct texture features? 272

How is the power spectrum defined? 272

How can we infer the periodicity of a texture from its power spectrum? 272

What is the phase and magnitude of the Fourier transform? 274

Box 3.22. Contour integration 281

Does the phase of the Fourier transform convey any useful information? 300

Since the phase conveys more information for a pattern than its power spectrum, why don’t we use the phase to describe textures? 306

Is it possible to compute from the image phase a function the value of which changes only due to genuine image changes? 306

How do we perform phase unwrapping? 307

What are the drawbacks of the simple phase unwrapping algorithm? 309

How do we perform phase-unwrapping in 2D? 311

What is the slice transform of an image? 313

Box 3.23. The slice transform 313

 

3.6 Markov random fields 316

What is a Markov random field? 316

What is a random field? 316

Which are the neighbouring pixels of a pixel? 316

How can we create a Markov random filed? 316

How can we use MRFs to characterise textures? 318

What is texture synthesis by analysis? 319

How can we apply the Markov model to create textures? 322

Can we apply the method discussed in the previous section to create images with 256 grey levels? 324

What is the auto-normal Markov random field model? 330

How can we estimate the Markov parameters of a texture? 332

What is maximum likelihood estimation? 332

What is the log-likelihood? 334

Box 3.24. What is the relationship between maximum likelihood estimation and Bayesian estimation? 335

How can we apply maximum likelihood estimation to estimate the parameters of a Markov random field? 335

How do we know which parameter values to try when we apply MLE to estimate the Markov parameters? 336

How can we estimate the Markov parameters with the least square error estimation method? 340

Box 3.25. Least square parameter estimation for the MRF parameters 342

How can we work out the size of the Markov neighbourhood? 348

What is the Laplacian pyramid? 348

Why is the creation of a Laplacian pyramid associated with the application of a Gaussian function at different scales, and the subtraction of the results? 348

Why may the second derivative of a Gaussian function be used as a filter to estimate the second derivative of a signal? 348

How can we use the Laplacian pyramid representation of an image to model it as a Markov field? 355

How can we synthesise a texture if we know the Markov parameters of the levels of its Laplacian pyramid? 355

Is a Markov random field always realisable given that we define it arbitrarily? 356

What conditions make an MRF self-consistent? 356

What is a clique in a neighbourhood structure? 356

 

3.7 Gibbs distributions 359

What is a Gibbs distribution? 359

What is a clique potential? 359

Can we have a Markov random field with only singleton cliques? 362

What is the relationship between Gibbs distributions and co-occurrence matrices? 372

What is the relationship between the clique potentials and the Markov parameters? 373

Box 3.26. Prove the equivalence of Markov random fields and Gibbs distributions (Hammersley-Clifford theorem). 377

How can we use the Gibbs distribution to create textures? 383

How can we create an image compatible with a Gibbs model if we are not interested in fixing the histogram of the image? 390

What is the temperature of a Gibbs distribution? 394

How does the temperature parameter of the Gibbs distribution determine how distinguishable one configuration is from another? 394

What is the critical temperature of a Markov random field? 403

 

3.8 Texture repair 411

What is texture repair? 411

How can we perform texture repair? 411

What is Kriging? 411

How does Kriging work? 411

How can we estimate the correlation matrix of the data? 416

How do we perform Kriging interpolation in practice? 416

Can we use another statistic to guide us in selecting the missing pixel values? 416

How can we repair a texture while preserving its co-occurrence matrices? 416

What is in-painting? 421

What is normalised convolution? 430

What is the error correction method? 431

What is the non-parametric MRF method for texture repair? 432

What is the “take home” message of this chapter? 435

 

4 Non-stationary grey texture images. 437

What is a non-stationary texture image? 437

What is this chapter about? 437

Why can’t we use the methods developed in the previous chapter here? 437

How can we be sure that the texture inside an image window is stationary? 437

 

4.1 The uncertainty principle and its implications in signal and image processing. 438

What is the uncertainty principle in signal processing? 438

Box 4.1. Prove the uncertainty principle in signal processing. 442

Does the window we choose in order to extract local information influence the result? 446

How can we estimate “what is happening where” in a digital signal? 456

How can we deal with the variability of the values of a feature? 459

How do we know which size window we should use? 464

How is the uncertainty principle generalised to 2D? 467

 

4.2 Gabor functions. 470

What is a Gabor function? 470

Why are Gabor functions useful in analysing a signal? 471

How can we use the Gabor functions in practice? 477

How is a Gabor function generalised in 2D? 482

How may we use the 2D Gabor functions to analyse an image? 486

Can we have alternative tessellations of the frequency domain? 495

How can we define a Gaussian window in polar coordinates in the frequency domain? 496

What is an octave? 498

How do we express a frequency in octaves? 498

How may we choose the parameters of the Gaussian window in the frequency space? 499

How do we perform the analysis of an image in terms of Gabor functions in practice? 501

 

4.3 Prolate spheroidal sequence functions. 528

Is it possible to have a window with sharp edges in one domain which has minimal side ripples in the other domain? 528

Box 4.2. Of all the band-limited sequences one can define, which sequence has the maximum energy concentration between a given set of indices? 529

Box 4.3. Do prolate spheroidal wave functions exists in the digital domain? 532

How can we construct a filter which is band-limited in two bands which are symmetrically placed about the origin of the axes in the frequency domain? 543

Box 4.4. How may we generalise the prolate spheroidal sequence functions to 2D? 553

Box 4.5. What is the difference between projection onto basis functions and convolution with the corresponding filters? 570

Box 4.6. Could we construct the 2D prolate spheroidal sequence filters as separable filters? 578

Box 4.7. What is the advantage of using separable filters? 583

How do we perform the analysis of an image using prolate spheroidal sequence functions in practice? 583

 

4.4 Local phase features 593

Can we use the phase of the DFT to compute from it texture features? 593

What is a Hilbert transform pair? 593

Why two filters that constitute a Hilbert transform pair may be used for local phase estimation? 593

What if the harmonics of a non-symmetric and non-antisymmetric function have all different phases? 594

How can we construct two filters so they constitute a Hilbert transform pair? 594

How do we compute the Riesz transform of an image? 596

How can the monogenic signal be used? 596

How can we select the even filter we use in the Riesz transform? 597

Can we tessellate the frequency space using polar coordinates like we did for the Gabor functions? 597

Can we use prolate spheroidal sequence functions instead of the monogenic signal to extract local symmetry and local orientation? 599

 

4.5 Wavelets 611

Is there a way other than using Gabor functions to span the whole spatio-frequency space? 611

What is a wavelet? 614

How can we use wavelets to analyse a signal? 615

Box 4.8. How should we choose the mother wavelet? 617

Box 4.9. Does the wavelet function minimise the uncertainty inequality? 624

How is the wavelet transform adapted for digital signals? 637

How do we compute the wavelet coefficients in practice? 640

Why is the continuous wavelet transform invertible and the discrete wavelet transform non-invertible? 652

How can we span the part of the “what happens when” space which contains the direct component of the signal? 653

Can we span the whole “what is where” space by using only the scaling function? 656

How can we extract the coarse resolution content of a signal from its content at a finer resolution? 657

How can we choose the scaling function? 662

How do we perform the multiresolution analysis of a signal in practice? 666

Why in tree wavelet analysis do we always analyse the part of the signal which contains the low frequencies only? 668

Box 4.10. How do we recover the original signal from its wavelet coefficients in practice? 677

How many different wavelet filters exist? 685

How may we use wavelets to process images? 685

How may we use wavelets to construct texture features? 692

What is the maximum overlap algorithm? 692

What is the relationship between Gabor functions and wavelets? 703

 

4.6 The dual tree complex wavelet transform 706

What is the dual tree complex wavelet transform? 706

Why was the dual tree complex wavelet transform developed? 706

How do we select the filters used by the dual tree complex wavelet transform? 708

How can we select two wavelet filters that form a Hilbert transform pair? 709

Box 4.11. How should two scaling filters be chosen, so that the corresponding wavelets they generate form a Hilbert transform pair? 723

How do we construct the filters we use in the dual tree complex wavelet transform? 724

Box 4.12. The quarter shift idea 724

How do we apply the Dual Tree Complex Wavelet transform to analyse 1D signals? 730

How do we use the Dual Tree Complex Wavelet transform to analyse images? 730

How do we apply the Dual Tree Complex Wavelet transform to images in practice? 736

What are the drawbacks of wavelets in image processing? 739

 

4.7 Ridgelets and Curvelets 740

What is a ridgelet? 740

What is the continuous ridgelet transform? 740

How do we calculate the continuous ridgelet transform? 740

What is the Radon transform of a function? 740

What exactly does the ridgelet transform do to the function? 741

Can the second stage of the ridgelet transform be performed with the help of a transform other than the wavelet transform? 743

How can we apply the ridgelet transform to digital images? 744

What is the finite Radon transform? 744

How do we form the lines that we use in the finite Radon transform? 745

Why N has to be prime for the lines of a certain slope to fill the image? 745

How many lines made up from pixels can we have in an N × N image? 745

How many pixels does each line in the finite Radon transform consist of? 749

If the lines we use are made up from pixels, how sure are we that all pixels are used in an unbiased way? 749

How do we compute the finite Radon transform? 750

Is the finite Radon transform invertible? 751

How do we compute the inverse finite Radon transform? 753

Can we perform the finite Radon transform using matrix multiplications? 753

Box 4.13. Circulant matrices 763

Box 4.14. Elements of frame theory 765

Does the order by which we consider the tracing lines in the finite Radon transform matter? 767

What determines the order by which we consider the lines in the finite Radon transform? 767

How may a line be parametrised? 767

What is the best way to parametrise the Radon lines for use in the ridgelet transform? 770

How can we select the tracing lines so low frequency information is captured best? 784

How do we use the finite Radon transform to calculate the finite ridgelet transform? 789

How do we create a wavelet basis for a signal that is not a power of 2 in length? 789

What are the basis images in terms of which the finite Radon transform expands an image? 791

Is the finite ridgelet transform invertible? 794

What are the basis images in terms of which the finite ridgelet transform expands an image? 794

Does the finite ridgelet transform extract local image information? 799

How do we apply the curvelet transform in practice? 801

 

4.8 Where image processing and pattern recognition meet 804

Why in wavelet analysis do we always split the band with the maximum energy? 804

What is feature selection? 804

How can we visualise the histogram of more than one feature in order to decide whether they constitute a good feature set? 806

What is the feature space? 806

What is the histogram of distances in a feature space? 806

Is it possible that the histogram of distances does not pick up the presence of clusters, even though clusters are present? 808

How do we segment the image once we have produced a set of features for each pixel? 810

What is the K-means algorithm? 810

What is deterministic annealing? 811

Box 4.15. Maximum entropy clustering 812

How may we assess the quality of a segmentation? 815

How is the Bhattacharyya distance defined? 815

How can we compute the Bhattacharyya distance in practice? 815

How may we assess the quality of a segmentation using a manual segmentation as reference? 816

What is a confusion matrix? 817

What are the over- and under-detection errors? 818

How can we search for a pattern in an image? 819

How do we compute the cross-correlation between two image patches? 819

How do we compute the mutual information between two images? 819

How do we perform matching by tone mapping between two images? 820

Box 4.16. Fast tone matching algorithm 827

Box 4.17. From mutual information to a point similarity measure 828

How can we compare two histograms? 831

What is the χ2 test? 831

What is the earth mover’s distance? 831

What is the Hungarian algorithm? 831

Box 4.18. The formal definition of earth mover’s distance 832

 

4.9 Laws’ masks and the “what looks like where” space 834

Is it possible to extract image features without referring to the frequency domain? 834

How are Laws’ masks defined? 834

Is there a systematic way to construct features that span the “what looks like where” space completely? 843

How can we expand a local image neighbourhood in terms of the Walsh elementary images? 851

Can we use convolution to compute the coefficients of the expansion of a sub-image in terms of a set of elementary images? 857

Is there any other way to express the local structure of the image? 868

 

4.10 Local binary patterns 869

What is the local binary pattern approach to texture representation? 869

How can we make this representation rotationally invariant? 869

How can we make this representation appropriate for macro-textures? 870

How can we use the local binary patterns to characterise textures? 871

What is a metric? 871

What is a pseudo-metric? 871

Why should one wish to use a pseudo-metric and not a metric? 872

How can we measure the difference between two histograms? 872

How can we use the local binary patterns to segment textures? 874

How can we overcome the shortcomings of the LBP segmentation? 875

4.11 The Wigner distribution 878

What is the Wigner distribution? 878

How is the Wigner distribution used for the analysis of digital signals? 886

What is the pseudo-Wigner distribution? 886

What is the Kaiser window? 887

What is the Nyquist frequency? 889

Why does the use of the pseudo-Wigner distribution require signals which have been sampled at twice their Nyquist frequency? 890

Should we worry about aliasing when we use the pseudo-Wigner distribution for texture analysis? 890

How is the pseudo-Wigner distribution defined for the analysis of images? 892

How can the pseudo-Wigner distribution be used for texture segmentation? 892

4.12 Convolutional Neural Networks for Textures Feature Extraction 901

Why do we require convolutional neural networks for texture analysis? 901

The basis of neural network is Mcculloch and Pitts neuron model 902

Box 4.19. Mcculloch and Pitts neuron model 902

Box 4.20. Hebbian theory 903

What is Rosenblatt learning model: Perceptron? 903

What is Widrow and Hoff learning model? 906

Box 4.21. Steepest descent method 910

What are the findings from physiology? 911

Box 4.22. Hubel and Wiesel discoveries 911

What are local patterns: Primitives and Texton 912

Box 4.23. Julesz texton 912

What kind of information do we extract as local patterns or features? 913

Box 4.24. What is Principal Component Analysis (PCA) 914

How do we find primitive patterns? 918

What is backpropagation learning method? 919

Box 4.25. Sigmoid function 920

Box 4.26. Three layers neural networks 920

How do we solve the vanishing gradient problem 927

We have difficulties in texture classification: intra-class variations and inter-class similarities. 927

What is Fisher’s discriminating feature space 928

How is texture analysis developed in the past 932

What is Fisher Information Matrix (FIM)? 935

How do we implement co-occurrence matrix feature map 936

Basebone CNN structure 938

There are several texture datasets available. 939

We set up several deep networks for texture classification 940

We have several existing Problems 942

What is the “take-home” message of this chapter? 944

Bibliographical Notes 945

References 947

Index

 

 

 

 

 

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program