CART

(0) items

Hyperspectral Data Processing : Algorithm Design and Analysis,9780471690566
This item qualifies for
FREE SHIPPING!

FREE SHIPPING OVER $59!

Your order must be $59 or more, you must select US Postal Service Shipping as your shipping preference, and the "Group my items into as few shipments as possible" option when you place your order.

Bulk sales, PO's, Marketplace Items, eBooks, Apparel, and DVDs not included.

Hyperspectral Data Processing : Algorithm Design and Analysis

by
Edition:
1st
ISBN13:

9780471690566

ISBN10:
0471690562
Format:
Hardcover
Pub. Date:
4/15/2013
Publisher(s):
Wiley-Interscience
List Price: $207.99

Rent Textbook

(Recommended)
 
Term
Due
Price
$176.79

Buy New Textbook

Currently Available, Usually Ships in 24-48 Hours
N9780471690566
$198.59

Used Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

More New and Used
from Private Sellers
Starting at $185.17
See Prices

Questions About This Book?

Why should I rent this book?
Renting is easy, fast, and cheap! Renting from eCampus.com can save you hundreds of dollars compared to the cost of new or used books each semester. At the end of the semester, simply ship the book back to us with a free UPS shipping label! No need to worry about selling it back.
How do rental returns work?
Returning books is as easy as possible. As your rental due date approaches, we will email you several courtesy reminders. When you are ready to return, you can print a free UPS shipping label from our website at any time. Then, just return the book to your UPS driver or any staffed UPS location. You can even use the same box we shipped it in!
What version or edition is this?
This is the 1st edition with a publication date of 4/15/2013.
What is included with this book?
  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any CDs, lab manuals, study guides, etc.
  • The Rental copy of this book is not guaranteed to include any supplemental materials. You may receive a brand new copy, but typically, only the book itself.

Summary

This book is intended to be a sequel to the author?s previous work, Hyperspectral Imaging: Techniques for Spectral Detection and Classification. It is comprised of five major parts and covers a wealth of information, including: a discussion of OSP, FPGA designs for OSP and CEM, Kalman filter-based linear unmixing, least squares fully constrained linear mixture analysis, interference rejection for linear unmixing, signal-composed interference-annihilated theory, techniques for linear unmixing, spectral coding, and more.

Author Biography

CHEIN-I CHANG, PhD, is a Professor in the Department of Computer Science and Electrical Engineering at the University of Maryland, Baltimore County. He established the Remote Sensing Signal and Image Processing Laboratory and conducts research in designing and developing signal processing algorithms for hyperspectral imaging, medical imaging, and documentation analysis. A Fellow of IEEE and SPIE, Dr. Chang has published over 125 refereed journal articles, including more than forty papers in the IEEE Transaction on Geoscience and Remote Sensing. In addition to authoring Hyperspectral Imaging: Techniques for Spectral Detection and Classification, as well as editing two books, Hyperspectral Data Exploitation: Theory and Applications and Recent Advances in Hyperspectral Signal and Imaging Processing and co-editing one book, High Performance Computing in Remote Sensing, he holds five patents and has several pending.

Table of Contents

PREFACE xxiii

1 OVERVIEWAND INTRODUCTION 1

1.1 Overview 2

1.2 Issues of Multispectral and Hyperspectral Imageries 3

1.3 Divergence of Hyperspectral Imagery from Multispectral Imagery 4

1.3.1 Misconception: Hyperspectral Imaging is a Natural Extension of Multispectral Imaging 4

1.3.2 Pigeon-Hole Principle: Natural Interpretation of Hyperspectral Imaging 5

1.4 Scope of This Book 7

1.5 Book’s Organization 10

1.5.1 Part I: Preliminaries 10

1.5.2 Part II: Endmember Extraction 12

1.5.3 Part III: Supervised Linear Hyperspectral Mixture Analysis 13

1.5.4 Part IV: Unsupervised Hyperspectral Analysis 13

1.5.5 Part V: Hyperspectral Information Compression 15

1.5.6 Part VI: Hyperspectral Signal Coding 16

1.5.7 Part VII: Hyperspectral Signal Feature Characterization 17

1.5.8 Applications 17

1.5.8.1 Chapter 30: Applications of Target Detection 17

1.5.8.2 Chapter 31: Nonlinear Dimensionality Expansion to Multispectral Imagery 18

1.5.8.3 Chapter 32: Multispectral Magnetic Resonance Imaging 19

1.6 Laboratory Data to be Used in This Book 19

1.6.1 Laboratory Data 19

1.6.2 Cuprite Data 19

1.6.3 NIST/EPA Gas-Phase Infrared Database 19

1.7 Real Hyperspectral Images to be Used in this Book 20

1.7.1 AVIRIS Data 20

1.7.1.1 Cuprite Data 21

1.7.1.2 Purdue’s Indiana Indian Pine Test Site 25

1.7.2 HYDICE Data 26

1.8 Notations and Terminologies to be Used in this Book 29

I: PRELIMINARIES 31

2 FUNDAMENTALS OF SUBSAMPLE AND MIXED SAMPLE ANALYSES 33

2.1 Introduction 33

2.2 Subsample Analysis 35

2.2.1 Pure-Sample Target Detection 35

2.2.2 Subsample Target Detection 38

2.2.2.1 Adaptive Matched Detector (AMD) 39

2.2.2.2 Adaptive Subspace Detector (ASD) 41

2.2.3 Subsample Target Detection: Constrained Energy Minimization (CEM) 43

2.3 Mixed Sample Analysis 45

2.3.1 Classification with Hard Decisions 45

2.3.1.1 Fisher’s Linear Discriminant Analysis (FLDA) 46

2.3.1.2 Support Vector Machines (SVM) 48

2.3.2 Classification with Soft Decisions 54

2.3.2.1 Orthogonal Subspace Projection (OSP) 54

2.3.2.2 Target-Constrained Interference-Minimized Filter (TCIMF) 56

2.4 Kernel-Based Classification 57

2.4.1 Kernel Trick Used in Kernel-Based Methods 57

2.4.2 Kernel-Based Fisher’s Linear Discriminant Analysis (KFLDA) 58

2.4.3 Kernel Support Vector Machine (K-SVM) 59

2.5 Conclusions 60

3 THREE-DIMENSIONAL RECEIVER OPERATING CHARACTERISTICS (3D ROC) ANALYSIS 63

3.1 Introduction 63

3.2 Neyman–Pearson Detection Problem Formulation 65

3.3 ROC Analysis 67

3.4 3D ROC Analysis 69

3.5 Real Data-Based ROC Analysis 72

3.5.1 How to Generate ROC Curves from Real Data 72

3.5.2 How to Generate Gaussian-Fitted ROC Curves 73

3.5.3 How to Generate 3D ROC Curves 75

3.5.4 How to Generate 3D ROC Curves for Multiple Signal Detection and Classification 77

3.6 Examples 78

3.6.1 Hyperspectral Imaging 79

3.6.1.1 Hyperspectral Target Detection 79

3.6.1.2 Linear Hyperspectral Mixture Analysis 80

3.6.2 Magnetic Resonance (MR) Breast Imaging 83

3.6.2.1 Breast Tumor Detection 84

3.6.2.2 Brain Tissue Classification 87

3.6.3 Chemical/Biological Agent Detection 91

3.6.4 Biometric Recognition 95

3.7 Conclusions 99

4 DESIGN OF SYNTHETIC IMAGE EXPERIMENTS 101

4.1 Introduction 102

4.2 Simulation of Targets of Interest 103

4.2.1 Simulation of Synthetic Subsample Targets 103

4.2.2 Simulation of Synthetic Mixed-Sample Targets 104

4.3 Six Scenarios of Synthetic Images 104

4.3.1 Panel Simulations 104

4.3.2 Three Scenarios for Target Implantation (TI) 106

4.3.2.1 Scenario TI1 (Clean Panels Implanted into Clean Background) 106

4.3.2.2 Scenario TI2 (Clean Panels Implanted into Noisy Background) 107

4.3.2.3 Scenario TI3 (Gaussian Noise Added to Clean Panels Implanted into Clean Background) 108

4.3.3 Three Scenarios for Target Embeddedness (TE) 108

4.3.3.1 Scenario TE1 (Clean Panels Embedded in Clean Background) 109

4.3.3.2 Scenario TE2 (Clean Panels Embedded in Noisy Background) 109

4.3.3.3 Scenario TE3 (Gaussian Noise Added to Clean Panels Embedded in Background) 110

4.4 Applications 112

4.4.1 Endmember Extraction 112

4.4.2 Linear Spectral Mixture Analysis (LSMA) 113

4.4.2.1 Mixed Pixel Classification 114

4.4.2.2 Mixed Pixel Quantification 114

4.4.3 Target Detection 114

4.4.3.1 Subpixel Target Detection 114

4.4.3.2 Anomaly Detection 122

4.5 Conclusions 123

5 VIRTUAL DIMENSIONALITY OF HYPERSPECTRAL DATA 124

5.1 Introduction 124

5.2 Reinterpretation of VD 126

5.3 VD Determined by Data Characterization-Driven Criteria 126

5.3.1 Eigenvalue Distribution-Based Criteria 127

5.3.1.1 Thresholding Energy Percentage 127

5.3.1.2 Thresholding Difference between Normalized Correlation Eigenvalues and Normalized Covariance Eigenvalues 128

5.3.1.3 Finding First Sudden Drop in the Normalized Eigenvalue Distribution 128

5.3.2 Eigen-Based Component Analysis Criteria 128

5.3.2.1 Singular Value Decomposition (SVD) 128

5.3.2.2 Principal Components Analysis (PCA) 129

5.3.3 Factor Analysis: Malinowski’s Error Theory 129

5.3.4 Information Theoretic Criteria (ITC) 130

5.3.4.1 AIC 131

5.3.4.2 MDL 131

5.3.5 Gershgorin Radius-Based Methods 131

5.3.5.1 Thresholding Gershgorin Radii 134

5.3.5.2 Thresholding Difference Gershgorin Radii between RLL and KLL 134

5.3.6 HFC Method 135

5.3.7 Discussions on Data Characterization-Driven Criteria 138

5.4 VD Determined by Data Representation-Driven Criteria 140

5.4.1 Orthogonal Subspace Projection (OSP) 140

5.4.2 Signal Subspace Estimation (SSE) 142

5.4.3 Discussions on OSP and SSE/HySime 143

5.5 Synthetic Image Experiments 144

5.5.1 Data Characterization-Driven Criteria 144

5.5.1.1 Target Implantation (TI) Scenarios 145

5.5.1.2 Target Embeddedness (TE) Scenarios 146

5.5.2 Data Representation-Driven Criteria 149

5.6 VD Estimated for Real Hyperspectral Images 155

5.7 Conclusions 163

6 DATA DIMENSIONALITY REDUCTION 168

6.1 Introduction 168

6.2 Dimensionality Reduction by Second-Order Statistics-Based Component Analysis Transforms 170

6.2.1 Eigen Component Analysis Transforms 170

6.2.1.1 Principal Components Analysis 170

6.2.1.2 Standardized Principal Components Analysis 172

6.2.1.3 Singular Value Decomposition 174

6.2.2 Signal-to-Noise Ratio-Based Components Analysis Transforms 176

6.2.2.1 Maximum Noise Fraction Transform 176

6.2.2.2 Noise-Adjusted Principal Component Transform 177

6.3 Dimensionality Reduction by High-Order Statistics-Based Components Analysis Transforms 179

6.3.1 Sphering 179

6.3.2 Third-Order Statistics-Based Skewness 181

6.3.3 Fourth-Order Statistics-Based Kurtosis 182

6.3.4 High-Order Statistics 182

6.3.5 Algorithm for Finding Projection Vectors 183

6.4 Dimensionality Reduction by Infinite-Order Statistics-Based Components Analysis Transforms 184

6.4.1 Statistics-Prioritized ICA-DR (SPICA-DR) 187

6.4.2 Random ICA-DR 188

6.4.3 Initialization Driven ICA-DR 189

6.5 Dimensionality Reduction by Projection Pursuit-Based Components Analysis Transforms 190

6.5.1 Projection Index-Based Projection Pursuit 191

6.5.2 Random Projection Index-Based Projection Pursuit 192

6.5.3 Projection Index-Based Prioritized Projection Pursuit 193

6.5.4 Initialization Driven Projection Pursuit 194

6.6 Dimensionality Reduction by Feature Extraction-Based Transforms 195

6.6.1 Fisher’s Linear Discriminant Analysis 195

6.6.2 Orthogonal Subspace Projection 196

6.7 Dimensionality Reduction by Band Selection 196

6.8 Constrained Band Selection 197

6.9 Conclusions 198

II: ENDMEMBER EXTRACTION 201

7 SIMULTANEOUS ENDMEMBER EXTRACTION ALGORITHMS (SM-EEAs) 207

7.1 Introduction 208

7.2 Convex Geometry-Based Endmember Extraction 209

7.2.1 Convex Geometry-Based Criterion: Orthogonal Projection 209

7.2.2 Convex Geometry-Based Criterion: Minimal Simplex Volume 214

7.2.2.1 Minimal-Volume Transform (MVT) 214

7.2.2.2 Convex Cone Analysis (CCA) 214

7.2.3 Convex Geometry-Based Criterion: Maximal Simplex Volume 215

7.2.3.1 Simultaneous N-FINDR (SM N-FINDR) 216

7.2.3.2 Iterative N-FINDR (IN-FINDR) 216

7.2.3.3 Various Versions of Implementing IN-FINDR 218

7.2.3.4 Discussions on Various Implementation Versions of IN-FINDR 222

7.2.3.5 Comparative Study Among Various Versions of IN-FINDR 222

7.2.3.6 Alternative SM N-FINDR 223

7.2.4 Convex Geometry-Based Criterion: Linear Spectral Mixture Analysis 225

7.3 Second-Order Statistics-Based Endmember Extraction 228

7.4 Automated Morphological Endmember Extraction (AMEE) 230

7.5 Experiments 231

7.5.1 Synthetic Image Experiments 231

7.5.1.1 Scenario TI1 (Endmembers Implanted in a Clean Background) 232

7.5.1.2 Scenario TI2 (Endmembers Implanted in a Noisy Background) 233

7.5.1.3 Scenario TI3 (Noisy Endmembers Implanted in a Noisy Background) 234

7.5.1.4 Scenario TE1 (Endmembers Embedded into a Clean Background) 235

7.5.1.5 Scenario TE2 (Endmembers Embedded into a Noisy Background) 235

7.5.1.6 Scenario TE3 (Noisy Endmembers Embedded into a Noisy Background) 236

7.5.2 Cuprite Data 237

7.5.3 HYDICE Data 237

7.6 Conclusions 239

8 SEQUENTIAL ENDMEMBER EXTRACTION ALGORITHMS (SQ-EEAs) 241

8.1 Introduction 241

8.2 Successive N-FINDR (SC N-FINDR) 244

8.3 Simplex Growing Algorithm (SGA) 244

8.4 Vertex Component Analysis (VCA) 247

8.5 Linear Spectral Mixture Analysis-Based SQ-EEAs 248

8.5.1 Automatic Target Generation Process-EEA (ATGP-EEA) 248

8.5.2 Unsupervised Nonnegativity Constrained Least-Squares-EEA

(UNCLS-EEA) 249

8.5.3 Unsupervised Fully Constrained Least-Squares-EEA (UFCLS-EEA) 250

8.5.4 Iterative Error Analysis-EEA (IEA-EEA) 251

8.6 High-Order Statistics-Based SQ-EEAS 252

8.6.1 Third-Order Statistics-Based SQ-EEA 252

8.6.2 Fourth-Order Statistics-Based SQ-EEA 252

8.6.3 Criterion for kth Moment-Based SQ-EEA 253

8.6.4 Algorithm for Finding Projection Vectors 253

8.6.5 ICA-Based SQ-EEA 254

8.7 Experiments 254

8.7.1 Synthetic Image Experiments 255

8.7.2 Real Hyperspectral Image Experiments 258

8.7.2.1 Cuprite Data 258

8.7.2.2 HYDICE Data 260

8.8 Conclusions 262

9 INITIALIZATION-DRIVEN ENDMEMBER EXTRACTION ALGORITHMS (ID-EEAs) 265

9.1 Introduction 265

9.2 Initialization Issues 266

9.2.1 Initial Conditions to Terminate an EEA 267

9.2.2 Selection of an Initial Set of Endmembers for an EEA 267

9.2.3 Issues of Random Initial Conditions Demonstrated by Experiments 268

9.2.3.1 HYDICE Experiments 268

9.2.3.2 AVIRIS Experiments 270

9.3 Initialization-Driven EEAs 271

9.3.1 Initial Endmember-Driven EEAs 272

9.3.1.1 Finding Maximum Length of Data Sample Vectors 272

9.3.1.2 Finding Sample Mean of Data Sample Vectors 273

9.3.2 Endmember Initialization Algorithm for SM-EEAs 274

9.3.2.1 SQ-EEAs 274

9.3.2.2 Maxmin-Distance Algorithm 275

9.3.2.3 ISODATA 275

9.3.3 EIA-Driven EEAs 275

9.4 Experiments 278

9.4.1 Synthetic Image Experiments 278

9.4.2 Real Image Experiments 281

9.5 Conclusions 283

10 RANDOM ENDMEMBER EXTRACTION ALGORITHMS (REEAs) 287

10.1 Introduction 287

10.2 Random PPI (RPPI) 288

10.3 Random VCA (RVCA) 290

10.4 Random N-FINDR (RN-FINDR) 290

10.5 Random SGA (RSGA) 292

10.6 Random ICA-Based EEA (RICA-EEA) 292

10.7 Synthetic Image Experiments 293

10.7.1 RPPI 293

10.7.2 Various Random Versions of IN-FINDR 296

10.7.2.1 Scenario TI2 297

10.7.2.2 Scenario TI3 299

10.7.2.3 TE2 301

10.7.2.4 TE3 Scenario 303

10.8 Real Image Experiments 305

10.8.1 HYDICE Image Experiments 305

10.8.1.1 RPPI 306

10.8.1.2 RN-FINDR 306

10.8.2 AVIRIS Image Experiments 309

10.8.2.1 RPPI 309

10.8.2.2 RN-FINDR 310

10.9 Conclusions 313

11 EXPLORATION ON RELATIONSHIPS AMONG ENDMEMBER EXTRACTION ALGORITHMS 316

11.1 Introduction 316

11.2 Orthogonal Projection-Based EEAs 318

11.2.1 Relationship among PPI, VCA, and ATGP 319

11.2.1.1 Relationship Between PPI and ATGP 319

11.2.1.2 Relationship Between PPI and VCA 320

11.2.1.3 Relationship Between ATGP and VCA 321

11.2.1.4 Discussions 322

11.2.2 Experiments-Based Comparative Study and Analysis 323

11.2.2.1 Synthetic Image Experiment: TI2 323

11.2.2.2 Real Image Experiments 325

11.3 Comparative Study and Analysis Between SGA and VCA 330

11.4 Does an Endmember Set Really Yield Maximum Simplex Volume? 339

11.5 Impact of Dimensionality Reduction on EEAs 344

11.6 Conclusions 348

III: SUPERVISED LINEAR HYPERSPECTRAL MIXTURE ANALYSIS 351

12 ORTHOGONAL SUBSPACE PROJECTION REVISITED 355

12.1 Introduction 355

12.2 Three Perspectives to Derive OSP 358

12.2.1 Signal Detection Perspective Derived from (d,U)-Model and OSP-Model 359

12.2.2 Fisher’s Linear Discriminant Analysis Perspective from OSP-Model 360

12.2.3 Parameter Estimation Perspective from OSP-Model 362

12.2.4 Relationship Between dLS ap ðrÞ and Least-Squares Linear Spectral Mixture Analysis 362

12.3 Gaussian Noise in OSP 364

12.3.1 Signal Detector in Gaussian Noise Using OSP-Model 365

12.3.2 Gaussian Maximum Likelihood Classifier Using OSP-Model 366

12.3.3 Gaussian Maximum Likelihood Estimator 367

12.3.4 Examples 367

12.4 OSP Implemented with Partial Knowledge 372

12.4.1 CEM 373

12.4.1.1 d Is Orthogonal to U (i.e., P? Ud ¼ d) and R ¼ I (i.e., Spectral Correlation is Whitened) 374

12.4.1.2 An Alternative Approach to Implementing CEM 374

12.4.1.3 CEM Implemented in Conjunction with P? U 375

12.4.1.4 CEM Implemented in Conjunction with P? U in White Noise 376

12.4.2 TCIMF 377

12.4.2.1 D ¼ mp ¼ d with nD ¼ 1 and U ¼ m1m2 mp1 with nU ¼ p 1 378

12.4.2.2 D ¼ mp ¼ d with nD ¼ 1 and U ¼ m1 m2 mp1 with nU ¼ p 1 and R ¼ I 378

12.4.2.3 D ¼ d and U ¼ Ø (i.e., Only the Desired Signature d is Available) 378

12.4.3 Examples 379

12.5 OSP Implemented Without Knowledge 383

12.6 Conclusions 390

13 FISHER’S LINEAR SPECTRAL MIXTURE ANALYSIS 391

13.1 Introduction 391

13.2 Feature Vector-Constrained FLSMA (FVC-FLSMA) 392

13.3 Relationship Between FVC-FLSMA and LCMV, TCIMF, and CEM 395

13.4 Relationship Between FVC-FLSMA and OSP 396

13.5 Relationship Between FVC-FLSMA and LCDA 396

13.6 Abundance-Constrained Least Squares FLDA (ACLS-FLDA) 397

13.7 Synthetic Image Experiments 398

13.8 Real Image Experiments 402

13.8.1 Image Background Characterized by Supervised Knowledge 402

13.8.2 Image Background Characterized by Unsupervised Knowledge 405

13.9 Conclusions 409

14 WEIGHTED ABUNDANCE-CONSTRAINED LINEAR SPECTRAL MIXTURE ANALYSIS 411

14.1 Introduction 411

14.2 Abundance-Constrained LSMA (AC-LSMA) 413

14.3 Weighted Least-Squares Abundance-Constrained LSMA 413

14.3.1 Weighting Matrix Derived from a Parameter Estimation Perspective 414

14.3.1.1 MD-Weighted AC-LSMA 415

14.3.1.2 LCMV-Weighted AC-LSMA 415

14.3.2 Weighting Matrix Derived from Fisher’s Linear Discriminant Analysis Perspective 416

14.3.3 Weighting Matrix Derived from an Orthogonal Subspace Projection Perspective 417

14.3.3.1 OSP-Weighted AC-LSMA 417

14.3.3.2 SSP-Weighted AC-LSMA 418

14.4 Synthetic Image-Based Computer Simulations 419

14.5 Real Image Experiments 426

14.6 Conclusions 432

15 KERNEL-BASED LINEAR SPECTRAL MIXTURE ANALYSIS 434

15.1 Introduction 434

15.2 Kernel-Based LSMA (KLSMA) 436

15.2.1 Kernel Least Squares Orthogonal Subspace Projection (KLSOSP) 436

15.2.2 Kernel-Based Non-Negative Constraint Least Square (KNCLS) 438

15.2.3 Kernel-Based Fully Constraint Least Square (KFCLS) 439

15.2.4 A Note on Kernelization 440

15.3 Synthetic Image Experiments 441

15.4 AVIRIS Data Experiments 444

15.4.1 Radial Basis Function Kernels 449

15.4.2 Polynomial Kernels 452

15.4.3 Sigmoid Kernels 454

15.5 HYDICE Data Experiments 460

15.6 Conclusions 462

IV: UNSUPERVISED HYPERSPECTRAL IMAGE ANALYSIS 465

16 HYPERSPECTRAL MEASURES 469

16.1 Introduction 469

16.2 Signature Vector-Based Hyperspectral Measures for Target Discrimanition and Identification 470

16.2.1 Euclidean Distance 471

16.2.2 Spectral Angle Mapper 471

16.2.3 Orthogonal Projection Divergence 471

16.2.4 Spectral Information Divergence 471

16.3 Correlation-Weighted Hyperspectral Measures for Target Discrimanition and Identification 472

16.3.1 Hyperspectral Measures Weighted by A Priori Correlation 473

16.3.1.1 OSP-Based Hyperspectral Measures for Discrimination 473

16.3.1.2 OSP-Based Hyperspectral Measures for Identification 473

16.3.2 Hyperspectral Measures Weighted by A Posteriori Correlation 474

16.3.2.1 Covariance Matrix-Weighted Hyperspectral Measures 474

16.3.2.2 Correlation Matrix-Weighted Hyperspectral Measures 475

16.3.2.3 Covariance Matrix-Weighted Matched Filter Distance 475

16.3.2.4 Correlation Matrix-Weighted Matched Filter Distance 476

16.4 Experiments 477

16.4.1 HYDICE Image Experiments 477

16.4.2 AVIRIS Image Experiments 478

16.5 Conclusions 482

17 UNSUPERVISED LINEAR HYPERSPECTRAL MIXTURE ANALYSIS 483

17.1 Introduction 483

17.2 Least Squares-Based ULSMA 486

17.3 Component Analysis-Based ULSMA 488

17.4 Synthetic Image Experiments 490

17.4.1 LS-ULSMA 491

17.4.2 CA-ULSMA 499

17.5 Real-Image Experiments 503

17.5.1 LS-ULSMA 503

17.5.2 CA-ULSMA 505

17.5.3 Qualitative and Quantitative Analyses between ULSMA and SLSMA 511

17.6 ULSMAVersus Endmember Extraction 517

17.7 Conclusions 524

18 PIXEL EXTRACTION AND INFORMATION 526

18.1 Introduction 526

18.2 Four Types of Pixels 527

18.3 Algorithms Selected to Extract Pixel Information 528

18.4 Pixel Information Analysis via Synthetic Images 528

18.5 Real Image Experiments 534

18.5.1 AVIRIS Image Data 534

18.5.2 DAIS 7915 Image Data 537

18.6 Conclusions 539

V: HYPERSPECTRAL INFORMATION COMPRESSION 541

19 EXPLOITATION-BASED HYPERSPECTRAL DATA COMPRESSION 545

19.1 Introduction 545

19.2 Hyperspectral Information Compression Systems 547

19.3 Spectral/Spatial Compression 549

19.3.1 Dimensionality Reduction by Transform-Based Spectral Compression 550

19.3.1.1 Determination of Number of PCs/ICs to be Retained 551

19.3.1.2 PCA (ICA)/2D Compression 551

19.3.1.3 PCA (ICA)/3D Compression 552

19.3.1.4 Inverse PCA (Inverse ICA)/2D Compression 553

19.3.1.5 Inverse PCA (Inverse PCA)/3D Compression 553

19.3.1.6 Mixed Component Transforms for Hyperspectral Compression 554

19.3.2 Dimensionality Reduction by Band Selection-Based Spectral Compression 556

19.4 Progressive Spectral/Spatial Compression 557

19.5 3D Compression 557

19.5.1 3D-Multicomponent JPEG 557

19.5.2 3D-SPIHT Compression 558

19.6 Exploration-Based Applications 559

19.6.1 Linear Spectral Mixture Analysis 559

19.6.2 Subpixel Target Detection 559

19.6.3 Anomaly Detection 560

19.6.4 Endmember Extraction 561

19.7 Experiments 561

19.7.1 Synthetic Image Experiments 562

19.7.2 Real Image Experiments 567

19.8 Conclusions 580

20 PROGRESSIVE SPECTRAL DIMENSIONALITY PROCESS 581

20.1 Introduction 582

20.2 Dimensionality Prioritization 584

20.3 Representation of Transformed Components for DP 585

20.3.1 Projection Index-Based PP 585

20.3.2 Mixed Projection Index-Based Prioritized PP (M-PIPP) 587

20.3.3 Projection Index-Based Prioritized PP (PI-PRPP) 587

20.3.4 Initialization-Driven PIPP (ID-PIPP) 588

20.4 Progressive Spectral Dimensionality Process 589

20.4.1 Progressive Principal Components Analysis 591

20.4.1.1 Simultaneous PCA 591

20.4.1.2 Progressive PCA 592

20.4.1.3 Sequential PCA 593

20.4.1.4 Initialization-Driven PCA 595

20.4.2 Progressive High-Order Statistics Component Analysis 596

20.4.3 Progressive Independent Component Analysis 596

20.5 Hyperspectral Compression by PSDP 597

20.5.1 Progressive Spectral Dimensionality Reduction 597

20.5.2 Progressive Spectral Dimensionality Expansion 597

20.6 Experiments for PSDP 598

20.6.1 Endmember Extraction 598

20.6.2 Land Cover/Use Classification 599

20.6.3 Linear Spectral Mixture Analysis 603

20.7 Conclusions 608

21 PROGRESSIVE BAND DIMENSIONALITY PROCESS 613

21.1 Introduction 614

21.2 Band Prioritization 615

21.3 Criteria for Band Prioritization 617

21.3.1 Second-Order Statistics-Based BPC 617

21.3.1.1 Variance-Based BPC 617

21.3.1.2 Signal-to-Noise-Ratio-Based BPC 618

21.3.2 High-Order Statistics-Based BPC 618

21.3.2.1 Skewness 618

21.3.2.2 Kurtosis 618

21.3.3 Infinite-Order Statistics-Based BPC 618

21.3.3.1 Entropy 619

21.3.3.2 Information Divergence 619

21.3.4 Classification-Based BPC 619

21.3.4.1 Fisher’s Linear Discriminant Analysis (FLDA)-Based BPC 619

21.3.4.2 OSP-Based BPC 620

21.3.5 Constrained Band Correlation/Dependence Minimization 620

21.3.5.1 Band Correlation/Dependence Minimization 621

21.3.5.2 Band Correlation Constraint 622

21.4 Experiments for BP 624

21.4.1 Applications Using Highest-Prioritized Bands 625

21.4.1.1 Unsupervised Linear Spectral Mixture Analysis 626

21.4.1.2 Endmember Extraction 632

21.4.2 Applications Using Least-Prioritized Bands 635

21.4.2.1 Unsupervised Linear Spectral Mixture Analysis 636

21.4.2.2 Endmember Extraction 637

21.4.3 Applications Using Mixing Highest-Prioritized and Least-Prioritized Bands 646

21.4.3.1 Unsupervised Linear Spectral Mixture Analysis 646

21.4.3.2 Endmember Extraction 646

21.5 Progressive Band Dimensionality Process 651

21.6 Hyperspectral Compresssion by PBDP 653

21.6.1 Progressive Band Dimensionality Reduction Via BP 654

21.6.2 Progressive Band Dimensionality Expansion Via BP 655

21.7 Experiments for PBDP 656

21.7.1 Endmember Extraction 656

21.7.2 Land Cover/Use Classification 658

21.7.3 Linear Spectral Mixture Analysis 660

21.8 Conclusions 662

22 DYNAMIC DIMENSIONALITYALLOCATION 664

22.1 Introduction 664

22.2 Dynamic Dimensionality Allocaction 665

22.3 Signature Discriminatory Probabilties 667

22.4 Coding Techniques for Determining DDA 667

22.4.1 Shannon Coding-Based DDA 667

22.4.2 Huffman Coding-Based DDA 668

22.4.3 Hamming Coding-Based DDA 669

22.4.4 Notes on DDA 669

22.5 Experiments for Dynamic Dimensionality Allocation 669

22.5.1 Reflectance Cuprite Data 670

22.5.2 Purdue’s Data 672

22.5.3 HYDICE Data 674

22.6 Conclusions 682

23 PROGRESSIVE BAND SELECTION 683

23.1 Introduction 683

23.2 Band De-Corrleation 684

23.2.1 Spectral Measure-Based BD 684

23.2.2 Orthogonalization-Based BD 685

23.3 Progressive Band Selection 686

23.3.1 PBS: BP Followed by BD 687

23.3.2 PBS: BD Followed by BP 687

23.4 Experiments for Progressive Band Selection 688

23.5 Endmember Extraction 688

23.6 Land Cover/Use Classification 690

23.7 Linear Spectral Mixture Analysis 694

23.8 Conclusions 715

VI: HYPERSPECTRAL SIGNAL CODING 717

24 BINARY CODING FOR SPECTRAL SIGNATURES 719

24.1 Introduction 719

24.2 Binary Coding 720

24.2.1 SPAM Binary Coding 720

24.2.2 Median Partition Binary Coding 721

24.2.3 Halfway Partition Binary Coding 722

24.2.4 Equal Probability Partition Binary Coding 722

24.3 Spectral Feature-Based Coding 723

24.4 Experiments 725

24.4.1 Computer Simulations 725

24.4.2 Real Hyperspectral Image Data 730

24.5 Conclusions 740

25 VECTOR CODING FOR HYPERSPECTRAL SIGNATURES 741

25.1 Introduction 741

25.2 Spectral Derivative Feature Coding 743

25.2.1 Re-interpretation of SPAM and SFBC 743

25.2.2 Spectral Derivative Feature Coding 744

25.2.3 AVIRIS Data Experiments 746

25.2.3.1 Signature Discrimination 747

25.2.3.2 Mixed Signature Classification 748

25.2.4 NIST Gas Data Experiments 749

25.2.4.1 Signature Discrimination 750

25.2.4.2 Mixed Signature Classification 751

25.3 Spectral Feature Probabilistic Coding 755

25.3.1 Arithmetic Coding 755

25.3.2 Spectral Feature Probabilistic Coding 756

25.3.3 AVIRIS Data Experiments 758

25.3.4 NIST Gas Data Experiments 760

25.4 Real Image Experiments 764

25.4.1 SDFC 764

25.4.2 SFPC 766

25.5 Conclusions 771

26 PROGRESSIVE CODING FOR SPECTRAL SIGNATURES 772

26.1 Introduction 772

26.2 Multistage Pulse Code Modulation 774

26.3 MPCM-Based Progressive Spectral Signature Coding 783

26.3.1 Spectral Discrimination 784

26.3.2 Spectral Identification 785

26.4 NIST-GAS Data Experiments 786

26.5 Real Image Hyperspectral Experiments 790

26.6 Conclusions 796

VII: HYPERSPECTRAL SIGNAL CHARACTERIZATION 797

27 VARIABLE-NUMBERVARIABLE-BAND SELECTION FOR HYPERSPECTRAL SIGNALS 799

27.1 Introduction 799

27.2 Orthogonal Subspace Projection-Based Band Prioritization Criterion 801

27.3 Variable-Number Variable-Band Selection 803

27.4 Experiments 806

27.4.1 Hyperspectral Data 806

27.4.1.1 Signature Discrimination 806

27.4.1.2 Signature Classification/Identification 809

27.4.1.3 Noise Effect on VNVBS 811

27.4.2 NIST-Gas Data 813

27.4.2.1 Signature Discrimination 813

27.4.2.2 Signature Classification/Identification 814

27.4.2.3 Signature Discrimination between Two Signatures with Different Numbers of Bands 816

27.5 Selection of Reference Signatures 819

27.6 Conclusions 819

28 KALMAN FILTER-BASED ESTIMATION FOR HYPERSPECTRAL SIGNALS 820

28.1 Introduction 820

28.2 Kalman Filter-Based Linear Unmixing 822

28.3 Kalman Filter-Based Spectral Characterization Signal-Processing Techniques 824

28.3.1 Kalman Filter-based Spectral Signature Estimator 825

28.3.2 Kalman Filter-Based Spectral Signature Identifier 826

28.3.3 Kalman Filter-Based Spectral Signature Quantifier 828

28.4 Computer Simulations Using AVIRIS Data 831

28.4.1 KFSSE 831

28.4.2 KFSSI 832

28.4.2.1 Subpixel Target Identification by KFSSI 832

28.4.2.2 Mixed Target Identification by KFSSI 838

28.4.3 KFSSQ 839

28.4.3.1 Subpixel Target Quantification by KFSSQ 839

28.4.3.2 Mixed Target Quantification by KFSSQ 840

28.5 Computer Simulations Using NIST-Gas Data 843

28.5.1 KFSSE 843

28.5.2 KFSSI 843

28.5.2.1 Subpixel Target Identification by KFSSI 843

28.5.2.2 Mixed Target Identification by KFSSI 848

28.5.3 KFSSQ 849

28.5.3.1 Subpixel Target Identification by KFSSQ 849

28.5.3.2 Mixed Target Quantification by KFSSQ 849

28.6 Real Data Experiments 852

28.6.1 KFSSE 852

28.6.2 KFSSI 852

28.6.3 KFSSQ 856

28.7 Conclusions 857

29 WAVELET REPRESENTATION FOR HYPERSPECTRAL SIGNALS 859

29.1 Introduction 859

29.2 Wavelet Analysis 860

29.2.1 Multiscale Approximation 860

29.2.2 Scaling Function 861

29.2.3 Wavelet Function 862

29.3 Wavelet-Based Signature Characterization Algorithm 863

29.3.1 Wavelet-Based Signature Characterization Algorithm for Signature Self-Tuning 863

29.3.2 Wavelet-Based Signature Characterization Algorithm for Signature Self-Correction 866

29.3.3 Signature Self-Discrimination, Classification, and Identification 867

29.4 Synthetic Image-Based Computer Simulations 868

29.4.1 Signature Self-Tuning and Self-Denoising 869

29.4.2 Signature Self-Discrimination, Self-Classification, and Self-Identification 870

29.5 Real Image Experiments 871

29.6 Conclusions 875

VIII: APPLICATIONS 877

30 APPLICATIONS OF TARGET DETECTION 879

30.1 Introduction 879

30.2 Size Estimation of Subpixel Targets 880

30.3 Experiments 881

30.3.1 Synthetic Image Experiments 881

30.3.2 HYDICE Image Experiments 886

30.4 Concealed Target Detection 891

30.5 Computer-Aided Detection and Classification Algorithm for Concealed Targets 892

30.6 Experiments for Concealed Target Detection 893

30.7 Conclusions 895

31 NONLINEAR DIMENSIONALITY EXPANSION TO MULTISPECTRAL IMAGERY 897

31.1 Introduction 897

31.2 Band Dimensionality Expansion 899

31.2.1 Rationale for Developing BDE 899

31.2.2 Band Expansion Process 901

31.3 Hyperspectral Imaging Techniques Expanded by BDE 902

31.3.1 BEP-Based Orthogonal Subspace Projection 903

31.3.2 BEP-Based Constrained Energy Minimization 903

31.3.3 BEP-Based RX-Detector 903

31.4 Feature Dimensionality Expansion by Nonlinear Kernels 904

31.4.1 FDE by Transformation 905

31.4.2 FDE by Classification 907

31.4.2.1 FDE by Classification using Sample Spectral Correlation 907

31.4.2.2 FDE by Classification using Intrapixel Spectral Correlation 908

31.5 BDE in Conjunction with FDE 909

31.6 Multispectral Image Experiments 909

31.7 Conclusion 918

32 MULTISPECTRAL MAGNETIC RESONANCE IMAGING 920

32.1 Introduction 920

32.2 Linear Spectral Mixture Analysis for MRI 923

32.2.1 Orthogonal Subspace Projection to MRI 925

32.2.2 Band Expansion Process-Based OSP 927

32.2.3 Unsupervised Orthogonal Subspace Projection 928

32.3 Linear Spectral Random Mixture Analysis for MRI 928

32.3.1 Source Separation-Based OC-ICA for MR Image Analysis 930

32.3.2 Band Expansion Process Over complete ICA for MR Image Analysis 931

32.3.2.1 Eigenvector-Prioritized ICA 931

32.3.2.2 High-Order Statistics-Based PICA 932

32.3.2.3 ATGP-Prioritized PCA 932

32.4 Kernel-Based Linear Spectral Mixture Analysis 933

32.5 Synthetic MR Brain Image Experiments 933

32.6 Real MR Brain Image Experiments 951

32.7 Conclusions 955

33 CONCLUSIONS 956

33.1 Design Principles for Nonliteral Hyperspectral Imaging Techniques 956

33.1.1 Pigeon-Hole Principle 956

33.1.1.1 Multispectral Imagery Versus Hyperspectral Imagery 957

33.1.1.2 Virtual Dimensionality 957

33.1.2 Principle of Orthogonality 963

33.2 Endmember Extraction 964

33.3 Linear Spectral Mixture Analysis 970

33.3.1 Supervised LSMA 970

33.3.2 Unsupervised LSMA 973

33.4 Anomaly Detection 974

33.5 Support Vector Machines and Kernel-Based Approaches 977

33.6 Hyperspectral Compression 981

33.7 Hyperspectral Signal Processing 984

33.7.1 Signal Coding 986

33.7.2 Signal Estimation 986

33.8 Applications 987

33.9 Further Topics 987

33.9.1 Causal Processing 987

33.9.2 Real-Time Processing 988

33.9.3 FPGA Designs for Hardware Implementation 989

33.9.4 Parallel Processing 990

33.9.5 Progressive Hyperspectral Processing 990

GLOSSARY 993

APPENDIX: ALGORITHM COMPENDIUM 997

REFERENCES 1052

INDEX 1071



Please wait while the item is added to your cart...