Detection Estimation and Modulation Theory, Part I

by ;
  • ISBN13:


  • ISBN10:


  • Edition: 2nd
  • Format: eBook
  • Copyright: 2013-04-26
  • Publisher: Wiley

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $117.33 Save up to $11.73
  • Rent Book $105.60
    Add to Cart Free Shipping


Supplemental Materials

What is included with this book?

  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
  • The Rental copy of this book is not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.


Originally published in 1968, Harry Van Trees’s Detection, Estimation, and Modulation Theory, Part I is one of the great time-tested classics in the field of signal processing. Highly readable and practically organized, it is as imperative today for professionals, researchers, and students in optimum signal processing as it was over thirty years ago. The second edition is a thorough revision and expansion almost doubling the size of the first edition and accounting for the new developments thus making it again the most comprehensive and up-to-date treatment of the subject.

With a wide range of applications such as radar, sonar, communications, seismology, biomedical engineering, and radar astronomy, among others, the important field of detection and estimation has rarely been given such expert treatment as it is here. Each chapter includes section summaries, realistic examples, and a large number of challenging problems that provide excellent study material. This volume which is Part I of a set of four volumes is the most important and widely used textbook and professional reference in the field.

Table of Contents

1 Introduction

1.1 Introduction

1.2 Topical Outline

1.3 Possible Approaches

1.4 Organization

2 Classic Detection Theory

2.1 Introduction

2.2 Simple Binary Hypothesis Tests

2.2.1 Decision Criteria

2.2.2 Performance: Receiver Operating Characteristic

2.3 M Hypotheses

2.4 Performance Bounds and Approximations

2.5 Monte Carlo Simulation

2.5.1 Monte Carlo Simulation Techniques

2.5.2 Importance Sampling Simulation of PF Simulation of PM Independent Observations Simulation of the ROC Examples Iterative Importance Sampling

2.5.3 Summary

2.6 Summary

2.7 Problems

3 General Gaussian Detection

3.1 Detection of Gaussian Random Vectors

3.1.1 Real Gaussian Random Vectors

3.1.2 Circular Complex Gaussian Random Vectors

3.1.3 General Gaussian Detection Real Gaussian Vectors Circular Complex Gaussian Vectors

3.2 Equal Covariance Matrices

3.2.1 Independent Components with Equal Variance

3.2.2 Independent Components with Unequal Variances

3.2.3 General Case: Eigendecomposition

3.2.4 Optimum Signal Design

3.2.5 Interference Matrix: Estimator-Subtractor

3.2.6 Low-Rank Models

3.2.7 Summary

3.3 Equal Mean Vectors

3.3.1 Diagonal Covariance Matrix on H0: Equal Variance Independent, Identically Distributed Signal Components Independent Signal Components: Unequal Variances Correlated Signal Components Low-Rank Signal Model Symmetric Hypotheses, Uncorrelated Noise

3.3.2 Non-diagonal Covariance Matrix on H0 Signal on H1 Only Signal on Both Hypotheses

3.3.3 Summary

3.4 General Gaussian

3.4.1 Real Gaussian Model

3.4.2 Circular Complex Gaussian Model

3.4.3 Single Quadratic Form

3.4.4 Summary

3.5 M Hypotheses

3.6 Summary

3.7 Problems

4 Classical Parameter Estimation

4.1 Introduction

4.2 Scalar Parameter Estimation

4.2.1 Random Parameters: Bayes Estimation

4.2.2 Nonrandom Parameter Estimation

4.2.3 Bayesian Bounds Lower Bound on the MSE Asymptotic Behavior

4.2.4 Case Study

4.2.5 Exponential Family Nonrandom Parameters Random Parameters

4.2.6 Summary of Scalar Parameter Estimation

4.3 Multiple Parameter Estimation

4.3.1 Estimation Procedures Random Parameter Nonrandom Parameters

4.3.2 Measures of Error Nonrandom Parameters Random Parameters

4.3.3 Bounds on Estimation Error Nonrandom Parameters Random Parameters

4.3.4 Exponential Family Nonrandom Parameters Random Parameters

4.3.5 Nuisance Parameters Nonrandom Parameters Random Parameters Hybrid Parameters

4.3.6 Hybrid Parameters Joint ML and MAP Estimation Nuisance Parameters

4.3.7 Summary of Multiple Parameter Estimation

4.4 Global Bayesian Bounds

4.4.1 Covariance Inequality Bounds Covariance Inequality Bayesian Bounds Scalar Parameters Vector Parameters Combined Bayesian Bounds Functions of the Parameter Vector Summary of Covariance Inequality Bounds

4.4.2 Method of Interval Estimation

4.4.3 Summary of Global Bayesian Bounds

4.5 Composite Hypotheses

4.5.1 Introduction

4.5.2 Random Parameters

4.5.3 Nonrandom Parameters

4.5.4 Simulation

4.5.5 Summary of Composite Hypotheses

4.6 Summary

4.7 Problems

5 General Gaussian Estimation

5.1 Introduction

5.2 Nonrandom Parameters

5.2.1 General Gaussian Estimation Model

5.2.2 Maximum Likelihood Estimation

5.2.3 Cramér-Rao Bound

5.2.4 Fisher Linear Gaussian Model Introduction White Noise Low-Rank Interference

5.2.5 Separable Models for Mean Parameters

5.2.6 Covariance Matrix Parameters White Noise Colored Noise Rank One Signal Matrix Plus White Noise Rank One Signal Matrix Plus Colored Noise

5.2.7 Linear Gaussian Mean and Covariance Matrix Parameters White Noise Colored Noise General Covariance Matrix

5.2.8 Computational Algorithms Introduction Gradient Techniques Alternating Projection Algorithm Expectation Maximization Algorithm Summary

5.2.9 Equivalent Estimation Algorithms Least Squares Minimum Variance Distortionless Response Summary

5.2.10 Sensitivity, Mismatch, and Diagonal Loading Sensitivity and Array Perturbations Diagonal Loading

5.2.11 Summary

5.3 Random Parameters

5.3.1 Model, MAP Estimation, and the BCRB

5.3.2 Bayesian Linear Gaussian Model

5.3.3 Summary

5.4 Sequential Estimation

5.4.1 Sequential Bayes Estimation

5.4.2 Recursive Maximum Likelihood

5.4.3 Summary

5.5 Summary

5.6 Problems

6 Representation of Random Processes

6.1 Introduction

6.2 Orthonormal Expansions: Deterministic Signals

6.3 Random Process Characterization

6.3.1 Random Processes: Conventional Characterizations

6.3.2 Series Representation of Sample Functions of Random Processes

6.3.3 Gaussian Processes

6.4 Homogeous Internal Equations and Eigenfunctions

6.4.1 Rational Spectra

6.4.2 Bandlimited Spectra

6.4.3 Nonstationary Processes

6.4.4 White Noise Processes

6.4.5 Low Rank Kernels

6.4.6 The Optimum Linear Filter

6.4.7 Properties of Eigenfunctions and Eigenvalues Monotonic property Asymptotic behavior properties

6.5 Vector Random Processes

6.6 Summary

6.7 Problems

7 Detection of Signals – Estimation of Signal Parameters

7.1 Introduction

7.1.1 Models Detection Estimation

7.1.2 Format

7.2 Detection and Estimation in White Gaussian Noise

7.2.1 Detection of Signals in Additive White Gaussian Noise Simple binary detection General binary detection in white Gaussian noise M-ary detection in white Gaussian noise Sensitivity

7.2.2 Linear Estimation

7.2.3 Nonlinear Estimation

7.2.4 Summary of Known Signals in White Gaussian Noise Detection Estimation

7.3 Detection and Estimation in Nonwhite Gaussian Noise

7.3.1 “Whitening” Approach Structures Construction of Qn(t; u) and g(t) Summary

7.3.2 A Direct Derivation Using the Karhunen-Loève Expansion

7.3.3 A Direct Derivation with a Sufficient Statistic

7.3.4 Detection Performance Performance: Simple binary detection problem Optimum signal design: Coincident intervals Singularity General binary receivers

7.3.5 Estimation

7.3.6 Solution Techniques for Integral Equations Infinite observation interval: Stationary noise Finite observation interval: Rational spectra Finite observation time: Separable kernels

7.3.7 Sensitivity, Mismatch and diagonal loading

7.3.8 Known Linear Channels Summary

7.4 Signals with Unwanted Parameters: The Composite Hypothesis Problem

7.4.1 Random Phase Angles

7.4.2 Random Amplitude and Phase

7.4.3 Other Target Models

7.4.4 Nonrandom Parameters Summary

7.5 Multiple Channels

7.5.1 Vector Karhunen-Loève Application

7.6 Multiple Parameter Estimation

7.6.1 Known Signal in Additive White Gaussian Noise

7.6.2 Separable Models

7.6.3 Summary

7.7 Summary

7.8 Problems

8 Estimation of Continuous–Time Random Processes

8.1 Optimum Linear Processes

8.2 Realizable Linear Filters: Stationary Processes, Infinite Past: Wiener Filters

8.2.1 Solution of Wiener-Hopf Equation

8.2.2 Errors in Optimum Systems

8.2.3 Unrealizable Filters

8.2.4 Closed-Form Error Expressions

8.3 Gaussian-Markov Processes: Kalman Filter

8.3.1 Differential Equation Representation of Linear Systems and Random Process Generation

8.3.2 Kalman Filter

8.3.3 Realizable Whitening Filter

8.3.4 Generalizations

8.3.5 Implementation Issues

8.4 Bayesian Estimation of Non-Gaussian Models

8.4.1 The Extended Kalman Filter Linear AWGN process and observations Linear AWGN process, nonlinear AWGN observations Nonlinear AWGN process and observations Nonlinear process and observations

8.4.2 Bayesian Cramér-Rao Bounds: Continuous-Time

8.4.3 Summary

8.5 Summary

8.6 Problems

9 Estimation of Discrete–Time Random Processes

9.1 Introduction

9.2 Discrete-time Wiener Filtering

9.2.1 Model

9.2.2 Random Process Models

9.2.3 Optimum FIR Filters

9.2.4 Unrealizable IIR Wiener Filters

9.2.5 Realizable IIR Wiener Filters

9.2.6 Summary: Discrete-time Wiener Filter

9.3 Discrete-time Kalman filter

9.3.1 Random process models

9.3.2 Kalman Filter Derivation Reduced Dimension Implementations Applications Estimation in Non-white Noise Sequential Processing of Estimators Square-root Filters Divergence Sensitivity and Model Mismatch Summary: Kalman Filters

9.3.3 Kalman Predictors Fixed-lead prediction Fixed-point prediction Fixed-Interval Prediction Summary: Kalman Predictors

9.3.4 Kalman Smoothing Fixed Interval Smoothing Fixed Lag Smoothing Summary: Kalman Smoothing

9.3.5 Bayesian Estimation of Nonlinear Models General Nonlinear Model: MMSE and MAP Estimation Extended Kalman Filter Recursive Bayesian Cramér-Rao Bounds Applications Joint State And Parameter Estimation Continuous–Time Processes and Discrete–Time Observations Summary

9.3.6 Summary: Kalman Filters

9.4 Summary

9.5 Problems

10 Detection of Gaussian Signals

10.1 Introduction

10.2 Detection of Continuous-time Gaussian Processes

10.2.1 Sampling

10.2.2 Optimum Continuous-Time Receivers

10.2.3 Performance of Optimum Receivers

10.2.4 State-Variable Realization

10.2.5 Stationary Process-Long Observation Time (SPLOT) Receiver

10.2.6 Low-rank Kernels

10.2.7 Summary

10.3 Detection of Discrete-Time Gaussian Processes

10.3.1 Second Moment Characterization Known means and covariance matrices Means and covariance matrices with unknown parameters

10.3.2 State Variable Characterization

10.3.3 Summary

10.4 Summary

10.5 Problems

11 Epilogue

11.1 Classical Detection and Estimation Theory

11.1.1 Classical Detection Theory

11.1.2 General Gaussian Detection

11.1.3 Classical Parameter Estimation

11.1.4 General Gaussian Estimation

11.2 Representation of Random Processes

11.3 Detection of signals and estimation of signal parameters

11.4 Linear estimation of random processes

11.5 Observations

11.5.1 Models and Mismatch

11.5.2 Bayes vis-a-vis Fisher

11.5.3 Bayesian and Fisher Bounds

11.5.4 Eigenspace

11.5.5 Whitening

11.5.6 The Gaussian Model

11.6 Conclusion

Rewards Program

Write a Review