CART

(0) items

Detection Estimation and Modulation Theory, Part I,9780470542965
This item qualifies for
FREE SHIPPING!

FREE SHIPPING OVER $59!

Your order must be $59 or more, you must select US Postal Service Shipping as your shipping preference, and the "Group my items into as few shipments as possible" option when you place your order.

Bulk sales, PO's, Marketplace Items, eBooks, Apparel, and DVDs not included.

Detection Estimation and Modulation Theory, Part I

by ;
Edition:
2nd
ISBN13:

9780470542965

ISBN10:
0470542969
Format:
Hardcover
Pub. Date:
4/15/2013
Publisher(s):
Wiley
List Price: $117.33

Rent Textbook

(Recommended)
 
Term
Due
Price
$99.73

Buy New Textbook

Currently Available, Usually Ships in 24-48 Hours
N9780470542965
$112.46

Used Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

More New and Used
from Private Sellers
Starting at $97.49
See Prices

Questions About This Book?

Why should I rent this book?
Renting is easy, fast, and cheap! Renting from eCampus.com can save you hundreds of dollars compared to the cost of new or used books each semester. At the end of the semester, simply ship the book back to us with a free UPS shipping label! No need to worry about selling it back.
How do rental returns work?
Returning books is as easy as possible. As your rental due date approaches, we will email you several courtesy reminders. When you are ready to return, you can print a free UPS shipping label from our website at any time. Then, just return the book to your UPS driver or any staffed UPS location. You can even use the same box we shipped it in!
What version or edition is this?
This is the 2nd edition with a publication date of 4/15/2013.
What is included with this book?
  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any CDs, lab manuals, study guides, etc.
  • The Rental copy of this book is not guaranteed to include any supplemental materials. You may receive a brand new copy, but typically, only the book itself.

Summary

Originally published in 1968, Harry Van Trees's Detection, Estimation, and Modulation Theory, Part I is one of the great time-tested classics in the field of signal processing. Highly readable and practically organized, it is as imperative today for professionals, researchers, and students in optimum signal processing as it was over thirty years ago. The second edition is a thorough revision and expansion almost doubling the size of the first edition and accounting for the new developments thus making it again the most comprehensive and up-to-date treatment of the subject.With a wide range of applications such as radar, sonar, communications, seismology, biomedical engineering, and radar astronomy, among others, the important field of detection and estimation has rarely been given such expert treatment as it is here. Each chapter includes section summaries, realistic examples, and a large number of challenging problems that provide excellent study material. This volume which is Part I of a set of four volumes is the most important and widely used textbook and professional reference in the field.

Author Biography

HARRY L. VAN TREES, ScD., received his BSc. from the United States Military Academy and his ScD. from Massachusetts Institute of Technology. During his fourteen years as a Professor of Electrical Engineering at MIT, he wrote Parts I, II, and III of the DEMT series. On loan from MIT, he served in four senior DoD positions including Chief Scientist of the U.S. Air Force and Principal Deputy Assistant Secretary of Defense (C3I). Returning to academia as an endowed professor at George Mason University, he founded the C3I Center and published Part IV of the DEMT series, Optimum Array Processing. He is currently a University Professor Emeritus.

KRISTINE L. BELL, PhD, is a Senior Scientist at Metron, Inc., and an affiliate faculty member in the Statistics Department at George Mason University. She coedited with Dr. Van Trees the Wiley-IEEE book Bayesian Bounds for Parameter Estimation and Nonlinear Filtering/Tracking.

ZHI TIAN, PhD, is a Professor of Electrical and Computer Engineering at Michigan Technological University. She is a Fellow of the IEEE.

Table of Contents

1 Introduction

1.1 Introduction

1.2 Topical Outline

1.3 Possible Approaches

1.4 Organization

2 Classic Detection Theory

2.1 Introduction

2.2 Simple Binary Hypothesis Tests

2.2.1 Decision Criteria

2.2.2 Performance: Receiver Operating Characteristic

2.3 M Hypotheses

2.4 Performance Bounds and Approximations

2.5 Monte Carlo Simulation

2.5.1 Monte Carlo Simulation Techniques

2.5.2 Importance Sampling

2.5.2.1 Simulation of PF

2.5.2.2 Simulation of PM

2.5.2.3 Independent Observations

2.5.2.4 Simulation of the ROC

2.5.2.5 Examples

2.5.2.6 Iterative Importance Sampling

2.5.3 Summary

2.6 Summary

2.7 Problems

3 General Gaussian Detection

3.1 Detection of Gaussian Random Vectors

3.1.1 Real Gaussian Random Vectors

3.1.2 Circular Complex Gaussian Random Vectors

3.1.3 General Gaussian Detection

3.1.3.1 Real Gaussian Vectors

3.1.3.2 Circular Complex Gaussian Vectors

3.2 Equal Covariance Matrices

3.2.1 Independent Components with Equal Variance

3.2.2 Independent Components with Unequal Variances

3.2.3 General Case: Eigendecomposition

3.2.4 Optimum Signal Design

3.2.5 Interference Matrix: Estimator-Subtractor

3.2.6 Low-Rank Models

3.2.7 Summary

3.3 Equal Mean Vectors

3.3.1 Diagonal Covariance Matrix on H0: Equal Variance

3.3.1.1 Independent, Identically Distributed Signal Components

3.3.1.2 Independent Signal Components: Unequal Variances

3.3.1.3 Correlated Signal Components

3.3.1.4 Low-Rank Signal Model

3.3.1.5 Symmetric Hypotheses, Uncorrelated Noise

3.3.2 Non-diagonal Covariance Matrix on H0

3.3.2.1 Signal on H1 Only

3.3.2.2 Signal on Both Hypotheses

3.3.3 Summary

3.4 General Gaussian

3.4.1 Real Gaussian Model

3.4.2 Circular Complex Gaussian Model

3.4.3 Single Quadratic Form

3.4.4 Summary

3.5 M Hypotheses

3.6 Summary

3.7 Problems

4 Classical Parameter Estimation

4.1 Introduction

4.2 Scalar Parameter Estimation

4.2.1 Random Parameters: Bayes Estimation

4.2.2 Nonrandom Parameter Estimation

4.2.3 Bayesian Bounds

4.2.3.1 Lower Bound on the MSE

4.2.3.2 Asymptotic Behavior

4.2.4 Case Study

4.2.5 Exponential Family

4.2.5.1 Nonrandom Parameters

4.2.5.2 Random Parameters

4.2.6 Summary of Scalar Parameter Estimation

4.3 Multiple Parameter Estimation

4.3.1 Estimation Procedures

4.3.1.1 Random Parameter

4.3.1.2 Nonrandom Parameters

4.3.2 Measures of Error

4.3.2.1 Nonrandom Parameters

4.3.2.2 Random Parameters

4.3.3 Bounds on Estimation Error

4.3.3.1 Nonrandom Parameters

4.3.3.2 Random Parameters

4.3.4 Exponential Family

4.3.4.1 Nonrandom Parameters

4.3.4.2 Random Parameters

4.3.5 Nuisance Parameters

4.3.5.1 Nonrandom Parameters

4.3.5.2 Random Parameters

4.3.5.3 Hybrid Parameters

4.3.6 Hybrid Parameters

4.3.6.1 Joint ML and MAP Estimation

4.3.6.2 Nuisance Parameters

4.3.7 Summary of Multiple Parameter Estimation

4.4 Global Bayesian Bounds

4.4.1 Covariance Inequality Bounds

4.4.1.1 Covariance Inequality

4.4.1.2 Bayesian Bounds

4.4.1.3 Scalar Parameters

4.4.1.4 Vector Parameters

4.4.1.5 Combined Bayesian Bounds

4.4.1.6 Functions of the Parameter Vector

4.4.1.7 Summary of Covariance Inequality Bounds

4.4.2 Method of Interval Estimation

4.4.3 Summary of Global Bayesian Bounds

4.5 Composite Hypotheses

4.5.1 Introduction

4.5.2 Random Parameters

4.5.3 Nonrandom Parameters

4.5.4 Simulation

4.5.5 Summary of Composite Hypotheses

4.6 Summary

4.7 Problems

5 General Gaussian Estimation

5.1 Introduction

5.2 Nonrandom Parameters

5.2.1 General Gaussian Estimation Model

5.2.2 Maximum Likelihood Estimation

5.2.3 Cramér-Rao Bound

5.2.4 Fisher Linear Gaussian Model

5.2.4.1 Introduction

5.2.4.2 White Noise

5.2.4.3 Low-Rank Interference

5.2.5 Separable Models for Mean Parameters

5.2.6 Covariance Matrix Parameters

5.2.6.1 White Noise

5.2.6.2 Colored Noise

5.2.6.3 Rank One Signal Matrix Plus White Noise

5.2.6.4 Rank One Signal Matrix Plus Colored Noise

5.2.7 Linear Gaussian Mean and Covariance Matrix Parameters

5.2.7.1 White Noise

5.2.7.2 Colored Noise

5.2.7.3 General Covariance Matrix

5.2.8 Computational Algorithms

5.2.8.1 Introduction

5.2.8.2 Gradient Techniques

5.2.8.3 Alternating Projection Algorithm

5.2.8.4 Expectation Maximization Algorithm

5.2.8.5 Summary

5.2.9 Equivalent Estimation Algorithms

5.2.9.1 Least Squares

5.2.9.2 Minimum Variance Distortionless Response

5.2.9.3 Summary

5.2.10 Sensitivity, Mismatch, and Diagonal Loading

5.2.10.1 Sensitivity and Array Perturbations

5.2.10.2 Diagonal Loading

5.2.11 Summary

5.3 Random Parameters

5.3.1 Model, MAP Estimation, and the BCRB

5.3.2 Bayesian Linear Gaussian Model

5.3.3 Summary

5.4 Sequential Estimation

5.4.1 Sequential Bayes Estimation

5.4.2 Recursive Maximum Likelihood

5.4.3 Summary

5.5 Summary

5.6 Problems

6 Representation of Random Processes

6.1 Introduction

6.2 Orthonormal Expansions: Deterministic Signals

6.3 Random Process Characterization

6.3.1 Random Processes: Conventional Characterizations

6.3.2 Series Representation of Sample Functions of Random Processes

6.3.3 Gaussian Processes

6.4 Homogeous Internal Equations and Eigenfunctions

6.4.1 Rational Spectra

6.4.2 Bandlimited Spectra

6.4.3 Nonstationary Processes

6.4.4 White Noise Processes

6.4.5 Low Rank Kernels

6.4.6 The Optimum Linear Filter

6.4.7 Properties of Eigenfunctions and Eigenvalues

6.4.7.1 Monotonic property

6.4.7.2 Asymptotic behavior properties

6.5 Vector Random Processes

6.6 Summary

6.7 Problems

7 Detection of Signals – Estimation of Signal Parameters

7.1 Introduction

7.1.1 Models

7.1.1.1 Detection

7.1.1.2 Estimation

7.1.2 Format

7.2 Detection and Estimation in White Gaussian Noise

7.2.1 Detection of Signals in Additive White Gaussian Noise

7.2.1.1 Simple binary detection

7.2.1.2 General binary detection in white Gaussian noise

7.2.1.3 M-ary detection in white Gaussian noise

7.2.1.4 Sensitivity

7.2.2 Linear Estimation

7.2.3 Nonlinear Estimation

7.2.4 Summary of Known Signals in White Gaussian Noise

7.2.4.1 Detection

7.2.4.2 Estimation

7.3 Detection and Estimation in Nonwhite Gaussian Noise

7.3.1 “Whitening” Approach

7.3.1.1 Structures

7.3.1.2 Construction of Qn(t; u) and g(t)

7.3.1.3 Summary

7.3.2 A Direct Derivation Using the Karhunen-Loève Expansion

7.3.3 A Direct Derivation with a Sufficient Statistic

7.3.4 Detection Performance

7.3.4.1 Performance: Simple binary detection problem

7.3.4.2 Optimum signal design: Coincident intervals

7.3.4.3 Singularity

7.3.4.4 General binary receivers

7.3.5 Estimation

7.3.6 Solution Techniques for Integral Equations

7.3.6.1 Infinite observation interval: Stationary noise

7.3.6.2 Finite observation interval: Rational spectra

7.3.6.3 Finite observation time: Separable kernels

7.3.7 Sensitivity,

7.3.7.2 Mismatch and diagonal loading

7.3.8 Known Linear Channels

7.3.8.1 Summary

7.4 Signals with Unwanted Parameters: The Composite Hypothesis Problem

7.4.1 Random Phase Angles

7.4.2 Random Amplitude and Phase

7.4.3 Other Target Models

7.4.4 Nonrandom Parameters

7.4.4.1 Summary

7.5 Multiple Channels

7.5.1 Vector Karhunen-Loève

7.5.1.1 Application

7.6 Multiple Parameter Estimation

7.6.1 Known Signal in Additive White Gaussian Noise

7.6.2 Separable Models

7.6.3 Summary

7.7 Summary

7.8 Problems

8 Estimation of Continuous–Time Random Processes

8.1 Optimum Linear Processes

8.2 Realizable Linear Filters: Stationary Processes, Infinite Past: Wiener Filters

8.2.1 Solution of Wiener-Hopf Equation

8.2.2 Errors in Optimum Systems

8.2.3 Unrealizable Filters

8.2.4 Closed-Form Error Expressions

8.3 Gaussian-Markov Processes: Kalman Filter

8.3.1 Differential Equation Representation of Linear Systems and Random Process Generation

8.3.2 Kalman Filter

8.3.3 Realizable Whitening Filter

8.3.4 Generalizations

8.3.5 Implementation Issues

8.4 Bayesian Estimation of Non-Gaussian Models

8.4.1 The Extended Kalman Filter

8.4.1.1 Linear AWGN process and observations

8.4.1.2 Linear AWGN process, nonlinear AWGN observations

8.4.1.3 Nonlinear AWGN process and observations

8.4.1.4 Nonlinear process and observations

8.4.2 Bayesian Cramér-Rao Bounds: Continuous-Time

8.4.3 Summary

8.5 Summary

8.6 Problems

9 Estimation of Discrete–Time Random Processes

9.1 Introduction

9.2 Discrete-time Wiener Filtering

9.2.1 Model

9.2.2 Random Process Models

9.2.3 Optimum FIR Filters

9.2.4 Unrealizable IIR Wiener Filters

9.2.5 Realizable IIR Wiener Filters

9.2.6 Summary: Discrete-time Wiener Filter

9.3 Discrete-time Kalman filter

9.3.1 Random process models

9.3.2 Kalman Filter

9.3.2.1 Derivation

9.3.2.2 Reduced Dimension Implementations

9.3.2.3 Applications

9.3.2.4 Estimation in Non-white Noise

9.3.2.5 Sequential Processing of Estimators

9.3.2.6 Square-root Filters

9.3.2.7 Divergence

9.3.2.8 Sensitivity and Model Mismatch

9.3.2.9 Summary: Kalman Filters

9.3.3 Kalman Predictors

9.3.3.1 Fixed-lead prediction

9.3.3.2 Fixed-point prediction

9.3.3.3 Fixed-Interval Prediction

9.3.3.4 Summary: Kalman Predictors

9.3.4 Kalman Smoothing

9.3.4.1 Fixed Interval Smoothing

9.3.4.2 Fixed Lag Smoothing

9.3.4.3 Summary: Kalman Smoothing

9.3.5 Bayesian Estimation of Nonlinear Models

9.3.5.1 General Nonlinear Model: MMSE and MAP Estimation

9.3.5.2 Extended Kalman Filter

9.3.5.3 Recursive Bayesian Cramér-Rao Bounds

9.3.5.4 Applications

9.3.5.5 Joint State And Parameter Estimation

9.3.5.6 Continuous–Time Processes and Discrete–Time Observations

9.3.5.7 Summary

9.3.6 Summary: Kalman Filters

9.4 Summary

9.5 Problems

10 Detection of Gaussian Signals

10.1 Introduction

10.2 Detection of Continuous-time Gaussian Processes

10.2.1 Sampling

10.2.2 Optimum Continuous-Time Receivers

10.2.3 Performance of Optimum Receivers

10.2.4 State-Variable Realization

10.2.5 Stationary Process-Long Observation Time (SPLOT) Receiver

10.2.6 Low-rank Kernels

10.2.7 Summary

10.3 Detection of Discrete-Time Gaussian Processes

10.3.1 Second Moment Characterization

10.3.1.1 Known means and covariance matrices

10.3.1.2 Means and covariance matrices with unknown parameters

10.3.2 State Variable Characterization

10.3.3 Summary

10.4 Summary

10.5 Problems

11 Epilogue

11.1 Classical Detection and Estimation Theory

11.1.1 Classical Detection Theory

11.1.2 General Gaussian Detection

11.1.3 Classical Parameter Estimation

11.1.4 General Gaussian Estimation

11.2 Representation of Random Processes

11.3 Detection of signals and estimation of signal parameters

11.4 Linear estimation of random processes

11.5 Observations

11.5.1 Models and Mismatch

11.5.2 Bayes vis-a-vis Fisher

11.5.3 Bayesian and Fisher Bounds

11.5.4 Eigenspace

11.5.5 Whitening

11.5.6 The Gaussian Model

11.6 Conclusion



Please wait while the item is added to your cart...