did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780470242094

Probability, Random Variables, and Random Processes Theory and Signal Processing Applications

by
  • ISBN13:

    9780470242094

  • ISBN10:

    0470242094

  • Edition: 1st
  • Format: Hardcover
  • Copyright: 2012-11-06
  • Publisher: Wiley-Interscience
  • Purchase Benefits
List Price: $177.01 Save up to $0.89
  • Buy New
    $176.12
    Add to Cart Free Shipping Icon Free Shipping

    PRINT ON DEMAND: 2-4 WEEKS. THIS ITEM CANNOT BE CANCELLED OR RETURNED.

Supplemental Materials

What is included with this book?

Summary

Probability is ubiquitous in every branch of science and engineering. This text on probability and random processes assumes basic prior knowledge of the subject at the undergraduate level. Targeted for first- and second-year graduate students in engineering, the book provides a more rigorous understanding of probability via measure theory and fields and random processes, with extensive coverage of correlation and its usefulness. The book also provides the background necessary for the study of such topics as digital communications, information theory, adaptive filtering, linear and nonlinear estimation and detection, and more.

Author Biography

JOHN J. SHYNK, PhD, is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He was a Member of Technical Staff at Bell Laboratories, and received degrees in systems engineering, electrical engineering, and statistics from Boston University and Stanford University.

Table of Contents

PREFACE xxi

NOTATION xxv

1 Overview and Background 1

1.1 Introduction 1

1.1.1 Signals, Signal Processing, and Communications 3

1.1.2 Probability, Random Variables, and Random Vectors 9

1.1.3 Random Sequences and Random Processes 11

1.1.4 Delta Functions 16

1.2 Deterministic Signals and Systems 19

1.2.1 Continuous Time 20

1.2.2 Discrete Time 25

1.2.3 Discrete-Time Filters 29

1.2.4 State-Space Realizations 32

1.3 Statistical Signal Processing with MATLAB® 35

1.3.1 Random Number Generation 35

1.3.2 Filtering 38

Problems 39

Further Reading 45

PART I Probability, Random Variables, and Expectation

2 Probability Theory 49

2.1 Introduction 49

2.2 Sets and Sample Spaces 50

2.3 Set Operations 54

2.4 Events and Fields 58

2.5 Summary of a Random Experiment 64

2.6 Measure Theory 64

2.7 Axioms of Probability 68

2.8 Basic Probability Results 69

2.9 Conditional Probability 71

2.10 Independence 73

2.11 Bayes’ Formula 74

2.12 Total Probability 76

2.13 Discrete Sample Spaces 79

2.14 Continuous Sample Spaces 83

2.15 Nonmeasurable Subsets of R 84

Problems 87

Further Reading 90

3 Random Variables 91

3.1 Introduction 91

3.2 Functions and Mappings 91

3.3 Distribution Function 96

3.4 Probability Mass Function 101

3.5 Probability Density Function 103

3.6 Mixed Distributions 104

3.7 Parametric Models for Random Variables 107

3.8 Continuous Random Variables 109

3.8.1 Gaussian Random Variable (Normal) 110

3.8.2 Log-Normal Random Variable 113

3.8.3 Inverse Gaussian Random Variable (Wald) 114

3.8.4 Exponential Random Variable (One-Sided) 116

3.8.5 Laplace Random Variable (Double-Sided Exponential) 119

3.8.6 Cauchy Random Variable 122

3.8.7 Continuous Uniform Random Variable 124

3.8.8 Triangular Random Variable 125

3.8.9 Rayleigh Random Variable 127

3.8.10 Rice Random Variable 129

3.8.11 Gamma Random Variable (Erlang for r ∈ N) 131

3.8.12 Beta Random Variable (Arcsine for α = β = 1/2, Power Function for β = 1) 133

3.8.13 Pareto Random Variable 136

3.8.14 Weibull Random Variable 137

3.8.15 Logistic Random Variable (Sigmoid for {μ = 0, α = 1}) 139

3.8.16 Chi Random Variable (Maxwell–Boltzmann, Half-Normal) 141

3.8.17 Chi-Square Random Variable 144

3.8.18 F-Distribution 147

3.8.19 Student’s t Distribution 149

3.8.20 Extreme Value Distribution (Type I: Gumbel) 150

3.9 Discrete Random Variables 151

3.9.1 Bernoulli Random Variable 152

3.9.2 Binomial Random Variable 154

3.9.3 Geometric Random Variable (with Support Z+ or N) 157

3.9.4 Negative Binomial Random Variable (Pascal) 160

3.9.5 Poisson Random Variable 162

3.9.6 Hypergeometric Random Variable 165

3.9.7 Discrete Uniform Random Variable 167

3.9.8 Logarithmic Random Variable (Log-Series) 168

3.9.9 Zeta Random Variable (Zipf) 170

Problems 173

Further Reading 176

4 Multiple Random Variables 177

4.1 Introduction 177

4.2 Random Variable Approximations 177

4.2.1 Binomial Approximation of Hypergeometric 177

4.2.2 Poisson Approximation of Binomial 179

4.2.3 Gaussian Approximations 181

4.2.4 Gaussian Approximation of Binomial 181

4.2.5 Gaussian Approximation of Poisson 181

4.2.6 Gaussian Approximation of Hypergeometric 183

4.3 Joint and Marginal Distributions 183

4.4 Independent Random Variables 186

4.5 Conditional Distribution 187

4.6 Random Vectors 190

4.6.1 Bivariate Uniform Distribution 193

4.6.2 Multivariate Gaussian Distribution 193

4.6.3 Multivariate Student’s t Distribution 196

4.6.4 Multinomial Distribution 197

4.6.5 Multivariate Hypergeometric Distribution 198

4.6.6 Bivariate Exponential Distributions 200

4.7 Generating Dependent Random Variables 201

4.8 Random Variable Transformations 205

4.8.1 Transformations of Discrete Random Variables 205

4.8.2 Transformations of Continuous Random Variables 207

4.9 Important Functions of Two Random Variables 218

4.9.1 Sum: Z = X + Y 218

4.9.2 Difference: Z = X − Y 220

4.9.3 Product: Z = XY 221

4.9.4 Quotient (Ratio): Z = X/Y 224

4.10 Transformations of Random Variable Families 226

4.10.1 Gaussian Transformations 226

4.10.2 Exponential Transformations 227

4.10.3 Chi-Square Transformations 228

4.11 Transformations of Random Vectors 229

4.12 Sample Mean ¯X and Sample Variance S2 232

4.13 Minimum, Maximum, and Order Statistics 234

4.14 Mixtures 238

Problems 240

Further Reading 243

5 Expectation and Moments 244

5.1 Introduction 244

5.2 Expectation and Integration 244

5.3 Indicator Random Variable 245

5.4 Simple Random Variable 246

5.5 Expectation for Discrete Sample Spaces 247

5.6 Expectation for Continuous Sample Spaces 250

5.7 Summary of Expectation 253

5.8 Functional View of the Mean 254

5.9 Properties of Expectation 255

5.10 Expectation of a Function 259

5.11 Characteristic Function 260

5.12 Conditional Expectation 265

5.13 Properties of Conditional Expectation 267

5.14 Location Parameters: Mean, Median, and Mode 276

5.15 Variance, Covariance, and Correlation 280

5.16 Functional View of the Variance 283

5.17 Expectation and the Indicator Function 284

5.18 Correlation Coefficients 285

5.19 Orthogonality 291

5.20 Correlation and Covariance Matrices 294

5.21 Higher Order Moments and Cumulants 296

5.22 Functional View of Skewness 302

5.23 Functional View of Kurtosis 303

5.24 Generating Functions 304

5.25 Fourth-Order Gaussian Moment 309

5.26 Expectations of Nonlinear Transformations 310

Problems 313

Further Reading 316

PART II Random Processes, Systems, and Parameter Estimation

6 Random Processes 319

6.1 Introduction 319

6.2 Characterizations of a Random Process 319

6.3 Consistency and Extension 324

6.4 Types of Random Processes 325

6.5 Stationarity 326

6.6 Independent and Identically Distributed 329

6.7 Independent Increments 331

6.8 Martingales 333

6.9 Markov Sequence 338

6.10 Markov Process 350

6.11 Random Sequences 352

6.11.1 Bernoulli Sequence 352

6.11.2 Bernoulli Scheme 352

6.11.3 Independent Sequences 353

6.11.4 Bernoulli Random Walk 354

6.11.5 Binomial Counting Sequence 356

6.12 Random Processes 359

6.12.1 Poisson Counting Process 359

6.12.2 Random Telegraph Signal 365

6.12.3 Wiener Process 368

6.12.4 Gaussian Process 371

6.12.5 Pulse Amplitude Modulation 372

6.12.6 Random Sine Signals 373

Problems 375

Further Reading 379

7 Stochastic Convergence, Calculus, and Decompositions 380

7.1 Introduction 380

7.2 Stochastic Convergence 380

7.3 Laws of Large Numbers 388

7.4 Central Limit Theorem 390

7.5 Stochastic Continuity 394

7.6 Derivatives and Integrals 404

7.7 Differential Equations 414

7.8 Difference Equations 422

7.9 Innovations and Mean-Square Predictability 423

7.10 Doob–Meyer Decomposition 428

7.11 Karhunen–Lo`eve Expansion 433

Problems 441

Further Reading 444

8 Systems, Noise, and Spectrum Estimation 445

8.1 Introduction 445

8.2 Correlation Revisited 445

8.3 Ergodicity 448

8.4 Eigenfunctions of RXX(τ ) 456

8.5 Power Spectral Density 457

8.6 Power Spectral Distribution 463

8.7 Cross-Power Spectral Density 465

8.8 Systems with Random Inputs 468

8.8.1 Nonlinear Systems 469

8.8.2 Linear Systems 471

8.9 Passband Signals 476

8.10 White Noise 479

8.11 Bandwidth 484

8.12 Spectrum Estimation 487

8.12.1 Periodogram 487

8.12.2 Smoothed Periodogram 493

8.12.3 Modified Periodogram 497

8.13 Parametric Models 500

8.13.1 Autoregressive Model 500

8.13.2 Moving-Average Model 505

8.13.3 Autoregressive Moving-Average Model 509

8.14 System Identification 513

Problems 515

Further Reading 518

9 Sufficient Statistics and Parameter Estimation 519

9.1 Introduction 519

9.2 Statistics 519

9.3 Sufficient Statistics 520

9.4 Minimal Sufficient Statistic 525

9.5 Exponential Families 528

9.6 Location-Scale Families 533

9.7 Complete Statistic 536

9.8 Rao–Blackwell Theorem 538

9.9 Lehmann–Scheff´e Theorem 540

9.10 Bayes Estimation 542

9.11 Mean-Square-Error Estimation 545

9.12 Mean-Absolute-Error Estimation 552

9.13 Orthogonality Condition 553

9.14 Properties of Estimators 555

9.14.1 Unbiased 555

9.14.2 Consistent 557

9.14.3 Efficient 559

9.15 Maximum A Posteriori Estimation 561

9.16 Maximum Likelihood Estimation 567

9.17 Likelihood Ratio Test 569

9.18 Expectation–Maximization Algorithm 570

9.19 Method of Moments 576

9.20 Least-Squares Estimation 577

9.21 Properties of LS Estimators 582

9.21.1 Minimum ξWLS 582

9.21.2 Uniqueness 582

9.21.3 Orthogonality 582

9.21.4 Unbiased 584

9.21.5 Covariance Matrix 584

9.21.6 Efficient: Achieves CRLB 585

9.21.7 BLU Estimator 585

9.22 Best Linear Unbiased Estimation 586

9.23 Properties of BLU Estimators 590

Problems 592

Further Reading 595

A Note on Part III of the Book 595

APPENDICES

Introduction to Appendices 597

A Summaries of Univariate Parametric Distributions 599

A.1 Notation 599

A.2 Further Reading 600

A.3 Continuous Random Variables 601

A.3.1 Beta (Arcsine for α = β = 1/2, Power Function for β = 1) 601

A.3.2 Cauchy 602

A.3.3 Chi 603

A.3.4 Chi-Square 604

A.3.5 Exponential (Shifted by c) 605

A.3.6 Extreme Value (Type I: Gumbel) 606

A.3.7 F-Distribution 607

A.3.8 Gamma (Erlang for r ∈ N with (r ) = (r − 1)!) 608

A.3.9 Gaussian (Normal) 609

A.3.10 Half-Normal (Folded Normal) 610

A.3.11 Inverse Gaussian (Wald) 611

A.3.12 Laplace (Double-Sided Exponential) 612

A.3.13 Logistic (Sigmoid for {μ = 0, α = 1}) 613

A.3.14 Log-Normal 614

A.3.15 Maxwell–Boltzmann 615

A.3.16 Pareto 616

A.3.17 Rayleigh 617

A.3.18 Rice 618

A.3.19 Student’s t Distribution 619

A.3.20 Triangular 620

A.3.21 Uniform (Continuous) 621

A.3.22 Weibull 622

A.4 Discrete Random Variables 623

A.4.1 Bernoulli (with Support {0, 1}) 623

A.4.2 Bernoulli (Symmetric with Support {−1, 1}) 624

A.4.3 Binomial 625

A.4.4 Geometric (with Support Z+) 626

A.4.5 Geometric (Shifted with Support N) 627

A.4.6 Hypergeometric 628

A.4.7 Logarithmic (Log-Series) 629

A.4.8 Negative Binomial (Pascal) 630

A.4.9 Poisson 631

A.4.10 Uniform (Discrete) 632

A.4.11 Zeta (Zipf) 633

B Functions and Properties 634

B.1 Continuity and Bounded Variation 634

B.2 Supremum and Infimum 640

B.3 Order Notation 640

B.4 Floor and Ceiling Functions 641

B.5 Convex and Concave Functions 641

B.6 Even and Odd Functions 641

B.7 Signum Function 643

B.8 Dirac Delta Function 644

B.9 Kronecker Delta Function 645

B.10 Unit-Step Functions 646

B.11 Rectangle Functions 647

B.12 Triangle and Ramp Functions 647

B.13 Indicator Functions 648

B.14 Sinc Function 649

B.15 Logarithm Functions 650

B.16 Gamma Functions 651

B.17 Beta Functions 653

B.18 Bessel Functions 655

B.19 Q-Function and Error Functions 655

B.20 Marcum Q-Function 659

B.21 Zeta Function 659

B.22 Rising and Falling Factorials 660

B.23 Laguerre Polynomials 661

B.24 Hypergeometric Functions 662

B.25 Bernoulli Numbers 663

B.26 Harmonic Numbers 663

B.27 Euler–Mascheroni Constant 664

B.28 Dirichlet Function 664

Further Reading 664

C Frequency-Domain Transforms and Properties 665

C.1 Laplace Transform 665

C.2 Continuous-Time Fourier Transform 669

C.3 z-Transform 670

C.4 Discrete-Time Fourier Transform 676

Further Reading 677

D Integration and Integrals 678

D.1 Review of Riemann Integral 678

D.2 Riemann–Stieltjes Integral 681

D.3 Lebesgue Integral 684

D.4 Pdf Integrals 688

D.5 Indefinite and Definite Integrals 690

D.6 Integral Formulas 692

D.7 Double Integrals of Special Functions 692

Further Reading 696

E Identities and Infinite Series 697

E.1 Zero and Infinity 697

E.2 Minimum and Maximum 697

E.3 Trigonometric Identities 698

E.4 Stirling’s Formula 698

E.5 Taylor Series 699

E.6 Series Expansions and Closed-Form Sums 699

E.7 Vandermonde’s Identity 702

E.8 Pmf Sums and Functional Forms 703

E.9 Completing the Square 704

E.10 Summation by Parts 705

Further Reading 706

F Inequalities and Bounds for Expectations 707

F.1 Cauchy–Schwarz and H¨older Inequalities 707

F.2 Triangle and Minkowski Inequalities 708

F.3 Bienaym´e, Chebyshev, and Markov Inequalities 709

F.4 Chernoff’s Inequality 711

F.5 Jensen’s Inequality 713

F.6 Cram´er–Rao Inequality 714

Further Reading 718

G Matrix and Vector Properties 719

G.1 Basic Properties 719

G.2 Four Fundamental Subspaces 721

G.3 Eigendecomposition 722

G.4 LU, LDU, and Cholesky Decompositions 724

G.5 Jacobian Matrix and the Jacobian 726

G.6 Kronecker and Schur Products 728

G.7 Properties of Trace and Determinant 728

G.8 Matrix Inversion Lemma 729

G.9 Cauchy–Schwarz Inequality 730

G.10 Differentiation 730

G.11 Complex Differentiation 731

Further Reading 732

GLOSSARY 733

REFERENCES 743

INDEX 755

PART III Applications in Signal Processing and Communications

Chapters at the Web Site www.wiley.com/go/randomprocesses

10 Communication Systems and Information Theory 771

10.1 Introduction 771

10.2 Transmitter 771

10.2.1 Sampling and Quantization 772

10.2.2 Channel Coding 777

10.2.3 Symbols and Pulse Shaping 778

10.2.4 Modulation 781

10.3 Transmission Channel 783

10.4 Receiver 786

10.4.1 Receive Filter 786

10.4.2 Demodulation 787

10.4.3 Gram–Schmidt Orthogonalization 789

10.4.4 Maximum Likelihood Detection 794

10.4.5 Matched Filter Receiver 797

10.4.6 Probability of Error 802

10.5 Information Theory 803

10.5.1 Mutual Information and Entropy 804

10.5.2 Properties of Mutual Information and Entropy 810

10.5.3 Continuous Distributions: Differential Entropy 813

10.5.4 Channel Capacity 818

10.5.5 AWGN Channel 820

Problems 821

Further Reading 824

11 Optimal Filtering www.wiley.com/go/randomprocesses 825

11.1 Introduction 825

11.2 Optimal Linear Filtering 825

11.3 Optimal Filter Applications 827

11.3.1 System Identification 827

11.3.2 Inverse Modeling 827

11.3.3 Noise Cancellation 828

11.3.4 Linear Prediction 828

11.4 Noncausal Wiener Filter 829

11.5 Causal Wiener Filter 831

11.6 Prewhitening Filter 837

11.7 FIR Wiener Filter 839

11.8 Kalman Filter 844

11.8.1 Evolution of the Mean and Covariance 846

11.8.2 State Prediction 846

11.8.3 State Filtering 848

11.9 Steady-State Kalman Filter 851

11.10 Linear Predictive Coding 857

11.11 Lattice Prediction-Error Filter 861

11.12 Levinson–Durbin Algorithm 865

11.13 Least-Squares Filtering 868

11.14 Recursive Least-Squares 872

Problems 876

Further Reading 879

12 Adaptive Filtering www.wiley.com/go/randomprocesses 880

12.1 Introduction 880

12.2 MSE Properties 880

12.3 Steepest Descent 889

12.4 Newton’s Method 894

12.5 LMS Algorithm 895

12.5.1 Convergence in the Mean 899

12.5.2 Convergence in the Mean-Square 901

12.5.3 Misadjustment 906

12.6 Modified LMS Algorithms 911

12.6.1 Sign-Error LMS Algorithm 911

12.6.2 Sign-Data LMS Algorithm 912

12.6.3 Sign-Sign LMS Algorithm 914

12.6.4 LMF Algorithm 914

12.6.5 Complex LMS Algorithm 916

12.6.6 “Leaky” LMS Algorithm 917

12.6.7 Normalized LMS Algorithm 918

12.6.8 Perceptron 920

12.6.9 Convergence of Modified LMS Algorithms 922

12.7 Adaptive IIR Filtering 923

12.7.1 Output-Error Formulation 924

12.7.2 Output-Error IIR Filter Algorithm 928

12.7.3 Equation-Error Formulation 932

12.7.4 Equation-Error Bias 933

Problems 936

Further Reading 939

13 Equalization, Beamforming, and Direction Finding www.wiley.com/go/randomprocesses 940

13.1 Introduction 940

13.2 Channel Equalization 941

13.3 Optimal Bussgang Algorithm 943

13.4 Blind Equalizer Algorithms 949

13.4.1 Sato’s Algorithm 949

13.4.2 Constant Modulus Algorithm 950

13.5 CMA Performance Surface 952

13.6 Antenna Arrays 958

13.7 Beampatterns 960

13.8 Optimal Beamforming 962

13.8.1 Known Look Direction 962

13.8.2 Multiple Constraint Beamforming 964

13.8.3 Training Signal 966

13.8.4 Maximum Likelihood 968

13.8.5 Maximum SNR and SINR 969

13.9 Adaptive Beamforming 970

13.9.1 LMS Beamforming 970

13.9.2 Constant Modulus Array 970

13.9.3 Decision-Directed Mode 973

13.9.4 Multistage CM Array 974

13.9.5 Output SINR and SNR 977

13.10 Direction Finding 981

13.10.1 Beamforming Approaches 981

13.10.2 MUSIC Algorithm 984

Problems 985

Further Reading 989

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program