CART

(0) items

Probability and Random Processes with Applications to Signal Processing,9780130200716
This item qualifies for
FREE SHIPPING!
FREE SHIPPING OVER $59!

Your order must be $59 or more, you must select US Postal Service Shipping as your shipping preference, and the "Group my items into as few shipments as possible" option when you place your order.

Bulk sales, PO's, Marketplace Items, eBooks, Apparel, and DVDs not included.

Probability and Random Processes with Applications to Signal Processing

by ;
Edition:
3rd
ISBN13:

9780130200716

ISBN10:
0130200719
Format:
Hardcover
Pub. Date:
1/1/2002
Publisher(s):
Prentice Hall

Related Products


  • Probability, Statistics, and Random Processes for Engineers
    Probability, Statistics, and Random Processes for Engineers





Summary

Provides users with an accessible, yet mathematically solid, treatment of probability and random processes. Many computer examples integrated throughout, including random process examples in MATLAB.Includes expanded discussions of fundamental principles, especially basic probability. Includes new problems which deal with applications of basic theoryin such areas as medical imaging, percolation theory in fractals, and generation of random numbers. Several new topics include Failure rates, the Chernoff bound, interval estimation and the Student t-distribution, and power spectral density estimation. Functions of Random Variables is included as a separate chapter. Mean square convergence and introduction of Martingales is covered in the latter half of the book.Provides electrical and computer engineers with a solid treatment of probability and random processes.

Table of Contents

Preface xiii
Introduction to Probability
1(57)
Introduction: Why Study Probability?
1(1)
The Different Kinds of Probability
2(3)
Probability as Intuition
2(1)
Probability as the Ratio of Favorable to Total Outcomes (Classical Theory)
3(1)
Probability as a Measure of Frequency of Occurrence
4(1)
Probability Based on an Axiomatic Theory
5(1)
Misuses, Miscalculations, and Paradoxes in Probability
5(2)
Sets, Fields, and Events
7(4)
Examples of Sample Spaces
7(4)
Axiomatic Definition of Probability
11(5)
Joint, Conditional, and Total Probabilities; Independence
16(6)
Bayes' Theorem and Applications
22(2)
Combinatorics
24(8)
Occupancy Problems
28(2)
Extensions and Applications
30(2)
Bernoulli Trials---Binomial and Multinomial Probability Laws
32(7)
Multinomial Probability Law
36(3)
Asymptotic Behavior of the Binomial Law: The Poisson Law
39(6)
Normal Approximation to the Binomial Law
45(2)
Summary
47(11)
Problems
48(9)
References
57(1)
Random Variables
58(58)
Introduction
58(1)
Definition of a Random Variable
59(3)
Probability Distribution Function
62(4)
Probability Density Function (PDF)
66(9)
Four Other Common Density Functions
71(3)
More Advanced Density Functions
74(1)
Continuous, Discrete, and Mixed Random Variables
75(5)
Examples of Probability Mass Functions
77(3)
Conditional and Joint Distributions and Densities
80(25)
Failure Rates
105(3)
Summary
108(8)
Problems
109(6)
References
115(1)
Additional Reading
115(1)
Functions of Random Variables
116(53)
Introduction
116(4)
Functions of a Random Variable (Several Views)
119(1)
Solving Problems of the Type Y = g(X)
120(14)
General Formula of Determining the pdf of Y = g(X)
129(5)
Solving Problems of the Type Z = g(X, Y)
134(18)
Solving Problems of the Type V = g(X, Y), W = h(X, Y)
152(5)
Fundamental Problem
152(2)
Obtaining fVW Directly from fXY
154(3)
Additional Examples
157(4)
Summary
161(8)
Problems
162(6)
References
168(1)
Additional Reading
168(1)
Expectation and Introduction to Estimation
169(75)
Expected Value of a Random Variable
169(14)
On the Validity of Equation 4.1-8
172(11)
Conditional Expectations
183(9)
Conditional Expectation as a Random Variable
190(2)
Moments
192(13)
Joint Moments
196(2)
Properties of Uncorrelated Random Variables
198(3)
Jointly Gaussian Random Variables
201(2)
Contours of Constant Density of the Joint Gaussian pdf
203(2)
Chebyshev and Schwarz Inequalities
205(6)
Random Variables with Nonnegative Values
207(1)
The Schwarz Inequality
208(3)
Moment-Generating Functions
211(3)
Chernoff Bound
214(2)
Characteristic Functions
216(14)
Joint Characteristic Functions
222(3)
The Central Limit Theorem
225(5)
Estimators for the Mean and Variance of the Normal Law
230(6)
Confidence Intervals for the Mean
231(3)
Confidence Interval for the Variance
234(2)
Summary
236(8)
Problems
237(6)
References
243(1)
Additional Reading
243(1)
Random Vectors and Parameter Estimation
244(60)
Joint Distribution and Densities
244(4)
Multiple Transformation of Random Variables
248(3)
Expectation Vectors and Covariance Matrices
251(3)
Properties of Covariance Matrices
254(5)
Simultaneous Diagonalization of Two Covariance Matrices and Applications in Pattern Recognition
259(10)
Projection
262(1)
Maximization of Quadratic Forms
263(6)
The Multidimensional Gaussian Law
269(8)
Characteristic Functions of Random Vectors
277(5)
The Characteristic Function of the Normal Law
280(2)
Parameter Estimation
282(4)
Estimation of E[X]
284(2)
Estimation of Vector Means and Covariance Matrices
286(4)
Estimation of μ
286(1)
Estimation of the Covariance K
287(3)
Maximum Likelihood Estimators
290(4)
Linear Estimation of Vector Parameters
294(3)
Summary
297(7)
Problems
298(5)
References
303(1)
Additional Reading
303(1)
Random Sequences
304(97)
Basic Concepts
304(30)
Infinite-Length Bernoulli Trials
310(5)
Continuity of Probability Measure
315(2)
Statistical Specification of a Random Sequence
317(17)
Basic Principles of Discrete-Time Linear Systems
334(6)
Random Sequences and Linear Systems
340(8)
WSS Random Sequences
348(14)
Power Spectral Density
351(1)
Interpretation of the PSD
352(3)
Synthesis of Random Sequences and Discrete-Time Simulation
355(3)
Decimation
358(1)
Interpolation
359(3)
Markov Random Sequences
362(10)
ARMA Models
365(1)
Markov Chains
366(6)
Vector Random Sequences and State Equations
372(3)
Convergence of Random Sequences
375(8)
Laws of Large Numbers
383(4)
Summary
387(14)
Problems
388(11)
References
399(2)
Random Processes
401(86)
Basic Definitions
402(4)
Some Important Random Processes
406(24)
Asynchronous Binary Signaling
406(2)
Poisson Counting Process
408(4)
Alternative Derivation of Poisson Process
412(2)
Random Telegraph Signal
414(2)
Digital Modulation Using Phase-Shift Keying
416(2)
Wiener Process or Brownian Motion
418(3)
Markov Random Processes
421(4)
Birth-Death Markov Chains
425(4)
Chapman-Kolmogorov Equations
429(1)
Random Process Generated from Random Sequences
430(1)
Continuous-Time Linear Systems with Random Inputs
430(7)
White Noise
436(1)
Some Useful Classifications of Random Processes
437(2)
Stationarity
437(2)
Wide-Sense Stationary Processes and LSI Systems
439(19)
Wide-Sense Stationary Case
440(3)
Power Spectral Density
443(1)
An Interpretation of the psd
444(4)
More on White Noise
448(7)
Stationary Processes and Differential Equations
455(3)
Periodic and Cyclostationary Processes
458(6)
Vector Processes and State Equations
464(5)
State Equations
466(3)
Summary
469(18)
Problems
469(17)
References
486(1)
Advanced Topics in Random Processes
487(65)
Mean-Square (m.s.) Calculus
487(15)
Stochastic Continuity and Derivatives [8-1]
487(10)
Further Results on m.s. Convergence [8-1]
497(5)
m.s. Stochastic Integrals
502(4)
m.s. Stochastic Differential Equations
506(5)
Ergodicity [8-3]
511(7)
Karhunen-Loeve Expansion [8-5]
518(6)
Representation of Bandlimited and Periodic Processes
524(11)
Bandlimited Processes
525(3)
Bandpass Random Processes
528(2)
WSS Periodic Processes
530(3)
Fourier Series for WSS Processes
533(2)
Summary
535(1)
Appendix: Integral Equations
535(17)
Existence Theorem
536(4)
Problems
540(11)
References
551(1)
Applications to Statistical Signal Processing
552(89)
Estimation of Random Variables
552(18)
More on the Conditional Mean
558(2)
Orthogonality and Linear Estimation
560(8)
Some Properties of the Operator E
568(2)
Innovation Sequences and Kalman Filtering
570(15)
Predicting Gaussian Random Sequences
574(1)
Kalman Predictor and Filter
575(6)
Error-Covariance Equations
581(4)
Wiener Filters for Random Sequences
585(4)
Unrealizable Case (Smoothing)
585(2)
Causal Wiener Filter
587(2)
Expectation-Maximization Algorithm
589(11)
Log-Likelihood for the Linear Transformation
592(2)
Summary of the E-M algorithm
594(1)
E-M Algorithm for Exponential Probability Functions
594(1)
Application to Emission Tomography
595(3)
Log-likelihood Function of Complete Data
598(1)
E-step
598(1)
M-step
599(1)
Hidden Markov Models (HMM)
600(10)
Specification of an HMM
601(3)
Application to Speech Processing
604(1)
Efficient Computation of P[E|M] with a Recursive Algorithm
605(2)
Viterbi Algorithm and the Most Likely State Sequence for the Observations
607(3)
Special Estimation
610(13)
The Periodogram
611(3)
Bartlett's Procedure---Averaging Periodograms
614(2)
Parametric Spectral Estimate
616(4)
Maximum Entropy Spectral Density
620(3)
Simulated Annealing
623(10)
Gibbs Sampler
624(1)
Noncausal Gauss-Markov Models
625(4)
Compound Markov Models
629(1)
Gibbs Line Sequence
630(3)
Summary
633(8)
Problems
635(4)
References
639(2)
Appendix A Review of Relevant Mathematics 641(17)
A.1 Basic Mathematics
641(3)
Sequences
641(1)
Convergence
642(1)
Summations
643(1)
Z-Transform
643(1)
A.2 Continuous Mathematics
644(5)
Definite and Indefinite Integrals
645(1)
Differentiation of Integrals
645(1)
Integration by Parts
646(1)
Completing the Square
647(1)
Double Integration
647(1)
Functions
648(1)
A.3 Residue Method for Inverse Fourier Transformation
649(7)
Fact
650(3)
Inverse Fourier Transform for psd of Random Sequence
653(3)
A.4 Mathematical Induction [A-4]
656(2)
Axiom of Induction
656(1)
References
657(1)
Appendix B Gamma and Delta Functions 658(4)
B.1 Gamma Function
658(1)
B.2 Dirac Delta Function
659(3)
Appendix C Functional Transformations and Jacobians 662(6)
C.1 Introduction
662(1)
C.2 Jacobians for n = 2
663(1)
C.3 Jacobian for General n
664(4)
Appendix D Measure and Probability 668(4)
D.1 Introduction and Basic Ideas
668(2)
Measurable Mappings and Functions
670(1)
D.2 Application of Measure Theory to Probability
670(2)
Distribution Measure
671(1)
Appendix E Sampled Analog Waveforms and Discrete-time Signals 672(2)
Index 674

Excerpts

PrefaceThe first edition of this book (1986) grew out of a set of notes used by the authors to teach two one-semester courses on probability and random processes at Rensselaer Polytechnic Institute (RPI). At that time the probability course at RPI was required of all students in the Computer and Systems Engineering Program and was a highly recommended elective for students in closely related areas. While many undergraduate students took the course in the junior year, many seniors and first-year graduate students took the course for credit as well. Then, as now, most of the students were engineering students. To serve these students well, we felt that we should be rigorous in introducing fundamental principles while furnishing many opportunities for students to develop their skills at solving problems.There are many books in this area and they range widely in their coverage and depth. At one extreme are the very rigorous and authoritative books that view probability from the point of view of measure theory and relate probability to rather exotic theorems such as the Radon-Nikodym theorem (see for exampleProbability and Measureby Patrick Billingsley, Wiley, 1978). At the other extreme are books that usually combine probability and statistics and largely omit underlying theory and the more advanced types of applications of probability. In the middle are the large number of books that combine probability and random processes, largely avoiding a measure theoretic approach, preferring to emphasize the axioms upon which the theory is based. It would be fair to say that our book falls into this latter category. Nevertheless this begs the question: why write or revise another book in this area if there are already several good texts out there that use the same approach and provide roughly the same coverage? Of course back in 1986 there were few books that emphasized the engineering applications of probability and random processes and that integrated the latter into one volume. Now there are several such books.Both authors have been associated (both as students and faculty) with colleges and universities that have demanding programs in engineering and applied science. Thus their experience and exposure have been to superior students that would not be content with a text that furnished a shallow discussion of probability. At the same time, however, the authors wanted to write a book on probability and random processes for engineering and, applied science students. A measure-theoretic book, or one that avoided the engineering applications of probability and the processing of random signals, was regarded not suitable for such students. At the same time the authors felt that the book should have enough depth so that students taking 2 nd year graduate courses in advanced topics such as estimation and detection, pattern recognition, voice and image processing, networking and queuing, and so forth would not be handicapped by insufficient knowledge of the fundamentals and applications of random phenomena. In a nutshell we tried to write a book that combined rigor with accessibility and had a strong self-teaching orientation. To that end we included a. large number of worked-out examples, MATLAB codes, and special appendices that include a review of the kind of basic math needed for solving problems in probability as well as an introduction to measure theory and its relation to probability. The MATLAB codes, as well as other useful material such as multiple choice exams that cover each of the book's sections, can be found at the book's web site http://www.prenhall.com/stark .The normal use of this book would be as follows: for a first course in probability at, say the junior or senior year, a reasonable goal is to cover Chapters 1 through 4. Nevertheless we have found that this may be too mu


Please wait while the item is added to your cart...