rent-now

Rent More, Save More! Use code: ECRENTAL

5% off 1 book, 7% off 2 books, 10% off 3+ books

9780387400846

An Introduction to Bayesian Analysis

by ; ;
  • ISBN13:

    9780387400846

  • ISBN10:

    0387400842

  • Format: Hardcover
  • Copyright: 2006-07-21
  • Publisher: Springer Verlag
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $109.99 Save up to $70.75
  • Digital
    $85.02*
    Add to Cart

    DURATION
    PRICE
    *To support the delivery of the digital material to you, a digital delivery fee of $3.99 will be charged on each digital item.

Summary

This is a graduate-level textbook on Bayesian analysis blending modern Bayesian theory, methods, and applications. Starting from basic statistics, undergraduate calculus and linear algebra, ideas of both subjective and objective Bayesian analysis are developed to a level where real-life data can be analyzed using the current techniques of statistical computing. Advances in both low-dimensional and high-dimensional problems are covered, as well as important topics such as empirical Bayes and hierarchical Bayes methods and Markov chain Monte Carlo (MCMC) techniques. Many topics are at the cutting edge of statistical research. Solutions to common inference problems appear throughout the text along with discussion of what prior to choose. There is a discussion of elicitation of a subjective prior as well as the motivation, applicability, and limitations of objective priors. By way of important applications the book presents microarrays, nonparametric regression via wavelets as well as DMA mixtures of normals, and spatial analysis with illustrations using simulated and real data. Theoretical topics at the cutting edge include high-dimensional model selection and Intrinsic Bayes Factors, which the authors have successfully applied to geological mapping. The style is informal but clear. Asymptotics is used to supplement simulation or understand some aspects of the posterior.

Author Biography

J.K. Ghosh is currently a professor of statistics at Purdue University and professor emeritus at the Indian Statistical Institute Mohan Delampady and Tapas Samanta are both professors of statistics at the Indian Statistical Institute

Table of Contents

1 Statistical Preliminaries
1(28)
1.1 Common Models
1(6)
1.1.1 Exponential Families
4(1)
1.1.2 Location-Scale Families
5(1)
1.1.3 Regular Family
6(1)
1.2 Likelihood Function
7(2)
1.3 Sufficient Statistics and Ancillary Statistics
9(2)
1.4 Three Basic Problems of Inference in Classical Statistics
11(10)
1.4.1 Point Estimates
11(5)
1.4.2 Testing Hypotheses
16(4)
1.4.3 Interval Estimation
20(1)
1.5 Inference as a Statistical Decision Problem
21(2)
1.6 The Changing Face of Classical Inference
23(1)
1.7 Exercises
24(5)
2 Bayesian Inference and Decision Theory
29(36)
2.1 Subjective and Frequentist Probability
29(1)
2.2 Bayesian Inference
30(5)
2.3 Advantages of Being a Bayesian
35(2)
2.4 Paradoxes in Classical Statistics
37(1)
2.5 Elements of Bayesian Decision Theory
38(2)
2.6 Improper Priors
40(1)
2.7 Common Problems of Bayesian Inference
41(9)
2.7.1 Point Estimates
41(1)
2.7.2 Testing
42(6)
2.7.3 Credible Intervals
48(1)
2.7.4 Testing of a Sharp Null Hypothesis Through Credible Intervals
49(1)
2.8 Prediction of a Future Observation
50(1)
2.9 Examples of Cox and Welch Revisited
51(1)
2.10 Elimination of Nuisance Parameters
51(2)
2.11 A High-dimensional Example
53(1)
2.12 Exchangeability
54(1)
2.13 Normative and Descriptive Aspects of Bayesian Analysis, Elicitation of Probability
55(1)
2.14 Objective Priors and Objective Bayesian Analysis
55(2)
2.15 Other Paradigms
57(1)
2.16 Remarks
57(1)
2.17 Exercises
58(7)
3 Utility, Prior, and Bayesian Robustness
65(34)
3.1 Utility, Prior, and Rational Preference
65(2)
3.2 Utility and Loss
67(1)
3.3 Rationality Axioms Leading to the Bayesian Approach
68(2)
3.4 Coherence
70(1)
3.5 Bayesian Analysis with Subjective Prior
71(1)
3.6 Robustness and Sensitivity
72(2)
3.7 Classes of Priors
74(2)
3.7.1 Conjugate Class
74(1)
3.7.2 Neighborhood Class
75(1)
3.7.3 Density Ratio Class
75(1)
3.8 Posterior Robustness: Measures and Techniques
76(15)
3.8.1 Global Measures of Sensitivity
76(5)
3.8.2 Belief Functions
81(2)
3.8.3 Interactive Robust Bayesian Analysis
83(1)
3.8.4 Other Global Measures
84(1)
3.8.5 Local Measures of Sensitivity
84(7)
3.9 Inherently Robust Procedures
91(1)
3.10 Loss Robustness
92(1)
3.11 Model Robustness
93(1)
3.12 Exercises
94(5)
4 Large Sample Methods
99(22)
4.1 Limit of Posterior Distribution
100(7)
4.1.1 Consistency of Posterior Distribution
100(1)
4.1.2 Asymptotic Normality of Posterior Distribution
101(6)
4.2 Asymptotic Expansion of Posterior Distribution
107(6)
4.2.1 Determination of Sample Size in Testing
109(4)
4.3 Laplace Approximation
113(6)
4.3.1 Laplace's Method
113(2)
4.3.2 Tierney-Kadane-Kass Refinements
115(4)
4.4 Exercises
119(2)
5 Choice of Priors for Low-dimensional Parameters
121(38)
5.1 Different Methods of Construction of Objective Priors
122(25)
5.1.1 Uniform Distribution and Its Criticisms
123(2)
5.1.2 Jeffreys Prior as a Uniform Distribution
125(1)
5.1.3 Jeffreys Prior as a Minimizer of Information
126(3)
5.1.4 Jeffreys Prior as a Probability Matching Prior
129(3)
5.1.5 Conjugate Priors and Mixtures
132(3)
5.1.6 Invariant Objective Priors for Location-Scale Families
135(1)
5.1.7 Left and Right Invariant Priors
136(2)
5.1.8 Properties of the Right Invariant Prior for Location-Scale Families
138(1)
5.1.9 General Group Families
139(1)
5.1.10 Reference Priors
140(5)
5.1.11 Reference Priors Without Entropy Maximization
145(1)
5.1.12 Objective Priors with Partial Information
146(1)
5.2 Discussion of Objective Priors
147(2)
5.3 Exchangeability
149(1)
5.4 Elicitation of Hyperparameters for Prior
149(6)
5.5 A New Objective Bayes Methodology Using Correlation
155(1)
5.6 Exercises
156(3)
6 Hypothesis Testing and Model Selection
159(46)
6.1 Preliminaries
159(4)
6.1.1 BIC Revisited
161(2)
6.2 P-value and Posterior Probability of Ho as Measures of Evidence Against the Null
163(1)
6.3 Bounds on Bayes Factors and Posterior Probabilities
164(12)
6.3.1 Introduction
164(1)
6.3.2 Choice of Classes of Priors
165(3)
6.3.3 Multiparameter Problems
168(4)
6.3.4 Invariant Tests
172(4)
6.3.5 Interval Null Hypotheses and One-sided Tests
176(1)
6.4 Role of the Choice of an Asymptotic Framework
176(3)
6.4.1 Comparison of Decisions via P-values and Bayes Factors in Bahadur's Asymptotics
178(1)
6.4.2 Pitman Alternative and Resealed Priors
179(1)
6.5 Bayesian P-value
179(6)
6.6 Robust Bayesian Outlier Detection
185(3)
6.7 Nonsubjective Bayes Factors
188(11)
6.7.1 The Intrinsic Bayes Factor
190(1)
6.7.2 The Fractional Bayes Factor
191(3)
6.7.3 Intrinsic Priors
194(5)
6.8 Exercises
199(6)
7 Bayesian Computations
205(34)
7.1 Analytic Approximation
207(1)
7.2 The E-M Algorithm
208(3)
7.3 Monte Carlo Sampling
211(4)
7.4 Markov Chain Monte Carlo Methods
215(18)
7.4.1 Introduction
215(1)
7.4.2 Markov Chains in MCMC
216(2)
7.4.3 Metropolis-Hastings Algorithm
218(2)
7.4.4 Gibbs Sampling
220(3)
7.4.5 Rao-Blackwellization
223(2)
7.4.6 Examples
225(6)
7.4.7 Convergence Issues
231(2)
7.5 Exercises
233(6)
8 Some Common Problems in Inference
239(16)
8.1 Comparing Two Normal Means
239(2)
8.2 Linear Regression
241(4)
8.3 Logit Model, Probit Model, and Logistic Regression
245(7)
8.3.1 The Logit Model
246(5)
8.3.2 The Probit Model
251(1)
8.4 Exercises
252(3)
9 High-dimensional Problems
255(34)
9.1 Exchangeability, Hierarchical Priors, Approximation to Posterior for Large p, and MCMC
256(4)
9.1.1 MCMC and E-M Algorithm
259(1)
9.2 Parametric Empirical Bayes
260(3)
9.2.1 PEB and HB Interval Estimates
262(1)
9.3 Linear Models for High-dimensional Parameters
263(1)
9.4 Stein's Frequentist Approach to a High-dimensional Problem
264(4)
9.5 Comparison of High-dimensional and Low-dimensional Problems
268(1)
9.6 High-dimensional Multiple Testing (PEB)
269(4)
9.6.1 Nonparametric Empirical Bayes Multiple Testing
271(1)
9.6.2 False Discovery Rate (FDR)
272(1)
9.7 Testing of a High-dimensional Null as a Model Selection Problem
273(3)
9.8 High-dimensional Estimation and Prediction Based on Model Selection or Model Averaging
276(8)
9.9 Discussion
284(1)
9.10 Exercises
285(4)
10 Some Applications 289(14)
10.1 Disease Mapping
289(3)
10.2 Bayesian Nonparametric Regression Using Wavelets
292(7)
10.2.1 A Brief Overview of Wavelets
293(3)
10.2.2 Hierarchical Prior Structure and Posterior Computations
296(3)
10.3 Estimation of Regression Function Using Dirichlet Multinomial Allocation
299(3)
10.4 Exercises
302(1)
A Common Statistical Densities 303(4)
A.1 Continuous Models
303(3)
A.2 Discrete Models
306(1)
B Birnbaum's Theorem on Likelihood Principle 307(4)
C Coherence 311(2)
D Microarray 313(2)
E Bayes Sufficiency 315(2)
References 317(22)
Author Index 339(6)
Subject Index 345

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program