did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780471663799

Applied Linear Regression

by
  • ISBN13:

    9780471663799

  • ISBN10:

    0471663794

  • Edition: 3rd
  • Format: Hardcover
  • Copyright: 2005-02-11
  • Publisher: Wiley
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $162.00

Summary

"Applied Linear Regression, Third Edition is thoroughly updated to help students master the theory and applications of linear regression modeling. Focusing on model building, assessing fit and reliability, and drawing conclusions, the text demonstrates how to develop estimation, confidence, and testing procedures primarily through the use of least squares regression. To facilitate quick learning, this "Third Edition stresses using graphical methods to find appropriate models and to better understand them. In that spirit, most analyses and homework problems use graphs for the discovery of structure as well as for the summarization of results. This text is an excellent tool for learning how to use linear regression analysis techniques to solve and gain insight into real-life problems.

Author Biography

SANFORD WEISBERG, PhD, is Professor of Statistics and Director of the Statistical Consulting Service at the University of Minnesota. He has authored or coauthored three popular texts for John Wiley & Sons, Inc. and is a Fellow of the American Statistical Association.

Table of Contents

Preface xiii
1 Scatterplots and Regression
1(18)
1.1 Scatterplots,
1(8)
1.2 Mean Functions,
9(2)
1.3 Variance Functions,
11(1)
1.4 Summary Graph,
11(1)
1.5 Tools for Looking at Scatterplots,
12(3)
1.5.1 Size,
13(1)
1.5.2 Transformations,
14(1)
1.5.3 Smoothers for the Mean Function,
14(1)
1.6 Scatterplot Matrices,
15(2)
Problems,
17(2)
2 Simple Linear Regression
19(28)
2.1 Ordinary Least Squares Estimation,
21(2)
2.2 Least Squares Criterion,
23(2)
2.3 Estimating σ2,
25(1)
2.4 Properties of Least Squares Estimates,
26(1)
2.5 Estimated Variances,
27(1)
2.6 Comparing Models: The Analysis of Variance,
28(3)
2.6.1 The F-Test for Regression,
30(1)
2.6.2 Interpreting ρ-values,
31(1)
2.6.3 Power of Tests,
31(1)
2.7 The Coefficient of Determination, R²,
31(1)
2.8 Confidence Intervals and Tests,
32(4)
2.8.1 The Intercept,
32(1)
2.8.2 Slope,
33(1)
2.8.3 Prediction,
34(1)
2.8.4 Fitted Values,
35(1)
2.9 The Residuals,
36(2)
Problems,
38(9)
3 Multiple Regression
47(22)
3.1 Adding a Term to a Simple Linear Regression Model,
47(3)
3.1.1 Explaining Variability,
49(1)
3.1.2 Added-Variable Plots,
49(1)
3.2 The Multiple Linear Regression Model,
50(1)
3.3 Terms and Predictors,
51(3)
3.4 Ordinary Least Squares,
54(7)
3.4.1 Data and Matrix Notation,
54(2)
3.4.2 Variance-Covariance Matrix of e,
56(1)
3.4.3 Ordinary Least Squares Estimators,
56(1)
3.4.4 Properties of the Estimates,
57(1)
3.4.5 Simple Regression in Matrix Terms,
58(3)
3.5 The Analysis of Variance,
61(4)
3.5.1 The Coefficient of Determination,
62(1)
3.5.2 Hypotheses Concerning One of the Terms,
62(1)
3.5.3 Relationship to the t-Statistic,
63(1)
3.5.4 t-Tests and Added-Variable Plots,
63(1)
3.5.5 Other Tests of Hypotheses,
64(1)
3.5.6 Sequential Analysis of Variance Tables,
64(1)
3.6 Predictions and Fitted Values,
65(1)
Problems,
65(4)
4 Drawing Conclusions
69(27)
4.1 Understanding Parameter Estimates,
69(8)
4.1.1 Rate of Change,
69(1)
4.1.2 Signs of Estimates,
70(1)
4.1.3 Interpretation Depends on Other Terms in the Mean Function,
70(3)
4.1.4 Rank Deficient and Over-Parameterized Mean Functions,
73(1)
4.1.5 Tests,
74(1)
4.1.6 Dropping Terms,
74(2)
4.1.7 Logarithms,
76(1)
4.2 Experimentation Versus Observation,
77(3)
4.3 Sampling from a Normal Population,
80(1)
4.4 More on R²,
81(3)
4.4.1 Simple Linear Regression and R²,
83(1)
4.4.2 Multiple Linear Regression,
84(1)
4.4.3 Regression through the Origin,
84(1)
4.5 Missing Data,
84(3)
4.5.1 Missing at Random,
84(1)
4.5.2 Alternatives,
85(2)
4.6 Computationally Intensive Methods,
87(5)
4.6.1 Regression Inference without Normality,
87(2)
4.6.2 Nonlinear Functions of Parameters,
89(1)
4.6.3 Predictors Measured with Error,
90(2)
Problems,
92(4)
5 Weights, Lack of Fit, and More
96(19)
5.1 Weighted Least Squares,
96(4)
5.1.1 Applications of Weighted Least Squares,
98(1)
5.1.2 Additional Comments,
99(1)
5.2 Testing for Lack of Fit, Variance Known,
100(2)
5.3 Testing for Lack of Fit, Variance Unknown,
102(3)
5.4 General F Testing,
105(3)
5.4.1 Non-null Distributions,
107(1)
5.4.2 Additional Comments,
108(1)
5.5 Joint Confidence Regions,
108(2)
Problems,
110(5)
6 Polynomials and Factors
115(32)
6.1 Polynomial Regression,
115(7)
6.1.1 Polynomials with Several Predictors,
117(3)
6.1.2 Using the Delta Method to Estimate a Minimum or a Maximum,
120(2)
6.1.3 Fractional Polynomials,
122(1)
6.2 Factors,
122(8)
6.2.1 No Other Predictors,
123(3)
6.2.2 Adding a Predictor: Comparing Regression Lines,
126(3)
6.2.3 Additional Comments,
129(1)
6.3 Many Factors,
130(1)
6.4 Partial One-Dimensional Mean Functions,
131(3)
6.5 Random Coefficient Models,
134(3)
Problems,
137(10)
7 Transformations
147(20)
7.1 Transformations and Scatterplots,
147(6)
7.1.1 Power Transformations,
148(2)
7.1.2 Transforming Only the Predictor Variable,
150(2)
7.1.3 Transforming the Response Only,
152(1)
7.1.4 The Box and Cox Method,
153(1)
7.2 Transformations and Scatterplot Matrices,
153(6)
7.2.1 The 1D Estimation Result and Linearly Related Predictors,
156(1)
7.2.2 Automatic Choice of Transformation of Predictors,
157(2)
7.3 Transforming the Response,
159(1)
7.4 Transformations of Nonpositive Variables,
160(1)
Problems,
161(6)
8 Regression Diagnostics: Residuals
167(27)
8.1 The Residuals,
167(9)
8.1.1 Difference Between ê and e,
168(1)
8.1.2 The Hat Matrix,
169(1)
8.1.3 Residuals and the Hat Matrix with Weights,
170(1)
8.1.4 The Residuals When the Model Is Correct,
171(1)
8.1.5 The Residuals When the Model Is Not Correct,
171(2)
8.1.6 Fuel Consumption Data,
173(3)
8.2 Testing for Curvature,
176(1)
8.3 Nonconstant Variance,
177(8)
8.3.1 Variance Stabilizing Transformations,
179(1)
8.3.2 A Diagnostic for Nonconstant Variance,
180(5)
8.3.3 Additional Comments,
185(1)
8.4 Graphs for Model Assessment,
185(6)
8.4.1 Checking Mean Functions,
186(3)
8.4.2 Checking Variance Functions,
189(2)
Problems,
191(3)
9 Outliers and Influence
194(17)
9.1 Outliers,
194(4)
9.1.1 An Outlier Test,
194(2)
9.1.2 Weighted Least Squares,
196(1)
9.1.3 Significance Levels for the Outlier Test,
196(1)
9.1.4 Additional Comments,
197(1)
9.2 Influence of Cases,
198(6)
9.2.1 Cook's Distance,
198(1)
9.2.2 Magnitude of Di,
199(1)
9.2.3 Computing Di,
200(3)
9.2.4 Other Measures of Influence,
203(1)
9.3 Normality Assumption,
204(2)
Problems,
206(5)
10 Variable Selection 211(22)
10.1 The Active Terms,
211(6)
10.1.1 Collinearity,
214(2)
10.1.2 Collinearity and Variances,
216(1)
10.2 Variable Selection,
217(4)
10.2.1 Information Criteria,
217(3)
10.2.2 Computationally Intensive Criteria,
220(1)
10.2.3 Using Subject-Matter Knowledge,
220(1)
10.3 Computational Methods,
221(5)
10.3.1 Subset Selection Overstates Significance,
225(1)
10.4 Windmills,
226(4)
10.4.1 Six Mean Functions,
226(2)
10.4.2 A Computationally Intensive Approach,
228(2)
Problems,
230(3)
11 Nonlinear Regression 233(18)
11.1 Estimation for Nonlinear Mean Functions,
234(3)
11.2 Inference Assuming Large Samples,
237(7)
11.3 Bootstrap Inference,
244(4)
11.4 References,
248(1)
Problems,
248(3)
12 Logistic Regression 251(19)
12.1 Binomial Regression,
253(2)
12.1.1 Mean Functions for Binomial Regression,
254(1)
12.2 Fitting Logistic Regression,
255(8)
12.2.1 One-Predictor Example,
255(1)
12.2.2 Many Terms,
256(4)
12.2.3 Deviance,
260(1)
12.2.4 Goodness-of-Fit Tests,
261(2)
12.3 Binomial Random Variables,
263(2)
12.3.1 Maximum Likelihood Estimation,
263(1)
12.3.2 The Log-Likelihood for Logistic Regression,
264(1)
12.4 Generalized Linear Models,
265(1)
Problems,
266(4)
Appendix 270(23)
A.1 Web Site,
270(1)
A.2 Means and Variances of Random Variables,
270(3)
A.2.1 E Notation,
270(1)
A.2.2 Var Notation,
271(1)
A.2.3 Cov Notation,
271(1)
A.2.4 Conditional Moments,
272(1)
A.3 Least Squares for Simple Regression,
273(1)
A.4 Means and Variances of Least Squares Estimates,
273(2)
A.5 Estimating E(Y/X) Using a Smoother,
275(3)
A.6 A Brief Introduction to Matrices and Vectors,
278(5)
A.6.1 Addition and Subtraction,
279(1)
A.6.2 Multiplication by a Scalar,
280(1)
A.6.3 Matrix Multiplication,
280(1)
A.6.4 Transpose of a Matrix,
281(1)
A.6.5 Inverse of a Matrix,
281(1)
A.6.6 Orthogonality,
282(1)
A.6.7 Linear Dependence and Rank of a Matrix,
283(1)
A.7 Random Vectors,
283(1)
A.8 Least Squares Using Matrices,
284(2)
A.8.1 Properties of Estimates,
285(1)
A.8.2 The Residual Sum of Squares,
285(1)
A.8.3 Estimate of Variance,
286(1)
A.9 The QR Factorization,
286(1)
A.10 Maximum Likelihood Estimates,
287(2)
A.11 The Box-Cox Method for Transformations,
289(2)
A.11.1 Univariate Case,
289(1)
A.11.2 Multivariate Case,
290(1)
A.12 Case Deletion in Linear Regression,
291(2)
References 293(8)
Author Index 301(4)
Subject Index 305

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program