Preface | p. xiii |
Introduction | p. 1 |
Regression and Model Building | p. 1 |
Data Collection | p. 7 |
Uses of Regression | p. 11 |
Role of the Computer | p. 12 |
Simple Linear Regression | p. 13 |
Simple Linear Regression Model | p. 13 |
Least-Squares Estimation of the Parameters | p. 14 |
Estimation of [beta subscript 0] and [beta subscript 1] | p. 14 |
Properties of the Least-Squares Estimators and the Fitted Regression Model | p. 20 |
Estimation of [sigma superscript 2] | p. 22 |
An Alternate Form of the Model | p. 24 |
Hypothesis Testing on the Slope and Intercept | p. 24 |
Use of t-Tests | p. 25 |
Testing Significance of Regression | p. 26 |
The Analysis of Variance | p. 28 |
Interval Estimation in Simple Linear Regression | p. 32 |
Confidence Intervals on [beta subscript 0], [beta subscript 1], and [sigma superscript 2] | p. 32 |
Interval Estimation of the Mean Response | p. 34 |
Prediction of New Observations | p. 37 |
Coefficient of Determination | p. 39 |
Some Considerations in the Use of Regression | p. 41 |
Regression Through the Origin | p. 44 |
Estimation by Maximum Likelihood | p. 50 |
Case Where the Regressor x is Random | p. 52 |
x and y Jointly Distributed | p. 52 |
x and y Jointly Normally Distributed: The Correlation Model | p. 53 |
Problems | p. 58 |
Multiple Linear Regression | p. 67 |
Multiple Regression Models | p. 67 |
Estimation of the Model Parameters | p. 71 |
Least-Squares Estimation of the Regression Coefficients | p. 71 |
A Geometrical Interpretation of Least Squares | p. 81 |
Properties of the Least-Squares Estimators | p. 82 |
Estimation of [sigma superscript 2] | p. 82 |
Inadequacy of Scatter Diagrams in Multiple Regression | p. 84 |
Maximum-Likelihood Estimation | p. 85 |
Hypothesis Testing in Multiple Linear Regression | p. 87 |
Test for Significance of Regression | p. 87 |
Tests on Individual Regression Coefficients | p. 91 |
Special Case of Orthogonal Columns in X | p. 96 |
Testing the General Linear Hypothesis | p. 98 |
Confidence Intervals in Multiple Regression | p. 101 |
Confidence Intervals on the Regression Coefficients | p. 102 |
Confidence Interval Estimation of the Mean Response | p. 103 |
Simultaneous Confidence Intervals on Regression Coefficients | p. 104 |
Prediction of New Observations | p. 108 |
Hidden Extrapolation in Multiple Regression | p. 109 |
Standardized Regression Coefficients | p. 112 |
Multicollinearity | p. 117 |
Why Do Regression Coefficients have the Wrong Sign? | p. 120 |
Problems | p. 122 |
Model Adequacy Checking | p. 131 |
Introduction | p. 131 |
Residual Analysis | p. 132 |
Definition of Residuals | p. 132 |
Methods for Scaling Residuals | p. 132 |
Residual Plots | p. 138 |
Partial Regression and Partial Residual Plots | p. 146 |
Other Residual Plotting and Analysis Methods | p. 150 |
The PRESS Statistic | p. 152 |
Detection and Treatment of Outliers | p. 154 |
Lack of Fit of the Regression Model | p. 158 |
A Formal Test for Lack of Fit | p. 158 |
Estimation of Pure Error from Near-Neighbors | p. 162 |
Problems | p. 166 |
Transformations and Weighting to Correct Model Inadequacies | p. 173 |
Introduction | p. 173 |
Variance-Stabilizing Transformations | p. 174 |
Transformations to Linearize the Model | p. 178 |
Analytical Methods for Selecting a Transformation | p. 186 |
Transformations on y: The Box-Cox Method | p. 186 |
Transformations on the Regressor Variables | p. 189 |
Generalized and Weighted Least Squares | p. 193 |
Generalized Least Squares | p. 193 |
Weighted Least Squares | p. 195 |
Some Practical Issues | p. 196 |
Problems | p. 200 |
Diagnostics for Leverage and Influence | p. 207 |
Importance of Detecting Influential Observations | p. 207 |
Leverage | p. 209 |
Measures of Influence: Cook's D | p. 210 |
Measures of Influence: DFFITS and DFBETAS | p. 213 |
A Measure of Model Performance | p. 216 |
Detecting Groups of Influential Observations | p. 217 |
Treatment of Influential Observations | p. 218 |
Problems | p. 219 |
Polynomial Regression Models | p. 221 |
Introduction | p. 221 |
Polynomial Models in One Variable | p. 221 |
Basic Principles | p. 221 |
Piecewise Polynomial Fitting (Splines) | p. 228 |
Polynomial and Trigonometric Terms | p. 236 |
Nonparametric Regression | p. 237 |
Kernel Regression | p. 238 |
Locally Weighted Regression (Loess) | p. 239 |
Final Cautions | p. 243 |
Polynomial Models in Two or More Variables | p. 244 |
Orthogonal Polynomials | p. 253 |
Problems | p. 258 |
Indicator Variables | p. 265 |
The General Concept of Indicator Variables | p. 265 |
Comments on the Use of Indicator Variables | p. 279 |
Indicator Variables versus Regression on Allocated Codes | p. 279 |
Indicator Variables as a Substitute for a Quantitative Regressor | p. 280 |
Regression Approach to Analysis of Variance | p. 281 |
Problems | p. 287 |
Variable Selection and Model Building | p. 291 |
Introduction | p. 291 |
The Model-Building Problem | p. 291 |
Consequences of Model Misspecification | p. 292 |
Criteria for Evaluating Subset Regression Models | p. 296 |
Computational Techniques for Variable Selection | p. 302 |
All Possible Regressions | p. 302 |
Stepwise Regression Methods | p. 310 |
Some Final Recommendations for Practice | p. 317 |
Problems | p. 318 |
Multicollinearity | p. 325 |
Introduction | p. 325 |
Sources of Multicollinearity | p. 325 |
Effects of Multicollinearity | p. 328 |
Multicollinearity Diagnostics | p. 334 |
Examination of the Correlation Matrix | p. 334 |
Variance Inflation Factors | p. 337 |
Eigensystem Analysis of X'X | p. 339 |
Other Diagnostics | p. 343 |
Methods for Dealing with Multicollinearity | p. 345 |
Collecting Additional Data | p. 345 |
Model Respecification | p. 346 |
Ridge Regression | p. 348 |
Other Methods | p. 363 |
Comparison and Evaluation of Biased Estimators | p. 375 |
Problems | p. 378 |
Robust Regression | p. 382 |
The Need for Robust Regression | p. 382 |
M-Estimators | p. 386 |
Properties of Robust Estimators | p. 400 |
Breakdown Point | p. 400 |
Efficiency | p. 401 |
Survey of Other Robust Regression Estimators | p. 401 |
High-Breakdown-Point Estimators | p. 401 |
Bounded Influence Estimators | p. 406 |
Other Procedures | p. 407 |
Computing Robust Regression Estimators | p. 409 |
Problems | p. 410 |
Introduction to Nonlinear Regression | p. 414 |
Linear and Nonlinear Regression Models | p. 414 |
Linear Regression Models | p. 414 |
Nonlinear Regression Models | p. 415 |
Nonlinear Least Squares | p. 416 |
Transformation to a Linear Model | p. 420 |
Parameter Estimation in a Nonlinear System | p. 423 |
Linearization | p. 423 |
Other Parameter Estimation Methods | p. 431 |
Starting Values | p. 432 |
Computer Programs | p. 433 |
Statistical Inference in Nonlinear Regression | p. 434 |
Examples of Nonlinear Regression Models | p. 437 |
Problems | p. 438 |
Generalized Linear Models | p. 443 |
Introduction | p. 443 |
Logistic Regression Models | p. 444 |
Models with a Binary Response Variable | p. 444 |
Estimating the Parameters in a Logistic Regression Model | p. 447 |
Interpretation of the Parameters in a Logistic Regression Model | p. 450 |
Hypothesis Tests on Model Param6ters | p. 453 |
Poisson Regression | p. 459 |
The Generalized Linear Model | p. 466 |
Link Functions and Linear Predictors | p. 467 |
Parameter Estimation and Inference in the GLM | p. 468 |
Prediction and Estimation with the GLM | p. 472 |
Residual Analysis in the GLM | p. 474 |
Overdispersion | p. 475 |
Problems | p. 477 |
Other Topics in the Use of Regression Analysis | p. 488 |
Regression Models with Autocorrelation Errors | p. 488 |
Source and Effects of Autocorrelation | p. 488 |
Detecting the Presence of Autocorrelation | p. 489 |
Parameter Estimation Methods | p. 494 |
Effect of Measurement Errors in the Regressors | p. 500 |
Simple Linear Regression | p. 501 |
The Berkson Model | p. 502 |
Inverse Estimation--The Calibration Problem | p. 503 |
Bootstrapping in Regression | p. 508 |
Bootstrap Sampling in Regression | p. 509 |
Bootstrap Confidence Intervals | p. 510 |
Classification and Regression Trees (CART) | p. 516 |
Neural Networks | p. 518 |
Designed Experiments for Regression | p. 521 |
Problems | p. 524 |
Validation of Regression Models | p. 529 |
Introduction | p. 529 |
Validation Techniques | p. 530 |
Analysis of Model Coefficients and Predicted Values | p. 530 |
Collecting Fresh Data--Confirmation Runs | p. 532 |
Data Splitting | p. 534 |
Data from Planned Experiments | p. 545 |
Problems | p. 545 |
Statistical Tables | p. 549 |
Data Sets For Exercises | p. 567 |
Supplemental Technical Material | p. 582 |
Background on Basic Test Statistics | p. 582 |
Background from the Theory of Linear Models | p. 585 |
Important Results on SS[subscript R] and SS[subscript Res] | p. 588 |
The Gauss-Markov Theorem, Var([varepsilon]) = [sigma superscript 2]I | p. 594 |
Computational Aspects of Multiple Regression | p. 595 |
A Result on the Inverse of a Matrix | p. 597 |
Development of the PRESS Statistic | p. 598 |
Development of S[superscript 2 subscript (i)] | p. 600 |
An Outlier Test Based on R-Student | p. 601 |
The Gauss-Markov Theorem, Var([varepsilon]) = V | p. 604 |
The Bias in MS[subscript Res] When the Model is Underspecified | p. 606 |
Computation of Influence Diagnostics | p. 608 |
Generalized Linear Models | p. 610 |
References | p. 621 |
Index | p. 637 |
Table of Contents provided by Syndetics. All Rights Reserved. |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.