did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780471746966

Regression Analysis by Example, 4th Edition

by ; ;
  • ISBN13:

    9780471746966

  • ISBN10:

    0471746967

  • Format: Hardcover
  • Copyright: 2006-07-01
  • Publisher: Wiley-Interscience
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $152.00

Summary

The essentials of regression analysis through practical applications Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgement. Regression Analysis by Example, Fourth Edition has been expanded and thoroughly updated to reflect recent advances in the field. The emphasis continues to be on exploratory data analysis rather than statistical theory. The book offers in-depth treatment of regression diagnostics, transformation, multicollinearity, logistic regression, and robust regression. This new edition features the following enhancements: Chapter 12, Logistic Regression, is expanded to reflect the increased use of the logit models in statistical analysis A new chapter entitled Further Topics discusses advanced areas of regression analysis Reorganized, expanded, and upgraded exercises appear at the end of each chapter A fully integrated Web page provides data sets Numerous graphical displays highlight the significance of visual appeal Regression Analysis by Example, Fourth Edition is suitable for anyone with an understanding of elementary statistics. Methods of regression analysis are clearly demonstrated, and examples containing the types of irregularities commonly encountered in the real world are provided. Each example isolates one or two techniques and features detailed discussions of the techniques themselves, the required assumptions, and the evaluated success of each technique. The methods described throughout the book can be carried out with most of the currently available statistical software packages, such as the software package R. An Instructor's Manual presenting detailed solutions to all the problems in the book is available online from the Wiley editorial department.

Author Biography

SAMPRIT CHATTERJEE, PHD, is Professor of Health Policy at Mount Sinai School of Medicine. He is also Professor Emeritus of Statistics at New York University. A well-known research scientist and Fulbright scholar, Dr. Chatterjee has co-authored Sensitivity Analysis in Linear Regression (with Dr. Hadi) and A Casebook for a First Course in Statistics and Data Analysis, both published by Wiley.

ALI S. HADI, PHD, is Vice Provost and Professor of Mathematical, Statistical, and Computing Sciences at The American University in Cairo. He is also a Stephen H. Weiss Presidential Fellow and Professor Emeritus at Cornell University. Dr. Hadi is the author/co-author of four other books, a Fellow of the American Statistical Association, and an elected member of the International Statistical Institute.

Table of Contents

Preface xiii
1 Introduction
1(20)
1.1 What Is Regression Analysis?
1(1)
1.2 Publicly Available Data Sets
2(1)
1.3 Selected Applications of Regression Analysis
3(4)
1.3.1 Agricultural Sciences
3(1)
1.3.2 Industrial and Labor Relations
3(1)
1.3.3 History
4(2)
1.3.4 Government
6(1)
1.3.5 Environmental Sciences
6(1)
1.4 Steps in Regression Analysis
7(10)
1.4.1 Statement of the Problem
11(1)
1.4.2 Selection of Potentially Relevant Variables
11(1)
1.4.3 Data Collection
11(1)
1.4.4 Model Specification
12(2)
1.4.5 Method of Fitting
14(1)
1.4.6 Model Fitting
14(2)
1.4.7 Model Criticism and Selection
16(1)
1.4.8 Objectives of Regression Analysis
16(1)
1.5 Scope and Organization of the Book
17(1)
Exercises
18(3)
2 Simple Linear Regression
21(32)
2.1 Introduction
21(1)
2.2 Covariance and Correlation Coefficient
21(5)
2.3 Example: Computer Repair Data
26(2)
2.4 The Simple Linear Regression Model
28(1)
2.5 Parameter Estimation
29(3)
2.6 Tests of Hypotheses
32(5)
2.7 Confidence Intervals
37(1)
2.8 Predictions
37(2)
2.9 Measuring the Quality of Fit
39(3)
2.10 Regression Line Through the Origin
42(2)
2.11 Trivial Regression Models
44(1)
2.12 Bibliographic Notes
45(1)
Exercises
45(8)
3 Multiple Linear Regression
53(32)
3.1 Introduction
53(1)
3.2 Description of the Data and Model
53(1)
3.3 Example: Supervisor Performance Data
54(3)
3.4 Parameter Estimation
57(1)
3.5 Interpretations of Regression Coefficients
58(2)
3.6 Properties of the Least Squares Estimators
60(1)
3.7 Multiple Correlation Coefficient
61(1)
3.8 Inference for Individual Regression Coefficients
62(2)
3.9 Tests of Hypotheses in a Linear Model
64(10)
3.9.1 Testing All Regression Coefficients Equal to Zero
66(3)
3.9.2 Testing a Subset of Regression Coefficients Equal to Zero
69(2)
3.9.3 Testing the Equality of Regression Coefficients
71(2)
3.9.4 Estimating and Testing of Regression Parameters Under Constraints
73(1)
3.10 Predictions
74(1)
3.11 Summary
75(1)
Exercises
75(7)
Appendix: Multiple Regression in Matrix Notation
82(3)
4 Regression Diagnostics: Detection of Model Violations
85(36)
4.1 Introduction
85(1)
4.2 The Standard Regression Assumptions
86(2)
4.3 Various Types of Residuals
88(2)
4.4 Graphical Methods
90(3)
4.5 Graphs Before Fitting a Model
93(4)
4.5.1 One-Dimensional Graphs
93(1)
4.5.2 Two-Dimensional Graphs
93(3)
4.5.3 Rotating Plots
96(1)
4.5.4 Dynamic Graphs
96(1)
4.6 Graphs After Fitting a Model
97(1)
4.7 Checking Linearity and Normality Assumptions
97(1)
4.8 Leverage, Influence, and Outliers
98(5)
4.8.1 Outliers in the Response Variable
100(1)
4.8.2 Outliers in the Predictors
100(1)
4.8.3 Masking and Swamping Problems
100(3)
4.9 Measures of Influence
103(4)
4.9.1 Cook's Distance
103(1)
4.9.2 Welsch and Kuh Measure
104(1)
4.9.3 Hadi's Influence Measure
105(2)
4.10 The Potential-Residual Plot
107(1)
4.11 What to Do with the Outliers?
108(1)
4.12 Role of Variables in a Regression Equation
109(5)
4.12.1 Added-Variable Plot
109(1)
4.12.2 Residual Plus Component Plot
110(4)
4.13 Effects of an Additional Predictor
114(1)
4.14 Robust Regression
115(1)
Exercises
115(6)
5 Qualitative Variables as Predictors
121(30)
5.1 Introduction
121(1)
5.2 Salary Survey Data
122(3)
5.3 Interaction Variables
125(3)
5.4 Systems of Regression Equations
128(11)
5.4.1 Models with Different Slopes and Different Intercepts
130(7)
5.4.2 Models with Same Slope and Different Intercepts
137(1)
5.4.3 Models with Same Intercept and Different Slopes
138(1)
5.5 Other Applications of Indicator Variables
139(1)
5.6 Seasonality
140(1)
5.7 Stability of Regression Parameters Over Time
141(2)
Exercises
143(8)
6 Transformation of Variables
151(28)
6.1 Introduction
151(2)
6.2 Transformations to Achieve Linearity
153(2)
6.3 Bacteria Deaths Due to X-Ray Radiation
155(4)
6.3.1 Inadequacy of a Linear Model
156(2)
6.3.2 Logarithmic Transformation for Achieving Linearity
158(1)
6.4 Transformations to Stabilize Variance
159(5)
6.5 Detection of Heteroscedastic Errors
164(2)
6.6 Removal of Heteroscedasticity
166(1)
6.7 Weighted Least Squares
167(1)
6.8 Logarithmic Transformation of Data
168(1)
6.9 Power Transformation
169(4)
6.10 Summary
173(1)
Exercises
174(5)
7 Weighted Least Squares
179(18)
7.1 Introduction
179(1)
7.2 Heteroscedastic Models
180(3)
7.2.1 Supervisors Data
180(2)
7.2.2 College Expense Data
182(1)
7.3 Two-Stage Estimation
183(2)
7.4 Education Expenditure Data
185(9)
7.5 Fitting a Dose-Response Relationship Curve
194(2)
Exercises
196(1)
8 The Problem of Correlated Errors
197(24)
8.1 Introduction: Autocorrelation
197(1)
8.2 Consumer Expenditure and Money Stock
198(2)
8.3 Durbin-Watson Statistic
200(2)
8.4 Removal of Autocorrelation by Transformation
202(2)
8.5 Iterative Estimation With Autocorrelated Errors
204(1)
8.6 Autocorrelation and Missing Variables
205(1)
8.7 Analysis of Housing Starts
206(4)
8.8 Limitations of Durbin-Watson Statistic
210(1)
8.9 Indicator Variables to Remove Seasonality
211(3)
8.10 Regressing Two Time Series
214(2)
Exercises
216(5)
9 Analysis of Collinear Data
221(38)
9.1 Introduction
221(1)
9.2 Effects on Inference
222(6)
9.3 Effects on Forecasting
228(5)
9.4 Detection of Multicollinearity
233(6)
9.5 Centering and Scaling
239(4)
9.5.1 Centering and Scaling in Intercept Models
240(1)
9.5.2 Scaling in No-Intercept Models
241(2)
9.6 Principal Components Approach
243(3)
9.7 Imposing Constraints
246(2)
9.8 Searching for Linear Functions of the 3's
248(4)
9.9 Computations Using Principal Components
252(2)
9.10 Bibliographic Notes
254(1)
Exercises
254(1)
Appendix: Principal Components
255(4)
10 Biased Estimation of Regression Coefficients 259(22)
10.1 Introduction
259(1)
10.2 Principal Components Regression
260(2)
10.3 Removing Dependence Among the Predictors
262(2)
10.4 Constraints on the Regression Coefficients
264(1)
10.5 Principal Components Regression: A Caution
265(3)
10.6 Ridge Regression
268(1)
10.7 Estimation by the Ridge Method
269(3)
10.8 Ridge Regression: Some Remarks
272(3)
10.9 Summary
275(1)
Exercises
275(2)
Appendix: Ridge Regression
277(4)
11 Variable Selection Procedures 281(36)
11.1 Introduction
281(1)
11.2 Formulation of the Problem
282(1)
11.3 Consequences of Variables Deletion
282(2)
11.4 Uses of Regression Equations
284(1)
11.4.1 Description and Model Building
284(1)
11.4.2 Estimation and Prediction
284(1)
11.4.3 Control
284(1)
11.5 Criteria for Evaluating Equations
285(3)
11.5.1 Residual Mean Square
285(1)
11.5.2 Mallows Cp
286(1)
11.5.3 Information Criteria: Akaike and Other Modified Forms
287(1)
11.6 Multicollinearity and Variable Selection
288(1)
11.7 Evaluating All Possible Equations
288(1)
11.8 Variable Selection Procedures
289(2)
11.8.1 Forward Selection Procedure
289(1)
11.8.2 Backward Elimination Procedure
290(1)
11.8.3 Stepwise Method
290(1)
11.9 General Remarks on Variable Selection Methods
291(1)
11.10 A Study of Supervisor Performance
292(4)
11.11 Variable Selection With Collinear Data
296(1)
11.12 The Homicide Data
296(3)
11.13 Variable Selection Using Ridge Regression
299(1)
11.14 Selection of Variables in an Air Pollution Study
300(7)
11.15 A Possible Strategy for Fitting Regression Models
307(1)
11.16 Bibliographic Notes
308(1)
Exercises
308(5)
Appendix: Effects of Incorrect Model Specifications
313(4)
12 Logistic Regression 317(24)
12.1 Introduction
317(1)
12.2 Modeling Qualitative Data
318(1)
12.3 The Logit Model
318(2)
12.4 Example: Estimating Probability of Bankruptcies
320(3)
12.5 Logistic Regression Diagnostics
323(1)
12.6 Determination of Variables to Retain
324(3)
12.7 Judging the Fit of a Logistic Regression
327(2)
12.8 The Multinomial Logit Model
329(7)
12.8.1 Multinomial Logistic Regression
329(1)
12.8.2 Example: Determining Chemical Diabetes
330(4)
12.8.3 Ordered Response Category: Ordinal Logistic Regression
334(1)
12.8.4 Example: Determining Chemical Diabetes Revisited
335(1)
12.9 Classification Problem: Another Approach
336(1)
Exercises
337(4)
13 Further Topics 341(12)
13.1 Introduction
341(1)
13.2 Generalized Linear Model
341(1)
13.3 Poisson Regression Model
342(1)
13.4 Introduction of New Drugs
343(2)
13.5 Robust Regression
345(1)
13.6 Fitting a Quadratic Model
346(2)
13.7 Distribution of PCB in U.S. Bays
348(4)
Exercises
352(1)
Appendix A: Statistical Tables 353(10)
References 363(8)
Index 371

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program