Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
Purchase Benefits
What is included with this book?
Deborah Rumsey, PhD, is a Statistics Education Specialist and Auxiliary Faculty Member in the Department of Statistics at Ohio State University. She is also a Fellow of the American Statistical Association and has received the Presidential Teaching Award from Kansas State University. Dr. Rumsey has published numerous papers and given many professional presentations on the subject of statistics education.
Introduction | p. 1 |
About This Book | p. 1 |
Conventions Used in This Book | p. 2 |
What You're Not to Read | p. 3 |
Foolish Assumptions | p. 3 |
How This Book Is Organized | p. 3 |
Tackling Data Analysis and Model-Building Basics | p. 4 |
Using Different Types of Regression to Make Predictions | p. 4 |
Analyzing Variance with ANOVA | p. 4 |
Building Strong Connections with Chi-Square Tests | p. 5 |
Nonparametric Statistics: Rebels without a Distribution | p. 5 |
The Part of Tens | p. 5 |
Icons Used in This Book | p. 5 |
Where to Go from Here | p. 6 |
Tackling Data Analysis and Model-Building Basics | p. 7 |
Beyond Number Crunching: The Art and Science of Data Analysis | p. 9 |
Data Analysis: Looking before You Crunch | p. 9 |
Nothing (not even a Straight line) lasts forever | p. 11 |
Data snooping isn't cool | p. 11 |
No (data) fishing allowed | p. 12 |
Getting the Big Picture: An Overview of Stats II | p. 13 |
Population parameter | p. 13 |
Sample statistic | p. 14 |
Confidence interval | p. 14 |
Hypothesis test | p. 15 |
Analysis of variance (ANOVA) | p. 15 |
Multiple comparisons | p. 16 |
Interaction effects | p. 16 |
Correlation | p. 17 |
Linear regression | p. 18 |
Chi-square tests | p. 19 |
Nonparametrics | p. 20 |
Finding the Right Analysis for the Job | p. 21 |
Categorical versus Quantitative Variables | p. 22 |
Statistics for Categorical Variables | p. 23 |
Estimating a proportion | p. 23 |
Comparing proportions | p. 24 |
Looking for relationships between categorical variables | p. 25 |
Building models to make predictions | p. 26 |
Statistics for Quantitative Variables | p. 27 |
Making estimates | p. 27 |
Making comparisons | p. 28 |
Exploring relationships | p. 28 |
Predicting y using x | p. 30 |
Avoiding Bias | p. 31 |
Measuring Precision with Margin of Error | p. 33 |
Knowing Your Limitations | p. 34 |
Reviewing Confidence Intervals and Hypothesis Tests | p. 37 |
Estimating Parameters by Using Confidence Intervals | p. 38 |
Getting the basics: The general form of a confidence interval | p. 38 |
Finding the confidence interval for a population mean | p. 39 |
What changes the margin of error? | p. 40 |
Interpreting a confidence interval | p. 43 |
What's the Hype about Hypothesis Tests? | p. 44 |
What Ho and Ha really represent | p. 44 |
Gathering your evidence into a test statistic | p. 45 |
Determining strength of evidence with a p-value | p. 45 |
False alarms and missed opportunities: Type I and II errors | p. 46 |
The power of a hypothesis test | p. 48 |
Using Different Types of Regression to Make Predictions | p. 53 |
Getting in Line with Simple Linear Regression | p. 55 |
Exploring Relationships with Scatterplots and Correlations | p. 56 |
Using scatterplots to explore relationships | p. 57 |
Collating the information by using the correlation coefficient | p. 58 |
Building a Simple Linear Regression Model | p. 60 |
Finding the best-fitting line to model your data | p. 60 |
The y-intercept of the regression line | p. 61 |
The slope of the regression line | p. 62 |
Making point estimates by using the regression line | p. 63 |
No Conclusion Left Behind: Tests and Confidence Intervals for Regression | p. 63 |
Scrutinizing the slope | p. 64 |
Inspecting the y-intercept | p. 66 |
Building confidence intervals for the average response | p. 68 |
Making the band with prediction intervals | p. 69 |
Checking the Model's Fit (The Data, Not the Clothes!) | p. 71 |
Defining the conditions | p. 71 |
Finding and exploring the residuals | p. 73 |
Using r2 to measure model fit | p. 76 |
Scoping for outliers | p. 77 |
Knowing the Limitations of Your Regression Analysis | p. 79 |
Avoiding slipping into cause-and-effect mode | p. 79 |
Extrapolation: The ultimate no-no | p. 80 |
Sometimes you need more than one variable | p. 81 |
Multiple Regression with Two X Variables | p. 83 |
Getting to Know the Multiple Regression Model | p. 83 |
Discovering the uses of multiple regression | p. 84 |
Looking at the general form of the multiple regression model | p. 84 |
Stepping through the analysis | p. 85 |
Looking at x's and y's | p. 85 |
Collecting the Data | p. 86 |
Pinpointing Possible Relationships | p. 88 |
Making scatterplots | p. 88 |
Correlations: Examining the bond | p. 89 |
Checking for Multicolinearity | p. 91 |
Finding the Best-Fitting Model for Two x Variables | p. 92 |
Getting the multiple regression coefficients | p. 93 |
Interpreting the coefficients | p. 94 |
Testing the coefficients | p. 95 |
Predicting y by Using the x Variables | p. 97 |
Checking the Fit of the Multiple Regression Model | p. 98 |
Noting the conditions | p. 98 |
Plotting a plan to check the conditions | p. 98 |
Checking the three conditions | p. 100 |
How Can I Miss You If You Won't Leave? Regression Model Selection | p. 103 |
Getting a Kick out of Estimating Punt Distance | p. 104 |
Brainstorming variables and collecting data | p. 104 |
Examining scatterplots and correlations | p. 106 |
Just Like Buying Shoes: The Model Looks Nice, But Does It Fit? | p. 109 |
Assessing the fit of multiple regression models | p. 110 |
Model selection procedures | p. 111 |
Getting Ahead of the Learning Curve with Nonlinear Regression | p. 115 |
Anticipating Nonlinear Regression | p. 116 |
Starting Out with Scatterplots | p. 117 |
Handling Curves in the Road with Polynomials | p. 119 |
Bringing back Polynomials | p. 119 |
Searching for the best polynomial model | p. 122 |
Using a second-degree polynomial to pass the quiz | p. 123 |
Assessing the fit of a polynomial model | p. 126 |
Making predictions | p. 129 |
Going Up? Going Down? Go Exponential! | p. 130 |
Recollecting exponential models | p. 130 |
Searching for the best exponential model | p. 131 |
Spreading secrets at an exponential rate | p. 133 |
Yes, No, Maybe So: Making Predictions by Using Logistic Regression | p. 137 |
Understanding a Logistic Regression Model | p. 138 |
How is logistic regression different from other regressions? | p. 138 |
Using an S-curve to estimate probabilities | p. 139 |
Interpreting the coefficients of the logistic regression model | p. 140 |
The logistic regression model in action | p. 141 |
Carrying Out a Logistic Regression Analysis | p. 142 |
Running the analysis in Minitab | p. 142 |
Finding the coefficients and making the model | p. 144 |
Estimating p | p. 145 |
Checking the fit of the model | p. 146 |
Fitting the Movie Model | p. 147 |
Analyzing Variance with Anova | p. 151 |
Testing Lots of Means? Come On Over to Anova! | p. 153 |
Comparing Two Means with a t-Test | p. 154 |
Evaluating More Means with Anova | p. 155 |
Spitting seeds: A situation just waiting for Anova | p. 155 |
Walking through the steps of Anova | p. 156 |
Checking the Conditions | p. 157 |
Verifying independence | p. 157 |
Looking for what's normal | p. 158 |
Taking note of spread | p. 159 |
Setting Up the Hypotheses | p. 162 |
Doing the F-Test | p. 162 |
Running Anova in Minitab | p. 163 |
Breaking down the variance into sums of squares | p. 164 |
Locating those mean sums of squares | p. 165 |
Figuring the F-statistic | p. 166 |
Making conclusions from Anova | p. 168 |
What's next? | p. 169 |
Checking the Fit of the Anova Model | p. 170 |
Sorting Out the Means with Multiple Comparisons | p. 173 |
Following Up after Anova | p. 174 |
Comparing cellphone minutes: An example | p. 174 |
Setting the Stage for multiple comparison procedures | p. 176 |
Pinpointing Differing Means with Fisher and Tukey | p. 177 |
Fishing for differences with Fisher's LSD | p. 178 |
Using Fisher's new and improved LSD | p. 179 |
Separating the turkeys with Tukey's test | p. 182 |
Examining the Output to Determine the Analysis | p. 183 |
So Many Other Procedures, So Little Time! | p. 184 |
Controlling for baloney with the Bonferroni adjustment | p. 185 |
Comparing combinations by using Scheffe's method | p. 186 |
Finding out whodunit with Dunnett's test | p. 186 |
Staying cool with Student Newman-Keuls | p. 187 |
Duncan's multiple range test | p. 187 |
Going nonparametric with the Kruskal-Wallis test | p. 188 |
Finding Your Way through Two-Way Anova | p. 191 |
Setting Up the Two-Way Anova Model | p. 192 |
Determining the treatments | p. 192 |
Stepping through the sums of squares | p. 193 |
Understanding Interaction Effects | p. 194 |
What is interaction, anyway? | p. 195 |
Interacting with interaction plots | p. 195 |
Testing the Terms in Two-Way Anova | p. 198 |
Running the Two-Way Anova Table | p. 199 |
Interpreting the results: Numbers and graphs | p. 200 |
Are Whites Whiter in Hot Water? Two-Way Anova Investigates | p. 202 |
Regression and Anova: Surprise Relatives! | p. 207 |
Seeing Regression through the Eyes of Variation | p. 208 |
Spotting variability and finding an ""x-planation"" | p. 208 |
Getting results with regression | p. 209 |
Assessing the fit of the regression model | p. 211 |
Regression and Anova: A Meeting of the Models | p. 212 |
Comparing sums of squares | p. 212 |
Dividing up the degrees of freedom | p. 214 |
Bringing regression to the Anova table | p. 215 |
Relating the F-and t-statistics: The final frontier | p. 216 |
Building Strong Connections with Chi-Square Tests | p. 219 |
Forming Associations with Two-Way Tables | p. 221 |
Breaking Down a Two-Way Table | p. 222 |
Organizing data into a two-way table | p. 222 |
Filling in the cell counts | p. 223 |
Making marginal totals | p. 224 |
Breaking Down the Probabilities | p. 225 |
Marginal probabilities | p. 226 |
Joint probabilities | p. 227 |
Conditional probabilities | p. 228 |
Trying To Be Independent | p. 233 |
Checking for independence between two categories | p. 233 |
Checking for independence between two variables | p. 235 |
Demystifying Simpson's Paradox | p. 236 |
Experiencing Simpson's Paradox | p. 236 |
Figuring out why Simpson's Paradox occurs | p. 239 |
Keeping one eye open for Simpson's Paradox | p. 240 |
Being Independent Enough for the Chi-Square Test | p. 241 |
The Chi-square Test for Independence | p. 242 |
Collecting and organizing the data | p. 243 |
Determining the hypotheses | p. 245 |
Figuring expected cell counts | p. 245 |
Checking the conditions for the test | p. 246 |
Calculating the Chi-square test statistic | p. 247 |
Finding your results on the Chi-square table | p. 249 |
Drawing your conclusions | p. 253 |
Putting the Chi-square to the test | p. 255 |
Comparing Two Tests for Comparing Two Proportions | p. 257 |
Getting reacquainted with the Z-test for two population proportions | p. 257 |
Equating Chi-square tests and Z-tests for a two-by-two table | p. 258 |
Using Chi-Square Tests for Goodness-of-Fit (Your Data, Not Your Jeans) | p. 263 |
Finding the Goodness-of-Fit Statistic | p. 264 |
What's observed versus what's expected | p. 264 |
Calculating the goodness-of-fit statistic | p. 266 |
Interpreting the Goodness-of-Fit Statistic Using a Chi-Square | p. 268 |
Checking the conditions before you start | p. 270 |
The steps of the Chi-square goodness-of-fit test | p. 270 |
Nonparametric Statistics: Rebels without a Distribution | p. 273 |
Going Nonparametric | p. 275 |
Arguing for Nonparametric Statistics | p. 275 |
No need to fret if conditions aren't met | p. 276 |
The median's in the spotlight for a change | p. 277 |
So, what's the catch? | p. 279 |
Mastering the Basics of Nonparametric Statistics | p. 280 |
Sign | p. 280 |
Rank | p. 282 |
Signed rank | p. 283 |
Rank sum | p. 284 |
All Signs Point to the Sign test and Signed Rank Test | p. 287 |
Reading the Signs: The Sign Test | p. 288 |
Testing the median | p. 290 |
Estimating the median | p. 292 |
Testing matched pairs | p. 294 |
Going a Step Further with the Signed Rank Test | p. 296 |
A limitation of the sign test | p. 296 |
Stepping through the signed rank test | p. 297 |
Losing weight with signed ranks | p. 298 |
Pulling Rank with the Rank Sum Test | p. 303 |
Conducting the Rank Sum Test | p. 303 |
Checking the conditions | p. 303 |
Stepping through the test | p. 304 |
Stepping up the sample size | p. 306 |
Performing a Rank Sum Test: Which Real Estate Agent Sells Homes Faster? | p. 307 |
Checking the conditions for this test | p. 307 |
Testing the hypotheses | p. 309 |
Do the Kruskal-Wallis and Rank the Sums with the Wilcoxon | p. 313 |
Doing the Kruskal-Wallis Test to Compare More than Two Populations | p. 313 |
Checking the conditions | p. 315 |
Setting up the test | p. 317 |
Conducting the test step by step | p. 317 |
Pinpointing the Differences: The Wilcoxon Rank Sum Test | p. 320 |
Pairing off with pariwise comparisons | p. 320 |
Carrying out comparison tests to see who's different | p. 321 |
Examining the medians to see how they're different | p. 323 |
Pointing Out Correlations with Spearman's Rank | p. 325 |
Pickin' On Pearson and His Precious Conditions | p. 326 |
Scoring with Spearman's Rank Correlation | p. 327 |
Figuring Spearman's rank correlation | p. 328 |
Watching Spearman at work: Relating aptitude to performance | p. 329 |
The Part of Tens | p. 333 |
Ten Common Errors in Statistical Conclusions | p. 335 |
Ten Ways to Get Ahead by Knowing Statistics | p. 347 |
Ten Cool Jobs That Use Statistics | p. 357 |
Appendix: Reference Tables | p. 367 |
Index | p. 379 |
Table of Contents provided by Ingram. All Rights Reserved. |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.