What is included with this book?
JENNIFER M. ROTHGEB is a Social Science Statistician at the Center for Survey Methods Research in the Statistical Research Division of the U.S. Census Bureau.
MICK P. COUPER is an Associate Research Professor at the Institute for Social Research, University of Michigan, and at the Joint Program in Survey Methodology, University of Maryland.
JUDITH T. LESSLER is Vice President, Partnership for Genomics and Molecular Epidemiology, Research Triangle Institute.
ELIZABETH MARTIN is Senior Survey Methodologist at the U.S. Census Bureau.
JEAN MARTIN is Director of the Social Analysis and Reporting Division of the Office for National Statistics, U.K.
ELEANOR SINGER is a Research Professor at the Institute for Social Research, University of Michigan.
Contributors | p. xi |
Preface | p. xiii |
Methods for Testing and Evaluating Survey Questions | p. 1 |
Cognitive Interviews | |
Cognitive Interviewing Revisited: A Useful Technique, in Theory? | p. 23 |
The Dynamics of Cognitive Interviewing | p. 45 |
Data Quality in Cognitive Interviews: The Case of Verbal Reports | p. 67 |
Do Different Cognitive Interview Techniques Produce Different Results? | p. 89 |
Supplements to Conventional Pretests | |
Evaluating Survey Questions by Analyzing Patterns of Behavior Codes and Question-Answer Sequences: A Diagnostic Approach | p. 109 |
Response Latency and (Para)Linguistic Expressions as Indicators of Response Error | p. 131 |
Vignettes and Respondent Debriefing for Questionnaire Design and Evaluation | p. 149 |
Experiments | |
The Case for More Split-Sample Experiments in Developing Survey Instruments | p. 173 |
Using Field Experiments to Improve Instrument Design: The SIPP Methods Panel Project | p. 189 |
Experimental Design Considerations for Testing and Evaluating Questionnaires | p. 209 |
Statistical Modeling | |
Modeling Measurement Error to Identify Flawed Questions | p. 225 |
Item Response Theory Modeling for Questionnaire Evaluation | p. 247 |
Development and Improvement of Questionnaires Using Predictions of Reliability and Validity | p. 275 |
Mode of Administration | |
Testing Paper Self-Administered Questionnaires: Cognitive Interview and Field Test Comparisons | p. 299 |
Methods for Testing and Evaluating Computer-Assisted Questionnaires | p. 319 |
Usability Testing to Evaluate Computer-Assisted Instruments | p. 337 |
Development and Testing of Web Questionnaires | p. 361 |
Special Populations | |
Evolution and Adaptation of Questionnaire Development, Evaluation, and Testing Methods for Establishment Surveys | p. 385 |
Pretesting Questionnaires for Children and Adolescents | p. 409 |
Developing and Evaluating Cross-National Survey Instruments | p. 431 |
Survey Questionnaire Translation and Assessment | p. 453 |
Multimethod Applications | |
A Multiple-Method Approach to Improving the Clarity of Closely Related Concepts: Distinguishing Legal and Physical Custody of Children | p. 475 |
Multiple Methods for Developing and Evaluating a Stated-Choice Questionnaire to Value Wetlands | p. 503 |
Does Pretesting Make a Difference? An Experimental Test | p. 525 |
References | p. 547 |
Index | p. 603 |
Table of Contents provided by Ingram. All Rights Reserved. |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.