Details

Advances in Questionnaire Design, Development, Evaluation and Testing


Advances in Questionnaire Design, Development, Evaluation and Testing


1. Aufl.

von: Paul C. Beatty, Debbie Collins, Lyn Kaye, Jose-Luis Padilla, Gordon B. Willis, Amanda Wilmot

103,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 24.10.2019
ISBN/EAN: 9781119263647
Sprache: englisch
Anzahl Seiten: 816

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>A new and updated definitive resource for survey questionnaire testing and evaluation</b></p> <p>Building on the success of the first Questionnaire Development, Evaluation, and Testing (QDET) conference in 2002, this book brings together leading papers from the Second International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2) held in 2016.  The volume assesses the current state of the art and science of QDET; examines the importance of methodological attention to the questionnaire in the present world of information collection; and ponders how the QDET field can anticipate new trends and directions as information needs and data collection methods continue to evolve.</p> <p>Featuring contributions from international experts in survey methodology, <i>Advances in Questionnaire Design, Development, Evaluation and Testing</i> includes latest insights on question characteristics, usability testing, web probing, and other pretesting approaches, as well as:   </p> <ul> <li>Recent developments in the design and evaluation of digital and self-administered surveys</li> <li>Strategies for comparing and combining questionnaire evaluation methods</li> <li>Approaches for cross-cultural and cross-national questionnaire development</li> <li>New data sources and methodological innovations during the last 15 years</li> <li>Case studies and practical applications</li> </ul> <p><i>Advances in Questionnaire Design, Development, Evaluation and Testing</i> serves as a forum to prepare researchers to meet the next generation of challenges, making it an excellent resource for researchers and practitioners in government, academia, and the private sector. </p>
<p>List of Contributors xvii</p> <p>Preface xxiii</p> <p><b>Part I Assessing the Current Methodology for Questionnaire Design, Development, Testing, and Evaluation 1</b></p> <p><b>1 Questionnaire Design, Development, Evaluation, and Testing: Where are We, and Where are We Headed? 3<br /></b><i>Gordon B. Willis</i></p> <p>1.1 Current State of the Art and Science of QDET 3</p> <p>1.2 Relevance of QDET in the Evolving World of Surveys 11</p> <p>1.3 Looking Ahead: Further Developments in QDET 16</p> <p>1.4 Conclusion 19</p> <p>References 20</p> <p><b>2 Asking the Right Questions in the Right Way: Six Needed Changes in Questionnaire Evaluation and Testing Methods 25<br /></b><i>Don A. Dillman</i></p> <p>2.1 Personal Experiences with Cognitive Interviews and Focus Groups 25</p> <p>2.2 My 2002 Experience at QDET 29</p> <p>2.3 Six Changes in Survey Research that Require New Perspectives on Questionnaire Evaluation and Testing 33</p> <p>2.4 Conclusion 42</p> <p>References 43</p> <p><b>3 A Framework for Making Decisions about Question Evaluation Methods 47<br /></b><i>Roger Tourangeau, Aaron Maitland, Darby Steiger, and Ting Yan</i></p> <p>3.1 Introduction 47</p> <p>3.2 Expert Reviews 48</p> <p>3.3 Laboratory Methods 51</p> <p>3.4 Field Methods 55</p> <p>3.5 Statistical Modeling for Data Quality 59</p> <p>3.6 Comparing Different Methods 63</p> <p>3.7 Recommendations 67</p> <p>References 69</p> <p><b>4 A Comparison of Five Question Evaluation Methods in Predicting the Validity of Respondent Answers to Factual Items 75<br /></b><i>Aaron Maitland and Stanley Presser</i></p> <p>4.1 Introduction 75</p> <p>4.2 Methods 76</p> <p>4.3 Results 79</p> <p>4.4 Discussion 84</p> <p>References 85</p> <p><b>5 Combining Multiple Question Evaluation Methods: What Does tt Mean When the Data Appear to Conflict? 91<br /></b><i>Jo d’Ardenne and Debbie Collins</i></p> <p>5.1 Introduction 91</p> <p>5.2 Questionnaire Development Stages 92</p> <p>5.3 Selection of Case Studies 93</p> <p>5.4 Case Study 1: Conflicting Findings Between Focus Groups and Cognitive Interviews 95</p> <p>5.5 Case Study 2: Conflicting Findings Between Eye-Tracking, Respondent Debriefing Questions, and Interviewer Feedback 97</p> <p>5.6 Case Study 3: Complementary Findings Between Cognitive Interviews and Interviewer Feedback 100</p> <p>5.7 Case Study 4: Combining Qualitative and Quantitative Data to Assess Changes to a Travel Diary 104</p> <p>5.8 Framework of QT Methods 110</p> <p>5.9 Summary and Discussion 110</p> <p>References 114</p> <p><b>Part II Question Characteristics, Response Burden, and Data Quality 117</b></p> <p><b>6 The Role of Question Characteristics in Designing and Evaluating Survey Questions 119<br /></b><i>Jennifer Dykema, Nora Cate Schaeffer, Dana Garbarski, and Michael Hout</i></p> <p>6.1 Introduction 119</p> <p>6.2 Overview of Some of the Approaches Used to Conceptualize, Measure, and Code Question Characteristics 120</p> <p>6.3 Taxonomy of Question Characteristics 127</p> <p>6.4 Case Studies 132</p> <p>6.5 Discussion 141</p> <p>Acknowledgments 147</p> <p>References 148</p> <p><b>7 Exploring the Associations Between Question Characteristics, Respondent Characteristics, Interviewer Performance Measures, and Survey Data Quality 153<br /></b><i>James M. Dahlhamer, Aaron Maitland, Heather Ridolfo, Antuane Allen, and Dynesha Brooks</i></p> <p>7.1 Introduction 153</p> <p>7.2 Methods 157</p> <p>7.3 Results 174</p> <p>7.4 Discussion 182</p> <p>Disclaimer 191</p> <p>References 191</p> <p><b>8 Response Burden: What is it and What Predicts It? 193<br /></b><i>Ting Yan, Scott Fricker, and Shirley Tsai</i></p> <p>8.1 Introduction 193</p> <p>8.2 Methods 197</p> <p>8.3 Results 202</p> <p>8.4 Conclusions and Discussion 206</p> <p>Acknowledgments 210</p> <p>References 210</p> <p><b>9 The Salience of Survey Burden and Its Effect on Response Behavior to Skip Questions: Experimental Results from Telephone and Web Surveys 213<br /></b><i>Frauke Kreuter, Stephanie Eckman, and Roger Tourangeau</i></p> <p>9.1 Introduction 213</p> <p>9.2 Study Designs and Methods 216</p> <p>9.3 Manipulating the Interleafed Format 219</p> <p>9.4 Discussion and Conclusion 224</p> <p>Acknowledgments 226</p> <p>References 227</p> <p><b>10 A Comparison of Fully Labeled and Top-Labeled Grid Question Formats 229<br /></b><i>Jolene D. Smyth and Kristen Olson</i></p> <p>10.1 Introduction 229</p> <p>10.2 Data and Methods 236</p> <p>10.3 Findings 243</p> <p>10.4 Discussion and Conclusions 253</p> <p>Acknowledgments 254</p> <p>References 255</p> <p><b>11 The Effects of Task Difficulty and Conversational Cueing on Answer Formatting Problems in Surveys 259<br /></b><i>Yfke Ongena and Sanne Unger</i></p> <p>11.1 Introduction 259</p> <p>11.2 Factors Contributing to Respondents’ Formatting Problems 262</p> <p>11.3 Hypotheses 267</p> <p>11.4 Method and Data 268</p> <p>11.5 Results 275</p> <p>11.6 Discussion and Conclusion 278</p> <p>11.7 Further Expansion of the Current Study 281</p> <p>11.8 Conclusions 282</p> <p>References 283</p> <p><b>Part III Improving Questionnaires on the Web and Mobile Devices 287</b></p> <p><b>12 A Compendium of Web and Mobile Survey Pretesting Methods 289<br /></b><i>Emily Geisen and Joe Murphy</i></p> <p>12.1 Introduction 289</p> <p>12.2 Review of Traditional Pretesting Methods 290</p> <p>12.3 Emerging Pretesting Methods 294</p> <p>References 308</p> <p><b>13 Usability Testing Online Questionnaires: Experiences at the U.S. Census Bureau 315<br /></b><i>Elizabeth Nichols, Erica Olmsted-Hawala, Temika Holland, and Amy Anderson Riemer</i></p> <p>13.1 Introduction 315</p> <p>13.2 History of Usability Testing Self-Administered Surveys at the US Census Bureau 316</p> <p>13.3 Current Usability Practices at the Census Bureau 317</p> <p>13.4 Participants: “Real Users, Not User Stories” 320</p> <p>13.5 Building Usability Testing into the Development Life Cycle 323</p> <p>13.6 Measuring Accuracy 327</p> <p>13.7 Measuring Efficiency 331</p> <p>13.8 Measuring Satisfaction 335</p> <p>13.9 Retrospective Probing and Debriefing 337</p> <p>13.10 Communicating Findings with the Development Team 339</p> <p>13.11 Assessing Whether Usability Test Recommendations Worked 340</p> <p>13.12 Conclusions 341</p> <p>References 341</p> <p><b>14 How Mobile Device Screen Size Affects Data Collected in Web Surveys 349<br /></b><i>Daniele Toninelli and Melanie Revilla</i></p> <p>14.1 Introduction 349</p> <p>14.2 Literature Review 350</p> <p>14.3 Our Contribution and Hypotheses 352</p> <p>14.4 Data Collection and Method 355</p> <p>14.5 Main Results 361</p> <p>14.6 Discussion 368</p> <p>Acknowledgments 369</p> <p>References 370</p> <p><b>15 Optimizing Grid Questions for Smartphones: A Comparison of Optimized and Non-Optimized Designs and Effects on Data Quality on Different Devices 375<br /></b><i>Trine Dale and Heidi Walsoe</i></p> <p>15.1 Introduction 375</p> <p>15.2 The Need for Change in Questionnaire Design Practices 376</p> <p>15.3 Contribution and Research Questions 378</p> <p>15.4 Data Collection and Methodology 380</p> <p>15.5 Main Results 386</p> <p>15.6 Discussion 392</p> <p>Acknowledgments 397</p> <p>References 397</p> <p><b>16 Learning from Mouse Movements: Improving Questionnaires and Respondents’ User Experience Through Passive Data Collection 403<br /></b><i>Rachel Horwitz, Sarah Brockhaus, Felix Henninger, Pascal J. Kieslich, Malte Schierholz, Florian Keusch, and Frauke Kreuter</i></p> <p>16.1 Introduction 403</p> <p>16.2 Background 404</p> <p>16.3 Data 409</p> <p>16.4 Methodology 410</p> <p>16.5 Results 415</p> <p>16.6 Discussion 420</p> <p>References 423</p> <p><b>17 Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings 427<br /></b><i>Paul Scanlon</i></p> <p>17.1 Introduction 427</p> <p>17.2 The NCHS Research and Development Survey 431</p> <p>17.3 Findings 433</p> <p>17.4 Discussion 445</p> <p>References 448</p> <p><b>18 The Practice of Cognitive Interviewing Through Web Probing 451<br /></b><i>Stephanie Fowler and Gordon B. Willis</i></p> <p>18.1 Introduction 451</p> <p>18.2 Methodological Issues in the Use of Web Probing for Pretesting 452</p> <p>18.3 Testing the Effect of Probe Placement 453</p> <p>18.4 Analyses of Responses to Web Probes 455</p> <p>18.5 Qualitative Analysis of Responses to Probes 459</p> <p>18.6 Qualitative Coding of Responses 459</p> <p>18.7 Current State of the Use of Web Probes 462</p> <p>18.8 Limitations 465</p> <p>18.9 Recommendations for the Application and Further Evaluation of Web Probes 466</p> <p>18.10 Conclusion 468</p> <p>Acknowledgments 468</p> <p>References 468</p> <p><b>Part IV Cross-Cultural and Cross-National Questionnaire Design and Evaluation 471</b></p> <p><b>19 Optimizing Questionnaire Design in Cross-National and Cross-Cultural Surveys 473<br /></b><i>Tom W. Smith</i></p> <p>19.1 Introduction 473</p> <p>19.2 The Total Survey Error Paradigm and Comparison Error 474</p> <p>19.3 Cross-Cultural Survey Guidelines and Resources 477</p> <p>19.4 Translation 478</p> <p>19.5 Developing Comparative Scales 480</p> <p>19.6 Focus Groups and Pretesting in Cross-National/Cultural Surveys 483</p> <p>19.7 Tools for Developing and Managing Cross-National Surveys 484</p> <p>19.8 Resources for Developing and Testing Cross-National Measures 485</p> <p>19.9 Pre- and Post-Harmonization 486</p> <p>19.10 Conclusion 488</p> <p>References 488</p> <p><b>20 A Model for Cross-National Questionnaire Design and Pretesting 493<br /></b><i>Rory Fitzgerald and Diana Zavala-Rojas</i></p> <p>20.1 Introduction 493</p> <p>20.2 Background 493</p> <p>20.3 The European Social Survey 495</p> <p>20.4 ESS Questionnaire Design Approach 496</p> <p>20.5 Critique of the Seven-Stage Approach 497</p> <p>20.6 A Model for Cross-National Questionnaire Design and Pretesting 497</p> <p>20.7 Evaluation of the Model for Cross-National Questionnaire Design and Pretesting Using the Logical Framework Matrix (LFM) 501</p> <p>20.8 Conclusions 512</p> <p>References 514</p> <p><b>21 Cross-National Web Probing: An Overview of Its Methodology and Its Use in Cross-National Studies 521<br /></b><i>Dorothée Behr, Katharina Meitinger, Michael Braun, and Lars Kaczmirek</i></p> <p>21.1 Introduction 521</p> <p>21.2 Cross-National Web Probing – Its Goal, Strengths, and Weaknesses 523</p> <p>21.3 Access to Respondents Across Countries: The Example of Online Access Panels and Probability-Based Panels 526</p> <p>21.4 Implementation of Standardized Probes 527</p> <p>21.5 Translation and Coding Answers to Cross-Cultural Probes 532</p> <p>21.6 Substantive Results 533</p> <p>21.7 Cross-National Web Probing and Its Application Throughout the Survey Life Cycle 536</p> <p>21.8 Conclusions and Outlook 538</p> <p>Acknowledgments 539</p> <p>References 539</p> <p><b>22 Measuring Disability Equality in Europe: Design and Development of the European Health and Social Integration Survey Questionnaire 545<br /></b><i>Amanda Wilmot</i></p> <p>22.1 Introduction 545</p> <p>22.2 Background 546</p> <p>22.3 Questionnaire Design 548</p> <p>22.4 Questionnaire Development and Testing 553</p> <p>22.5 Survey Implementation 560</p> <p>22.6 Lessons Learned 563</p> <p>22.7 Final Reflections 566</p> <p>Acknowledgments 567</p> <p>References 567</p> <p><b>Part V Extensions and Applications 571</b></p> <p><b>23 Regression-Based Response Probing for Assessing the Validity of Survey Questions 573<br /></b><i>Patrick Sturgis, Ian Brunton-Smith, and Jonathan Jackson</i></p> <p>23.1 Introduction 573</p> <p>23.2 Cognitive Methods for Assessing Question Validity 574</p> <p>23.3 Regression-Based Response Probing 577</p> <p>23.4 Example 1: Generalized Trust 579</p> <p>23.5 Example 2: Fear of Crime 580</p> <p>23.6 Data 581</p> <p>23.7 Discussion 586</p> <p>References 588</p> <p><b>24 The Interplay Between Survey Research and Psychometrics, with a Focus on Validity Theory 593<br /></b><i>Bruno D. Zumbo and José-Luis Padilla</i></p> <p>24.1 Introduction 593</p> <p>24.2 An Over-the-Shoulder Look Back at Validity Theory and Validation Practices with an Eye toward Describing Contemporary Validity Theories 595</p> <p>24.3 An Approach to Validity that Bridges Psychometrics and Survey Design 602</p> <p>24.4 Closing Remarks 606</p> <p>References 608</p> <p><b>25 Quality-Driven Approaches for Managing Complex Cognitive Testing Projects 613<br /></b><i>Martha Stapleton, Darby Steiger, and Mary C. Davis</i></p> <p>25.1 Introduction 613</p> <p>25.2 Characteristics of the Four Cognitive Testing Projects 614</p> <p>25.3 Identifying Detailed, Quality-Driven Management Approaches for Qualitative Research 615</p> <p>25.4 Identifying Principles for Developing Quality-Driven Management Approaches 616</p> <p>25.5 Applying the Concepts of Transparency and Consistency 617</p> <p>25.6 The 13 Quality-Driven Management Approaches 618</p> <p>25.7 Discussion and Conclusion 632</p> <p>References 634</p> <p><b>26 Using Iterative, Small-Scale Quantitative and Qualitative Studies: A Review of 15 Years of Research to Redesign a Major US Federal Government Survey 639<br /></b><i>Joanne Pascale</i></p> <p>26.1 Introduction 639</p> <p>26.2 Measurement Issues in Health Insurance 641</p> <p>26.3 Methods and Results 645</p> <p>26.4 Discussion 660</p> <p>26.5 Final Reflections 663</p> <p>References 664</p> <p><b>27 Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey 671<br /></b><i>Robin L. Kaplan, Brandon Kopp, and Polly Phipps</i></p> <p>27.1 Introduction 671</p> <p>27.2 The Sleep Gap 672</p> <p>27.3 The Present Research 674</p> <p>27.4 Study 1: Behavior Coding 675</p> <p>27.5 Study 2: Cognitive Interviews 678</p> <p>27.6 Study 3: Quantitative Study 682</p> <p>27.7 Study 4: Validation Study 686</p> <p>27.8 General Discussion 689</p> <p>27.9 Implications and Future Directions 692</p> <p>References 692</p> <p><b>28 Questionnaire Design Issues in Mail Surveys of All Adults in a Household 697<br /></b><i>Douglas Williams, J. Michael Brick, W. Sherman Edwards, and Pamela Giambo</i></p> <p>28.1 Introduction 697</p> <p>28.2 Background 698</p> <p>28.3 The NCVS and Mail Survey Design Challenges 699</p> <p>28.4 Field Test Methods and Design 704</p> <p>28.5 Outcome Measures 706</p> <p>28.6 Findings 708</p> <p>28.7 Summary 716</p> <p>28.8 Discussion 716</p> <p>28.9 Conclusion 719</p> <p>References 720</p> <p><b>29 Planning Your Multimethod Questionnaire Testing Bento Box: Complementary Methods for a Well-Balanced Test 723<br /></b><i>Jaki S. McCarthy</i></p> <p>29.1 Introduction 723</p> <p>29.2 A Questionnaire Testing Bento Box 725</p> <p>29.3 Examples from the Census of Agriculture Questionnaire Testing Bento Box 733</p> <p>29.4 Conclusion 743</p> <p>References 744</p> <p><b>30 Flexible Pretesting on a Tight Budget: Using Multiple Dependent Methods to Maximize Effort-Return Trade-Offs 749<br /></b><i>Matt Jans, Jody L. Herman, Joseph Viana, David Grant, Royce Park, Bianca D.M. Wilson, Jane Tom,</i> <i>Nicole Lordi, and Sue Holtby</i></p> <p>30.1 Introduction 749</p> <p>30.2 Evolution of a Dependent Pretesting Approach for Gender Identity Measurement 752</p> <p>30.3 Analyzing and Synthesizing Results 759</p> <p>30.4 Discussion 764</p> <p>Acknowledgments 766</p> <p>References 766</p> <p>Index 769</p>
<p><b>PAUL C. BEATTY</b> is Chief of the Center for Behavioral Science Methods at the U.S. Census Bureau. <p><b>DEBBIE COLLINS</b> is a Senior Research Director at the National Centre for Social Research, UK. <p><b>LYN KAYE</b> is a consultant in Survey Research Methods, and previously Statistics New Zealand's Senior Researcher. <p><b>JOSE???LUIS PADILLA</b> is Professor of Methodology of Behavioral Sciences at University of Granada, Spain. <p><b>GORDON B. WILLIS</b> is Cognitive Psychologist at the National Cancer Institute, National Institutes of Health, USA. <p><b>AMANDA WILMOT</b> is a Senior Study Director at Westat, USA.
<p><b>A new and updated definitive resource for survey questionnaire testing and evaluation</b><b></b> <p>Building on the success of the first Questionnaire Development, Evaluation, and Testing (QDET) conference in 2002, this book brings together leading papers from the Second International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2) held in 2016. The volume assesses the current state of the art and science of QDET; examines the importance of methodological attention to the questionnaire in the present world of information collection; and considers how the QDET field can anticipate new trends and directions as information needs and data collection methods continue to evolve. <p>Featuring contributions from international experts in survey methodology, <i>Advances in Questionnaire Design, Development, Evaluation and Testing</i> includes latest insights on question characteristics, usability testing, web probing, and other pretesting approaches, as well as: <ul> <li>Recent developments in the design and evaluation of digital and self-administered surveys</li> <li>Strategies for comparing and combining questionnaire evaluation methods</li> <li>Approaches for cross-cultural and cross-national questionnaire development</li> <li>New data sources and methodological innovations</li> <li>Case studies and practical applications</li> </ul> <p><i>Advances in Questionnaire Design, Development, Evaluation and Testing</i> serves as a forum to prepare researchers to meet the next generation of challenges, making it an excellent resource for researchers and practitioners in government, academia, and the private sector.

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €