WILEY SERIES IN SURVEYMETHODOLOGY
Established in Part by Walter A. Shewhart and Samuel S. Wilks
Editors: Mick P. Couper, Graham Kalton, Lars Lyberg, J. N. K. Rao, Norbert Schwarz,
Christopher Skinner
Editor Emeritus: Robert M. Groves
A complete list of the titles in this series appears at the end of this volume.
Edited by
This edition first published 2019
© 2019 John Wiley & Sons, Inc.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Paul J. Lavrakas, Michael W. Traugott, Courtney Kennedy, Allyson L. Holbrook, Edith D. de Leeuw, and Brady T. West to be identified as the authors of the editorial material in this work has been asserted in accordance with law.
Registered Office
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
Editorial Office
111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of Warranty
While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Lavrakas, Paul J., editor.
Title: Experimental methods in survey research : techniques that combine
random sampling with random assignment / edited by Paul J. Lavrakas, [and
five others].
Description: New Jersey : Wiley, 2020. | Series: Wiley series in survey
methodology | Includes bibliographical references and index. |
Identifiers: LCCN 2019015585 (print) | LCCN 2019021945 (ebook) | ISBN
9781119083757 (Adobe PDF) | ISBN 9781119083764 (ePub) | ISBN 9781119083740
(hardback)
Subjects: LCSH: Social surveys. | Social sciences–Research–Methodology. |
Surveys–Evaluation. | BISAC: SOCIAL SCIENCE / Statistics.
Classification: LCC HM538 (ebook) | LCC HM538 .E97 2020 (print) | DDC
300.72/3–dc23
LC record available at https://lccn.loc.gov/2019015585
Cover design: Wiley
Cover image: © Angelina Babii/Shutterstock
We dedicate this book with gratitude to the contributions made to the use of experimental design by Donald T. Campbell, Thomas D. Cook, Robert Boruch, Marilyn B. Brewer, Jeanne Converse, Stephen Fienberg, Stanley Presser, Howard Schuman, and Eleanor Singer, and to their influence on our own usage of survey‐based experiments.
Katrin Auspurg
Ludwig Maximilian University of Munich
Munich, Germany
Paul Beatty
Center for Behavioral Science Methods
U.S. Census Bureau
Washington, DC, USA
A. Bianchi
Department of Management, Economics and Quantitative Methods
University of Bergamo
via dei Caniana 2, 24127 Bergamo, Italy
S. Biffignandi
Department of Management, Economics and Quantitative Methods
University of Bergamo
via dei Caniana 2, 24127 Bergamo, Italy
Philip S. Brenner
Department of Sociology and Center for Survey Research
University of Massachusetts
100 Morrissey Blvd., Boston, MA 02125, USA
Alexandru Cernat
Social Statistics Department
University of Manchester
Manchester, M13 9PL, UK
David J. Ciuk
Department of Government
Franklin & Marshall College
Lancaster, PA, USA
Carol Cosenza
Center for Survey Research
University of Massachusetts
Boston, MA, USA
Mathew J. Creighton
School of Sociology, Geary Institute for Public Policy
University College Dublin
Stillorgan Road, Dublin 4, Ireland
Andrew W. Crosby
Department of Public Administration
Pace University
New York, NY 10038, USA
Richard Curtin
Institute for Social Research
University of Michigan
Ann Arbor, MI 48106, USA
Edith D. de Leeuw
Department of Methodology & Statistics
Utrecht University
Utrecht, the Netherlands
Stefanie Eifler
Department of Sociology
Catholic University of Eichstätt‐Ingolstadt
85072 Eichstätt, Germany
Mahmoud Elkasabi
ICF, The Demographic and Health Surveys Program
Rockville, MD 20850, USA
Floyd J. Fowler Jr
Center for Survey Research
University of Massachusetts
Boston, MA, USA
Marek Fuchs
Darmstadt University of Technology
Institute of Sociology
Karolinenplatz 5, 64283 Darmstadt, Germany
Thomas Hinz
University of Konstanz
Konstanz, Germany
Allyson L. Holbrook
Departments of Public Administration and Psychology and the Survey Research Laboratory
University of Illinois at Chicago
Chicago, IL 60607, USA
Joop Hox
Department of Methodology and Statistics
University of Utrecht
Utrecht, the Netherlands
Ronaldo Iachan
ICF International
Fairfax, VA 22031, USA
Annette Jäckle
University of Essex
Institute for Social and Economic Research
Colchester CO4 3SQ, UK
Timothy P. Johnson
Department of Public Administration and the Survey Research Laboratory
University of Illinois at Chicago
Chicago, IL 60607, USA
Jenny Kelly
NORC, University of Chicago
55 East Monroe Street, Chicago, IL 60603, USA
Courtney Kennedy
Pew Research Center
Washington, DC, USA
Florian Keusch
Department of Sociology, School of Social Sciences
University of Mannheim
Mannheim, Germany
Samara Klar
School of Government & Public Policy
University of Arizona
Tucson, AZ 85721, USA
Maria Krysan
Department of Sociology and the Institute of Government & Public Affairs
University of Illinois at Chicago
Chicago, IL, USA
Tanja Kunz
Leibniz‐Institute for the Social Sciences
P.O.Box 12 21 55, 68072, Mannheim, Germany
Paul J. Lavrakas
NORC, University of Chicago
55 East Monroe Street, Chicago, IL 60603, USA
Thomas J. Leeper
Department of Methodology
London School of Economics and Political Science
London WC2A 2AE, UK
James M. Lepkowski
Institute for Social Research
University of Michigan
Ann Arbor, MI 48106, USA
Mingnan Liu
Facebook, Inc.
Menlo Park, CA, USA
Peter Lynn
University of Essex
Institute for Social and Economic Research
Colchester CO4 3SQ, UK
Jaki S. McCarthy
United States Department of Agriculture
National Agricultural Statistics Service (USDA/NASS)
1400 Independence Avenue, SW, Washington, DC 20250‐2054, USA
Colleen McClain
Program in Survey Methodology at the University of Michigan, Institute for Social Research
426 Thompson Street, Ann Arbor, MI 48104, USA
Daniel L. Oberski
Department of Methodology & Statistics
Utrecht University
Utrecht, 3584 CH, the Netherlands
Kristen Olson
Department of Sociology
University of Nebraska‐Lincoln
Lincoln, NE, USA
Colm O'Muircheartaigh
Harris School of Public Policy
University of Chicago and NORC at the University of Chicago
Chicago, IL, United States
Linda K. Owens
Stephens Family Clinical Research Institute
Carle Foundation Hospital, 611W. Park Street
Urbana, IL 61801, USA
Jennifer A. Parsons
Survey Research Laboratory
University of Illinois at Chicago
Chicago, IL, USA
Knut Petzold
Sociology Section
Ruhr‐Universität Bochum
44801 Bochum, Germany
Annette Scherpenzeel
Chair for Economics of Aging, School of Management
Technical University of Munich
Munich, Germany
Peter Schmidt
Department of Political Science and Centre for Environment and Development (ZEU)
University of Giessen
Karl‐Glöcknerstrasse 21 E, 35394 Giessen Germany
Barbara Schneider
College of Education and Sociology Department
Michigan State University
East Lansing, MI 48824, USA
Stephen Smith
NORC at the University of Chicago
Chicago, IL, United States
Tom W. Smith
Center for the Study of Politics and Society
NORC at the University of Chicago
1155 East 60th Street Chicago, IL 60637, USA
Jolene D. Smyth
Department of Sociology
University of Nebraska‐Lincoln
Lincoln, NE, USA
Jaesok Son
Center for the Study of Politics and Society
NORC at the University of Chicago
1155 East 60th Street Chicago, IL 60637, USA
Mathew Stange
Mathematica Policy Research
Ann Arbor, MI, USA
Marina Stavrakantonaki
Department of Public Administration
University of Illinois at Chicago
Chicago, IL 60607, USA
David Sterrett
NORC at the University of Chicago
Chicago, IL 60603, USA
Z. Tuba Suzer‐Gurtekin
Institute for Social Research
University of Michigan
Ann Arbor, MI 48106, USA
Elizabeth Tipton
Human Development Department, Teachers College
Columbia University
New York, NY 10027, USA
and
Statistics Department
Northwestern University
Evanston, IL 60201, USA
Michael W. Traugott
Center for Political Studies
Institute for Social Research, University of Michigan
Ann Arbor, MI, USA
Jan A. van den Brakel
Department of Statistical Methods
Statistics Netherlands
Heerlen, the Netherlands
and
Department of Quantitative Economics
Maastricht University School of Business and Economics
Maastricht, the Netherlands
Susanne Vogl
Department of Education
Department of Sociology, University of Vienna
Vienna, Austria
Sandra Walzenbach
University of Konstanz,
Konstanz, Germany
and
ISER/University of Essex
Colchester, UK
Xiaoheng Wang
Department of Public Administration
University of Illinois at Chicago
Chicago, IL 60607, USA
Brady T. West
Survey Research Center
Institute for Social Research, University of Michigan
Ann Arbor, MI, USA
Diane K. Willimack
United States Department of Commerce
U.S. Census Bureau
4600 Silver Hill Road, Washington, DC 20233–0001, USA
Jaclyn S. Wong
Department of Sociology
University of South Carolina
Columbia, SC, United States
Ting Yan
Westat
Rockville, MD, USA
David S. Yeager
Psychology Department
University of Texas at Austin
Austin, TX 78712, USA
Berwood A. Yost
Franklin & Marshall College
Floyd Institute for Public Policy and Center for Opinion Research
Lancaster, PA, USA
Diana Zavala‐Rojas
Research and Expertise Centre for Survey Methodology and European Social Survey ERIC, Department of Political and Social Sciences
Universitat Pompeu Fabra
C/de Ramon Trias Fargas, 25‐27, 08005 Barcelona, Spain
Tianshu Zhao
Department of Public Administration
University of Illinois at Chicago
Chicago, IL 60607, USA
I am enormously flattered to be asked to supply a preface for this path‐breaking volume of essays about experiments embedded in surveys. I had hoped to contribute a chapter to the volume with my friend and long‐term collaborator, Stephen Fienberg, but Steve, active to the end, lost his long battle with cancer before he was able to make the time to work on the chapter we had planned to write. So, I would like to use this opportunity to write something about the work we had done, and had hoped to continue, on the parallels between experimental and survey methodology, on experiments embedded in surveys (and vice‐versa), and some of the considerations for analysis occasioned by such embedding.
We had long noted (e.g. Fienberg and Tanur 1987, 1988, 1989, 1996) that there are a great many parallels between elements of survey design and experimental design. Although in fact surveys and experiments had developed very long and independent traditions by the start of the twentieth century, it was only with the rise of ideas associated with mathematical statistics in the 1920s that the tools for major progress in these areas became available. The key intellectual idea was the role of randomization or random selection, both in experimentation and in sampling, and both R.A. Fisher and Jerzy Neyman utilized that idea, although in different ways. The richness of the two separate literatures continues to offer new opportunities for cross‐fertilization of theory and tools for survey practice in particular.
Although the ideas are parallel, often the purpose these ideas serve is different in the two domains. For example, experimental randomization is used in experimental design in order to justify the assumption that the experimental and control groups are equivalent a priori, but it finds its analog in probability sampling in surveys in order to assure the “representativeness” of the sample vis a vis the population to which generalizations are to be made. On the other hand, similar processes to create homogeneous groups occur in both experimental and sampling designs and serve similar purposes. In experimentation, blocking creates homogenous groups, exerting control of experimental error by segregating the effects of extraneous sources of variation. In sampling, stratification serves the same purpose by drawing sample members from each of the homogenous groups into which the population can be divided. The list could be lengthened (as indeed, Steve and I did in the papers cited above) to include such parallels in design as those between Latin and Graeco‐Latin squares on the one hand and lattice sampling or “deep stratification” on the other and between split plot designs and cluster sampling. In the analysis stage, we pointed to the parallel between covariance adjustment in experiments and poststratification in surveys. We embarked on a project to find and describe more modern parallels and to encourage researchers in each of these fields to look in the literature of the other for ideas about design and analysis.
We wrote a series of papers and envisioned a book, Reaching Conclusions: The Role of Randomized Experiments and Sample Surveys, that will now, unfortunately, remain a draft, that explored the ramifications of these parallels, pointed out newer and less obvious parallels between experiments and surveys, and noted the frequent embedding of experiments in surveys and vice versa. In particular, we urged – sometimes by example – that when such embedding took place, the analyst should take advantage of the embedded structure in planning and carrying out the analysis.
In our 1989 Science paper, we addressed these issues of analysis of embedded experiments, reiterating the point that although there are formal parallels between the structures of surveys and experiments there are fundamental inferential differences as described above. We pointed out three inferential stances that could be used. In the context of this volume, it is perhaps worth quoting that discussion at some length (p. 243).
(1) One can use the standard experiment paradigm, which relies largely on internal validity based on randomization and local control (for example, the device of blocking) and on the assumption that the unique effects of experimental units and the treatment effects can be expressed in a simple additive form, without interaction (Fisher 1935, Ch. 4; Kempthorne 1952, Ch. 9). Then inference focuses on within‐experiment treatment differences.
(2) One can use the standard sampling paradigm, which, for a two‐treatment experiment embedded in a survey relies largely on external validity and generalizes the observations for each of the treatments to separate but paired populations of values. Each unit or individual in the original population from which the sample was drawn is conceived to have a pair of values, one for each treatment. But only one of these is observable, depending on which treatment is given. Then inferences focus on the mean difference or the difference in the means of the two populations.
(3) One can conceptualize a population of experiments, of which the present embedded experiment is a unit or sample of units, and thus capitalize on the internal validity created by the design of the present embedded experiment as well as the external validity created by the generalization from the present experiment to the conceptual population of experiments. Then inferences focus on treatment differences in a broader context than simply the present embedded experiment.
Then we took advantage of the generosity of Roger Tourangeau and Kenneth Rasinski to reanalyze data they had collected in a probability sample survey on context effects in attitude surveys (1989). They had crossed four issues at differing levels of familiarity with four orders of presentation (balanced in the form of a Latin Square), two versions of the context questions, and two methods of structuring the context question. Thus, there were 16 versions of the questionnaire plus two extra versions with neutral context questions. These 18 versions of the questionnaire were crossed with four interviewers. We considered the interviewers as blocks, and so within each block, we had five replications of an 18‐treatment experiment, where 16 of the treatments represent a 4 × 2 × 2 factorial design. Focusing separately on two of the issues (abortion and welfare), we had four treatment combinations (positive vs. negative contexts by scattered vs. massed structure of the context‐setting questions). The context‐setting questions for the abortion question dealt with women's rights or traditional values, while those for the welfare question concerned fraud and waste in government programs or government responsibility to provide services. Using logit models to predict a favorable response on the target question and contrasting the four basic treatment combinations with the neutral context, we carried out detailed analyses and found very complicated results. I shall try to sketch only a fraction of results of those analyses – I urge the reader with a serious interest in these issues to refer to the original paper (Fienberg and Tanur, 1987). In short, we found that context matters – but its effect depends of how often the respondent agrees with the context‐setting questions. And we found that all three of the inferential stances detailed above gave similar results for the abortion question, while the first and third gave similar results for the welfare question.
This volume concentrates on the embedding of experiments within surveys, often to answer questions about survey methodology as did the Tourangeau/Rasinski experiment discussed above. I would guess that such procedures are the most common uses of embedding. But perhaps it is worth bearing in mind that there are many good examples of the reverse – the embedding of surveys in experiments and such embedded surveys often serve a more substantive purpose. A set of particularly good examples were the massive social experiments of the mid‐twentieth century such as the Negative Income Tax experiments and the Health Insurance Study. Many of the measurements of the outcome variables in those randomized experiments were necessarily carried out via surveys of the participants.
I find the contents of this volume fascinating, both for the broad sweep of the topics examined and for the variety of disciplines represented by the contributing authors and what I know of their expertise. I look forward to reading many of the articles.
I wish Steve's work could have been included in a more specific and current way than I have been able to achieve in this preface – he had plans for updating that I am not able to carry out. In that context, I am especially pleased to note the inclusion of a chapter by Jan Van den Brakel, whose work I know Steve admired and whose chapter I expect will contain much of the new material that would have informed our chapter had Steve lived to complete it.
Prolific as Steve was, he left so much good work unfinished and so much more not even yet contemplated. I miss him.
Judith Tanur
Montauk, New York
December 2017
This book is accompanied by a companion website:
www.wiley.com/go/Lavrakas/survey-research
The website includes: