Search TESS:

.

Cross-National Replication of Questions Design Experiments


Download data and study materials

Download proposal

 


Principal Investigator(s):

Jon A. Krosnick
Stanford University
Email: krosnick@stanford.edu
Home page: https://comm.stanford.edu/faculty-krosnick/

Henning Silber
GESIS – Leibniz-Institute for the Social Sciences, Germany
Email: henning.silber@gesis.org

Tobias Stark
University of Utrecht, Netherlands
Email: t.h.stark@uu.nl
Home page: http://www.tobiasstark.nl/

Annelies Blom
University of Mannheim, Germany
Email: blom@uni-mannheim.de
Home page: http://blom.sowi.uni-mannheim.de/english/

Sample size: 1029
Field period: 02/10/2015-04/17/2015

 

Abstract:

The Multi-National Study of Questionnaire Design (MSQD) explored whether the principles of question design derived from primarily US American research a few decades ago still apply today. Moreover, it utilized the availability of probability-based online panels around the world to test whether these principles can be legitimately generalized across countries.

Hypotheses:

Should survey researchers from other countries expect the same effects across countries, or should they expect different effects? The answer to this question ultimately hinges on the mechanism(s) that explain why the effects occurred in the US. If some question design effects occur because of culture-specific aspects of response behavior (e.g., the tendency to defer to seemingly higher-status researchers; the tendency to express opinions regardless of confidence in them), then we expect to see the same effects in countries whose cultures work according to the same social norms and might not see them in countries with different social norms. If some question design effects occur because of global strategies that respondents implement when they lack the motivation and/or ability to answer questions optimally (e. g., Krosnick 1991), then we expect to see the same effects across all countries.

Experimental Manipulations:

The experiments tested for the differences in response behavior by (a) altering the order in which the response options are presented, (b) altering the order in which questions are asked, (c) varying question wording to test for acquiescence response bias (a tendency to agree with a presented statement), (d) varying the presence or absence of various no opinion filters (filter 1: not enough information; filter 2: no opinion; filter 3: don’t know), (e) varying the mentioning of “some people” and “other people” in an effort to balance a question (e. g., “Some people feel the government should see to it that all people have adequate housing, while others feel each person should provide for his or her own housing. Which comes closest to how you feel about this?”), and (f) for the impact of adding a counter-argument.

Key Dependent Variables:

The outcome variables covered a wide range of political issues including oil prices, climate change, and unions.

Summary of Findings:

Our research is still in progress.

 

 


Copyright © 2014, TESS