ESRA logo

ESRA 2025 Preliminary Program

              



All time references are in CEST

Open Forms of Pretesting

Session Organisers Dr Arne Bethmann (SHARE Germany and SHARE Berlin Institute)
Ms Charlotte Hunsicker (SHARE Germany and SHARE Berlin Institute)
Ms Dörte Naber (Universidad de Granada)
TimeThursday 17 July, 09:00 - 10:30
Room Ruppert 111

The pretesting of survey questionnaires is crucial for ensuring high quality data in social research. In order to establish the validity of survey responses, it is essential to assess how respondents understand the questions and whether their understanding is consistent with the researcher's intended meaning. Over the years, Willis' (2005, 2015) original approach of cognitive pretesting, rooted in cognitive psychology, has been further developed and merged with other survey methodological and qualitative research approaches into new forms of open pretesting. These allow us to focus on respondents' understanding of survey questions by listening, to varying degrees, to what respondents actually think about the questions.

However, there is a trade-off between the richness of the data and the effectiveness of data collection and analysis. At one end of the spectrum is web-probing, which is highly efficient at collecting data on a large scale, but with a limited amount of information. At the other end are qualitative-interpretive approaches, such as grounded theory, which provide in-depth understanding at a high cost in time and resources. Classical cognitive pretesting, using e.g. standardised and emergent probes, falls between the two extremes, as do qualitative pretesting methods using more focused methodologies, such as problem-centred interviews.

In this session we bring together researchers who are developing and using different approaches across the spectrum of open pretesting. We are particularly interested in exploring the added complexity of different cultures and languages across countries. In addition, we will discuss how data collected using these methods can be used for further substantive research beyond the original purpose of pretesting survey questionnaires.

Keywords: open pretesting, questionnaire development, question meaning, cognitive pretesting

Papers

QPIs in Large-Scale Surveys: Lessons Learned

Ms Charlotte Hunsicker (SHARE Berlin Institute (SBI)) - Presenting Author
Dr Arne Bethmann (SHARE Berlin Institute (SBI))
Dr Christina Buschle (IU International University of Applied Sciences)
Dr Herwig Reiter (Geman Youth Institute (DJI))
Ms Theresa Fabel (SHARE Berlin Institute (SBI))
Dr Barbara Thumann (SHARE Berlin Institute (SBI))

The use of Qualitative Pretest Interviews (QPIs) in large-scale, multinational surveys like the Survey of Health, Ageing and Retirement in Europe (SHARE) provides valuable insights into respondents’ understanding of survey questions. Reflecting on the implementation of QPIs within SHARE, this presentation examines lessons learned across three key phases: conducting interviews, analysis, and implementing results.

1. Conducting Interviews: We reflect on the challenges and opportunities of conducting QPIs with an in-house team of interviewers who had specialized training and subject-matter familiarity. This phase highlights the importance of interviewer expertise in fostering intersubjective understanding with respondents, ensuring the collection of meaningful pretest data.
2. Analysis: Different qualitative approaches were explored to analyse the collected interviews, focusing on aligning these methods with the specific aims and contexts of the pretest. A key insight was balancing interpretive depth with the practical demands of conducting a large-scale, multinational survey.
3. Implementing Results: This section discusses how findings from QPIs were used to implement actionable changes in questionnaire design, highlighting the process of integrating qualitative feedback into the development of standardized survey instruments.

By reflecting on these three phases, the presentation provides a nuanced reflection on the role of QPIs in large-scale surveys, addressing both their potential and the challenges encountered.


Innovation in pre-tests methods: is pop up testing a fruitful approach?

Dr Vivian Meertens (Statistics Netherlands) - Presenting Author

In this presentation we reflect on pop up testing in comparison with traditional cognitive interviewing as a potential application for evaluating and testing methods to access the quality of data collection instruments. This presentation shows results of several pop up tests as an alternative and innovative method for pre-testing methods using by the Qlab of Statistics Netherlands. Several data collection products were tested like different versions of a landing page to log in on household surveys, different mobile designs of grid questions, versions of advance letters and statistical output products like figures and visualisations. Aspects like methods for recruitment, test locations and relevant practicalities will be discussed as well as reporting formats and data templates. The main question we reflect on is whether this innovative testing method can be a fruitful approach to test and evaluate data collection products?


Respondent-Centered Design for Survey Question Development

Professor Kristen Miller (National Center for Health Statistics) - Presenting Author

Recently, the concept of respondent-driven survey practice has become a focus of attention. This presentation provides a framework specifically for respondent-driven question design. Using examples from several question design projects, the presentation lays out a definition for such a design along with methodological considerations.

Question and questionnaire design is often associated with a certain mystique, that to develop quality measures and instruments requires ‘artistic know-how’ and ‘common wisdom.’ This understanding solidified with Payne’s 1951 classic, The Art of Asking Questions, as he brought attention to the intricacy of question wording, arguing that what went into a question mattered. In the 1980s, the field of psychology brought attention to the cognitive aspects of survey measurement which, in part, opened the door for understanding question design as a scientific endeavor. By recognizing question response as a complex cognitive process, research into its four stages (comprehension, judgement, retrieval, response) could help to identify specific design features that would optimize respondents’ engagement and quality of their responses. Survey designers could then draw on a body of research about such topics as number of scale points, ‘don’t know’ options, order effects and recall error. Still, despite these efforts, an element of mystery remains, for in the Handbook of Survey Research (2010), Krosnick and Presser conclude that “the design of questions and questionnaires is an art as well as a science.”

This paper argues that respondent-driven design along with interpretive cognitive interviewing methodology, is another avenue for reducing ‘art,’ making question design a more objective process. Indeed, the ways in which respondents interpret and process survey questions is rooted in their personal experience that is tied to social location. Drawing upon respondents’ approach to questions can improve question design by ensuring that a question is similarly meaningful across contexts.