ESRA logo

ESRA 2025 Preliminary Program

              



All time references are in CEST

Evaluating Mixed-Mode in Panel Surveys: Where do we stand and where do we go from here?

Session Organisers Dr Patricia Hadler (GESIS - Leibniz Institute for the Social Sciences)
Dr Steffen Pötzschke (GESIS - Leibniz Institute for the Social Sciences)
TimeWednesday 16 July, 09:00 - 10:30
Room Ruppert 042

Mixed-mode surveys combine different data collection methods such CAPI, CATI, PAPI, and CAWI. Mixed-mode designs (might) offer respondents the opportunity to respond to a survey in their preferred mode, can decrease biases etc. At the same time, mixing modes makes survey design more complex and raises questions regarding the comparability of measurements across modes.

Due to their longitudinal character and the need to safeguard data quality and measurement equivalence over time, mode choice is particularly important to panel studies. While some panel surveys used mixed-mode designs from the start, other, often long-running panel surveys have switched from mainly interviewer-administered data collection to mixed-mode designs employing additional self-administered approaches, or are even considering transitioning to web-only data collection in light of declining response rates and rising costs.

This session invites submissions exploring the use of mixed-mode data collection in panel surveys. Authors are encouraged to present research on the advantages, limitations, and implications of mixed-mode designs. Topics may include, but are not limited to:

- Mode effects: How do different modes impact data quality, measurement error, and response patterns?
- Sampling bias: Do mixed-mode designs improve sample quality?
- Cost-benefit analysis: What are the trade-offs between cost savings and data quality in mixed-mode approaches?
- Attrition: How do mixed-mode designs influence participant retention over time?
- Mode sequencing: What is the optimal mode sequence, and how do transitions affect respondent behaviour?
- Data harmonization: How can researchers ensure data comparability across modes?
- Innovation: How are emerging technologies—like mobile apps and passive data collection—transforming mixed-mode panel surveys?

We welcome empirical studies, theoretical analyses, case studies, and innovative methodologies that evaluate the state of the art of mixed-mode panel surveys and provide insight on possible future developments.

Keywords: mixed-mode surveys, mode effects, survey modes, panel surveys, longitudinal surveys, attrition, data quality

Papers

ON THE TRAIL OF MODE EFFECTS: MIXING MODES IN A LARGE GERMAN EMPLOYMENT SURVEY

Mr Marcel Lück (Federal Institute for Occupational Safety and Health) - Presenting Author
Dr Sophie-Charlotte Meyer (Federal Institute for Occupational Safety and Health)

In recent years, mixed-mode surveys – combining various data collection methods, often including online surveys – have become a prominent practice in survey research, mainly driven by technological advancements and societal changes (DeLeeuw, 2018; Dillman, 2017). While mixed-mode designs can improve representation and participation, they may introduce mode effects impacting measurement comparability (Luijkx et al., 2021; Wolf et al., 2021). While there is a growing literature studying these effects, mode effects in large-scale employment surveys have yet largely been underexplored (e.g., Liedl & Steiber, 2024; Mackeben, 2020).
This paper examines response behavior and measurement variation between Computer Assisted Web Interviews (CAWI) and Computer Assisted Telephone Interviews (CATI) using data from the second wave of the survey "Digitalization and Change in Employment" (DiWaBe) (for details, see Arntz et al., 2020). The survey was conducted in 2024 and includes approximately 9,000 German employees from approximately 2,000 German manufacturing and service companies, who had already participated in a representative company survey in 2022. The main part of the employee survey is cross-sectional (n=8,000), with a panel sample (n=1,000) who already participated in the first wave in 2019. While initial contact was made via a postal invitation, participants could choose CAWI or CATI. A majority (75%) opted for CAWI, while 25% chose CATI.
Focusing on indicators related to working conditions and health, we assess selection and mode measurement effects (Vannieuwenhuyze & Loosveldt, 2013) through comparative analyses of means, as well as regression models. Additionally, we evaluate measurement invariance to ensure comparability between CAWI and CATI (Hox et al., 2015).
Our findings contribute to the understanding of mode effects in employment research and offer important implications for designing future mixed-mode surveys, particularly in the context of working conditions and health.


Is Online-Only the Future of General Population Surveys? Findings from a German Probability-Based Mixed-Mode Panel Survey

Ms Lena Rembser (GESIS – Leibniz Institute for the Social Sciences) - Presenting Author
Professor Tobias Gummer (GESIS – Leibniz Institute for the Social Sciences)

Due to concerns about bias stemming from the undercoverage of non-internet users, most probability-based surveys try to include offliners (i.e., respondents not able or willing to participate online). Often, this is accomplished by adding a costly and labor-intensive mail mode. Previous research shows that including offliners results in more accurate estimates for some socio-demographic characteristics while estimators for others remain unchanged or get worse. These prior studies lack the inclusion of substantive variables. We aim to address this research gap by analyzing the necessity of including offliners for positively impacting measures of substantive variables. We examine the research question of whether the inclusion of offliners in a probability-based panel impacts measures of substantive variables.

We use data from the GESIS Panel.pop, a probability-based self-administered mixed-mode panel of the German general population, surveyed via web and mail mode. We analyze around 150 substantive variables from six core studies collected since 2014, which we compare between the whole sample and a sample of only onliners (i.e., without offliners). To assess the impact of including offliners, we compute differences between both samples for each substantive variable and compute average absolute relative bias (AARB) for each variable and by (sub-)topic. In addition, we re-run these analyses for different definitions of onliners and offliners and for different recruitment cohorts.

Our study contributes to the practical challenge of deciding whether to include the offline population in surveys by employing a costly and labor-intensive mail mode. We go beyond previous research by examining a wide range of substantive variables, enabling us to draw conclusions on topical areas in which including offliners is more warranted than in others. We expect our findings to be relevant for survey practitioners and substantive researchers alike.



“Push to Face-to-Face” – Assessing the Effects of a Mixed-Mode Design on Response Probability in a Dwelling Panel

Mr Marius Maul (Rheinische Friedrich-Wilhelms-Universität Bonn) - Presenting Author

Survey modes have different implications for data quality, nonresponse and cost efficiency. On the one hand, face-to-face interviews continue to yield some of the most reliable results regarding data quality and a direct interaction between respondent and interviewer may increase response rates. On the other hand, surveys are increasingly conducted online to leverage features such as auto-mated data collection and reduced costs. In addition, some sample members may deem it more comfortable to participate at their convenience, rather than scheduling a face-to-face interview. As a result, mixed-mode surveys, particularly those utilizing "push-to-web" designs, have become more common.
In this presentation, preliminary results from a different new mixed-mode design (“push to face-to-face”) will be introduced. Our data source is the “Cologne Dwelling Panel”, a face-to-face panel study investigating neighborhood changes in two residential areas in Cologne. The sample compris-es approximately 1,500 dwellings, with one current resident interviewed per dwelling in each wave. In 2022, the fifth panel wave was conducted as a face-to-face survey with 915 valid interviews. The sixth wave in 2025 adopts a mixed-mode approach: Sample members who are not available or who refuse to participate face-to-face receive a postal reminder with a link to an online survey.
By comparing the socio-demographic characteristics of participants in the different modes, I aim to identify potential biases and uncover the factors driving web-only participation. Additionally, the presentation will explore the impact of the “push to face-to-face” design on reducing unit nonre-sponse by comparing mode outcomes and leveraging the panel design to analyze the socio-demographic profiles of non-respondents.


Digging deep in the answering process of the IAB Establishment Panel: How visualizing paradata can help us understand differences in data quality by mode.

Mrs Marieke Volkert (IAB) - Presenting Author
Mrs Rebecca Schmitz (IAB)
Mrs Corinna König (IAB)
Mr Joe Sakshaug (IAB)

The IAB Establishment Panel, conducted by the Institute for Employment Research (IAB), is an annual employer survey in Germany that collects longitudinal data to provide deep insights into macroeconomic developments since 1993. Until 2018, the primary mode of data collection was face-to-face with self-administration available upon request, both conducted with paper-and-pencil questionnaires.
From 2018 on, we have been working on transitioning the panel to allow for both self-administered web interviewing and computer-assisted personal interviewing. To promote comparability, we designed the survey software for both modes alike. The software is designed to the special needs of establishments answering a survey, which, for example, includes answering the survey non-linearly, sharing the questionnaire with multiple respondents, or elements of depending interviewing. Special features are available for every respondent taking part in the survey independent of their mode choice.
Starting in 2022, we collected detailed paradata in which every click and every answer given is recorded together with a timestamp accurate to the second. These paradata help to understand the response process step-by-step on the respondent level.
In our presentation, we visualize various respondent behaviors of answering the survey and pay special attention to differences between mode, establishment size, and industry. This sheds light on the answering process itself, allowing us to identify behaviors which lead to shorter or longer interviews. We also observe behavior patterns, such as looping through the questionnaire, pauses, and skipping parts of the questionnaire.
In addition, we aim to answer the question of whether the choice of survey mode favors a certain response style and whether these styles influence different data quality indicators, such as item non-response, the duration of the response process, and the consistency of responses given a year earlier.


Health in Germany – A new mixed-mode panel infrastructure for Public Health Research in Germany: study design, outcome rates and non-response

Mr Johannes Lemcke (Robert Koch-Institut) - Presenting Author
Mr Stefan Damerow (Robert Koch-Institut)
Mr Marcel Hintze (Robert Koch-Institut)
Mr Ilter Öztürk (Robert Koch-Institut)
Mr Tobias Heller (Robert Koch-Institut)
Mr Matthias Wetzstein (Robert Koch-Institut)
Mrs Jennifer Allen (Robert Koch-Institut)

Background
The Robert Koch Institute (RKI) developed a new panel infrastructure, "Health in Germany," to enhance public health research through continuous, rapid data collection from questionnaire surveys, measurements, and laboratory analyses. Using online and offline questionnaire mode (CAWI and PAPI) the first probability-based recruitment sample reached 47.000 registered active panelists.
Methods
The sample of the recruitment study is based on address data from Germany's local registration offices, whereby a mixed-mode design is used for data collection. Individuals aged 16-69 receive an initial online-only invitation, followed by postal reminders that include a paper questionnaire. For those aged 70 and above, both online and paper options are available from the outset. Incentives include an unconditional €5 cash payment for all invitees and an additional €10 for successful registration into the panel infrastructure.
Results
The following presentation will illustrate the recruitment process using a range of quality indicators. The outcome rates, selection effects and non-response analyses are presented and discussed. The initial response rate of the recruitment study is 37.5%. Approximately 90% of CAWI participants expressed a willingness to participate again, with around 80% of these individuals successfully completing the online registration process. The overall recruitment rate for both CAWI and PAPI panelists is 27.5%. The analysis of the selection effects through panel registration shows that there are distortions in the sociodemographic characteristics e.g. related to school education, while age does not provide any clear results. The health status and health behavior as relevant influential factors for the registration show no clear distortion pattern. Further detailed results will be presented at the conference.


Lessons Learned from the DAB Panel Study: Tracking Educational Transitions of Swiss Adolescents Over a Decade

Dr Sara Möser (Department of Sociology of Education, University of Bern) - Presenting Author
Dr David Glauser (Department of Sociology of Education, University of Bern)
Professor Rolf Becker (Department of Sociology of Education, University of Bern)

Since 2012, the DAB panel study has tracked the educational and occupational trajectories of adolescents in German-speaking Switzerland, following individuals from grade 8 through their transitions to the labour market. Over 11 survey waves, the study has navigated methodological challenges in collecting and validating detailed biographical data, evolving from classroom-based surveys in waves 1-3 to an individual sequential mixed-mode design (CAWI, CATI, PAPI) in waves 4-11.
Over time, we have repeatedly adapted our questionnaire design to incorporate lessons learned and optimise data quality. To capture comprehensive educational biographies, DAB has tested different approaches to episodic data collection. This presentation evaluates the methods used in waves 4-11, emphasising the deliberations between data accuracy and compatibility with questionnaire standardisation and complexity.
A particular challenge was the validation of the documented biographies, which was carried out in three waves of the survey by presenting the respondents with their complex episode history data. This process allowed respondents to check, correct and complete their biographical data, providing valuable insight into the reliability of self-reported information.
As the project enters what is likely to be its final phase, we plan to incorporate these findings into four additional waves of the survey (waves 12 to 15), redesigning the data collection and validation processes to further improve the data quality and questionnaire design.
This contribution draws on DAB as a case study to explore practical strategies and potential pitfalls in longitudinal data collection of educational episodes. It reflects on lessons learned from managing a complex survey design and offers practical guidance for researchers. By encouraging discussion of challenges and best practices, this presentation aims to advance methodologies for the study of educational transitions and trajectories.