All time references are in CEST
Evaluating Mixed-Mode in Panel Surveys: Where do we stand and where do we go from here? |
|
Session Organisers | Dr Patricia Hadler (GESIS - Leibniz Institute for the Social Sciences) Dr Steffen Pötzschke (GESIS - Leibniz Institute for the Social Sciences) |
Time | Tuesday 18 July, 09:00 - 10:30 |
Room |
Mixed-mode surveys combine different data collection methods such CAPI, CATI, PAPI, and CAWI. Mixed-mode designs (might) offer respondents the opportunity to respond to a survey in their preferred mode, can decrease biases etc. At the same time, mixing modes makes survey design more complex and raises questions regarding the comparability of measurements across modes.
Due to their longitudinal character and the need to safeguard data quality and measurement equivalence over time, mode choice is particularly important to panel studies. While some panel surveys used mixed-mode designs from the start, other, often long-running panel surveys have switched from mainly interviewer-administered data collection to mixed-mode designs employing additional self-administered approaches, or are even considering transitioning to web-only data collection in light of declining response rates and rising costs.
This session invites submissions exploring the use of mixed-mode data collection in panel surveys. Authors are encouraged to present research on the advantages, limitations, and implications of mixed-mode designs. Topics may include, but are not limited to:
- Mode effects: How do different modes impact data quality, measurement error, and response patterns?
- Sampling bias: Do mixed-mode designs improve sample quality?
- Cost-benefit analysis: What are the trade-offs between cost savings and data quality in mixed-mode approaches?
- Attrition: How do mixed-mode designs influence participant retention over time?
- Mode sequencing: What is the optimal mode sequence, and how do transitions affect respondent behaviour?
- Data harmonization: How can researchers ensure data comparability across modes?
- Innovation: How are emerging technologies—like mobile apps and passive data collection—transforming mixed-mode panel surveys?
We welcome empirical studies, theoretical analyses, case studies, and innovative methodologies that evaluate the state of the art of mixed-mode panel surveys and provide insight on possible future developments.
Keywords: mixed-mode surveys, mode effects, survey modes, panel surveys, longitudinal surveys, attrition, data quality
Dr Susanne Kohaut (Institute for Employment Research) - Presenting Author
Dr Iris Möller (Institute for Employment Research)
The IAB-Establishment Panel is the most comprehensive establishment survey in Germany with 15.000 firms participating every year. New firms are also included to cover panel attrition and the change in the economy. Until 2018 all interviews were conducted face-to-face with paper and pencil. In 2018 a computer-aided instrument was introduced (CAWI/CAPI) in an experiment and in the following years sequential modes were offered to the respondents starting with a Web-questionnaire with face-to-face follow-ups.
In recent years response rates dropped considerably. Therefore, we like to gain insight into the unit-nonresponse process and to finde out what role data collection and other factors play. We analyse firms that took part in the survey from 2021 to 2023 which means that we focus on the panel sample of the IAB-Establishment Panel. The advantage of a panel survey is that information is available from previous years. Our conceptual framework is based on the model for the decision process in establishments presented by Willimack/ Snijkers (2013).
Using probit-models we analyse unit-nonresponse in 2023 to figure out which factors drive the probability to take part again in the survey. With the data of the IAB-Establishment Panel a wide range of firm characteristics are available as well as data on the respondent. We can also use paradata like the duration of the interview to model the response process. First result show that a change in the survey mode is associated with a lower response rate in the following year as well as variables that point to a tight economic situation for the firm. On the other hand, firms that are quick to respond and have the same respondent over time are more likely to stay in the survey.
Dr Brady West (University of Michigan) - Presenting Author
Ms Chendi Zhao (University of Michigan)
Mrs Heather Schroeder (University of Michigan)
Mr Paul Burton (University of Michigan)
Mrs Eva Leissou (University of Michigan)
Mr Andrew Hupp (University of Michigan)
With declining response rates in panel surveys worldwide, researchers have been exploring optimal mixed-mode approaches for panel member recruitment. This study considered the household screening stage of recruitment in the 2022 Health and Retirement Study (HRS). The study investigates the effects of different initial contact protocols and invitation letter envelope types on completion outcomes and contact attempts, specifically focusing on the variation in effects among race/ethnicity and age groups. A total of 16,452 sampled households were divided randomly into two groups. One group received a “web-first” protocol, where sampled individuals were invited to complete the screening questionnaire online and nonrespondents were eventually followed up FTF (face-to-face), and the other group received a "field-first" (FTF) protocol. Within the "web-first" group, two random subgroups were created. One subgroup received a regular HRS mailing envelope containing the invitation letter and a pre-paid $2 cash incentive. The other received an envelope with the cash visible.
We analyzed completion outcomes by screening protocol and envelope type. We tested interactions in multivariable models to explore differences in these effects among race and age groups. We also evaluated predictors of the number of contact attempts needed among completed cases. The results indicate that the "web-first" protocol increased completion rates and reduced the contact attempts needed for a completed screener among the youngest individuals. However, this protocol had an opposite effect among Hispanic individuals. The envelope type also significantly influenced screener completion. The regular envelope increased completion rates among younger and non-Hispanic Black individuals, and the youngest age group also generally received fewer contact attempts before completion. Overall, the findings highlight the effectiveness of employing different contact methods using different modes to enhance panel survey participation, specifically among different socio-demographic subgroups.
Ms Lena Rembser (GESIS – Leibniz Institute for the Social Sciences) - Presenting Author
Professor Tobias Gummer (GESIS – Leibniz Institute for the Social Sciences)
Due to concerns about bias stemming from the undercoverage of non-internet users, most probability-based surveys try to include offliners (i.e., respondents not able or willing to participate online). Often, this is accomplished by adding a costly and labor-intensive mail mode. Previous research shows that including offliners results in more accurate estimates for some socio-demographic characteristics while estimators for others remain unchanged or get worse. These prior studies lack the inclusion of substantive variables. We aim to address this research gap by analyzing the necessity of including offliners for positively impacting measures of substantive variables. We examine the research question of whether the inclusion of offliners in a probability-based panel impacts measures of substantive variables.
We use data from the GESIS Panel.pop, a probability-based self-administered mixed-mode panel of the German general population, surveyed via web and mail mode. We analyze around 150 substantive variables from six core studies collected since 2014, which we compare between the whole sample and a sample of only onliners (i.e., without offliners). To assess the impact of including offliners, we compute differences between both samples for each substantive variable and compute average absolute relative bias (AARB) for each variable and by (sub-)topic. In addition, we re-run these analyses for different definitions of onliners and offliners and for different recruitment cohorts.
Our study contributes to the practical challenge of deciding whether to include the offline population in surveys by employing a costly and labor-intensive mail mode. We go beyond previous research by examining a wide range of substantive variables, enabling us to draw conclusions on topical areas in which including offliners is more warranted than in others. We expect our findings to be relevant for survey practitioners and substantive researchers alike.
Miss Marie Stjerne Grønkjær (Center for Clinical Research and Prevention, Copenhagen University Hospital – Bispebjerg and Frederiksberg, Copenhagen) - Presenting Author
Mrs Marie Holm Eliasen (Center for Clinical Research and Prevention, Copenhagen University Hospital – Bispebjerg and Frederiksberg, Copenhagen)
Mr Peter Elsborg (Center for Clinical Research and Prevention, Copenhagen University Hospital – Bispebjerg and Frederiksberg, Copenhagen)
Mrs Anne Helms Andreasen (Center for Clinical Research and Prevention, Copenhagen University Hospital – Bispebjerg and Frederiksberg, Copenhagen)
Background: Participation is crucial in large-scale population surveys that aim to draw valid conclusions about the general population. In a previous study, we found that participation rates were substantially influenced by the survey administration mode and the number of reminders, while sociodemographic representativeness was largely unaffected. Using the same study sample and a randomized design, this study aimed to examine whether the survey administration mode influenced key survey outcomes related to well-being, health, and illness.
Methods: We used data from the panel study of The Danish Capital Region Health Survey, including 6564 individuals (≥20 years) who participated in both 2017 and 2021. A sequential mixed-mode survey administration was used in 2017, whereas individuals were randomized into two groups in 2021: a single-mode group (three digital letters) and a sequential mixed-mode group (two digital and then three physical letters). Self-reported survey data on well-being, health, health behavior, and illness in 2021 were compared between the two groups.
Results: When comparing key outcomes between the single-mode group (N=2876) and the sequential mixed-mode group (N=3688), poor quality of life was more prevalent in the single-mode group (4.4%) than in the mixed-mode group (3.3%; p=0.016). The groups showed no significant differences in loneliness, stress symptoms, health-related outcomes (self-rated health, obesity, sleep problems, smoking, alcohol consumption, diet, physical activity), or chronic disease prevalence.
Conclusion: Differences between the survey administration groups were observed in only one of several key outcomes. Thus, in this panel study sample, key outcomes related to well-being, health, and illness were largely unaffected by survey administration mode.
Ms Theresa Müller (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences)) - Presenting Author
Mr Michael Ruland (infas Institut für angewandte Sozialwissenschaft GmbH (Institute for Applied Social Sciences))
Dr Elena Sommer (Socio-Economic Panel at DIW)
Professor Sabine Zinn (DIW-SOEP / Humboldt University Berlin)
The IAB-BAMF-SOEP Survey of Refugees in Germany is an annual household panel conducted as part of the Socio-Economic Panel (SOEP) since 2016. Its primary data collection method has always been CAPI, at both the household and individual level. Households are approached by interviewers, and until recently, at least the household interview was required to be conducted face-to-face but there are options to switch to a self-administered mode (CAWI or CASI) at the individual level.
In the 2024 survey wave, households were, for the first time, invited to complete the entire survey online as part of a CAWI follow-up. This sequential mixed-mode design at the household level was introduced during the final phase of fieldwork to boost response among households, that were hard to reach or motivate.
One the one hand, this paper evaluates which households could be reached through the CAWI follow-up and whether panel attrition could be reduced by offering this additional mode. On the other hand, we compare data quality indicators between households surveyed via CAPI and CAWI. Furthermore, we analyze which households should be offered the option to switch to CAWI at the household level and at which point in the field period this switch offer could be most effective.
Dr Larissa Pople (UCL Centre for Longitudinal Studies) - Presenting Author
Mr Matt Brown (UCL Centre for Longitudinal Studies)
Professor Emla Fitzsimons (UCL Centre for Longitudinal Studies)
Mrs Lucy Haselden-Garcez (UCL Centre for Longitudinal Studies)
Mr Nicholas Gilby (Ipsos)
Mrs Kirsty Burston (Ipsos)
In recent years longitudinal studies have made increasing use of mixed-mode data collection strategies, especially those involving the web. This shift has been driven by falling response rates, increasing survey costs for face-to-face surveys, and almost universal internet access in the population.
The UK Millennium Cohort Study (MCS) is a multi-disciplinary longitudinal study following the lives of around 19,000 children born in 2000-2002 across the UK.
The 8th sweep, the Age 23 Survey, involves a multifaceted questionnaire that includes cognitive assessments, the collection of consents for administrative data linkage and other elements. All previous sweeps were conducted face-to-face, enabling the collection of anthropometric and other complex measurements.
Finding and engaging participants at this critical juncture - the first major adult wave in which participants will be approached fully independently from parents - is vital to the ongoing success of the study. Participants are expected to be highly mobile and living busy lives. Perhaps now is the right time to switch to a web-first, mixed-mode approach?
To inform the decision we conducted an experiment in the first phase of fieldwork, randomly allocating half of the sample to take part face-to-face and half to take part via web, with non-respondents followed up face-to-face.
In this paper we explore the findings of this experiment. We compare the overall response rates achieved in the two arms of the experiment. We also explore differences in sample composition, data quality, item-non-response, interview length, mode effects on measurement, levels of interviewer effort and fieldwork costs across the two approaches.
Dr Pablo Cabrera-Álvarez (University of Essex) - Presenting Author
Dr Jonathan Burton (University of Essex)
Professor Annette Jäckle (University of Essex)
Professor Gabriele Durrant (University of Southampton)
Dr Jamie Moore (University of Essex)
Professor Peter Smith (University of Southampton)
The last decade has seen some high-quality surveys adopting web as the primary mode for data collection, a trend that has been accelerated due to the pandemic. To reach segments of the population without internet access, most high-quality surveys employ mixed-mode designs, pairing web surveys with an interviewer-administered mode such as face-to-face or telephone. However, the mixed-mode designs entail higher fixed costs than a web-only survey and might introduce measurement differences affecting data comparability. As internet access has grown in the UK to cover almost the whole population and the levels of digital literacy have increased considerably, a critical question is whether it is now feasible to conduct web-only surveys of the general population without compromising representativeness.
This research first examines the characteristics of the UK population that remains offline and how they differ from those who use the internet. Second, it evaluates differences between web survey respondents and non-respondents, exploring how these differences affect representativeness. This analysis aims to disentangle the bias due to the offline population from the bias produced by the rest of the web non-respondents. The analysis uses data from the Innovation Panel and the main study of Understanding Society, the United Kingdom Household Longitudinal Study (UKHLS). We use the Innovation Panel, which includes a random subsample of households allocated to a sequential web-first with CAPI follow-up mode design since 2012, to explore the research questions with a longitudinal perspective. Furthermore, we use a larger random subsample of the UKHLS main study that transitioned to a web-first design in 2020 due to the COVID-19 pandemic to investigate the effect of using a web-only design on the of some subgroups, such as ethnic minorities.
Mr Marius Maul (Rheinische Friedrich-Wilhelms-Universität Bonn) - Presenting Author
Survey modes have different implications for data quality, nonresponse and cost efficiency. On the one hand, face-to-face interviews continue to yield some of the most reliable results regarding data quality and a direct interaction between respondent and interviewer may increase response rates. On the other hand, surveys are increasingly conducted online to leverage features such as auto-mated data collection and reduced costs. In addition, some sample members may deem it more comfortable to participate at their convenience, rather than scheduling a face-to-face interview. As a result, mixed-mode surveys, particularly those utilizing "push-to-web" designs, have become more common.
In this presentation, preliminary results from a different new mixed-mode design (“push to face-to-face”) will be introduced. Our data source is the “Cologne Dwelling Panel”, a face-to-face panel study investigating neighborhood changes in two residential areas in Cologne. The sample compris-es approximately 1,500 dwellings, with one current resident interviewed per dwelling in each wave. In 2022, the fifth panel wave was conducted as a face-to-face survey with 915 valid interviews. The sixth wave in 2025 adopts a mixed-mode approach: Sample members who are not available or who refuse to participate face-to-face receive a postal reminder with a link to an online survey.
By comparing the socio-demographic characteristics of participants in the different modes, I aim to identify potential biases and uncover the factors driving web-only participation. Additionally, the presentation will explore the impact of the “push to face-to-face” design on reducing unit nonre-sponse by comparing mode outcomes and leveraging the panel design to analyze the socio-demographic profiles of non-respondents.
Mrs Marieke Volkert (IAB) - Presenting Author
Mrs Rebecca Schmitz (IAB)
Mrs Corinna König (IAB)
Mr Joe Sakshaug (IAB)
The IAB Establishment Panel, conducted by the Institute for Employment Research (IAB), is an annual employer survey in Germany that collects longitudinal data to provide deep insights into macroeconomic developments since 1993. Until 2018, the primary mode of data collection was face-to-face with self-administration available upon request, both conducted with paper-and-pencil questionnaires.
From 2018 on, we have been working on transitioning the panel to allow for both self-administered web interviewing and computer-assisted personal interviewing. To promote comparability, we designed the survey software for both modes alike. The software is designed to the special needs of establishments answering a survey, which, for example, includes answering the survey non-linearly, sharing the questionnaire with multiple respondents, or elements of depending interviewing. Special features are available for every respondent taking part in the survey independent of their mode choice.
Starting in 2022, we collected detailed paradata in which every click and every answer given is recorded together with a timestamp accurate to the second. These paradata help to understand the response process step-by-step on the respondent level.
In our presentation, we visualize various respondent behaviors of answering the survey and pay special attention to differences between mode, establishment size, and industry. This sheds light on the answering process itself, allowing us to identify behaviors which lead to shorter or longer interviews. We also observe behavior patterns, such as looping through the questionnaire, pauses, and skipping parts of the questionnaire.
In addition, we aim to answer the question of whether the choice of survey mode favors a certain response style and whether these styles influence different data quality indicators, such as item non-response, the duration of the response process, and the consistency of responses given a year earlier.
Mr Johannes Lemcke (Robert Koch-Institut) - Presenting Author
Mr Stefan Damerow (Robert Koch-Institut)
Mr Marcel Hintze (Robert Koch-Institut)
Mr Ilter Öztürk (Robert Koch-Institut)
Mr Tobias Heller (Robert Koch-Institut)
Mr Matthias Wetzstein (Robert Koch-Institut)
Mrs Jennifer Allen (Robert Koch-Institut)
Background
The Robert Koch Institute (RKI) developed a new panel infrastructure, "Health in Germany," to enhance public health research through continuous, rapid data collection from questionnaire surveys, measurements, and laboratory analyses. Using online and offline questionnaire mode (CAWI and PAPI) the first probability-based recruitment sample reached 47.000 registered active panelists.
Methods
The sample of the recruitment study is based on address data from Germany's local registration offices, whereby a mixed-mode design is used for data collection. Individuals aged 16-69 receive an initial online-only invitation, followed by postal reminders that include a paper questionnaire. For those aged 70 and above, both online and paper options are available from the outset. Incentives include an unconditional €5 cash payment for all invitees and an additional €10 for successful registration into the panel infrastructure.
Results
The following presentation will illustrate the recruitment process using a range of quality indicators. The outcome rates, selection effects and non-response analyses are presented and discussed. The initial response rate of the recruitment study is 37.5%. Approximately 90% of CAWI participants expressed a willingness to participate again, with around 80% of these individuals successfully completing the online registration process. The overall recruitment rate for both CAWI and PAPI panelists is 27.5%. The analysis of the selection effects through panel registration shows that there are distortions in the sociodemographic characteristics e.g. related to school education, while age does not provide any clear results. The health status and health behavior as relevant influential factors for the registration show no clear distortion pattern. Further detailed results will be presented at the conference.