ESRA logo

ESRA 2025 Preliminary Program

              



All time references are in CEST

Innovative Uses of Multimode and Multidata in Surveys: Challenges, Success, and Error Trade-offs 2

Session Organisers Dr Ting Yan (NORC at the University of Chicago)
Dr Leah Christian (NORC at the University of Chicago)
TimeFriday 18 July, 11:00 - 12:30
Room Ruppert paars - 0.44

Declining response rates and increasing cost of data collection continue to challenge survey practitioners, survey researchers, and government agencies. At the same time, there is a growing need for more data and for more data to be collected more frequently, timely, and cost effectively. As a result, surveys are increasingly employing multiple modes both to contact sampled persons and to collect data from them. For instance, sampled persons may be mailed postal letters and also visited by field interviewers inviting them to participate in a survey. They may be also offered to participate in the survey online or by telephone. Furthermore, data from multiple sources are utilized to assist survey data collection, and to supplement and complement data from traditional surveys for inferences. For instance, satellite images are used to prescreen for buildings with certain desired characteristics. Administrative data can be linked to obtain additional information without burdening survey respondents.

This session will be exploring the innovative uses of multimodes and multidata sources to assist survey data collection, reduce burden, improve survey quality, and reduce cost of data collection. This session focuses on challenges, success, and error trade-offs in the multimode and multidata environment. Researchers are invited to submit papers on any of the following topics:

• Challenges, success, and error trade-offs of using multiple modes to contact/reach sampled persons
• Challenges, success, and error trade-offs of using multiple modes to collect data from sampled persons
• Challenges, success, and error trade-offs of using data from different sources to improve sampling efficiency
• Challenges, success, and error trade-offs of using data from different sources to assist and supplement data collection
• Challenges and success of combining survey data across multiple modes and from multiple sources
• Evaluating error trade-offs arising from combining data from multiple modes and/or multiple

Papers

Asking panel respondents to complete additional data collection tasks: Which types of tasks increase panel dropout and which types of respondents are we more likely to lose?

Miss Jasmine Mitchell (University of Essex) - Presenting Author
Professor Annette Jäckle (University of Essex)

Surveys are increasingly asking respondents to complete additional data collection tasks that go beyond completing survey questionnaires. These might include tasks embedded within the survey, such as consents for data linkage, and tasks respondents have to complete after the interview, such as a diary or mobile app. In previous research we have started to examine the cumulative effects of such additional tasks on dropout in a panel survey. Our findings suggest that each invitation to an additional task increases the probability of drop out by on average two percentage points. This suggests that asking respondents to complete additional data collection tasks might be detrimental to panel surveys.

In this paper we examine (1) which types of additional tasks increase dropout from annual interviews of a household panel, and (2) which types of respondents are more likely to drop out from the panel if they are invited to additional tasks. We use data from 15 additional tasks across 16 waves of the Understanding Society Innovation Panel. This is a clustered and stratified sample of approximately 1,500 households in Great Britain with refreshment samples added about every three years. The 15 additional tasks include data linkage consent questions, mobile app studies, bio measures and samples, a time-use diary, monthly surveys, and consent to send survey questions by SMS. Our analysis sample includes 6,712 sample members who completed at least one of the annual interviews. Using these data, we will conduct survival analyses to determine which types of tasks increase the probability of subsequent dropout from the panel and which types of people are more likely to drop out due to additional tasks. The findings will contribute to decisions on how best to gather data on different concepts using different methods, in a way that sample members will cooperate.


A Mixed Mode Case Management (MMCM) system and dashboard

Mr Lon Hofman (Statistics Netherlands) - Presenting Author
Mrs Gina Cheung (Independent)

Over the past 40 years, Blaise, the advanced data collection tool developed by Statistics Netherlands (CBS), has been widely adopted and successfully utilized by prominent statistical agencies (NSIs), universities, and survey research organizations. Its robust paradata capabilities have significantly influenced survey research methodologies and data collection strategies worldwide, emphasizing the growing importance of paradata in contemporary research.

In response to the pressing challenges of declining response rates faced by many NSIs, Blaise has introduced a Mixed Mode Case Management (MMCM) system and dashboard. This system addresses various challenges by creating a unified platform that integrates all modes of data collection—CAWI, CATI, and CAPI—into a cohesive framework. The initiative is designed to facilitate seamless collaboration among these modes, permitting both sequential and concurrent usage.

A key aspect of this effort is the development of a comprehensive mixed-mode dashboard that provides a clear, unified overview of all data collection activities and outcomes. By streamlining processes and harmonizing terminologies and workflows, we aim to significantly reduce data discrepancies and enhance overall system efficiency. Through this integration, we anticipate improvements in data collection processes and a more intuitive user experience, as well as more reliable and accurate data outputs.

In a later stage, we also plan to integrate CAVI (video interviewing) and other modes, such as paper-based surveys.

In the presentation, we will demonstrate how to effectively use MMCM to manage a mixed-mode project.


Going Online with a Telephone Employee Survey: Effects on Coverage, Nonresponse, and Total Selection Bias

Dr Joseph Sakshaug (IAB; LMU-Munich) - Presenting Author
Dr Jan Mackeben (IAB)

Telephone surveys have historically been a popular form of data collection in labor market research and continue to be used to this day. Yet, telephone surveys are con-fronted with many challenges, including imperfect coverage of the target population, low response rates, risk of nonresponse bias, and rising data collection costs. To address these challenges, many telephone surveys have shifted to online and mixed-mode data collection to reduce costs and minimize the risk of coverage and nonresponse biases. However, empirical evaluations of the intended effects of introducing online and mixed-mode data collection in ongoing telephone surveys are lacking. We address this research gap by analyzing a telephone employee survey in Germany, the Linked Personnel Panel (LPP), which experimentally introduced a sequential web-to-telephone mixed-mode design in the refreshment samples of the 4th and 5th waves of the panel. By utilizing ad-ministrative data available for the sampled individuals with and without known tele-phone numbers, we estimate the before-and-after effects of introducing the web mode on coverage and nonresponse rates and biases. We show that the LPP was affected by known telephone number coverage bias for various employee subgroups prior to intro-ducing the web mode, though many of these biases were partially offset by nonresponse bias. Introducing the web-to-telephone design improved the response rate but increased total selection bias, on average, compared to the standard telephone single-mode design. This result was driven by larger nonresponse bias in the web-to-telephone design and partial offsetting of coverage and nonresponse biases in the telephone single-mode de-sign. Significant cost savings (up to 50% per respondent) were evident in the web-to-telephone design.


Dialing the Ideal Design: Optimizing Modes and Practices for Effective Establishment Surveys

Ms Sophie Hensgen (Institute for Employment Research) - Presenting Author
Dr Joseph Sakshaug (Institute for Employment Research)

Achieving high response rates is a reoccurring challenge for many surveys. However, voluntary establishment surveys face unique challenges, such as intricate business structures, data security protocols, participation during work hours, and complex questionnaires, which increase the difficulty to collect high quality interviews. Further, these surveys lack the enforcement power of mandatory surveys.
Face-to-face interviews can overcome several challenges associated with data collection in voluntary establishments, offering higher response rates and good data quality. They, however, are expernsive, and the shrinking interviewer workforce further necessitates exploring alternatives such as web-based surveys, or sequential mixed-mode designs. While the latter can help limit costs and increase response rates, it also introduces mode effects. Considering all these factors, identifying the most effective mode design for establishment surveys is essential.
In the 2024 wave of the IAB Establishment Panel an experimental setup was employed in which randomly selected groups were assigned to different single-mode or sequential mixed-mode designs to investigate their effectiveness for the refreshment sample.
The aim of this study is to compare response rates, data quality, and selection bias across the assigned groups to determine the most effective survey design. Additionally, we examine the impact of specific design elements, such as dividing the sample into tranches for better management or setting deadlines, on improving response rates. This study provides practical insights and lessons learned to optimize establishment data collection in an evolving survey landscape.