Survey Participation and Breakoffs |
|
Session Organiser |
Mr Julian B. Axenfeld (University of Mannheim) |
Time | Friday 16 July, 13:15 - 14:45 |
Increasing and securing high survey response rates are core issues in survey research striving to secure high data quality and limit costs. This only gets more significant the more response rates continue to globally decline and pressures to save costs intensify. The increasing use of online data collection does its part in limiting costs, but also brings new challenges such as issues of higher breakoff rates, noncoverage, or multi-device use.
This session explores new developments in recent research regarding the prediction and monitoring of survey participation and breakoff as well as interventions aiming especially to increase and improve survey participation. Research particularly revolves around when to contact respondents, the effectiveness of push-to-web strategies, and how to present the questionnaire regarding mobile optimisation, survey topics and filter questions.
Keywords: response rates, nonresponse, data quality, mixed-mode surveys, web surveys
Mr Marc Asensio Manjon (University of Lausanne) - Presenting Author
Professor Carolina Roberts (University of Lausanne)
Dr Jessica Herzing (University of Lausanne)
Smartphone penetration around the world has continuously increased throughout the last years. Nonetheless, the number of respondents who choose to complete web surveys with their mobile devices consists on a small proportion of total web survey respondents. Considering the many advantages for data collection that smartphones can offer to survey practitioners, increasing smartphone participation becomes an important focus for research endeavors. We assume that web surveys can be fully optimized for smartphones and still, achieve similar levels of participation than regular web surveys. Besides, following optimization strategies for mobile devices can attract new respondents that would not access the survey otherwise. Therefore, in this study, we investigate the implementation of two strategies to optimize the mobile device experience on a web survey: adapting the survey invitation to smartphones by adding a QR code and conducing the survey through a smartphone application. For this purpose, we compare two web surveys conducted in Switzerland – one including both optimization strategies and the other without them. We then address the following questions: 1) Can we conduct a web survey attracting smartphone respondents and reaching similar levels of participation as in a regular web survey? 2) How are these respondents defined by their sociodemographic characteristics and how they differ from the others, if at all? 3) Can we observe differences in the survey process based on fieldwork progress indicators? For answering these questions, we report the overall response rates per survey and also by device; we compare respondents from both surveys by their sociodemographic characteristics to assess the differences and; we analyze a set of fieldwork progress indicators for estimating the performance of both surveys and which one reaches an optimal level of performance before.
Mr Zeming Chen (University of Manchester) - Presenting Author
Mr Alexandru Cernat (University of Manchester)
Mrs Natalie Shlomo (University of Manchester)
Ms Stephanie Eckman (RTI International)
Web surveys have become increasingly popular over the last decade, but they tend to suffer from more breakoffs, which take place when respondents start the survey but do not complete it. Many studies have investigated the factors impacting breakoffs, but they often ignored the breakoff timing and gave scant attention to two factors: question topics and filter question formats (grouped vs. interleafed as defined by whether filter questions are presented upfront or not). This study aims to research the effect of these two factors on the breakoff event and its timing with the help of survival analysis. By using a web survey that experimentally manipulates the filter question format and randomly orders the question topic, this study finds that presenting the filter questions in the grouped format leads to lower breakoffs at the beginning in comparison to the interleafed counterpart, but the breakoff risk in the grouped format catches up quickly when respondents realise their previous answers will trigger more questions. This study also identifies that the insurance topic has more breakoffs while demographic and income topics have fewer breakoffs despite their perceived sensitivity level. The present study demonstrates the importance of taking into account the time dimension in the study of breakoff and gives practical guidance on questionnaire design and survey breakoff mitigation.
Mr Benjamin Küfner (Institute for Employment Research (IAB)) - Presenting Author
Professor Joseph W. Sakshaug (Institute for Employment Research (IAB))
Dr Stefan Zins (Institute for Employment Research (IAB))
The IAB-Job Vacancy Survey is a voluntary nationally-representative establishment survey that quantifies the size and structure of job vacancies and other worker flows in Germany. Since 2011, it has been carried out using a concurrent mixed-mode design, with establishments receiving paper questionnaires and the option of online completion. This mode design is facing increasing costs and declining response rates. To counteract these trends, a more pronounced push-to-web strategy offers a promising alternative. However, a change of mode design might affect nonresponse bias and data quality. To test an implementation of a mode design switch, a large-scale experiment comparing four self-administered mode designs was conducted with 155,000 establishments in the 4th quarter of 2020: Just Online, Just Paper, Sequential Mixed Mode with online invitation followed by paper questionnaire for nonrespondents, and the standard concurrent mixed-mode design. Further, we experimented with a pre-due-date reminder as an additional response enhancement measure motivating establishments to respond earlier. In this paper, we present first results of these experiments on response rates and costs.
Professor Peter Lynn (University of Essex)
Professor Annamaria Bianchi (University of Bergamo)
Dr Alessandra Gaia (University of Milano-Bicocca) - Presenting Author
Survey researchers using online data collection methods continue to invest in efforts to identify ways of improving response rates. Response speed is also of importance, particularly for surveys that use methods other than email to send reminders or to seek participation from initial nonrespondents (i.e. push-to-web mixed mode surveys), due to the additional costs associated with slow response. Meanwhile, the tools used to improve response rates and response speed have become more sophisticated, particularly various types of adaptive designs. Researchers no longer focus on the average effect of survey design features but are instead interested in the effect on subgroups of particular interest, namely those with otherwise low response rates or a propensity for slow response. This reflects a recognition that both outcomes of interest (response rate, response speed) and the effectiveness of design features that influence the outcomes may vary substantially over sample subgroups.
Panel surveys provide a particularly rich environment for the application of targeted or static adaptive designs as the wealth of prior information available can be used to identify subgroups with (likely) variation in the outcomes of interest and to inform the choice of design features that might provide improved outcomes. The design feature of interest in this article is the day of week on which an invitation to participate is mailed. Using experimental data from the Understanding Society Innovation panel we study the interaction between the day of week on which an invitation to participate is mailed and socio-demographic and survey participation characteristics of sample members. The socio-demographic characteristic in which we are interested is, specifically, economic activity status as we hypothesise ways in which the association between mailing day and outcomes might vary depending on economic activity status. We also examine the role of prior participation in the panel and previous provision of an email address as moderators of the interaction between mailing day and economic activity status. The latter affects the frequency and volume of reminders that can be sent and could well therefore interact with mailing day. Our study therefore provides evidence of whether any interaction between mailing day and economic activity status depends on whether an email address is available, allowing invitations and reminders to be sent by email.
Dr Anke Metzler (Darmstadt University of Technology)
Professor Marek Fuchs (Darmstadt University of Technology) - Presenting Author
Web surveys suffer from substantial breakoff which has the potential to induce breakoff bias, decrease data quality and cause inaccurate survey estimates. Interactive features have successfully been incorporated in Web surveys to compensate for potential increases of item missing, non-differentiation or speeding. To adopt this approach for survey breakoff, it is essential to identify these respondents prior to their actual dropping out of the survey.
For analyses reported in this paper, we used five Web surveys among university applicants conducted in 2013 (n=7,395), 2014 (n=5,996), 2015 (n=4,034), 2016 (n=944) and 2017 (n=545). We assess previous response behavior such as response time and item nonresponse, as well as page characteristics of previous pages and the breakoff page such as number of characters, number of questions, number of answer categories and question type to predict subsequent survey breakoff on question level.
Previous findings suggest that the item nonresponse and response time alone do not reliably predict breakoff, in part because the effect of item nonresponse and response time on breakoff is non-linear: “careless respondents” (very fast and no item missing), “failing optimizer” (very slow and very high item missing rates) and “lurking dropouts” (very fast and very high item missing rates) are more likely to break off than respondents with intermediate response time and moderate levels of item nonresponse.
In this presentation, we present more sophisticated multilevel analyses modelling the nonlinear effects and also interaction effects of response time and item nonresponse. In addition we include indicators of page characteristics to allow a more reliable prediction of breakoff. Results suggest that item nonresponse and response time can be used to more reliably predict survey breakoff when indicators for response burden of the particular survey page are considered in parallel.