All time references are in CEST
Improving the representativeness, response rates and data quality of longitudinal surveys 6 |
|
Session Organisers |
Dr Jason Fields (US Census Bureau) Dr Nicole Watson (University of Melbourne) |
Time | Friday 21 July, 09:00 - 10:30 |
Room | U6-10 |
Longitudinal survey managers are increasingly finding it difficult to achieve their statistical objectives within available resources, especially with the changes to the survey landscape brought by the COVID-19 pandemic. Tasked with interviewing in different populations, measuring diverse substantive issues, and using mixed or multiple modes, survey managers look for ways to improve survey outcomes. We encourage submission of papers on the following topics related to improving the representativeness, response rates and data quality of all, but especially longitudinal, surveys:
1. Adaptive survey designs that leverage the strength of administrative records, big data, census data, or paradata. For instance, what cost-quality tradeoff paradigm can be operationalized to guide development of cost and quality metrics and their use around the survey life cycle? Under what conditions can administrative records or big data be adaptively used to supplement survey data collection and improve data quality?
2. Adaptive survey designs that address the triple drivers of cost, respondent burden, and data quality. For instance, what indicators of data quality can be integrated to monitor the course of the data collection process? What stopping rules of data collection can be used across a multi-mode survey life cycle?
3. Papers involving longitudinal survey designs focused on improving the quality of measures of change over time. How can survey managers best engage with the complexities of such designs? How are overrepresented or low priority cases handled in a longitudinal context?
4. Survey designs involving targeted procedures for sub-groups of the sample aimed to improve representativeness, such as sending targeted letters, prioritising contact of hard-to-get cases, timing calls to the most productive windows for certain types of cases, or assigning the hardest cases to the most experienced interviewers.
5. Papers involving experimental designs or simulations aimed to improve the representativeness, response rates
Dr Sara Möser (University of Bern) - Presenting Author
Dr David Glauser (University of Bern)
Professor Rolf Becker (University of Bern)
Since 2012, the DAB Panel Study has collected longitudinal data on the vocational and educational trajectories of adolescents in German-speaking Switzerland. The sampled individuals have been observed since their eighth school year (2012) and so far have been surveyed ten times (last survey 2022).
Different material incentives have been used in the DAB Panel Study to achieve the highest possible response rate. The effectiveness of these measures was evaluated in three analyses.
In the fourth wave, a classic random experiment was conducted to test the effectiveness of monetary prepaid incentives. A supermarket voucher worth CHF 10 was included in the invitation letter sent by post to the DAB sample. Our evaluation shows that latency was lower in the treatment group and also that the probability of completing the questionnaire was higher compared to the control group.
In subsequent waves, the incentives chosen were systematically varied. In the fifth survey wave, the same voucher worth CHF 10 was given away to all contactable panellists, in the sixth wave an engraved pen and in the seventh survey wave CHF 10 in cash. The impact of the tested incentives on the willingness to participate and on response rates depends on the subjective evaluation of these gifts by the respondents. As assumed due to the value and universal applicability, cash has the strongest effect on the overall response rate and on latency.
In the tenth survey wave selected panellists who had either responded late or not at all in the previous survey were promised an additional post-paid incentive of CHF 10 or CHF 20, in addition to a pre-paid incentive of CHF 10, if they completed the survey within 7 days. The combination of unconditional and conditional incentives reduces both the time to participation and the overall refusal rate.
Dr Pablo Cabrera-Álvarez (ISER, University of Essex) - Presenting Author
Professor Peter Lynn (ISER, University of Essex)
Early bird incentives (EBI) are increasingly used to prompt response in push-to-web surveys. Previous studies have shown the effectiveness of this type of conditional time-limited incentive to foster early response to the survey and, in some instances, increase the final response rates. In addition, in the context of a sequential mixed-mode design that starts with web followed by an interviewer-administered mode, EBI can effectively reduce survey costs by prompting response during the web stage and, thus, minimising interviewer efforts at the subsequent phase. Although several experiments testing the effectiveness of EBIs have been published in the last few years, there is little evidence about the impact of changing or increasing their value in the context of a longitudinal study, where participants will be aware of the change in the value.
In this paper, we present the analysis of an experiment embedded in wave 12 of Understanding Society, where we offered £10 and £20 EBIs to two randomly selected groups of panel members. The experiment results are offered for two subsamples: 1) panel members transitioning from CAPI to a web-first design who were offered the EBI for the first time and 2) panel members that had previously taken part in a web-first design and had been offered the £10 EBI bonus. This analysis provides valuable insights for the designers of longitudinal surveys about whether it is effective to 1) offer higher EBIs from the start and 2) raise the incentive at a later wave. We evaluate the effect of the manipulation in the short-term, i.e. the wave where the experiment was implemented, using the response rates – at the end of the web-only period and after the CATI stage, the full household response rate, and the sample composition.
Ms Lisa Marie Natter (Max Planck Institute for the Study of Crime, Security and Law) - Presenting Author
Given the steady decline in response rates over the past few decades, many attempts have been made to counter this trend. In particular incentives have been proven to be very successfully. The research on the effects of incentives in web/mail mixed-mode surveys is only just developing. Most studies, however, are cross-sectional and only few are panel studies, in which response rates are particularly important.
We investigate the effects of a prepaid monetary incentive (5 €) on the retention rates. We use data from the second wave of a mixed web/paper panel survey on social capital and insecurity perceptions in urban neighborhoods of two large German cities in autumn 2020 and 2021 (N = ca. 4000). At the first contact of the second wave, participants were randomly allocated to a group receiving an unconditional monetary incentive or to a control group. Based on their mode preferences in the first wave, participants received either only a web invitation or also a paper questionnaire with the first letter, but could choose between both modes during the survey.
We were able to show that incentives could significantly boost the retention rate by 17.7 percent points. Incentives helped to reduce social selectivity, as they worked especially well for younger persons and second-generation migrants, and persons with a low subjective income. However, we did not find differences by gender, education, or welfare dependency. Also, the effects of incentives did not vary by survey modes. Our results show that incentives did not only increase the retention rate, but also led to faster responses.
This study helps to maximize the effectiveness of approach strategies by contributing to the question whether monetary incentives can be used to boost retention rates in panel surveys and work equally in web and paper modes.
Dr Oliver Lipps (FORS, Lausanne) - Presenting Author
Mr Max Felder (FORS, Lausanne)
Mr Lukas Lauener (FORS, Lausanne)
Mrs Anna Meisser (FORS, Lausanne)
Mr Nicolas Pekari (FORS, Lausanne)
Dr Line Rennwald (FORS, Lausanne)
Professor Anke Tresch (FORS, Lausanne)
While web surveys often suffer from high attrition in first waves, less is known whether this continues during later waves. There is some evidence that while low-participation propensity respondents might only be motivated by offering them high incentives, high-participation propensity respondents in later waves are motivated by factors other than incentives. The question arises if survey designers can save on incentive costs for the latter group without harming sample size and composition.
In this presentation, we report the results from two experiments in the context of the Swiss Election Survey (Selects): incentives of different values were tested among high-participation propensity respondents (measured in wave 1: the politically more interested who responded quickly) on one hand, and low-participation propensity respondents (all others) on the other:
1.) In wave 5, high-participation propensity respondents were randomized into a group receiving an (expensive) conditional CHF 10 (cash) incentive, and a group receiving (inexpensive) lottery (5×300 CHF).
2.) In wave 6, low-participation propensity respondents were randomized into a group (continuing to) receive an (expensive) conditional 20 CHF incentive, and a group receiving an (inexpensive) lottery (5 x 300 CHF).
First results show
1.) for high-participation propensity respondents only small response rate differences in wave 5, which continue to produce slightly higher response rates in wave 6 (carryover effect).
2.) for low-participation propensity respondents an about 8% points higher response rate in the expensive incentive design in wave 6.
3.) similar sample compositions in the different designs.