All time references are in CEST
Boost that respondent motivation! 1 |
|
Session Organisers |
Dr Marieke Haan (University of Groningen) Dr Yfke Ongena (University of Groningen) |
Time | Thursday 20 July, 09:00 - 10:30 |
Room | U6-01f |
Conducting surveys is harder than ever before: the overwhelming number of surveys has led to survey fatigue, and people generally feel less responsible to participate in surveys. The downward trend in response rates of surveys is a major threat for conducting high-quality surveys, because it introduces the potential for nonresponse bias leading to distorted conclusions. Also, even when respondents decide to participate, they may be reluctant to disclose information for reasons such as: dislike of the topic, finding questions too sensitive or too hard, or they can be annoyed by the length of the survey.
Therefore, surveyors need to come up with innovative strategies to motivate (potential) respondents for survey participation. These strategies may be designed for the general population but can also be targeted to specific hard-to-survey groups. For instance, machine learning methods may improve data collection processes (Buskirk & Kircher, 2021), the survey setting can be made more attractive (e.g., by using interactive features or videos), and reluctance to disclose sensitive information may for instance be reduced by using face-saving question wording (Daoust et al. 2021).
In this session we invite you to submit abstracts on strategies that may help to boost respondent motivation. On the one hand abstracts can focus on motivating respondents to start a survey on the other hand we also welcome abstracts that focus on survey design to prevent respondents from dropping out or giving suboptimal responses. More theoretically based abstracts, for example literature reviews, also fit within this session.
Keywords: nonresponse, innovation, motivation
Miss Elisabeth Falk (The SOM Institute)
Dr Sebastian Lundmark (The SOM Institute) - Presenting Author
Mrs Frida Sandelin (The SOM Institute)
For some time now, surveys have struggled to handle declining response rate, which in turn have increased the risk of non-response bias (Groves, 2006). Although studies have investigated the impact of nonresponse (e.g., Groves 2006), fewer have suggested remedies on how to improve the response rates, and especially among harder-to-recruit subgroups. In this study, experiments were administered in an attempt to increase response propensities among two hard-to-reach subgroups in Sweden: young people and people born outside the Nordics. The experiments were administered during the fall of 2021, where the treatment group among young people were offered a monetary incentive in terms of a digital gift card applicable in a wide range of stores (retail value 50 SEK) sent to their email address, whereas the treatment group among people born outside the Nordics were offered equivalent gift card but with a retail value of 99 SEK. The respondents in the control groups were sent a lottery ticket by physical mail. The results indicated that young people that were offered the monetary incentive showed a lower response propensity than the group that were offered the lottery incentive, but no difference in response propensity was detected in the experiment among people born outside the Nordics. In the fall of 2022, two follow up experiments were administered. The treatment groups among young people were offered either a cinema gift card (retail value ≈130 SEK) or a gift card at Sweden’s largest grocery shop chain to the value of 75 SEK. The treatment group among people born outside the Nordics were offered a gift card at a well-known café chain (retail value 100 SEK). The respondents in the control group in both experiments were sent a lottery ticket by physical mail.
Mrs Julia Bergquist (The SOM institute) - Presenting Author
Mr Sebastian Lundmark (The SOM institute)
Declining response rates has been an increasing problem in survey research, which may lead to an increased risk of non-response bias, and subsequently, misrepresentations of reality.
Although studies have investigated the impact of nonresponse (Blumberg and Luke, 2007; Groves 2006; Groves et al. 2012; Groves and Peytcheva 2008), fewer have suggested remedies on how to improve the response rates, and especially how to improve the response rates among the harder-to-recruit subgroups.
In the present study, the impact of adding an unconditional symbolic incentive to the survey has been assessed. The assessment was made on a self-administered mixed-mode survey (paper-and-pencil mail-back and web questionnaire). The experiment randomly assigned one group of respondents to be given an unconditional symbolic incentive in the first invitation to complete the questionnaire, and the other group did not receive the symbolic incentive. One main sample containing 44,250 individuals was invited to participate. Only individuals between the ages of 16-90 years old was invited to participate. The main sample was divided into four sub-samples based on geographical areas. Analyses were done on the main sample as well as the four sub-samples separately. Prior to being invited to complete the questionnaire, each sample person in all four sub-samples was being randomly assigned to one of two groups.
The results of the experiment will shed light on whether a symbolic unconditional incentive affects response propensities, non-response bias, measurement error, data quality and cost of administration.
Dr Andre Pirralha (LIfBi) - Presenting Author
Dr Roman Auriga (LIfBi)
Dr Friederike Schlücker (LIfBi)
Dr Götz Lechner (LIfBi)
Ms Anna Passmann (LIfBi)
Retaining and keeping engaged longitudinal survey participants is important because it directly impacts both the quality and cost of survey data. Research has shown that unconditional incentives, given before the respondent answers the survey, are effective in promoting higher response rates. While between-wave contacts in longitudinal surveys are originally designed to track panel members, they can also have a differential impact on response rates. Whereas the literature on incentives is large, there are very few studies exploring the effects of between-wave contacts on response rates. Furthermore, to the best of our knowledge, no study has focused on the long-term impact of these contacts on response rates, nor its effects on the response rates of other members in the household.
In this paper, we evaluate the impact on response rates of between-wave keeping-in-touch mailing contacts both on its own and combined with tailored gifts and incentives, using data from the German National Education Panel Study (NEPS). We designed a randomized survey experiment where schoolchildren were randomly assigned to five different treatment conditions. The effect of the treatment on the response rate is assessed both for the short-term, the subsequent treatment wave (t+1), as well as for the long run, two waves after the treatment (t+2). In addition, we assess if between-wave contacts impact the response rate of parents, also asked to participate in the NEPS survey. The preliminary results show that between-wave mailings combined with incentives have a positive significant effect on the schoolchildren’s response rate. Finally, we also discuss the steps taken to tailor both the incentives and communication materials, aiming to optimize respondent engagement, to the specific schoolchildren and parents target group.
Miss Mara Verheijen (Centerdata) - Presenting Author
Mr Joris Mulder (Centerdata)
Mr Joost Leenen (Centerdata)
Online panels are nowadays routinely used around the world as a method for gathering survey data for many purposes and survey research relies heavily on its panel members and their responses. Therefore, non-response bias poses a serious threat to drawing reliable conclusions from survey data collected in online panels, as non-respondents can differ from respondents in terms of their characteristics and attitudes.
Even though surveys in the Dutch LISS panel (Longitudinal Internet studies for the Social Sciences) also face the risk of potential survey fatigue and nonresponse, several measures have been successfully deployed to mitigate these risks and keep respondent participation high. The LISS panel, active since 2007, is the only panel in the Netherlands based on a true probability sample of households drawn from the population register by Statistics Netherlands. Households not included in the sample cannot participate, so no self-selection can take place. The panel consists of 5,000 households, comprising approximately 7,500 individuals. The monthly response rate lays between 70 - 85%, depending on survey topic and the target group. Respondent attrition is about 10 to 12% per year.
In this presentation we discuss the initial procedures for setting up such a highly responsive panel (i.e. sample and recruitment), how to maintain the panel and keeping panel members motivated. We show results of several incentive experiments, how this is helpful during the recruitment phase, how it can help prevent attrition during panel participation and the effects on overall response. Also, we will briefly discuss wat causes attrition in the LISS panel and the effects and results of our ‘sleepers study’, an experiment on stimulating and retain participation in the LISS panel (via letters, incentives, feedback and language use). Finally, we will touch on applying machine learning predicting optimal survey length.
Ms Anouk Zabal (GESIS – Leibniz Institute for the Social Sciences) - Presenting Author
Ms Silke Martin (GESIS – Leibniz Institute for the Social Sciences)
Dr Britta Gauly (GESIS – Leibniz Institute for the Social Sciences)
Dr Sanja Kapidzic (GESIS – Leibniz Institute for the Social Sciences)
Ms Natascha Massing (GESIS – Leibniz Institute for the Social Sciences)
After more than two years in the COVID-19 pandemic, the data collection for the second cycle of PIAAC, the Programme for the International Assessment of Adult Competencies, took place from September 2022 to April 2023. PIAAC is an international survey that measures key skills in a face-to-face interview based on a random sample of adults 16 to 65 years of age. Because face-to-face fieldwork in Germany was extremely reduced in the preceding pandemic years, new challenges were to be expected. The current contribution will present the German PIAAC approach to motivating survey participation and discuss strategies and experiences during fieldwork.
On the respondent side, one focus of fieldwork preparation was placed on developing appropriate outreach materials and fieldwork measures to boost respondent cooperation, the idea being that with a varied bouquet, target persons from all walks of life would find something that appealed to them.
On the interviewer side, a five-day in-person interviewer training was carried out. Beyond providing comprehensive training on the survey protocols, one of the objectives was to motivate the interviewers and make them enthusiastic ambassadors for the study, and, as such, excellent recruiters. Given that the PIAAC interview is over 2 hours on average, another important aspect was to equip interviewers with strategies to motivate respondents not only to participate in such a long interview, but to maintain their engagement during the interview.
The COVID-19 pandemic did leave its trace on the face-to-face survey field, and the data collection was challenging. Various strategies were explored during fieldwork to tailor and intensify measures to reach and gain cooperation from target persons, although reaching under-represented target groups remained a challenge.