All time references are in CEST
Reducing respondent burden |
|
Session Organiser | Ms Susanne Göttlinger (Statistics Austria) |
Time | Tuesday 18 July, 14:00 - 15:30 |
Room | U6-01f |
The lectures allocated to the session “Reducing respondent burden” offer insights on a variety of different aspects of the issue.
Georg Wittenburg and Josef Hartmann (Inspirient, Kantar) focus on the evaluation of the impact of sample size on burdens and costs.
Susanne Göttlinger, Andrea Schrott and Pamela Kultscher (Statistics Austria) discuss how to reduce respondent burden using three main approaches: respondent centred communication, respondent centred questionnaire design, and respondent centred survey and workflow design in practice.
Ben Stewart (UK’s Office for National Statistics) elaborates on the importance of focusing on respondents´ needs when designing surveys by listening to them and demonstrates the practical implementation of the concept of respondent centred design.
Katharina Roßbach, Dag Gravem and Sara Grimstad (Statistics Norway) show how they managed to construct and analyse user journeys during surveys and how they gained useful insights into user experience within the Adult Education Survey.
Altogether the session will show different approaches of coping with the challenges of nonresponse, dropouts and respondent burden
Keywords: nonresponse, dropouts, respondent burden
Miss Katharina Roßbach (Statistics Norway) - Presenting Author
Mr Dag F. Gravem (Statistics Norway)
Miss Sara Grimstad (Statistics Norway)
In Statistics Norway, various methods and techniques are commonly used to improve surveys, such as expert reviews, cognitive and usability testing, and focus groups. For a EUROSTAT grant project regarding the Adult Education Survey (AES) 2022 (EU Grants: Application form (SMP ESS): V1.0 – 15 .04.2021), we attempted a relatively new approach by doing a user journey analysis. It can offer unique insights not only into how respondents experience the survey questionnaire, but the whole process of participating in a survey.
For the AES survey, our goal was to map the respondent experience for two demographic groups of specific interest for that survey: Respondents aged 25-34, and respondents with an educational level below ISCED3. This provides us with a better perspective how users in these groups, often underrepresented in net survey samples, experience participation in surveys, what challenges they face, and what influences their motivation to participate.
In this paper, we will present how we managed to construct and analyse a realistic user journey and gain useful insights into users’ experience in participating in the AES. We will discuss challenges we experienced and recommendations for future user journey analyses for surveys in general. We will present how modes and the survey itself affected respondent burden for the two selected groups and present suggestions to reduce response burden.
User journey mappings can help to improve the survey design as well as communication strategies both before and during survey completion. Ultimately, they can shed light on reasons for nonresponse and how the survey, the mode and other factors affect respondent burden. With tailoring communication strategies and offering help to specific groups we aim to increase representativity and positive experience of participating in AES as well as improving data quality.
Dr Georg Wittenburg (Inspirient) - Presenting Author
Dr Josef Hartmann (Kantar)
It is intuitively clear that more can be learned from the initial survey interview of a sample than from the 10,000th interview: Assuming random sampling, the information contained in each new sample element only contributes in smaller and smaller increments to the information of all previously collected interviews. While it is a fact that – given a sufficiently high sample size – a significant result may eventually be achieved even for trivial relations, this point of view only partly describes the issue at hand, simply because in real-world surveys each additional sample element comes at a cost. The cost per sample element may be constant in simpler setups, but in practice this cost increases as hard-to-reach or hard-to-convince subpopulations may require additional effort. For real-world surveys, conducted under economic constraints, the question of when to stop sampling is thus very much worth revisiting.
The question of optimal sample size is commonly modeled as an a priori problem: The required sample size n is estimated for a given set of parameters, including effect size, significance level and statistical power (power analysis). Complementing this perspective, we propose to look at determining sample size as an adaptive problem, i. e., one that tracks effect size and significance metrics as sampled elements are coming in. We propose to to observe the rate of convergence of these metrics while the survey is still in progress, and thus have the opportunity to stop as soon as saturation sets in. We have validated this approach on a number of real-world survey datasets and found that in some cases comparable results regarding effect size and overall significance levels could be reached with less than half of the number of cases actually taken. The results imply less respondents and thus, less respondent burden.
Ms Susanne Göttlinger (Statistics Austria) - Presenting Author
Dr Andrea Schrott (Statistics Austria)
Ms Pamela Kultscher (Statistics Austria)
Statistics Austria strives to reduce respondent burden applying three main approaches: respondent centred communication, respondent centred questionnaire design, and respondent centred survey and workflow design. Our goal is not only to reduce effort and boredom among respondents but also to raise their motivation and interest. Welcomed side-effects are better response rates, representativity, and quality of data.
Our respondent communication follows the principles of plain language. Visual design done by professional graphic designers optimizes readability, comprehensibility and attractivity. Tailored icons and colours, QR-codes, boxes, and well-considered fonts and formatting support structure and shape our message. We vary in paper formats, linguistic styles and depth of information to reach a variety of respondent groups.
Statistics Austria’s respondent-centred questionnaire design guidelines and standards must be observed by all social surveys if possible. We aim at focusing on the respondents’ perspectives, using the respondents´ vocabulary. Our goal is to address even hard-to-reach groups, and thus to reduce selection bias and enhance motivation, data quality, and response rates.
Our in-house developed software enables online questionnaires on mobile devices and on desktop or laptop computers without the interposition of an interviewer. The Covid-19 pandemic has shown the great advantages of this strategy. Tailored questionnaire design allows to apply exact automatic routing. We even adapt the wording of questions to previous answers on parameters like sex, age or number of household members.
We frequently use different channels for contacting and re-contacting respondents in (panel) surveys, for example mail, e-mail, text messages, telephone calls and personally handed over invitation cards. A predesigned automatic workflow for each survey schedules the distribution of these means of contact, the start and the ending of the survey phase, different modes and mode switches.
Mr Ben Stewart (Office For National Statistics) - Presenting Author
Too often, surveys are designed for the analyst rather than the respondent, resulting in cumbersome surveys with low response rates. Increasing demand for self-completion surveys means we can no longer rely on skilled interviewers to improve the experience and secure responses. It’s time to achieve survey goals by empowering our respondents. The speaker shares an innovative approach to survey design, called ‘Respondent Centred Design’ (RCD). It challenges the status quo by putting respondents’ needs at the heart of survey development. It encourages the designer to stop designing based on assumptions and instead to listen to respondents needs. Only then can we design a survey that provides a great experience, alongside collecting more accurate data and without compromising the analyst's needs. Overall, this approach aims to refocus our investment to the beginning of the data lifecycle, to the design phase, by prioritising the respondent and enhancing their end-to-end experience.
This method is being used at the UK’s Office for National Statistics (ONS) to transform the Labour Force Survey. The theory of the RCD process has been presented to several international audiences over the last few years. In this session, the speaker will focus on demonstrating RCD in action, using case studies which showcase the process. The case studies discussed are based on projects completed during 2022. These projects have been undertaken with considerable restraints on resource and time. This will allow the audience to understand how RCD might be applied in their own work, especially to smaller or at pace projects.
Drawing on their experience, the speaker will:
• Briefly share why moving to RCD is necessary
• Explain how to design respondent centred surveys (including the RCD Framework)
• Demonstrate practical application the RCD Framework through ONS case studies
• Focus on application to projects with tight deadlines