All time references are in CEST
Responsive and Adaptive Surveys: Are they really addressing current Data Collection challenges? |
|
Session Organisers | Dr Dimitri Prandner (Johannes Kepler University of Linz) Professor Patrick Kutschar (Paracelsus Medical University) Professor Martin Weichbold (Paris Lodron University of Salzburg) Mr Christopher Etter (Paris Lodron University of Salzburg) |
Time | Tuesday 18 July, 09:00 - 10:30 |
Room |
Recent evidence challenges the longstanding reliance on rigid, single-mode surveys. Individual differences in survey participation motivation, preferred survey modes or specific question formats suggest the need for more flexible, participant-tailored approaches. Thus, responsive and adaptive designs (RAD), that allow for the use of various sampling and surveying methods tailored to different populations, survey topics, and data collection contexts gained traction over the last few years.
Methodological research has consistently shown that pre-planned conditional adaptive survey paths and situational, dynamically adjusting data collection procedures can improve both cost efficiency and data quality. It is often argued that using RAD to adapt survey methods in real-time to optimally align design features with respondent characteristics improves measurement quality overall.
However, while RAD can mitigate certain sources of error and bias, it has also been noted that it may introduce additional new ones. Using the total survey error (TSE) framework, RAD-related design trade-offs can impact various error sources from both TSE components: representation (e.g., refusals, nonresponse) and measurement (e.g., interviewer effects, context effects).
We invite theoretical, conceptual, and empirical papers from laboratory and field research (small to large scale) that address the implications of RAD for data quality. Topics of interest include, but are not limited to:
• Tailored contact strategies and survey modes (e.g., integration of RAD and push-to-web approaches, individualized incentives)
• Adaptive changes to design features during the interview (e.g., proxies, mixed mode/methods, instruments, question difficulty, format or layout, visuals and pictures)
• Predictors for and RAD application in certain respondent groups and specific populations (e.g., vulnerable populations)
• The role of advanced technologies in real-time data monitoring and adjustment (e.g., AI-assisted adaptive procedures, machine learning)
• The use of auxiliary data to inform adaptive survey design
• Strategies and
Keywords: responsive and adaptive designs; sampling; response rate; data quality
Mr Simon Moss (National Centre for Social Research (NatCen))
Ms Line Knudsen (National Centre for Social Research (NatCen)) - Presenting Author
Ms Noémie Bourguignon (National Centre for Social Research (NatCen))
Ongoing technical education reforms in England, initiated by the previous government, aim to improve the quality of technical education. The Technical Education Learners’ Survey (‘Tech Ed’) is designed to monitor the impact of these reforms. In order to maximise the use of limited budget for telephone fieldwork, cases were prioritised on the basis of modelled likelihood of responding online. Implementing responsive design improved the achieved sample profile and response rates among sub-groups of the target population.
The initial waves of the ‘Tech Ed’ Study followed up with different cohorts of learners in multiple waves of longitudinal data collection using a ‘web-first’ approach, with a series of reminders sent to prompt self-completion. Follow-up Computer Assisted Telephone Interviewing (CATI) was then used to increase response rates.
To prioritise cases for follow-up telephone interviewing, following the start of fieldwork, unproductive cases were assigned into batches based on modelled likelihood of responding online. Final variables in the model include sex, age, ethnicity, deprivation rank and additional auxiliary data variables from the National Pupil Database (NPD). Cases were ordered from lowest predicted productivity to highest before being contacted in priority order by telephone interviewers. In taking this approach, interviewer effort and resource could be maximised with the added benefit of achieving improved study outcomes in terms of response and sample quality.
With an emphasis on response, the presentation will outline and discuss the approach taken to prioritise cases for follow-up telephone interviewing, in addition to the impact of the approach on response rates among sub-groups of the target population. The presentation will also reflect on using responsive design to reduce bias and improve sample quality in a cost-effective manner in longitudinal and panel studies using a web-CATI approach.
Ms Vanessa Schmieja (Forschungszentrum Jülich) - Presenting Author
Dr Hawal Shamon (Forschungszentrum Jülich)
Professor Dirk Temme (Bergische Universität Wuppertal)
Standardized surveys face the challenge that survey participants may have individual preferences and needs that deviate from standard recommendations for questionnaire design. Customization options in surveys are measures to respond to such individual preferences and needs of survey participants. On the one hand, tailoring the questionnaire to the preferences and needs of each survey participant (e.g., Dillman et al. 2014) might promote optimizing behavior during survey participation. On the other hand, adaptations in surveys through customization options are also likely to be associated with greater respondent burden as well as limited comparability of survey participants who have chosen different adaptations. Given the advantages and disadvantages of adaptations, it should be examined to what extent customization options can make an additional contribution to higher data quality beyond the consideration of standard recommendations for questionnaire design.
To investigate the above-mentioned research question, we randomly assigned participants of an online survey conducted in 2024 to four different groups, which differed in terms of questionnaire quality (i.e., high vs. low) as well as customization options (i.e., with vs. without). In the two groups with customization options, survey participants had the opportunity to make changes to the survey themselves. For example, they were offered several additional comment fields and the opportunity to see other people´s responses from previous surveys compared to their own responses. In addition to a regular version, a special version for people with color blindness and for people who use a screen reader were selectable. Analyses will be completed in the coming weeks and presented at the ESRA conference.