ESRA 2025 Preliminary Program
All time references are in CEST
Optimizing Probability-Based Web Panel Performance 3 |
Session Organisers |
Professor Vasja Vehovar (University of Ljubljana, Faculty of Social Sciences) Dr Gregor Čehovin (University of Ljubljana, Faculty of Social Sciences) Ms Andreja Praček (University of Ljubljana, Faculty of Social Sciences)
|
Time | Wednesday 16 July, 09:00 - 10:30 |
Room |
Ruppert A - 0.21 |
In contemporary social science research, driven by the rising costs of traditional survey modes, web surveys have become increasingly prevalent. Due to the high costs of recruiting sample units, panels are frequently employed for web surveys. Probability-based web panels play a particularly important role by selecting participants with a known chance of inclusion, thereby offering a more accurate representation of the population. These panels may also combine web surveys with other modes (e.g., phone or face-to-face) to reach nonrespondents. Panel studies face numerous challenges, including recruiting respondents, establishing and maintaining panels, optimizing data quality and cost-efficiency, minimizing biases, managing mixed survey modes, and retaining respondents across waves while ensuring response quality.
Submissions are invited on methods and strategies related to the improvement and optimization of probability-based panels, focusing on the following processes:
• Assessing and reducing item nonresponse, unit nonresponse, and mitigating the impact of nonresponse bias;
• Investigating the relationship between costs and overall response quality, including careless responding and satisficing;
• Enhancing sampling techniques, incentive strategies, and recruitment methods to improve initial response rates and respondent retention;
• Comparing response quality between web surveys and other modes, as well as across different device types;
• Assessing response quality and nonresponse bias across panel waves;
• Improving questionnaire design and layout to improve response quality.
Keywords: probability-based panels, web surveys, mixed-mode surveys, survey costs, response quality, nonresponse bias, experimental survey research, survey design
Papers
Applying a Panel Member-First Approach on KnowledgePanel: Lessons to Improve Panel Performance
Mr Nick Bertoni (Ipsos Public Affairs) - Presenting Author
KnowledgePanel in the U.S. is an online, probability-based panel born in 1999 that has undergone numerous transformative changes since its inception. The platform has seen significant advancements in areas like recruitment, sampling, weighting, and surveying, creating a highly accurate and trusted online, probability-based data collection vehicle over the last two decades. Federal agencies like the CDC, NIH, FDA and others have relied upon KnowledgePanel for their data collection needs.
Benchmarking against federal point estimates is one common way that researchers can evaluate and quantify the quality of online probability panels. This is an area where KnowledgePanel has excelled. Beyond this tangible measure is an intriguing question: Are there other management strategies that can directly enhance the quality of data collection, even if they are harder to define or measure?
In 2023, the Ipsos KnowledgePanel team set out to rethink panel management by adopting a “panel member-first" design. Being panel-member first means being intentional about managing the panel member experience in a way that promotes and improves member engagement. The tone, cadence, and overall content of messaging was overhauled in an attempt in increase engagement. Some extra rewards along the way can also be a pleasant surprise. The impact of this approach was clearly seen within the first year of implementation, as this presentation will demonstrate. KnowledgePanel saw substantial improvements in panel retention and completion rates in 2023. The panel performance in 2024 has been even better. Changes in methodology, messaging, and incentivization will be presented to demonstrate how treating panel members better can directly lead to better panel performance. These lessons learned can be used to inform panel managers of all varieties.
Improving the Catalan Citizen Panel through Adaptive Survey Design
Mr Joel Ardiaca (Universitat de Barcelona) - Presenting Author
Professor Jordi Muñoz (Universitat de Barcelona)
Dr Raül Tormos (Centre d'Estudis d'Opinió)
We have initiated a large-scale probabilistic panel project in Catalonia, utilizing both web and paper
survey modes. This approach allows us to systematically study nonresponse biases across various
sociodemographic profiles, providing detailed insights into the factors influencing participation rates.
To improve response rates and mitigate nonresponse biases, we have conducted a series of
recruitment experiments focusing on incentives and reminders. These experiments evaluate the
effectiveness of diverse strategies, such as varying the type and amount of incentives offered and
the frequency of reminders. Leveraging the data collected from these experiments, we have
developed predictive models to estimate nonresponse probabilities based on participants’
characteristics and their previous responses in refreshment samples.
With access to an extensive dataset of nearly 90,000 cases, we utilize machine learning models to
better understand and predict nonresponse behavior. Our focus extends beyond improving
response rates; instead, we prioritize enhancing the representativity of the panel to achieve a more
balanced and accurate sample. We also evaluate data quality to ensure that methodological
innovations lead to more reliable results. Crucially, the Catalan Citizen Panel provides a unique
opportunity to empirically test the effectiveness of Adaptive Survey Design (ASD), demonstrating
how tailored protocols can optimize treatment allocation and improve samples for public opinion
research.
Testing the impact of a financial incentive for early bird registration for a probability panel
Dr Amelie Van Pottelberge (Universiteit Gent) - Presenting Author
Miss Katrien Vandenbroeck (Katholieke Universtiteit Leuven)
Dr Gert Thielemans (Universiteit Antwerpen)
Dr Bart Meuleman (Katholieke Universiteit Leuven)
Dr John Lievens (Universiteit Gent)
Empirical evidence has demonstrated that cash incentives can be an effective way to increase participation in survey research (e.g. Göritz, 2006; Singer & Ye., 2013). Although unconditional incentives are known to produce the strongest effects, several studies have argued that providing an early bird incentive – that is, an additional incentive that is conditional upon participation before a specific deadline – can further boost response (Friedel et al., 2023; McConagle, Sastry & Freedman, 2023).
This paper discusses the effectiveness of introducing a conditional monetary early bird incentive, in addition to an unconditional monetary incentive, in recruiting panelists for The Social Study. The Social Study is a Belgian mixed-mode probability panel facilitating survey research by offering panelists to option to complete questionnaires online or on paper. The early bird incentive is paid conditionally upon panel registration within 18 days of receiving the initial postal mail invitation. During the first stage of recruitment in 2023 and 2024, a large-scale experiment (N= 6066) is conducted, offering half of the sample units the early bird incentive. The experiment tests the effect of introducing an early bird cash incentive on: response rates, recruitment rates, cost-effectiveness of the recruitment strategy, sample representativeness and panelists’ engagement after recruitment.
Improving Nonresponse Prediction in Online Probability-based Panels: Evaluating Machine Learning Approaches in the context of Varying Engagement Histories
Ms Ziyue Tang (The University of Manchester) - Presenting Author
Dr Alexandru Cernat (The University of Manchester)
Mr Curtis Jessop (National Centre for Social Research (NatCen))
Professor Natalie Shlomo (The University of Manchester)
Predicting nonresponse in complex panel surveys, where respondents join at different stages or participate in various studies over time, presents persistent challenges due to the diverse and dynamic nature of participation patterns. Many probability-based panels regularly recruit new participants, either continuously or periodically, to mitigate biases caused by attrition and ageing. Among them, the NatCen Opinion Panel collects extensive information from both existing participants and newly recruited ones, who are annually recruited through sources such as the British Social Attitudes Survey. Despite the richness of this data, only a small portion is typically leveraged in prediction models, constrained by issues such as missing values, missing variables, and complex participation histories. This study focuses on leveraging machine learning techniques to enhance nonresponse prediction models in panel surveys. Logistic regression is adopted as a baseline model, incorporating multiple variables (e.g., demographic information, socioeconomic characteristics, and response history). Advanced machine learning models, including Random Forests, Gradient Boosting, and Recurrent Neural Networks, are applied to better capture temporal dependencies and response patterns. Furthermore, this study explores the moderating role of panel tenure, investigating how the length of a respondent’s participation moderates the relationship between response propensities and other predictors. Model performance is evaluated using metrics such as Area Under the Curve and prediction error rates to assess accuracy across subgroups. These prediction models are designed to improve data collection by addressing nonresponse challenges and enhancing representativeness. The anticipated findings aim to demonstrate how machine learning can effectively address challenges in nonresponse prediction, providing actionable insights for improving adaptive survey designs in large, complex panel studies (e.g., varying incentives or optimising reminder frequencies). This study seeks to provide a solid foundation for improving the design and implementation of survey research, while advancing the application of machine learning in social surveys.
The role of survey experience in determining subsequent nonresponse in an online probability panel: a survival analysis
Mrs Katya Kostadintcheva (London School of Economics and Political Science) - Presenting Author
Professor Patrick Sturgis (London School of Economics and Political Science)
Professor Jouni Kuha (London School of Economics and Political Science)
Online probability panels are an increasingly common feature of the modern survey landscape. Their design is based on recruiting a random sample of respondents who agree to complete surveys at regular intervals for small incentives. However, compared to interviewer-based survey panels they are characterised by considerably higher rates of nonresponse at each panel wave, in addition to the already low initial recruitment rates, thus making the cross-sectional response rate for this type of design very low. Given their increasing prevalence, it is essential that we understand better the factors that lead online panel respondents to decline survey invitations. In this paper we examine how different measures of survey experience affect subsequent nonresponse to survey invitations.
This research uses data from the Verian Public Voice, a commercially operated online probability panel, which is used for repeated cross-sectional studies. We employ a discrete-time survival analysis where the outcome is respondents’ first nonresponse to a survey invitation, following an earlier survey completion. This approach accommodates the unbalanced data structure typical of such panels, where some panel members receive more frequent survey invitations than others based on their response propensity.
We find that several aspects of the survey experience influence respondents’ propensity to respond to the next survey invitation. These include the extent to which respondents report enjoying the survey, the number of days since the last invitation, the average survey duration and the individual survey length.