ESRA 2025 Preliminary Glance Program
All time references are in CEST
Good, fast, cheap: Pick two – Optimizing Sampling Strategies for Modern Survey Research 3 |
Session Organisers |
Professor Sabine Zinn (DIW-SOEP / Humboldt University Berlin) Dr Hans Walter Steinhauer (Socio-Economic Panel at DIW)
|
Time | Thursday 17 July, 13:45 - 15:00 |
Room |
Ruppert A - 0.21 |
Survey research is increasingly adapting to the demands of fast-paced environments where timely, reliable data is crucial, often within limited budgets. To meet these demands, researchers frequently use non-random sampling and online data collection, which provide quick results but may lack reliability. Traditional methods that ensure accuracy are slower and more costly, yet essential for scientific research and policymaking.
This session invites contributions on the practical use of sampling frames for generating random samples, such as population registers or geo-referenced databases. We are also interested in research on non-probability sampling methods, including data from social media, affordable access panels like Prolific, and respondent-driven sampling schemes. Our goal is to examine the pros and cons of these sampling strategies, focusing on coverage, bias, generalizability, cost, and speed.
We seek discussions on optimal sampling approaches tailored to specific study needs, where researchers must balance the urgency of obtaining rapid results with the need for high-quality studies that can inform policy recommendations.
We invite submissions on:
- Innovative sampling frames for social science surveys
- Combining different sampling frames to enhance data quality and timeliness
- Methods for improving accuracy and quick data access
- Cost analyses of various sampling strategies
- Experiences using fast-access data from web providers like Prolific and Respondi for social science research
Through these discussions, we aim to guide the development of more effective and efficient approaches to survey research in today’s fast-paced data environment.
Keywords: random sampling, non-random sampling, combining sampling frames
Papers
A Comparison of Screening Techniques to Increase the Efficiency of Mobile RDD Samples in Europe
Ms Carolyn Lau (Pew Research Center) - Presenting Author
Ms Georgina Pizzolitto (Pew Research Center)
Ms Sofi Sinozich (Pew Research Center)
Dr Patrick Moynihan (Pew Research Center)
Random-digit-dial (RDD) surveys have become less popular over the past decade or so as response rates dropped to the low single-digits and the cost and effort required to reach respondents rose significantly. Yet unlike other probability-based methods, RDD samples are widely available throughout Europe and easily accessible to researchers, offering high population coverage and the potential for relatively quick data collection. Therefore, RDD surveys can still hold an important place in the researcher’s toolkit, particularly if some of its inefficiencies are addressed.
One such inefficiency is the need to dial thousands of numbers to reach the target number of respondents. If the sample can be screened ahead of time to remove likely nonworking numbers, this can make fieldwork much more manageable. Pew Research Center’s annual Global Attitudes Project (GAP), which has traditionally used unfiltered RDD samples in Europe, provides an opportunity to test three screening methods for mobile sample:
(1) Home Location Register (HLR) lookup, which queries a mobile network operator’s database of numbers to determine their working status,
(2) Activity flags that are assigned to a number based on use of social media and messaging apps, and
(3) Silent SMS, which sends an SMS message to a mobile number without any notification on the recipient’s end and uses the delivery report to determine its working status.
These methods will be tested, as available, during the GAP 2025 cycle in France, Germany, Italy, the Netherlands, Spain, and the UK. The analysis will assess the accuracy of these methods in identifying working and nonworking numbers and examine the potential noncoverage implications of using screened sample – that is, how would sample demographics and attitudinal estimates change if the sample was restricted to the numbers flagged as working by each of the three methods?
Piggybacking Strategies in Survey Recruitment: The Role of Salience and Incentives in Cross-Sectional vs. Longitudinal Designs
Dr Jessica Daikeler (GESIS- Leibniz Institute for the Social Sciences ) - Presenting Author
Dr Joachim Piepenburg (GESIS- Leibniz Institute for the Social Sciences )
Dr Barbara Binder (GESIS- Leibniz Institute for the Social Sciences )
Dr Henning Silber (University of Michigan)
Recruiting survey participants has become increasingly difficult due to declining response rates. Piggybacking, which recruits participants for additional studies through existing surveys, offers a promising strategy. However, research on its effectiveness remains limited. This study examines piggybacking's potential using the ALLBUS 2023 (a cross-sectional survey) and the GESIS Panel.pop (a longitudinal survey) to recruit participants for the GESIS Panel.dbd under varying incentives and salience conditions.
Two research questions guided the study: (1) How do salience and financial incentives influence willingness to participate in piggybacking recruitment? and (2) Does piggybacking effectiveness differ between longitudinal and cross-sectional surveys? A 2x2 experimental design varied incentive levels (€5 or €10) and salience (highlighting the incentive or not). The ALLBUS sample, limited to self-administered modes, included 3,297 respondents (1,710 PAPI and 1,587 CAWI). Recruitment outcomes—agreement to future contact—were analyzed across experimental conditions. The same experiment was conducted in the full GESIS Panel.pop sample.
Preliminary ALLBUS results show marginal effects of salience and incentive levels on participation. Agreement was highest (70.2%) among CAWI respondents offered €10 with salience, but overall differences between conditions were small. Among respondents offered €5, higher salience reduced agreement by 5 percentage points (PP), while higher salience increased agreement by 5 PP for those offered €10. These findings suggest minor effects of incentives and salience, with other factors, such as initial survey characteristics, likely playing a larger role.
Results from the GESIS Panel.pop recruitment (expected in early 2025) will provide comparative insights into longitudinal survey respondents. This study underscores the potential of piggybacking and highlights the need for further research to uncover the mechanisms driving its effectiveness, advancing recruitment methodologies and informing future survey design.
From Clicks to Quality: Assessing Advertisement Design’s Impact on Social Media Survey Response Quality
Ms Jessica Donzowa (Max Planck Institute for Demographic Research/ Bielefeld University) - Presenting Author
Professor Simon Kühne (Bielefeld University)
Ms Zaza Zindel (Bielefeld University)
Researchers are increasingly using social media platforms for survey recruitment. However, empirical evidence remains sparse on how the content and design characteristics of advertisements used for recruitment affect response quality in surveys. Building on leverage-salience and self-determination theory, we assess the effects of advertisement design on response quality. We argue that different advertisement designs may resonate with specific social groups who vary in their commitment to the survey, resulting in differences in the observed response quality. We use data from a survey experiment conducted via ads placed on Facebook in Germany and the United States in June 2023. A commercial access panel company was contracted to include identical survey questions to allow for comparison with the Facebook recruited survey data. The survey, focusing on attitudes toward climate change and immigration, featured images with varying thematic associations with the topics (strong, loose, neutral). The Facebook sample consisted of 4,170 respondents in Germany and 5,469 respondents in the United States. We compare several data quality indicators, including break-off rate, completion time, non-differentiation, item non-response, passing an attention check question, and follow-up availability, across different advertisement features. Regression analyses indicate differences in response quality across advertisement designs, with a strong thematic design generally being associated with poorer response quality. Strongly themed ad designs are generally associated with higher attrition, non-differentiation, and item non-response, and with a lower probability of passing an attention check and providing an e-mail address for future survey inquiries. Our study advances the literature by highlighting the substantial impact of advertisement design on survey data quality, and emphasizing the importance of tailored decision-making in recruitment design for social media-based survey research.