ESRA 2025 Preliminary Program
All time references are in CEST
Adapting survey mode in a changing survey landscape: Experiences from repeat cross-national, cross-sectional, and general social surveys 3 |
Session Organisers |
Dr Gijs van Houten (Eurofound) Dr René Bautista (NORC at the University of Chicago) Professor Rory Fitzgerald (ESS HQ; City St Georges, University of London) Mr Tim Hanson (ESS HQ; City St Georges, University of London) Mr Nathan Reece (ESS HQ; City St Georges, University of London) Ms Daphne Ahrendt (Eurofound) Ms Jodie Smylie (NORC at the University of Chicago)
|
Time | Wednesday 16 July, 09:00 - 10:30 |
Room |
Ruppert Blauw - 0.53 |
Studies to measure attitudes, opinions, and behaviors are critical to understanding societies around the world. In the face of social developments, changing trends in respondent recruitment methods, budget constraints, national infrastructure disruptions, and public health concerns, many repeat cross-sectional social surveys are experimenting with self-completion and mixed-mode approaches. The European Social Survey (launched 2001), United States’ General Social Survey (launched 1972), and the European Quality of Life Surveys (launched 2003) are examples of longstanding studies collecting data to inform research on changes over time and now exploring and transitioning to new modes. This session brings together cross-sectional social surveys to share experiences in survey mode transition.
The session's aims include: (1) Share results and lessons from recent mode experiments and mixed-mode applications by general social studies, and potential ways to improve methods. (2) Highlight how different cross-sectional studies have recently modified survey protocols to adapt to changing public conditions. (3) Provide space for data creators, data users, and survey practitioners to discuss methodological and statistical challenges for cross-sectional studies considering such moves. (4) Discuss integrity and comparability of data collected using new data collection methods with the existing time-series. (5) Explore applications of emergent technologies to new modes.
We invite submissions from those involved in transitioning repeat, cross-sectional, and cross-national social surveys to new data collection approaches. Topics of interest include: results from pilots or feasibility studies based on self-completion or mixed-mode approaches; findings from experimental research testing aspects of self-completion/mixed-mode designs (e.g., incentive and mailing strategies, survey length adaptations, sequential vs. concurrent designs); impacts of mode switches on measurement and survey time series; and discussions of experiences and challenges with adapting cross-sectional surveys to new modes across different cultural/national contexts.
Keywords: general surveys, survey methodology, data collection, data collection modes, mixed mode, self administration
Papers
Are People Likely to Accept PI's Words at "Face" Value?: A Test to Boost Respondents’ Willingness to Participate
Dr Takayuki Sasaki (Tsuda University) - Presenting Author
It has become difficult to maintain sample representativeness with traditional survey methods, such as direct visits to the homes of respondents. As a result, many existing repeated surveys have shifted their survey modes from the gold standard to mixed-mode approaches. In the midst of a survey mode transition, there are still important questions that are left unanswered. Are there ways to increase respondents’ willingness to participate without face-to-face interactions? The aim of the present study is to investigate the effects of invitation letters on response rates by using a randomized controlled trial.
Traditionally, Japanese researchers send an invitation letter via mail to a sample who are randomly derived from the basic resident registration. Approximately one week later, interviewers ask individuals to participate in the survey by explaining research purposes during the home visit. With new survey modes, however, researchers do not have enough chance to explain the importance of each participation in a survey. Thus, I argue that principal investigators should directly speak to participants to increase their willingness to participate (WTP). One way to do that is to paste a QR code in an invitation letter linking to a video clip.
Data from the National Survey on Family-friendly Society (NSFS) conducted by Tsuda University in February 2025 will be used for this study. A national sample of 3,200 individuals will be split into a half randomly. One group receives an invitation letter with a PI’s video clip, and the other group receives an invitation letter without. Respondents are asked to answer the questionnaire using their own smartphones. Sampling method and many questions are parallel to Japanese General Social Survey, and thus response rates and response patterns can be comparable. Findings from this study should suggest pros and cons of the new survey design.
Experimental approaches in transitioning from face-to-face to push-to-web: Learnings from the Childcare and early years survey of parents
Dr Tom Huskinson (Ipsos) - Presenting Author
The Childcare and early years survey of parents is a high-quality random probability face-to-face survey of c6,000 parents per year in England. Commissioned by the Department for Education, it started in 2004 and is published as an Official Statistic. In line with increasing pressures for face-to-face surveys to transition to predominantly online data collection, this research investigated the extent to which survey estimates could be collected using a push-to-web methodology, and the implications in terms of maintaining trend data, and value for money.
The face-to-face questionnaire was adapted to online administration following ‘Mobile First’ design principles.
Two features of the push-to-web survey were experimentally manipulated to explore the optimal design: incentivisation (none, £5, £10 or £15), and deadline for completion (stated vs not stated). In addition, split-ballot experiments were embedded in the questionnaire to inform aspects of questionnaire design, including collecting continuous data via open numeric versus banded pre-codes, displaying versus hiding “Don’t know” code, and varying the position of certain response options. Measures of respondent experience were collected at the end of the questionnaire. The face-to-face survey was fielded as usual, providing a 'parallel run' against which survey estimates could be compared.
Incentivisation raised the response rate from 12% (none) to 27% (£15), increased representativeness, and delivered value-for-money. Making the survey deadline explicit reduced response slightly. Open numeric data was of high quality, but deciding whether to display “Don’t know” codes online remains a challenge. Changes to code positions had major implications for response distributions. A comparison of weighted key survey estimates between the two modes found significant difference for most, with the extent of these differences varying by whether the questions measured awareness, perceptions, preferences, or behaviours.
Still the gold standard in survey research? Comparing face-to-face and self-completion data collection in a repeat cross-sectional general social survey
Professor Rory Fitzgerald (European Social Survey ERIC ) - Presenting Author
Mr Tim Hanson (European Social Survey ERIC)
Professor Olga Maslovskaya (Southampton University)
Professor Peter Lynn (University of Essex)
Dr Ruxandra Comanaru (European Social Survey ERIC)
Dr Cristian Domarchi (Southampton University)
Dr Nhlanhla Ndebele (City St George's University of London)
Surveys aim to provide estimates of the behaviour, social condition or attitudes for the population that they seek to represent. To do this well, the total survey error needs to be as minimal as possible, otherwise conclusions might reflect methodological artefacts of data collection, rather than the true population score . Since modern surveys of the general population were first established, the best way to collect high quality data was felt to be face-to-face surveys, amongst probability samples of households or individuals. However, more recently, face-to-face data collection has been seen declining response rates, increasing interviewer effects, increased costs and a reduction in the number of commercial providers of this service, casting doubt on whether it remains the ‘gold standard’. At the same time, self-completion surveys offer an increasingly convincing alternative, with increased web penetration and digital literacy, zero interviewer effects, relative cost efficiency as well as promising response rates and representativeness.
This paper compares face-to-face data collection on the 10th round of the European Social Survey in Great Britain to an experimental self-completion survey. The paper finds that the self-completion data collection approach achieved a considerably higher response rate than the face-to-face survey, similar representativeness, and a substantially lower cost per interview, whilst being completed far quicker than face-to-face data collection. In terms of data comparability between the modes, the authors find whilst there are differences in the point estimates between the data collected face-to-face and self-completion, the correlations between variables are similar, regardless of data collection approach. The paper concludes that self-completion data collection, combining both web and paper approaches, offers a high-quality alternative to face-to-face data collection with the potential to offer a new gold standard.
Who sent the questionnaire? Assessing the impact of adding a second logo of a more well-known government agency as a visible survey sponsor on the envelopes of mailed surveys
Mr Marcus Weissenbilder (The SOM-institute, University of Gothenburg) - Presenting Author
Ms Cornelia Andersson (The SOM-institute, University of Gothenburg)
Dr Sebastian Lundmark (The SOM-institute, University of Gothenburg)
Ms Elisabeth Falk (Nordicom, University of Gothenburg)
Declining response rates in surveys are a well-documented issue. One potential explanation is survey fatigue due to the number of market research surveys administered to each adult has been thought to have increased immensely (Groves, 2006; Kreuter, 2013; Peytchev, 2013; Leeper, 2019). To stand out amid this flooding of market surveys, government agencies, universities, and research institutes might be able to increase the likelihood of getting respondents to open their envelopes by printing logos from trusted and well-known government agencies sponsoring the survey.
Since 1986, the SOM Institute at the University of Gothenburg has conducted annual mailed paper-and-pencil surveys using probability samples. On each of those surveys, the SOM Institute has printed its logo. In this paper, one preregistered pilot and two preregistered replications of an experiment are presented, altering whether an additional survey sponsor is printed on the envelope. Printing an additional logo of a more well-known and trusted government logo or university may increase the likelihood of respondents opening and completing the questionnaire.
In 2023, 9,000 respondents in Gothenburg were mailed a survey, where half of them received an envelope showing the SOM Institute logo, whereas the other half received an envelope showing both the institute logo and the Gothenburg municipality logo. In 2024, the experiment was directly replicated in Gothenburg and extended to assess the impact of another government logo (Region Västra Götaland).
The pilot study showed that adding a second logo of a survey sponsor with a more well-known government logo increased response rates by 2.8% and decreased the number of reminders having to be sent, albeit without decreasing nonresponse bias. The data for the two replications will finish being collected by the end of 2024 and preliminary results of those studies are not yet known.
The effect of incentives strategies on yield, sample composition and data quality: findings from the 2024 European Working Conditions Survey
Mrs Tanja Kimova (Verian) - Presenting Author
Mr Gijs van Houten (Eurofound)
Mr Christopher White (Eurofound)
Miss Hajar GAD (Verian)
Mr Jamie Burnett (Verian)
The European Foundations for the Improvement of Living and Working Conditions (Eurofound) commissioned Verian to conduct the eighth edition of the European Working Conditions Survey (EWCS) in Spring 2024 in 37 countries. As part of Eurofound’s strategy for future-proofing its surveys, the EWCS 2024 was conducted both face-to-face and online in all EU Member States (using a telephone push-to-web or a postal push-to-web approach), and the implementation of the online component included a range of test elements.
A key test element is the approach to incentives. As part of the pilot test different approaches to incentives were trialled: a combination of a small unconditional and a larger conditional incentive, an “early bird” approach to conditional incentives where the value of the incentive was reduced after a certain period, and different values for the conditional incentive. It was found that the unconditional incentive and the early-bird approach did not sufficiently increase the yield to warrant the additional cost and complexity. It was also found that offering a higher conditional incentive did improve cost efficiency. Therefore, in mainstage fieldwork in most countries only the higher value conditional incentive was offered. In seven countries, respondents in the online segment of the mainstage survey were randomly allocated to this higher value or to an even higher incentive value, allowing to further calibrate the most effective incentive level, in terms of yield, response profile, and data quality.
In this paper we will discuss the results of the pilot test – mainly focusing on the cost efficiency of the different incentive strategies – and the results from mainstage fieldwork – assessing the effect of the different values of the conditional incentive on yield, as well as on sample composition and data quality.