All time references are in CEST
Improving the representativeness of longitudinal surveys |
|
Session Organisers | Dr Nicole Watson (University of Melbourne) Dr Pablo Cabrera Alvarez (University of Essex) |
Time | Tuesday 18 July, 09:00 - 10:30 |
Room |
Managers of household panel studies and cohort studies are facing increasing challenges in maintaining or improving the representativeness of their studies. Such challenges include declining response rates, changing preferences in how respondents want to be contacted or interviewed, shortage of interviewers, maintaining and improving engagement and commitment of respondents with less engaging modes of data collection, and how best to tailor or mix modes and methods for the greatest benefit.
We invite submissions to this session that focus on the representativeness of longitudinal surveys (rather than just overall response rates). We are more interested in exploring and understanding the heterogeneous effects of various modes and methods on subgroups (rather than the main effects) and how these can be best used to improve overall representativeness. Submissions are encouraged on the following topics:
- Experiments in improving response or representativeness
- Refreshment samples of new immigrants or other groups not represented or under-represented
- Adaptive survey design approaches to improve representativeness
- Optimally mixed modes and methods to improve participation amongst hard-to-reach or hard-to-interview sample members
- New methods or approaches to improve response probabilities among specific groups.
Keywords: representativity, refresh, re-engage, responsive design
Dr Pablo Cabrera-Álvarez (University of Essex) - Presenting Author
Professor Peter Lynn (University of Essex)
Longitudinal studies that extend over time might require to add a refreshment sample to mitigate the effect of the lower response rates and panel attrition and allow to represent changes in the target population (e.g., immigrants). In the past decade, especially after covid-19, push-to-web strategies that use mail to obtain responses to a web questionnaire have been increasingly used to sample the general population in the UK. This methodology has also been used to recruit refreshment samples in some longitudinal studies. However, when only address frames are available it is particularly challenging to achieve good response rates using push-to-web. This makes response maximisation strategies such as the use of incentives or enhanced contact strategies valuable tools to implement in this type of design.
This presentation will show the results of two experiments embedded in the recruitment of a general population refreshment sample for the Understanding Society Innovation Panel, a sample of the Great Britain household population devoted to testing methodological innovations. Therefreshment sample consisted of more than 6,000 addresses and it was collected using a push-to-web design. The first experiment tested the use of an early bird incentive versus a standard conditional incentive. The second experiment explored the effect of augmenting the contact strategy of an invite and two reminder letters with a pre-notification letter, a third reminder letter or both. In the analysis we examine the effect of these design features and the interaction between them on response rates, web completion, costs and sample composition. We interpret the findings of the analysis in light of the social exchange theory.
Dr Nicole Watson (University of Melbourne) - Presenting Author
High response rates in longitudinal studies are important to ensure the sample size remains as high as possible, the representativeness of the sample is upheld, and the number of respondents for which there is a rich history of data is maximised. This is especially true for the young people in long-running household panel surveys as they are the future of such studies.
This paper examines the response patterns of young people over the first 23 years of the Household, Income and Labour Dynamics in Australia (HILDA) Survey. The participation histories for the first six waves when young people becoming eligible for interview at age 15 (i.e., from age 15 to 20) is examined. Sample members are pooled into two groups so that there is sufficient sample for analysis and also to test if these response patterns differ between those turning 15 early in the panel (waves 1 to 8) or late in the panel (waves 11 to 8). A latent class analysis model is used to identify the response patterns encountered. How distinct are the different response classes? Do people who are missed being interviewed initially when they turn 15 mostly participate in later waves or does this set a pattern of non-response to follow?
The link between response rates of young people and fieldwork changes (such as the change in incentives, change of fieldwork provider, and changes due to the COVID-19 pandemic) is also examined. Finally, several strategies used to engage and motivate young people to participate in the study are discussed.
Ms Nursel Alkoç (University of Lausanne) - Presenting Author
Professor Anke Daniela Tresch (FORS/University of Lausanne)
Dr Line Rennwald (FORS/University of Lausanne)
Mr Lukas Lauener (FORS/University of Lausanne)
Professor Georg Lutz (FORS/University of Lausanne)
Political surveys are subject to response bias in measures of voter turnout and party choice: they overestimate voter turnout by a significant margin and in some cases underestimate levels of support for far-right parties. This paper presents findings from an experiment in which the salience of the topic of a political survey was manipulated at the recruitment phase to achieve voter turnout and party choice rates closer to actual rates. The treatment group was informed that the survey was about the upcoming federal elections, while the control group was informed that it was about current societal and political issues. The results from the first two waves of the panel survey of the Swiss Election Study (Selects) show that this manipulation (1) worked well in the first wave, but resulted in control group respondents abandoning the survey in the second wave, (2) helped to recruit more non-voters, albeit insignificantly, and (3) significantly mitigated the problem of underrepresentation of the far-right party. We conclude that framing a political survey in a more generic and less political way may provide some improvements in terms of representativeness but is not a panacea for minimizing turnout bias due to self-selection.