All time references are in CEST
Improving the representativeness, response rates and data quality of longitudinal surveys 2 |
|
Session Organisers |
Dr Jason Fields (US Census Bureau) Dr Nicole Watson (University of Melbourne) |
Time | Wednesday 19 July, 16:00 - 17:30 |
Room | U6-01b |
Longitudinal survey managers are increasingly finding it difficult to achieve their statistical objectives within available resources, especially with the changes to the survey landscape brought by the COVID-19 pandemic. Tasked with interviewing in different populations, measuring diverse substantive issues, and using mixed or multiple modes, survey managers look for ways to improve survey outcomes. We encourage submission of papers on the following topics related to improving the representativeness, response rates and data quality of all, but especially longitudinal, surveys:
1. Adaptive survey designs that leverage the strength of administrative records, big data, census data, or paradata. For instance, what cost-quality tradeoff paradigm can be operationalized to guide development of cost and quality metrics and their use around the survey life cycle? Under what conditions can administrative records or big data be adaptively used to supplement survey data collection and improve data quality?
2. Adaptive survey designs that address the triple drivers of cost, respondent burden, and data quality. For instance, what indicators of data quality can be integrated to monitor the course of the data collection process? What stopping rules of data collection can be used across a multi-mode survey life cycle?
3. Papers involving longitudinal survey designs focused on improving the quality of measures of change over time. How can survey managers best engage with the complexities of such designs? How are overrepresented or low priority cases handled in a longitudinal context?
4. Survey designs involving targeted procedures for sub-groups of the sample aimed to improve representativeness, such as sending targeted letters, prioritising contact of hard-to-get cases, timing calls to the most productive windows for certain types of cases, or assigning the hardest cases to the most experienced interviewers.
5. Papers involving experimental designs or simulations aimed to improve the representativeness, response rates
Dr Anne Elevelt (Statistics Netherlands) - Presenting Author
Dr Annemieke Luiten (Statistics Netherlands)
Ms Michelle Creemers (Statistics Netherlands)
Dr Maaike Kompier (Statistics Netherlands)
Our society is becoming increasingly multilingual. As a result, an increasing proportion of the nonresponse to Dutch-language questionnaires is caused by language problems. However, according to the interviewers, many of these people are willing: had they been able to participate in another language than Dutch, they would have done so. This affects not only the response rate and representativeness, but also the (inclusive) image of Statistics Netherlands. Statistics Netherlands ambition is namely to map all of the Netherlands and thus be inclusive in all modes of observation. Implementation of multilingual questionnaires however requires a lot of extra (programming and translating) work and costs. The question is whether multilingual questionnaires yield (enough) additional respondents, and especially respondents that would not participate in a Dutch only questionnaire resulting in a better representativity.
We conducted three pilot studies to investigate the impact of multilingual questionnaires on response (bias), all fielded in a general population sample. In a preliminary study, respondents who could not respond to a questionnaire in Dutch were asked in which language they could have participated. Most respondents indicated that they could participate in English, followed by Polish, Arabic and Turkish. In these three studies we investigated:
1) A Dutch – English questionnaire both online as via an interviewer (CAPI/CASI).
2) A Dutch – English online questionnaire.
3) An online questionnaire in Dutch, English, Polish, Arabic and Turkish.
In this presentation we will show you how these three multilingual questionnaires impacted response rates and representativeness. Who are the additional respondents? Furthermore, we held focus groups with interviewers from study 1. How did people react at the door? We will use all these results to make recommendations for future research.
Ms Marieke Volkert (IAB) - Presenting Author
Ms Corinna König (IAB)
The IAB Establishment Panel, conducted by the Institute for Employment Research (IAB), is an annual employer survey in Germany that collects longitudinal data to provide deep insights into macroeconomic developments since 1993. Until 2018, the primary mode of data collection was face-to-face with self-administration available upon request, both conducted with paper-and-pencil questionnaires.
From 2018 on, a team of survey experts has been working on transitioning the panel to allow for both self-administered web interviewing and computer-assisted personal interviewing (CAPI). Due to the COVID-19 pandemic, computerized data collection was pushed forward in 2020 and the majority of respondents were offered a web-first option while face-to-face interviewing was replaced by telephone interviewing. As a consequence of switching to mainly computer-assisted data collection, from 2020 on we can draw on a large pool of paradata that was not previously collected, including time stamps for every click on hyperlinks as well as change-logs of every single answer.
Using these paradata, we now are able to observe the digital response process of self-administered and interviewer assisted respondents which differ in their behavior. This leads to different outcomes, namely giving or changing an answer, taking a pause, breaking off or delegating parts of the questionnaire to a colleague. We are interested in how respondents make use of different functionalities of our software. By comparing durations and time stamps of different questions for different establishment types, we are able to draw on the burden single questions put upon our respondents.
In this paper, we gather knowledge and insights about the survey response process which will be used as a basis for modernizing further modernization efforts in the IAB Establishment Panel with the aim of decreasing respondent burden and enhancing data quality.
Miss Intifar Chowdhury (Australian National University) - Presenting Author
Professor Ben Edwards (Australian National University)
Dr Nikiki Honey (Social Research Centre)
Mr Lachlan Hugo (Social Research Centre)
Dr Kylie Hillman (Australian Council for Educational Research)
Dr Daniel Edwards (Australian Council for Educational Research)
Professor Matthew Gray (Australian National University)
Professor Andrew Norton (Australian National University)
An increasing trend towards lower retention in longitudinal surveys of young people, reduces representativeness, increases costs and presents ongoing methodological issues in subsequent waves. Beyond a handful of small-n qualitative research, most questionnaire surveys in Australia do not ask participants – both of those who have agreed to be recontacted and those who have indicated that they will not participate in future waves – direct questions about what would motivate them to reengage in subsequent survey waves. Using 2022 Wave 1 data in the Australian Post-School Destination (GENERATION) survey and an R package for the quantitative analysis of textual data (Quanteda), we report open-ended responses of over 5,000 students to the following question: “What is one thing we could do to make it more likely that other people like you would complete the survey in the future?” Further, we provide a snapshot of the core themes that recur in post-survey deliberations in online youth advisory groups. Employing insights gathered from this mixed method approach, we also test whether there is any variation in responses driven by equity group memberships, such as youth from low-SES backgrounds, who identify as indigenous or as non-binary or who report a disability. We discuss the implications of our findings for future cohort studies and the extent to which youth ideas align with best survey practice.