All time references are in CEST
Reducing and measuring nonresponse bias in times of crisis: Challenges, opportunities, and new directions 2 |
|
Session Organiser |
Mr Tom Krenzke (Westat) |
Time | Wednesday 19 July, 11:00 - 12:30 |
Room | U6-11 |
With response rates for sample surveys continuing to decline, much focus has been on reducing and measuring nonresponse bias. The COVID-19 pandemic, the geo-political conflicts and humanitarian crisis have made it difficult to change the state of response rates. Methods exist for reducing nonresponse bias during data collection and after. The approaches include offering incentives for respondents and interviewers, hiring tips, training, use of outreach materials, and nonresponse strategies such as reassigning cases, tracking progress, and communication. Methods also exist that help to measure nonresponse bias to gauge the quality of the collected data. These methods include comparing demographic distributions before and after nonresponse adjustments, comparing respondent distributions to frame distributions, computing correlations between weighting variables and outcome variables, and conducting a level of effort analysis. Presentations will provide emerging approaches to improve response rates, and to reduce and measure nonresponse bias. The session will include a variety of surveys (e.g., Programme for the International Assessment of Adult Competencies), from different countries, and sectors (government, university).
Keywords: Total survey error, analysis, adaptive survey design
Mr Hafsteinn Einarsson (University of Iceland) - Presenting Author
Dr Alexandru Cernat (University of Manchester)
Professor Natalie Shlomo (University of Manchester)
Cross-national surveys run the risk of differential survey errors, where data collected vary in quality from country to country. Adaptive survey designs have been proposed as a way to reduce survey errors, by leveraging auxiliary variables to inform fieldwork efforts, but have rarely been considered in the context of cross-national surveys. Here, using data from the European Social Survey, demographic variables, paradata (interviewer observations) and contact data are used to inform adaptive survey design protocols where fieldwork efforts are ended early for selected units in the final stage of data collection. Eight combinations of response propensity models and selection mechanisms are evaluated in terms of sample composition (as measured by the coefficient of variation of response propensities), response rates, number of contact attempts saved, and effects on estimates of target variables in the survey. We find that sample balance can be improved in most country-round combinations. Response rates can be increased marginally and targeting high propensity respondents could lead to significant cost savings associated with making fewer contact attempts. Estimates of target variables are not changed by the case prioritisations used in the simulations, indicating that they do not reduce nonresponse bias. We conclude that adaptive survey designs should be considered in cross-national surveys, but that more work is needed to identify suitable covariates to inform fieldwork efforts.
Professor Gabriele Durrant (University of Southampton) - Presenting Author
Dr Olga Maslovskaya (University of Southampton)
Dr Rose Lindsey (University of Southampton)
The Covid-19 pandemic had a significant impact on survey data collection methods, forcing survey agencies to stop collecting information via face-to-face (f2f) interviewing, which for some studies led to a rapid move to online data collection. The pandemic was, in particular, a catalyst for exploration, development and adoptions of (relatively) new, innovative data collection approaches being applied to survey data collection designs. Examples include video-assisted personal interviewing, use of electronic questionnaire devices, other Covid-secure contact approaches such as ‘knock-to-nudge’ approaches, where an interviewer reminds respondents about the survey and/or collects their phone numbers, but does not conduct an interview, and greater use of web and mixed-mode within surveys and respondent centred designs. Often approaches were applied heuristically without much theoretical grounding or even pilot testing. However, they are hoped to offer breakthroughs or improvements to survey designs. The consequences of such approaches are not well understood at present, and there are big gaps in the literature on the data quality of new approaches to data collection, which were trialled and/or tested since the onset of the pandemic.
Given these significant changes, it is important to better understand the current survey data collection landscape, specifically recent innovations in surveys and their effects on the changes on data quality, selection and measurement. The presentation will focus on results from a review of innovations employed in the UK during the pandemic. A series of qualitative interviews were conducted with a range of leading survey methodologist and UK large-scale survey investments. We will report on findings from the ESRC-funded Survey Data Collection Network. We will draw particularly on evidence from UK social surveys. If by then available, we will also report on findings from a new UK survey data collection collaboration project)
Professor Ismet Koc (Hacettepe University Institute of Population Studies) - Presenting Author
Dr Melike Sarac (Hacettepe University Institute of Population Studies)
Turkey is not an exception in declining the response rates over time as in most countries across the world. This decline is more visible for clusters with high socio-economic status in urban settlements. The reduced response in high socio-economic groups and resistance of remarkable response in low socio-economic groups may affect the reliability of estimates. For instance, the infant mortality rate which is on the decline since 1993, estimated at 13 per thousand in 2013, increased to 17 per thousand in 2018 Turkey Demographic and Health Survey (TDHS) while it declined to less than 10 per thousand according to the registration system. This result seems to be associated with an increasing part of interviewed households with low welfare. The primary objective of this study is to examine the effect of household welfare on response behavior. The data comes from the six nationwide repeated cross-sectional surveys conducted in the period 1993-2018. The descriptive findings showed that almost half of the interviews were completed in the clusters that achieved the median number of interviews in 1993, while it declined to 37 percent in 2018. Multivariate analyses that control for just household welfare found that low welfare level increases the odds of interviewing households up to 10.3 times in 2018. The final model that controls for the region, type of settlement, number of visits, number of eligible women, household size, number of children under five, and mean years of education estimated a remarkable effect, particularly for the last three surveys. These findings suggest several requirements such as taking steps to ease gaining cooperation with housing units that have high security, organising special training sessions focusing on nonresponse, sending prenotification letters to selected units, media activities designed to create awareness among households, and collecting para-data for nonrespondents.
Dr Jason Fields (U.S. Census Bureau) - Presenting Author
Mr David Hornick (U.S. Census Bureau)
The Census Household Pulse Survey (HPS), created as an experimental rapid-response to the COVID-19 pandemic, leverages electronic (email and SMS text) messaging to contact household members and internet collection to quickly provide information for emerging and quickly changing issues. While the design allows for the quick collection and dissemination of information and is based on a probability-based sample selection from a robust sampling frame, there are concerns over the response, coverage, and biases in the estimates (Peterson, Toribio, Farber, & Hornick, 2021). In some ways, there are similarities between opt-in non-probability samples and this type of low-response probability sample (Bradley et al., 2021; Nishimura, Wagner, & Elliott, 2016). While increasing nonresponse and related biases are not unique to surveys like the HPS (Brick & Williams, 2013; Kreuter, 2013), these concerns raise a critical need to evaluate and incorporate diagnostic measures for the assessment and correction for biases in estimates from this type of collection. Addressing theses quality concerns will be critical to advance the HPS type data-collection from the experimental realm to a more mainstream resource. This analysis combines administrative records at the individual and address level with survey responses and the sample frame to assess coverage and bias, and to propose adjustments to improve data quality. Based on the evaluation and correlates available at the frame level, we consider bias minimizing adjustments to sampling and weighting. Measures like the ddc framework (Meng, 2018) for single dimensions or (Little, West, Boonstra, & Hu, 2020) may provide the necessary tools for evaluating, adjusting, and increasing the confidence in measures generated from data sources like the HPS. This evaluation will have broad applicability as more mainstream household surveys continue to also experience declining response and likely increasing selection and response biases.