New Developments in the Use of Adaptive Survey Designs in Longitudinal Studies |
|
Session Organisers |
Ms Nicole Watson (University of Melbourne) Dr Alexandru Cernat (University of Manchester) |
Time | Tuesday 16th July, 11:00 - 12:30 |
Room | D24 |
As the costs of survey data collection and non-response increase, survey methodologists are searching for innovative approaches to make data collection more efficient. One of the most promising approaches that have received considerable attention recently is the use of adaptive survey designs. These imply the change of data collection procedures during fieldwork in order to target particular groups that might be underrepresented or harder to interview. While this approach is increasingly popular in cross-sectional studies their use in longitudinal studies has received less attention.
In this session we aim to bring together talks that investigate the uses of adaptive survey designs in longitudinal studies. We especially encourage papers that discuss some of the special characteristics of longitudinal studies such as:
- The use of prior wave information, current wave information and administrative data in order to target respondents
- Trade-offs between retaining and changing data collection procedures for the same units
- Impact of adaptive survey designs on attrition patterns
- Impact of adaptive survey designs on measurement error (e.g., due to changes in mode, interviewer, etc.)
- Long run effects of the implementation of adaptive survey designs
- Metrics used to measure the comparative performance of adaptive survey designs options
- Costs implications of using adaptive survey designs in longitudinal studies
Keywords: longitudinal data, adaptive survey design, non-response error, measurement error
Dr Kevin Tolliver (U.S. Census Bureau) - Presenting Author
Dr Jason Fields (U.S. Census Bureau)
One criticism of adaptive survey designs, in particular case prioritization in face-to-face surveys, is that it may lead to inefficient routing and time-use for interviewers, thereby increasing costs. The Survey of Income and Program Participation (SIPP) uses adaptive design protocols in data collection in order to assist in achieving the survey’s data quality goals, including: retaining more movers, increasing response of household members who cannot be linked to administrative data, and fighting attrition bias. Experiments were conducted in Wave 3 and Wave 4 of the 2014 panel that showed moderate success of achieving these goals. By separately testing the impact of case prioritization randomized across interviewers in Wave 3 and across cases in Wave 4, we were able to make critical decisions for the implementation of adaptive design during production interviewing for the initial Wave of the 2018 SIPP. This research summarizes the results of what has been learned during the implementation and development of adaptive design procedures during the 2014 SIPP. We discuss the trade-offs of data quality and costs for the adaptive survey design and discusses possible opportunities for improvement in future data collections.
Dr Jason Fields (U.S. Census Bureau) - Presenting Author
Dr Kevin Tolliver (U.S. Census Bureau)
The Survey of Income and Program Participation (SIPP) is a longitudinal panel survey that uses adaptive design protocols in data collection in order to assist in managing survey progress and cost, and in achieving the survey’s data quality goals. The SIPP program has transitioned from experimental implementation of adaptive design during the 2014 SIPP panel, to a more comprehensive production implementation starting in 2018. In 2018, the SIPP program initiated a new sample. As with many previous years, the SIPP program faced budgetary uncertainty related to continuing federal budget resolutions in addition to continuing challenges in gaining cooperation. In 2018, the adaptive survey design had to account for a shift in how the sample was released for work, 5 monthly samples instead of one 4-month sample. For 2019, to combat more severe non-response in 2018, additional new sample is being released for collection, initiating an overlapping panel design. This change adds further demands on the adaptive design procedures, and also affords additional opportunities. This means during the 2019 data collection, interviewers have to prioritize cases at different stages in their panel experience, and with different prioritization schema. This research discusses how the previous adaptive survey design was modified to account for the change in survey design and gives preliminary results for the adaptive survey design.
Dr Annette Scherpenzeel (Chair for the Economics of Aging, Technical University of Munich) - Presenting Author
Dr Arne Bethmann (Munich Center for the Economics of Aging (MEA), Max-Planck-Institute for Social Law and Social Policy)
Dr Michael Bergmann (Chair for the Economics of Aging, Technical University of Munich)
Mrs Sabine Friedel (Chair for the Economics of Aging, Technical University of Munich)
Prior research with the Survey of Health, Ageing and Retirement in Europe (SHARE) data has shown that survey respondents who gave no answer to the income questions have a significantly lower probability of participating in the next wave than any other groups. This hence seems to be a group for which adaptive fieldwork measures to prevent panel drop-out are especially valuable. Since such measures should address the common cause of the income nonresponse in one wave and unit nonresponse in the next wave, we tried to reveal that cause in the present study.
Analyzing the observed relationship with available SHARE panel survey data showed who these respondents are and how they behave in our survey. However, it did not clearly reveal the reasons for not answering the income questions. We therefore conducted in-depth interviews with a small selection of respondents with income nonresponse in order to better understand what the respondents were thinking when not answering the income question and better understand the consideration to (not) continue to participate in further waves. Prior to the in-depth interviews, we had the following hypotheses about the common cause for income item nonresponse and panel drop-out: 1) a general reluctance or lack of motivation for surveys and survey questions; 2) Strong privacy concerns; 3) Don't know the answer due to deteriorating health or cognitive abilities. The results of the in-depth interviews elaborated these hypotheses but also indicated some reasons we had not thought of yet. In the next step, the results were used to design a structured questionnaire for a larger quantitative study.
We will present which panel respondents show the response pattern from income item nonresponse to panel drop-out, the results of the in-depth interviews, and the design and pretest of the structured questionnaires.
Ms Nicole Watson (University of Melbourne)
Mr Mark O'Shea (University of Melbourne) - Presenting Author
Dr Alexandru Cernat (University of Manchester)
In recent years, the field of adaptive survey design has emerged as an important addition to the survey design literature. Much of the research that has been undertaken focuses on repeated cross-sectional surveys, yet it seems there is greater potential for adaptive survey design within longitudinal surveys as data on the respondent and their survey experience builds with each wave. Whilst there has been some research into adaptive survey design approaches in longitudinal surveys, we add to this literature by considering longer-term impacts of modifications to fieldwork processes in the context of two household panels. We use waves 11 to 16 of the Household, Income and Labour Dynamics in Australia (HILDA) Survey and waves 1 to 6 of UK Household Longitudinal Study (also known as Understanding Society) to simulate alternative follow-up strategies which target cases based on a number of indicators. We focus on the extent to which these adjustments to fieldwork efforts impact on the response rates, sample representativeness, and sample size. We also examine the long-run implications of these adaptive survey design strategies and assess whether the impact of reducing fieldwork effort can be mitigated by weighting.