Assessing the Extent of Nonresponse Bias |
|
Session Organiser |
Dr Marta Kolczynska (Institute of Philosophy and Sociology, Polish Academy of Sciences) |
Time | Friday 19th July, 13:30 - 14:30 |
Room | D13 |
This session includes papers that showcase methods for the assessment of nonresponse bias.
Keywords: nonresponse error, refusals, representativeness
Dr Marta Kolczynska (Institute of Philosophy and Sociology, Polish Academy of Sciences) - Presenting Author
Dr Piotr Cichocki (Adam Mickiewicz University in Poznan)
Professor Piotr Jabkowski (Adam Mickiewicz University in Poznan)
In survey methodology there is a need for standardised, easily applicable, and theory-based measures of the quality of samples in general population surveys. This demand is especially strong when it comes to cross-country survey projects, and fails to be satisfied by either the handbook-invoked gauges of survey quality such as response, contact, cooperation or refusal rates or indicators based on the evaluation of survey documentation. Such measures prove insufficient due to the fact that they are indirect and may be poorly correlated with total survey errors. In this paper we present two ways of evaluating sample quality based on external (Groves 2006) and internal (Kohler 2006) criteria of representativeness. We apply these criteria to a harmonized dataset of 1721 national surveys from 22 international survey projects using standardized meta-data containing information about sample types and survey modes. Our analyses focus on: 1) cross-project sample quality evaluation; 2) longitudinal differences in sample quality; 3) sample quality by sample type controlling for mode effects; 4) associations between project documentation quality and sample quality; 5) associations between response rates and sample quality, and 6) associations between external and internal evaluation of sample quality.
Professor Jürgen Friedrichs (University of Cologne)
Mr Felix Lesske (University of Cologne) - Presenting Author
Mrs Vera Schwarzenberg (University of Cologne)
Systematic non-response represents a classic problem in empirical social research. Even in major German studies such as the ALLBUS, a continuous decline in response rates has been observed over the past decades (ALLBUS 1980: 69,5%; 1990: 60,4%; 2000: 46,9 – 56,5%; 2010: 34,4; 2016: 34,6). Since distortions caused by non-response cannot be assumed to be randomly, this also impacts the data quality and thus the validity of interpretations.
We investigate non-response-bias and whether there are systematic differences in status between the respondents and those who refused. Data come from our study “The Socio-spatial Integration of Refugees in Hamburg, Cologne and Mulheim”, comprising six residential areas with a refugee accommodation. We used a probability samples; a total of 1,742 face-to-face interviews were completed, the response rate was 38.2 %.
Several interviewers drew our attention to the circumstance that refusals were becoming more frequent in some streets - indicating systematic bias. We test this assumption for two areas: Cologne Ostheim with one refugee shelter for 400 people, and Mulheim Mitte, an inner-city area in which the refugees are housed in dispersed apartments.
We examine whether geo-referenced data supplemented by socio-economic data from “microm Corporation”, can be used to better determine the type of potential systematic errors. It is to be expected, particularly in the case of such a polarising issue as housing of refugees, that the refusals will not vary randomly. First results show that: 1. According to our assumption, participation in the interviews tends to be higher in high-status residential areas, and 2. In areas with a higher proportion of migrants, the participation rate is lower. We discuss further applications of this procedure to secure data quality.