Undesirable Interviewer Behaviour and Data Falsification: Prevention, Detection and Intervention 2 |
|
Coordinator 1 | Dr Joost Kappelhof (SCP) |
Coordinator 2 | Dr Roberto Bricenos-Rosas (GESIS) |
Coordinator 3 | Dr Caroline Vandenplas (KU Leuven) |
Coordinator 4 | Professor Geert Loosveldt (KU Leuven) |
Coordinator 5 | Dr Ineke Stoop (SCP) |
Coordinator 6 | Miss Céline Wuyts (KU Leuven) |
Undesirable interviewer behavior can occur in various ways of which data falsification is an extreme example. In general undesirable interviewer behaviour can be defined as any interviewer behaviour that can have a negative impact on the data quality. Such behaviour can be unintentional or intentional. Undesirable behaviour can occur during the performance of various interviewer’s tasks. These tasks are related both to contacting, selecting and convincing the respondent to participate and to the interaction with the respondent during the interview. In the context of standardized interviewing these tasks must be performed according to some basic rules. Deviations from these rules can be considered as an important category of undesirable interviewer behaviour. Typical examples are interviewers who rush through the questionnaire forcing the respondents to satisfy or interviewers who anticipate clarification demands by suggesting an answers. They may also engaged in side conversations or fail to keep a neutral attitude. In the worst case, the interviewer will fill in an answer without asking the question. We can regard the latter as falsification also called curbstoning.
Notice that data falsification does not only need to be performed by the interviewers. Also field supervisors, survey agencies and even researchers may be involved in undesirable practices such as data fabrication. The data processing step during which ‘advanced’ data cleaning procedures are used seems to be very relevant in this context. There seems to be a grey zone between data cleaning and data fabrication.
We are interested in any papers discussing ways to detect and prevent undesirable behaviour or practices tackling both the theoretical aspects and the current practices. Examples of detection tools are interaction analysis, interviewer effects, unusual response patterns, partial duplicates, back checks and the use of paradata such as contact forms, key strokes or time stamps. Prevention techniques can be related to interviewer briefing and training, fair interviewer payment, early detection of suspicious cases through fieldwork monitoring, or the development of easy-to-administer and not-burdensome questionnaires. Papers may also consider the possible interventions after the detection of fraud or undesirable interviewer behaviour either during the fieldwork or post-survey.