All time references are in CEST
Survey Data Harmonisation 1 |
|
Session Organisers | Dr Ruxandra Comanaru (European Social Survey, City University, London UK) Ms Daniela Negoita (European Values Study, Tilburg University, The Netherlands) |
Time | Tuesday 18 July, 11:00 - 12:30 |
Room | U6-01f |
The harmonisation of survey data has been a burgeoning research strand within social sciences over the last few years. Harmonisation has at its core the attempt to make data more comparable, allowing for data linkage and analysis of distinct datasets that were not initially meant to be assessed together. Sound methodology in data harmonisation allows comparisons and harmonisation of instruments ahead of the data collection. as well as evaluations of data from various sources that were not initially meant to be compared. As such, survey data can be harmonised ex-ante (during their design or after fieldwork before issuing the collected data) or ex-post (combining surveys not meant to be compared at the point of data collection). Some countries have made concerted efforts to harmonise instruments at the point of data collection (ex-ante), such that, for example, the wellbeing of the nation can be tracked in all national statistics surveys, pushing for the same measures to be used regularly to assess the same concepts. The latter approach to data harmonisation (i.e., ex-post) has been used to attempt to tease out insights from sources not designed to be compared a priori. The SUSTAIN 2 (WP2, Task 1) project, for example, tried to bridge data from two long-standing surveys in the European social science context: the European Values Study (EVS) and the European Social Survey (ESS). It aimed to harmonise their data over several decades in order to allow cross-survey and cross-national comparisons, and thus to link measures that have conceptual, and potentially statistical, overlap.
This session, which aims at offering practical prompts for future venues for cooperation between surveys, invites contributions on all aspects and challenges of data harmonisation: data collection mode, sampling design, translation method, and measurement instruments.
Keywords: data harmonisation, linear stretching, European Social Survey, European Values Study
Dr Kea Tijdens (WageIndicator Foundation, University of Amsterdam) - Presenting Author
Professor Ruud Luijkx (European Values Study/Tilburg University/University of Trento)
Dr Verena Ortmanns (German Institute for Adult Education Leibniz Centre for Lifelong Learning)
Dr Silke Schneider (GESIS – Leibniz Institute for the Social Sciences)
Mrs Daniela Negoita (European Values Study/ Tilburg University)
Mr Maurice Martens (Centerdata)
Socio-demographic attributes such as education, occupation, and religious denomination provide the background for studying attitudes, behaviors, and values. The use of these variables, however, posit a range of measurement challenges. Respondents can give vague or insufficient information during the data collection phase that makes it difficult to categorise these variables consistently in the post-coding stage. Changes in the social structure require continuous update of the existing classification systems which increase the costs of the survey implementation in terms of knowledge expertise, questionnaire amendments, and interviewers’ training. Along with that, face-to-face interviews have become more expensive than in the past and new solutions are devised to ensure high quality survey research, such as web-administered surveys.
Within this wedge, SurveyCodings developed tools aimed at reducing office coding and harmonisation logistics by giving access to free of charge, ready-to-be used key socio-demographic variables (field and level of education, occupational titles, job tasks, religious denominations, regions, cost of living). From general and broad categories, the classifications drill down to country-specific characteristics translated in the languages spoken in that country. Database lookups can be linked directly to questionnaires where the respondent or the interviewer can identify immediately the answer option of interest. Lookups can either replace open questions that need to be post-interview coded or long-list questions.
In this presentation, we will showcase the resources developed within SurveyCodings so far, as well as welcome suggestions and improvements.
Dr Boris Heizmann (GESIS - Leibniz Institute for the Social Sciences) - Presenting Author
Survey data harmonization can greatly improve the analytical potentials of survey data. While there may often be ideas regarding what would be the optimal harmonization strategy (e.g. imputation or equipercentile equating), these approaches are not always feasible e.g. due to data-based restrictions. Therefore, sometimes there is only a set of second-best strategies with no clear idea which of these should be given preference. The present paper takes this situation as a starting point and asks: Do the actual scientific conclusions differ when different harmonization approaches are employed?
The analytical setup of the paper will consist of a feasible set of three harmonization variants that are applied to heterogeneous, publicly available “real-world” survey data. The analysis will be conducted via a series of regressions that are based on different scenarios, such as varying sample sizes, modeling complexities, interactions etc. These scenarios mirror different typical empirical conditions. In order to mimic a typical harmonization situation, we will employ data from several sources such as the German General Social Survey (ALLBUS), the German Longitudinal Election Study (GLES), European Social Survey (ESS), or the European Value Study (EVS).
In the end, the result patterns obtained for these scenario combinations could imply that divergent strategies of harmonization lead to roughly similar results. In this case, researchers can more confidently apply data that would, from a purely methodological standpoint, be regarded as sub-optimally harmonized. In such a substantive result scenario, the implication for data quality would be that even a less-than-perfect harmonization can lead to valid results. Conversely, whenever the results diverge, this implies that the data quality would benefit from more sophisticated methods. Researchers would then be well-advised to add different harmonization strategies to their robustness checks.
Mrs Lisa Rutherford (National Centre for Social Research)
Ms Jo D'Ardenne (National Centre for Social Research) - Presenting Author
Since 2001, the cross-national European Social Survey (ESS) has been surveying the attitudes and behaviours of participants on a biennial basis. Part of the ESS’ rigorous model of cross-national questionnaire design and pre-testing includes cognitive interviewing. Cognitive interviewing was incorporated into the ESS questionnaire development process at round 6 and precedes piloting.
Cognitive interviewing methods provide insights on participants’ thought processes as they are confronted and attempt to answer survey questions. Survey questions are presented to participants in ways that mimic the intended survey presentation. The ESS has developed a cognitive interviewing protocol that captures observational and dialogic data generated from the use of cognitive probes during the interview. This protocol, along with protocols for recruitment of test participants, translation and data summarisation aim to minimise deviation in practices between interviewers and countries.
The SUSTAIN 2 (WP2, Task 1) project attempts to bridge data from ESS and another long-standing survey – the European Values Survey (EVS). As part of this project, questions from EVS are being considered for inclusion in the ESS. The questions were required to go through the same pre-testing that all potential ESS content is subject to. The National Centre for Social Research (UK) co-ordinated cross-national cognitive testing across three countries: Finland; France and the UK. This paper discusses the ex-ante measures taken to harmonise the ESS and EVS question testing, such as the use of scripted probes, joint analysis meetings, and translation methods. It also considers the challenges in harmonising question testing methods and the survey questions being tested.
Miss Daniela Negoita (European Values Study - Tilburg University) - Presenting Author
Dr Ruxandra Comanaru (European Social Survey - City University)
The European Values Study (EVS) and the European Social Survey (ESS) are cross-national surveys that collect data in most European countries in the domains of social attitudes, family, gender roles, work, politics, and many more. EVS is fielded every 9 years (1981-present), while ESS is conducted biannually (2002-present). As part of the ESS - SUSTAIN project we explored the feasibility of fielding an EVS module within the ESS infrastructure. To achieve this goal, substantive and methodological differences between the two surveys were tackled by tapping into ex-post harmonisation techniques. EVS Wave 5 (2017-2020) and ESS Round 9 (2018-2019) English source questionnaires were compared by identifying pairs of compatible items along criteria grouped in 4 domains: 1) question attributes, 2) interviewer role, 3) response attributes, and 4) showcards. We then selected 24 pairs of items and compared them based on the correlational patterns, frequency distributions along with relevant statistical tests and non-substantive responses. To minimise further variability stemming from factors other than measurement, the pairs of items were compared in three countries (Germany, Slovenia, Norway) with similar sampling frames (individual registers), fieldwork duration (less than 1 year apart), and response rates. The correlations showed most consistency, while average and proportions distributions yielded most discrepancies. These insights not only point to the potential harmonisability of EVS and ESS, but also add to the methodological knowledge of the ex-post harmonisation field. To strengthen more the plausibility of bridging EVS and ESS, an experimental step was added to the process. The items were pre-tested in 3 countries (France, United Kingdom, Finland) using online panels for quantitative analysis and cognitive testing for a more qualitative approach. The presentation will bring together the results of these different assessments to illustrate a sound methodological approach to harmonising social surveys.