ESRA logo

ESRA 2025 Preliminary Program

              



All time references are in CEST

Understanding survey participation among children, teenagers and young adults: opportunities, challenges and gaps

Session Organisers Dr Violetta Parutis (ISER, University of Essex)
Dr Jonathan Burton (ISER, University of Essex)
TimeTuesday 15 July, 09:00 - 10:30
Room Ruppert 119

With response rates declining worldwide, most research into survey participation tends to focus on the adult population, with very little attention devoted to factors that influence survey response among children, teenagers and young people, especially in the longitudinal context. Collecting good quality robust longitudinal survey data about children and young adults is crucial, as research shows significant associations between experiences, attitudes and behaviours in childhood, adolescence, early adulthood, and later in life (Parutis, 2023). There is also a need to improve our understanding of the transition into adulthood, which is linked to numerous social, emotional, identity and behavioural changes that happen in young people’s life at the time. Therefore, research is needed to identify what drives young people’s participation, including recruitment and retention, in surveys, and what hinders their engagement in survey research.
This panel aims to bring together researchers from a variety of disciplines interested in sharing their experiences of surveying young people. We are interested in hearing from cross-sectional and longitudinal surveys, as well as from qualitative research colleagues who would be willing to share their insights into this important but so far under-researched area.
Papers may include but are not limited to
• reasons behind survey non-response among this population
• innovative methods of increasing youth survey response rates
• best ways of engaging young audiences in on-going research
• ideas around how to reach and retain an audience that is constantly changing
• creative, effective and ethical techniques for online and offline engagement with under 18s
• ethical use of social media, youtube and other digital technologies in capturing and researching young people
The call for papers is aimed at those who have undertaken research with children and young adults. Comparisons with older respondents are also encouraged. Cross-national and cross-survey insights are welcome.

Keywords: youth, response

Papers

Emerging Adulthood, Participation and Response Behaviour in Longitudinal Surveys – Evidence from the Generations and Gender Surveys (GGS)

Mrs P. Linh Nguyen (French Institute for Demographic Studies (INED), University of Essex, University of Mannheim) - Presenting Author

Survey data need to be representative and of high quality to inform research and policy development on children and young people. However, previous research demonstrated selective nonresponse from people in their emerging adulthood years as they are more likely to drop out of a panel survey due to multiple life course changes (starting higher education, getting the first job, moving out of the parental home, etc.). Therefore, survey research needs to pay more attention on how the representativeness of a panel is impacted by the (non-)participation of young adults over the survey waves. This study explores nonresponse patterns of emerging adults aged 18 to 29 compared to those of older respondents using both panel data from the GGS round one and the first waves of GGS round two. As the GGS provides important data for policymaking and research on family formation and dynamics, survey participation of emerging adults is crucial.
The descriptive analysis captures response rates, as well as indicators of response behaviour and quality while cross-tabulating them with respondent characteristics (e.g., sex, age, education), as well as past GGS survey participation. As nonresponse is defined as failed attempt to contact the panel member and refusals to participate, non-response and attrition of emerging adults are predicted using logit regressions. Furthermore, several indicators of response behaviour and response quality are also considered (such as analysing item nonresponse) with an additional focus on mode-effects by also analysing device choice and break-off behaviour per device as several GGS countries opted for a mixed-mode administration. Lastly, the study provides an overview of different recruitment and participant engagement and retention strategies over the GGS countries included for analysis.


Anchors Consent to Survey their Children within a Multi-Actor-Survey-Design in Self-administered Modes. The role of Survey-related Factors within a Probability-based Refugee Panel

Dr Jean Philippe Décieux (University of Bonn & Federal Institute for Population Research (BiB)) - Presenting Author
Dr Andreas Ette (Federal Institute for Population Research (BiB))
Dr Ludovica Gambaro (Federal Institute for Population Research (BiB))
Dr Lenore Sauer (Federal Institute for Population Research (BiB))

Refugee children represent a challenging combination of two hard-to-reach populations: children and refugees. Refugees form a relatively small, highly mobile group that may encounter language barriers, face stigmatization, and be hesitant to identify themselves as refugees or may be shielded by gatekeepers. For children, a central challenge lies in obtaining parental consent, as minors are not legally autonomous. Consequently, parents must first be convinced of the survey's value before their children can participate. Hence, parental consent is a critical factor in surveys involving minors, as parents often view their children as particularly vulnerable.
An interesting approach for addressing this challenge is a multi-actor survey design. However, in self-administered surveys—where interviewers are absent and cannot offer persuasive arguments—securing consent becomes more difficult. To overcome these obstacles and improve anchors consent rates for their children in self-administered surveys, it is crucial to identify factors that enhance parents' willingness to allow their children’s participation. Ideally, these factors should be under the control of researchers, such as elements of the survey design and questionnaire.
This study therefore examines factors in self-administered surveys that influence parental consent for surveying children, as well as the actual participation behavior of the children. Based on data from the 4th wave of the probability-based BiB/FReDA survey “Refugees from Ukraine,” initial findings suggest that both actual and perceived survey duration, overall survey attitude, consistent panel participation, and lower item non-response are associated with higher parental consent rates and actual participation of children. Further analysis of survey attitudes indicates that perceptions of the questionnaire as “interesting,” “too long,” or “too personal” had the most significant impact.


Youth Nonresponse in the Understanding Society Survey: Investigating the Impact of Life Events

Dr Camilla Salvatore (Utrecht University) - Presenting Author
Dr Peter Lugtig (Utrecht University)
Dr Bella Struminskaya (Utrecht University)

Survey response rate are declining worldwide, particularly among young individuals. This trend is evident in both cross-sectional and longitudinal surveys, such as Understanding Society, where young people exhibit a higher likelihood of either missing waves or dropping out entirely.
This paper aims to explore why young individuals exhibit lower participation rates in Understanding Society. Specifically, we investigate the hypothesis that young people experience more life events such as a change in job, relationship status and a move of house, and it is the occurrence of such life events that are associated with a higher likelihood to not participate in the survey.
The data source is Understanding Society, a mixed-mode probability-based general population panel study in the UK. We analyze individuals aged 18-44 at Understanding Society's Wave 1, and we follow them until Wave 12. We consider four age groups: 18-24 (youth), 25-31 (early adulthood), 32-38 (late adulthood) and 39-45 middle age (reference group for comparison). In order to study the effect of life events on attrition, we applied the Discrete-Time Multinomial Hazard Model. In this model the time is entered as a covariate and the outcome variable is the survey participation indicator (interview, noncontact, refusals or other). The outcome is modeled as a function of lagged covariates, including demographics, labor market participation, qualifications, household structure and characteristics, marital status and mobility, as well as binary indicators for life event-related status changes.
Consistent with existing literature, our findings reveal that younger respondents, as well as those with an immigration background, lower education, and unemployment status, are less likely to participate. We also demonstrate that changes in job status and relocation contribute particularly to attrition, with age remaining a significant factor.


Emerging Challenges: Trends in Exclusion and Participation in Large-Scale Assessments in Education

Dr Sabine Meinck (IEA) - Presenting Author
Mr Umut Atasever (IEA)

International large-scale assessments in education (ILSA), such as TIMSS, PIRLS, and PISA, are essential tools for evaluating education systems and informing policy. They rely on unbiased, nationally representative data to enable valid cross-national comparisons. The data quality standards set by ILSA are stricter than most education or social science surveys. ILSA aim for 100% participation from randomly selected schools and students but require minimum participation rates of 75% at both stages. Additionally, they seek full coverage of the target population, such as all fourth-grade students in TIMSS, permitting a maximum exclusion of 5%. Education systems that fail to meet these standards are flagged in reports with annotations or by separating their results due to concerns about data reliability.
Most ILSA are conducted cyclically every three to five years, with many established over 25 years ago. While school participation rates have remained relatively stable, recent studies—particularly post-COVID-19—show a decline in student participation rates and an increase in exclusion rates. This raises concerns about the comparability of findings across education systems, as research indicates that nonrespondents and excluded students are often lower achievers within the target populations.
This paper examines trends in participation and exclusion rates across IEA studies, including TIMSS, PIRLS, ICCS, and ICILS. We analyze the potential impact of these trends on the validity of comparative statements about education system performance. Furthermore, we explore strategies to address these challenges, such as methods to improve response rates while respecting diverse national contexts and possibilities to enhance the inclusivity of these studies. By addressing these issues, ILSA can better uphold their rigorous standards and continue to provide valuable insights for educational improvement worldwide.


Who are the nonrespondents? Insights from the Swiss youth panel TREE

Mrs Ellen Laupper (Swiss Federal University for Vocational Education and Training SFUVET ) - Presenting Author
Dr Barbara Müller (University of Berne)

Despite increased efforts to counteract declining response rates, the nonresponse bias often remains almost stable, making it crucial to better understand who is hard to reach in surveys. While traditional nonresponse bias analyses reveal various differences in sociodemographic variables between respondents and nonrespondents, psychological differences remain underexplored.

This study leverages a large-scale longitudinal representative sample of Swiss 9th graders surveyed at the baseline in classrooms, capturing a rich set of sociodemographic information, cognitive skills, personality traits, and other relevant psychological constructs. We aim to investigate how individuals who refused follow-up participation differ from panel participants regarding psychological characteristics. We also examine how survey participation patterns, such as tendencies for early versus late responses and survey attrition, can be predicted by such psychological traits.
For the analysis, data from the second cohort of the TREE (Transitions from Education to Employment) panel study is used. The study began with a 2016 baseline survey at the end of lower secondary education (age 15). The aim of the TREE-2 study is to track approximately 8,000 participants over seven waves via annual telephone interviews and supplementary web surveys to monitor educational and labour market trajectories.

Based on response behaviour across the seven follow-up waves (e.g., early response, late response, nonresponse, attrition, and nonparticipation), we identify distinct respondent groups or clusters by using techniques such as latent class analysis and sequence pattern analysis. Multivariate techniques explore the role of personality traits—such as the Big Five, locus of control, self-esteem, risk aversion, trust, and time preferences—alongside sociodemographic variables (e.g., gender, migration background, parental education, and HISEI).

Our analysis provides insights into the interplay between psychological and sociodemographic factors in shaping survey participation behaviours, providing a deeper understanding of survey (non)participation.