ESRA logo

ESRA 2023 Glance Program


All time references are in CEST

Understanding survey participation among children, teenagers and young adults: opportunities, challenges and gaps

Session Organisers Dr Violetta Parutis (ISER, University of Essex)
Dr Jonathan Burton (ISER, University of Essex)
TimeTuesday 18 July, 09:00 - 10:30
Room

With response rates declining worldwide, most research into survey participation tends to focus on the adult population, with very little attention devoted to factors that influence survey response among children, teenagers and young people, especially in the longitudinal context. Collecting good quality robust longitudinal survey data about children and young adults is crucial, as research shows significant associations between experiences, attitudes and behaviours in childhood, adolescence, early adulthood, and later in life (Parutis, 2023). There is also a need to improve our understanding of the transition into adulthood, which is linked to numerous social, emotional, identity and behavioural changes that happen in young people’s life at the time. Therefore, research is needed to identify what drives young people’s participation, including recruitment and retention, in surveys, and what hinders their engagement in survey research.
This panel aims to bring together researchers from a variety of disciplines interested in sharing their experiences of surveying young people. We are interested in hearing from cross-sectional and longitudinal surveys, as well as from qualitative research colleagues who would be willing to share their insights into this important but so far under-researched area.
Papers may include but are not limited to
• reasons behind survey non-response among this population
• innovative methods of increasing youth survey response rates
• best ways of engaging young audiences in on-going research
• ideas around how to reach and retain an audience that is constantly changing
• creative, effective and ethical techniques for online and offline engagement with under 18s
• ethical use of social media, youtube and other digital technologies in capturing and researching young people
The call for papers is aimed at those who have undertaken research with children and young adults. Comparisons with older respondents are also encouraged. Cross-national and cross-survey insights are welcome.

Keywords: youth, response

Papers

Engaging children and young people in surveys using remote modes: the role of parents and caregivers

Ms Line Knudsen (National Centre for Social Research) - Presenting Author
Mr Martin Wood (National Centre for Social Research)
Ms Samantha Spencer (National Centre for Social Research)

In many cohort and longitudinal surveys that follow children and young people over time, parents and caregivers provide important insights and information, yet hearing from the child or young person directly is key. However, the move towards remote modes (online and CATI) and away from face-to-face in many longitudinal studies – whether at the initial wave and/or subsequent wave(s) – poses some additional challenges in reaching and surveying children and young people.

With a face-to-face approach, interviewers are in the household; they can speak to both the young person and parent/caregiver and adjust their approach as needed – be it in relation to encouraging participation, gaining informed consent, and/or administering the questionnaire in accordance with study protocols. In contrast, with an online approach (and to a lesser extent a CATI approach), the dynamics and practicalities of the household are largely unknown. For example, who (if anyone!) reads the invitation materials? Are materials passed on to/discussed with the young person? Has the young person actually understood the information, and do they understand the survey questions? Has their parent/caregiver? And have they given their consent? Do they help them complete the survey?

Many of these issues are, of course, not unique to surveying children and young people. Nevertheless, the role of the parent/caregiver as gatekeeper/possible advocate introduces particular challenges for engaging children and young people in online and remote surveys. This presentation will outline and discuss some of these challenges and consider potential approaches for addressing these challenges. It will do so drawing, predominantly, on the SEND Futures study, a longitudinal mixed-mode study which surveyed around 3,000 young people with special educational needs and disabilities and their parent/caregiver.


Emerging Adulthood, Participation and Response Behaviour in Longitudinal Surveys – Evidence from the Generations and Gender Surveys (GGS)

Mrs P. Linh Nguyen (French Institute for Demographic Studies (INED), University of Essex, University of Mannheim) - Presenting Author

Survey data need to be representative and of high quality to inform research and policy development on children and young people. However, previous research demonstrated selective nonresponse from people in their emerging adulthood years as they are more likely to drop out of a panel survey due to multiple life course changes (starting higher education, getting the first job, moving out of the parental home, etc.). Therefore, survey research needs to pay more attention on how the representativeness of a panel is impacted by the (non-)participation of young adults over the survey waves. This study explores nonresponse patterns of emerging adults aged 18 to 29 compared to those of older respondents using both panel data from the GGS round one and the first waves of GGS round two. As the GGS provides important data for policymaking and research on family formation and dynamics, survey participation of emerging adults is crucial.
The descriptive analysis captures response rates, as well as indicators of response behaviour and quality while cross-tabulating them with respondent characteristics (e.g., sex, age, education), as well as past GGS survey participation. As nonresponse is defined as failed attempt to contact the panel member and refusals to participate, non-response and attrition of emerging adults are predicted using logit regressions. Furthermore, several indicators of response behaviour and response quality are also considered (such as analysing item nonresponse) with an additional focus on mode-effects by also analysing device choice and break-off behaviour per device as several GGS countries opted for a mixed-mode administration. Lastly, the study provides an overview of different recruitment and participant engagement and retention strategies over the GGS countries included for analysis.


Anchors Consent to Survey their Children within a Multi-Actor-Survey-Design in Self-administered Modes. The role of Survey-related Factors within a Probability-based Refugee Panel

Dr Jean Philippe Décieux (University of Bonn & Federal Institute for Population Research (BiB)) - Presenting Author
Dr Andreas Ette (Federal Institute for Population Research (BiB))
Dr Ludovica Gambaro (Federal Institute for Population Research (BiB))
Dr Lenore Sauer (Federal Institute for Population Research (BiB))

Refugee children represent a challenging combination of two hard-to-reach populations: children and refugees. Refugees form a relatively small, highly mobile group that may encounter language barriers, face stigmatization, and be hesitant to identify themselves as refugees or may be shielded by gatekeepers. For children, a central challenge lies in obtaining parental consent, as minors are not legally autonomous. Consequently, parents must first be convinced of the survey's value before their children can participate. Hence, parental consent is a critical factor in surveys involving minors, as parents often view their children as particularly vulnerable.
An interesting approach for addressing this challenge is a multi-actor survey design. However, in self-administered surveys—where interviewers are absent and cannot offer persuasive arguments—securing consent becomes more difficult. To overcome these obstacles and improve anchors consent rates for their children in self-administered surveys, it is crucial to identify factors that enhance parents' willingness to allow their children’s participation. Ideally, these factors should be under the control of researchers, such as elements of the survey design and questionnaire.
This study therefore examines factors in self-administered surveys that influence parental consent for surveying children, as well as the actual participation behavior of the children. Based on data from the 4th wave of the probability-based BiB/FReDA survey “Refugees from Ukraine,” initial findings suggest that both actual and perceived survey duration, overall survey attitude, consistent panel participation, and lower item non-response are associated with higher parental consent rates and actual participation of children. Further analysis of survey attitudes indicates that perceptions of the questionnaire as “interesting,” “too long,” or “too personal” had the most significant impact.


The challenges of youth self-completion surveys in a mixed mode survey

Dr Violetta Parutis (ISER, University of Essex) - Presenting Author
Dr Jonathan Burton (ISER, University of Essex)

Understanding Society: The UK Household Longitudinal Study is now a mixed mode survey, with most adults invited to complete online first, with a face-to-face or telephone follow-up (“web-first”). This mixed-mode design was introduced at Wave 8 (2016-17).

One consequence of this move to online interviewing is that the response rate for the youth (age 10-15) self-completion has fallen dramatically. This is a concern for longitudinal studies where researchers may want to use data from childhood to analyse future outcomes. There is also a risk that non-response to the youth survey will lead to non-response when the sample member is eligible for an adult interview.

When the survey was face-to-face only, youth response rates were around 75-80%. But where the parental interview is done online, the paper youth self-completion questionnaire is sent to the household. This design saw the youth response rate decline to around 55% at Waves 11-13. We instituted a number of changes around the reminder strategy to try and increase youth response rates.

This presentation will explore the effectiveness of inviting young people to complete their annual interview online. This presentation describes the experimentation on Waves 16 and 17 of the Innovation Panel. At IP16 we implemented two experiments: (1) an additional conditional incentive; (2) an information leaflet targeted either at the young person or their parent. At IP17 we implemented a more child-friendly design for the online survey. We experimented with the way in which the young person was invited: the standard way (via a letter addressed to the parent), or with an envelope addressed to the young person included in the letter sent to the parent.

We would welcome information from other studies on optimising studies aimed at young people.


Data Privacy and Other Concerns: Preliminary Testing of Assent Forms for the National Longitudinal Survey of Youth

Dr Tywanquila Walker (U.S. Bureau of Labor Statistics) - Presenting Author
Dr Robin Kaplan (U.S. Bureau of Labor Statistics)
Ms Rebecca L. Morrison (U.S. Bureau of Labor Statistics)
Ms Safia Abdirizak (U.S. Bureau of Labor Statistics)

The National Longitudinal Survey of Youth (NLSY) is a longitudinal project that collects information on American respondents’ labor market behavior and educational experiences. The survey also includes topics such as income, assets, marital status, fertility, and health. The U.S. Bureau of Labor Statistics (BLS) is planning to begin a new youth cohort with youth born between 2011 to 2016. Parents and guardians provide consent to allow their children to participate, and youth provide assent to agree to participate in the study. Using qualitative and quantitative approaches, including interviews and intercept testing, we evaluated the youth assent form language over multiple testing rounds. For intercept testing, we spoke to youth in public settings and gathered quick feedback on the form. After each round, we modified the form based on participant feedback.

Our research focused on improving the assent materials to 1) ensure youth understand why they were selected for the survey; 2) explain the commitment to their privacy and data confidentiality; and 3) provide an overview of survey topics, procedures, duration, and frequency. Moderated cognitive interviews were conducted with parents (N=10) and their youth (N=13); parent and youth interviews were conducted separately to allow youth to provide feedback without parental input. Intercept testing was conducted with youth (N=31).

Throughout testing, we examined what information participants found helpful, important, or confusing and assessed differences between younger (ages 11 to 14) and older (ages 15 to 17) youth. We discuss comprehension of the form language, willingness to participate in the study, and youth concerns regarding assent, data use, and privacy. We focus specifically on the youth’s responses to the assent form process and provide insight into the perspectives of younger and older youth.


Engaging children and young people in survey research

Mrs Amy Tallett (Picker Institute Europe) - Presenting Author

Every child has the right to express their views and wishes, and surveys are one method of enabling this. In healthcare settings, it is vital to seek children’s views about their care to ensure services can be planned and delivered in the most suitable way for them.

There are many important considerations when designing surveys for children to ensure they are appropriate for a younger audience and maximise participation. This presentation will talk through some of those considerations, sharing learning from recent national patient experience surveys carried out in England that the author and her colleagues have designed, such as the Under 16 Cancer Patient Experience Survey (that has been running for four years). Topics to be covered are:

* Survey content. Content should be informed by what is important to children and young people – if we are going to ask for their feedback then we should be asking about what matters to them. This can be achieved by conducting qualitative research with them before the survey is designed.
* Survey design. Surveys must be engaging. Imagery and colour can help make surveys more attractive, and children should be involved in any design decisions.
* Tailoring of design and content to different ages. Different surveys for varying age groups are recommended. The views of parents or carers should also be sought using separate, appropriately designed surveys.
* Cognitive testing. Surveys must be cognitively tested to ensure questions are interpreted as intended. This is particularly important with children who might interpret questions differently to adult researchers.
* Pilot testing. Data collection methods should be trialled, keeping in mind that many children may prefer digital approaches.
* Ethics and accessibility. Accessible survey versions should be considered to ensure children are able to adequately access and respond to the survey.


How does it feel to run a survey 365 days a year? The UK Graduate Outcomes survey experience.

Dr Gosia Turner (The Higher Education Statistical Agency (part of Jisc)) - Presenting Author


The Graduate Outcomes survey is the largest social survey in the UK. Every year it collects information from 900k+ graduates on their activities 15 months post-graduation, whether they are working, studying, or doing something else. If they are in employment, we ask about their job title, employer’s name, their job duties, and their annual salary. This information is used by the UK Higher Education regulator, various UK government departments, it informs choices of prospective students in the UK and around the World, it features in league tables and is frequently and widely quoted in the media. The Graduate Outcomes data provides the most reliable and complete picture of graduate employability in the UK.
The Graduate Outcomes survey is a population survey using the census of UK students collected by the Higher Education Statistical Agency. This session will introduce delegates to our unique methodological approach and will include details on the sampling frame and collecting contact details, mode of collection, continuous fieldwork, engagement strategy, data quality control processes and technological solutions used to share data between us and our survey partners, including UK universities.


Youth Nonresponse in the Understanding Society Survey: Investigating the Impact of Life Events

Dr Camilla Salvatore (Utrecht University) - Presenting Author
Dr Peter Lugtig (Utrecht University)
Dr Bella Struminskaya (Utrecht University)

Survey response rate are declining worldwide, particularly among young individuals. This trend is evident in both cross-sectional and longitudinal surveys, such as Understanding Society, where young people exhibit a higher likelihood of either missing waves or dropping out entirely.
This paper aims to explore why young individuals exhibit lower participation rates in Understanding Society. Specifically, we investigate the hypothesis that young people experience more life events such as a change in job, relationship status and a move of house, and it is the occurrence of such life events that are associated with a higher likelihood to not participate in the survey.
The data source is Understanding Society, a mixed-mode probability-based general population panel study in the UK. We analyze individuals aged 18-44 at Understanding Society's Wave 1, and we follow them until Wave 12. We consider four age groups: 18-24 (youth), 25-31 (early adulthood), 32-38 (late adulthood) and 39-45 middle age (reference group for comparison). In order to study the effect of life events on attrition, we applied the Discrete-Time Multinomial Hazard Model. In this model the time is entered as a covariate and the outcome variable is the survey participation indicator (interview, noncontact, refusals or other). The outcome is modeled as a function of lagged covariates, including demographics, labor market participation, qualifications, household structure and characteristics, marital status and mobility, as well as binary indicators for life event-related status changes.
Consistent with existing literature, our findings reveal that younger respondents, as well as those with an immigration background, lower education, and unemployment status, are less likely to participate. We also demonstrate that changes in job status and relocation contribute particularly to attrition, with age remaining a significant factor.


Emerging Challenges: Trends in Exclusion and Participation in Large-Scale Assessments in Education

Dr Sabine Meinck (IEA) - Presenting Author
Mr Umut Atasever (IEA)

International large-scale assessments in education (ILSA), such as TIMSS, PIRLS, and PISA, are essential tools for evaluating education systems and informing policy. They rely on unbiased, nationally representative data to enable valid cross-national comparisons. The data quality standards set by ILSA are stricter than most education or social science surveys. ILSA aim for 100% participation from randomly selected schools and students but require minimum participation rates of 75% at both stages. Additionally, they seek full coverage of the target population, such as all fourth-grade students in TIMSS, permitting a maximum exclusion of 5%. Education systems that fail to meet these standards are flagged in reports with annotations or by separating their results due to concerns about data reliability.
Most ILSA are conducted cyclically every three to five years, with many established over 25 years ago. While school participation rates have remained relatively stable, recent studies—particularly post-COVID-19—show a decline in student participation rates and an increase in exclusion rates. This raises concerns about the comparability of findings across education systems, as research indicates that nonrespondents and excluded students are often lower achievers within the target populations.
This paper examines trends in participation and exclusion rates across IEA studies, including TIMSS, PIRLS, ICCS, and ICILS. We analyze the potential impact of these trends on the validity of comparative statements about education system performance. Furthermore, we explore strategies to address these challenges, such as methods to improve response rates while respecting diverse national contexts and possibilities to enhance the inclusivity of these studies. By addressing these issues, ILSA can better uphold their rigorous standards and continue to provide valuable insights for educational improvement worldwide.


Who are the nonrespondents? Insights from the Swiss youth panel TREE

Mrs Ellen Laupper (Swiss Federal University for Vocational Education and Training SFUVET ) - Presenting Author
Dr Barbara Müller (University of Berne)

Despite increased efforts to counteract declining response rates, the nonresponse bias often remains almost stable, making it crucial to better understand who is hard to reach in surveys. While traditional nonresponse bias analyses reveal various differences in sociodemographic variables between respondents and nonrespondents, psychological differences remain underexplored.

This study leverages a large-scale longitudinal representative sample of Swiss 9th graders surveyed at the baseline in classrooms, capturing a rich set of sociodemographic information, cognitive skills, personality traits, and other relevant psychological constructs. We aim to investigate how individuals who refused follow-up participation differ from panel participants regarding psychological characteristics. We also examine how survey participation patterns, such as tendencies for early versus late responses and survey attrition, can be predicted by such psychological traits.
For the analysis, data from the second cohort of the TREE (Transitions from Education to Employment) panel study is used. The study began with a 2016 baseline survey at the end of lower secondary education (age 15). The aim of the TREE-2 study is to track approximately 8,000 participants over seven waves via annual telephone interviews and supplementary web surveys to monitor educational and labour market trajectories.

Based on response behaviour across the seven follow-up waves (e.g., early response, late response, nonresponse, attrition, and nonparticipation), we identify distinct respondent groups or clusters by using techniques such as latent class analysis and sequence pattern analysis. Multivariate techniques explore the role of personality traits—such as the Big Five, locus of control, self-esteem, risk aversion, trust, and time preferences—alongside sociodemographic variables (e.g., gender, migration background, parental education, and HISEI).

Our analysis provides insights into the interplay between psychological and sociodemographic factors in shaping survey participation behaviours, providing a deeper understanding of survey (non)participation.