Surveying Children and Young People 1 |
|
Chair | Ms Kate Smith (Centre for Longitudinal Studies ) |
Coordinator 1 | Dr Lisa Calderwood (Centre for Longitudinal Studies) |
The Longitudinal Study of Australian Children (LSAC), also known as Growing Up in Australia, commenced in 2004 when parents of approximately 5, 000 babies and 5,000 4-5 year olds were recruited to the study. Now in 2017, the older cohort is turning 18. Our challenge is to keep these new adults engaged enough to remain in a study that their parents agreed to take part in 14 years ago.
This presentation covers the various methods of engagement that LSAC is trying in an attempt to maintain a robust sample. Examples include:
• The 18th birthday mail out of a card, certificate of participation and a gift card, and a phone call where we confirm all contact information as well as start the selling campaign about the future of the study;
• The changes to methodology that are being developed for the Wave 8 interview where the young adult is interviewed by themselves for the first time, and the introduction of a online component prior to the interviewer visit to the home; and
• Other engagement activities including our first foray into social media, and the making of a thank you video.
We have also recruited a respondent panel to assist us with all of these and other engagement ideas.
We are also grappling with the question of when and how to go back to past refusals and invite them to return to the study. The timing of this approach is complex as we know that the parent who opted out of the study is probably still residing with the young adult we want to “re-recruit”. How to entice them back into the study is a question we are working towards answering and other people’s experiences and advice will be sought at this session.
The increase in social media use amongst teenagers and young adults offers a new way to track members of a cohort study, and engage these typically harder-to-reach respondents in survey research. This presentation will discuss the use of Facebook and Twitter in tracking and engaging respondents in the Millennium Cohort Study (MCS) and Next Steps.
MCS is a large birth cohort study of young people born in 2000-01. Preparations are currently underway for the seventh sweep of the survey, when cohort members will be 17 years old. At age 17, the young person will be the main respondent for the first time. Given this change in dynamic, it is key to ensure that cohort member are engaged in the study and feel it is relevant to their lives. Social media was first used to engage cohort members in the run up to the Age 14 Survey.
Next Steps (formerly known as LSYPE 1) is a longitudinal study of young people in England born in 1989/1990. The most recent sweep of data collection took place in 2015-2016 when respondents were aged 25/26. A large gap between the most recent sweep and the prior sweep at age 20 meant extra effort was, and continues to be, needed to re-engage participants in the study. The use of social media platforms offered a way to find and engage study members who had not been contacted for many years.
This paper will present the different ways social media has been used to track cohort members, specifically in Next Steps, and help to engage respondents in both studies. Social media is a platform that these age groups already use to receive their news and other updates. As such, it offers a way to trace and get in touch with respondents who studies may not be able to reach by other means. Social media also allows us to update cohort members much more frequently than traditional postal mailings. Regular campaigns are run on the studies’ social media platforms to highlight recent findings and draw cohort members’ attention to study news, driving traffic to the study websites. It should be noted that we use social media as a way of informing respondents, and discourage its use as a way for cohort members to get in touch with us.
This paper will also discuss some ethical issues of using social media to track and engage respondents of a cohort study, such as confidentiality and participant disclosure. We will talk about the approaches we take to reduce these risks, including the creation of a social media code of conduct that all staff involved with the studies’ social media accounts must adhere to.
We will discuss the different approaches and challenges specific to each of the studies, and reflect on the successes and limitations of using social media as a participant tracking and engagement tool, using insights from participant website and social media traffic, and summarise the lessons learned.
Howe and Strauss (1991) define the Millennial Cohort as consisting of individuals born between 1982 to 2004. Even if different authors use different definitions, they usually agree that millennials are defined as the first generation to have had, during their formative years, access to internet. Moreover, this generation (sometimes called generation Y) has the lowest rate of high sustained attention, a 31% according to the Microsoft Consumer Insights report “Attention Spans” (2015), versus 34% for the 35-54 years cohort and 35% for the 54 and older cohort.
In order to involve this generation in survey participation, and achieve high quality answers from them, designers requires new survey tools. Several approaches have been used. In particular, gamification has emerged as an increasingly popular solution to improve the motivation and the engagement of young respondents to surveys (Mavletova, 2015). This trend assumes that the inclusion of visually appealing or gamified elements can help to improve the engagement.
In this study, we focused on emojis (the popular pictographs used in electronic messages), as a tool to make the surveys more attractive to millenials. Six billions emojis are used everyday according to Swiftkey. According to an analysis that we have done using data from twitter, 7.441.058 emojis were send on Twitter in 24 hours (November 2, 2016). In addition, emojis, which originally were only present on Internet, started to invade all the offline world too: there are mugs, t-shirts, all kinds of products using emojis. Emojis are used by brands. They are used by political parties in their campaign. They became part of everybody's life, online and offline, from birth to old age. For millennials, emojis are really integrated in their way to communicate. Thus, if we want to make surveys more natural for respondents, integrating the use of emojis in the surveys seem an interesting alternative. In addition, emojis are used all over the world, which make them the first international language.
In order to study this idea, we will collect data in Spain and Mexico, through an online opt-in panel. We will both ask about the current use of emojis of millennials (which are the most used emojis, when are they using them, for which reason, etc.) and about the willingness of using them in surveys. Moreover, we will try out to implement questions in which respondents can answer using emojis. We will examine how this could affect the data quality and satisfaction with the survey participation.
Strauss, W., & Howe, N. (1991). 'Generations: The history of America’s future', 1584 to 2069. New York: Quill/William/Morrow
Microsoft Canada. 'Attention Spans'. (2015). Retrieved December 3, 2016 from: https://advertising.microsoft.com/en/WWDocs/User/display/cl/researchreport/31966/en/microsoft-attention-spans-research-report.pdf
Mavletova, A 2015, ‘Web surveys among children and adolescents: is there a gamification effect?’, Social Science Computer Review, vol.33, no.3, pp.372—398.
Telephone interviewing provides an appropriate mode for collecting reports on sensitive topics when respondents are able to find a private setting in which to complete the interview and questions are worded or a response booklet is used so as to not require respondents to provide sensitive responses aloud. These conditions may be harder to achieve in telephone interviews with adolescents for three reasons: first, adolescents may have less control over the presence or interference of others during a telephone interview compared to adults, thus increasing the risk that sensitive information will be disclosed to a parent or sibling; second, the consequences of such disclosure may be uniquely consequential and detrimental for adolescents ; and third, adolescents’ greater tendency to provide socially desirable responses in survey settings compared to adults potentially compromises the quality of information on sensitive topics collected during an interviewer-administered telephone interview (Paulhus 1991; Reynolds and Richmond 1978).
Interactive voice response technology (IVR) provides one method to overcome these concerns. In the survey context, IVR technology uses a pre-recorded or computer-generated voice to deliver questionnaire content to respondents and allows respondents to use their telephone keypads to input responses. This method allows participants to respond to sensitive questionnaire content without disclosing their answers directly to an interviewer and without the risk of inadvertent or intentional verbal disclosure to others. Responses are recorded in an electronic database without individually identifying information and the database is delivered securely from the IVR vendor to the survey operations team.
We describe the implementation of an IVR-administered questionnaire to a sample of adolescents aged 12 to 17 years who participated in the Panel Study of Income Dynamics 2014 Child Development Supplement (CDS-2014, N=1100). Questionnaire topics included experience of bullying, physical development, sexual activity, drug and alcohol use, and delinquent behavior and took an average of 18 minutes to complete. Approximately 73 percent of eligible respondents completed the IVR interview. While telephone-ACASI and IVR data collection methods have been used with small regional samples of adults (Beach et al. 2010; Cooley et al. 2000), we are aware of no other national study has used IVR technology to collect information on sensitive topics from adolescents.
Our presentation covers the following topics:
* Adaptation of ACASI questionnaire content on sensitive topics to the IVR context
* Development of authentication credentials to ensure that the target respondent is the individual completing the questionnaire
* Operational strategies and challenges to connecting the respondent to the IVR instrument
* Training respondents to respond to the IVR questionnaire using their telephone keypads
* Nonresponse
* Breakoffs
* Data quality and completeness
* Respondent perceptions of task difficulty
* Lessons learned and recommendations