All time references are in CEST
Willingness to participate in data collection |
|
Session Organiser |
Dr Michael Weinhardt (German Centre of Gerontology) |
Time | Wednesday 19 July, 11:00 - 12:30 |
Room | U6-09 |
Researchers find it increasingly difficult to encourage respondents to cooperate in different steps of the survey process. For example, motivating respondents to participate in surveys has become more complex over time, manifested in declining response rates over the years. However, initial participation is only the first step in data collection. Depending on the type of survey, a range of other steps may be involved, all of which require respondents’ willingness and motivation. Such steps entail completing the survey, i.e., providing answers and filling in the questionnaire, fulfilling additional tasks such as completing diaries or wearing electronic devices, giving informed consent to record linkage, and last but not least, agreeing to be re-interviewed in the context of panel studies. Factors influencing willingness and motivation may relate to survey characteristics such as topic and interview duration, the overall survey climate in a given community and society, as well as time constraints or survey fatigue at the individual level. To increase motivation, it is essential, on the one hand, to investigate respondent characteristics that influence the decision to cooperate. On the other hand, we need to identify measures that researchers may take to successfully boost respondents’ motivation and willingness to participate at the different stages of data collection, such as monetary incentives or an appealing survey design. The session brings together contributions looking at various aspects of survey participation and what researchers can do to increase the willingness to cooperate in the survey process.
Keywords: respondents' motivation, willingness to participate
Dr Paloma Raggo (Carleton University)
Ms Callie Mathieson (Carleton University) - Presenting Author
The long-standing need to develop and evaluate evidence-based solutions, programs, and services to effectively address complex social and environmental challenges became more urgent following the COVID-19 pandemic. The data collected by governments and researchers on the nonprofit sector and the needs of those it serves is subject to a stark publishing lag – typically being released 18 months after their initial collection – and thus offering limited insights on real-time issues and trends affecting charities’ activities especially in times of crisis where they often serve as front line responders. Researchers at Carleton University recruited a representative rapid response panel of over 1000 charities across the country aimed at providing weekly insights on the needs of the charitable sector in Canada. The randomly selected participants, i.e. the highest ranking executive available for each organization contacted, agreed to answer weekly surveys about their activities, challenges, and the trends they saw emerging for one full year.
The literature offers limited insights on panel recruitment outside STEM related fields and even less on surveying nonprofit organizations and their staff. How do we convince extremely busy, often under resourced staff of organizations to participate in a yearlong study without any monetary compensation? In this paper, we review the various recruitment strategies we tested (email, telephone, mail-in) and reflect on the efficiency of each method. Unexpectedly, initial results suggest that we were able to sustain a response rate of 67% (n=948). Recruitment strategies and commitment have varied considerably based on the formal nature of the organization, their perception of the nonprofit sector as well as our ability to reach people within some of the hard-to-reach target groups. The lessons from our recruiting effort speak to scholars interested in studying hard to reach organizations and more generally suggest finding innovative
Dr Michael Weinhardt (German Centre of Gerontology) - Presenting Author
Dr Mareike Bünning (German Centre of Gerontology)
Should individuals be contacted again for new waves of a panel study if they have not participated over several waves? This question is discussed on the basis of empirical data from the German Ageing Survey (DEAS). The DEAS has been conducted initially every six years since 1996 and every three years since 2008. The survey is based on probability samples of people aged 40 to 85 drawn from municipal registration offices (1996, 2002, 2008, 2014). It also exhibits a panel component that aims to re-interview all respondents in future waves. So far, all individuals remain in the DEAS sample until they explicitly withdraw panel consent or permanently drop out for other reasons (e.g., permanent illness, moving abroad, death). From a cost-benefit perspective, the question arises to what extent such a strategy should be maintained. On the one hand, some people can be won back for an interview even after many years of panel absence. On the other hand, persons with multiple spells of non-participation are much less likely to participate in future waves. Moreover, the question arises what added value such cases with long gaps in the data series offer for longitudinal analyses. In this presentation, we investigate whether contacting repeated non-participants disproportionally increases the workload for interviewers and the survey institute (e.g., based on the number of contact attempts) and to what extent the probability for a successful interview can be estimated from the final disposition codes (e.g., whether unsuccessful contact attempts over several waves are a stronger predictor for repeated non-participation than temporary refusals or other reasons for non-participation). We especially discuss this question in the context of studies of older people, as they are, e.g., more prone to temporary absences due to illness.
Dr Julia Koltai (Centre for Social Sciences)
Dr Akos Huszar (Centre for Social Sciences)
Dr Akos Mate (Centre for Social Sciences) - Presenting Author
Ms Zsofia Rakovics (Centre for Social Sciences)
Ms Szilvia Rudas (Centre for Social Sciences)
Dr Bence Sagvari (Centre for Social Sciences)
The main question of our paper is whether and how demographic and socio-economic characteristics and smartphone usage behavior of the people influence their willingness to participate in a smartphone application-based data collection, where participants would both fill out a questionnaire and let the app to collect data on their smartphone usage.
Our sample collected in 2021 is consists of 1,000 respondents and is representative for the Hungarian internet users. The questionnaire included hypothetical scenarios about a smartphone-based data collection and respondents had to declare the likelihood of their participation in such research.
Our results show that compared to the youngest age-group older users are less likely to participate in such a data collection and women are less likely to participate in the research collecting digital traces of their everyday lives. In the more detailed analyses, we found that compared to the educated active group of respondents, for those who are skilled and retired the control over the data collection is less important when they decide about their participation, but the lack of interruption is a red flag and under that circumstance. Regarding the role of the incentive, we can observe that those who are educated and retired less rely on the amount of the incentive in their decision on participation compared to those who are also educated but active.
Regarding smartphone usage, those users who only use the camera and basic options are less, while advance users are more likely to participate in such research compared to those who typically use their smartphone for social media and entertaining options. We can also observe that for broad-non-social-media-users it is more a positive factor if the organizer of the data collection is a research institute, and size of incentive and the control over
Miss Norma Kok (Citizen Leader Lab) - Presenting Author
Miss Diemo Masuko (Citizen Leader Lab)
Miss Thandokazi Dlongwana (Citizen Leader Lab)
Over the years, it has been difficult to get a good survey response rate. This stems from a number of reasons, including survey fatigue and lengthy surveys. Various methods have been used to increase survey response rates. Citizen Leader Lab, a non-profit organisation based in South Africa, uses a unique way of increasing survey responses.
Citizen Leader Lab facilitates the Partners for Possibility (PfP) programme to provide leadership development and support to school principals serving under-resourced communities in South Africa. The PfP programme creates partnerships between school principals and business leaders over a 12-month period. These partnerships are actively supported by a Learning Process Facilitator (LPF) who provides individual and group coaching and facilitates regular Community of Practice (CoP) meetings during the 12 months.
Citizen Leader Lab believes in “connection before content”, and therefore connection is established with participants before requesting them to complete surveys. As the LPFs build a good relationship with the participants during the programme, they work closely with the monitoring and evaluation (M&E) team to encourage participating school principals and business leaders to complete the mid-year and end-of-year surveys for the PfP programme. When the M&E team administer the online surveys to the participants, the LPFs use coaching conversations to highlight the importance of completing the surveys. They motivate non-respondents during the CoP meetings and on PfP’s WhatsApp groups to complete the surveys. They also give participants the opportunity to complete the surveys during CoP meetings and at the end of coaching sessions. The surveys consist of a question asking permission from participants should the M&E team need to contact them for further information or in the case of sensitive information being provided.
Building connections and rapport with survey respondents lead to increase survey responses.
Ms Judith Gilsbach (GESIS-Leibniz Institute for the Social Sciences) - Presenting Author
Mr Joachim Piepenburg (GESIS-Leibniz Institute for the Social Sciences)
Mr Frank Mangold (GESIS-Leibniz Institute for the Social Sciences)
Mr Sebastian Stier (GESIS-Leibniz Institute for the Social Sciences)
Mr Bernd Weiß (GESIS-Leibniz Institute for the Social Sciences)
Linking digital behavioral data, such as web tracking data, with surveys opens new research areas for social scientists. Web tracking data refers to detailed recordings of study participants’ web browsing behavior. In our study, we record it using a browser plugin. This data collection approach allows for measuring behavior that survey participants tend to recall inaccurately. Additionally, it reduces the survey burden. However, previous studies found that participants were reluctant to participate in tracking studies. One effective way to increase the participation rate of a study is by offering monetary incentives for participation. Therefore, paying higher incentives than in survey-only studies might be necessary for effectively recruiting participants for linkage studies.
Generally, incentives can be granted prepaid (i.e., unconditionally), postpaid (i.e., conditioned on participation) or as a combination of both. It is, however, unclear how large conditional postpaid incentives should be and whether unconditional prepaid incentives can additionally increase web tracking participation rates. To answer this question, we conduct a 2x3 factorial experiment with approximately 3000 panelists of a new non-probability online access panel. The first factor is whether panelists receive a prepaid incentive of 5 Euro or not. The second factor is the amount of the postpaid incentive (15, 30 or 45 Euro). Participants qualify for the postpaid incentive if they are active on at least 70% of the days during the data collection period.
We investigate three outcomes: (1) consent for participation in a web tracking study, (2) actual installation of the browser plugin, and (3) active days of sending tracking data. We will present the results of our experiment and discuss the implications of different incentive schemes on data quality. Our research design enhances external validity compared to previous experimental research on the willingness to participate in hypothetical data collections.