ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Recruiting and Maintaining Probability-Based Online Panel Surveys 2

Session Organisers Mr Ulrich Krieger (SFB 884, University of Mannheim)
Ms Sabine Friedel (SFB 884, University of Mannheim)
Ms Ines Schaurer (GESIS - Leibniz Institute for the social sciences)
TimeThursday 18th July, 14:00 - 15:30
Room D21

In recent years, there has been an increasing number of online studies based on a probability sample. In addition, traditional large scale panel surveys have increasingly gone online (e.g. UKHLS, ESS).
Recruiting and maintaining such online panel surveys creates its unique challenges.

Probability-based online panels aim to combine the best of the offline and online survey world. Because of the lack of a sampling frame, there are many ways of offline recruitment that have their own challenges. Furthermore, the online survey needs to be constantly adjusted to the rapidly changing web and device infrastructure. As most online panel surveys have short wave spans, contact procedures to respondents need to be adjusted accordingly.

In this session we invite papers presenting solutions and discussing challenges when recruiting, refreshing or maintaining probability-based online panel surveys. This includes topics such as:
Sampling and contacting respondents for online participation.
Pushing respondents to web participation in online panel studies.
Improving sign-up rates to online panel studies.
Strategies for refreshing the samples of running online panel studies.
Techniques and interventions to prevent panel attrition.
Methods to improve the usability of the online surveys, especially for mobile devices.
In addition, all other topics relevant to the session theme are welcome.






Keywords: probability-based online survey, panel maintenance, panel recruitment

Using Targeted Design to Improve Sample Quality in a Probability-Based Mixed-Mode Panel

Mr Curtis Jessop (NatCen Social Research) - Presenting Author
Ms Klaudia Lubian (NatCen Social Research)

Download presentation

Most survey samples suffer from non-response bias to some extent. Historically, survey administrators of probability-based surveys have aimed to minimise the risk of non-response bias by maximising survey response rates - the proportion of people invited to take part that do so. However, as response rates continue to decline internationally, the resources required to do so have grown. In addition, response rates are only a proxy for non-response bias; increasing response rates may only succeed in attracting more of the types of people already over-represented in the achieved sample, rather than the people missing, actually increasing the bias in the sample.

A targeted design approach aims to use fieldwork resources in a more guided manner, increasing response rates among some, but not others. This paper outlines how a targeted fieldwork design approach was applied on the NatCen Panel, a probability-based mixed-mode panel in Great Britain, and measures its impact on the sample quality. By moving fieldwork resources away from the types of people who are over-represented among those completing surveys and already taking part regularly or not taking part at all, and towards those that are under-represented among those completing surveys and only taking part sometimes, this approach aimed to improve the sample profile and effective sample size produced by the NatCen Panel, while keeping costs and overall response rates neutral.


Factors Affecting Participation Rates in Probability-Based Online Panel Research: A Meta-Analysis

Mr Sebastian Kocar (Australian National University) - Presenting Author
Dr Lars Kaczmirek (ANU Centre for Social Research and Methods)
Dr Nicholas Biddle (ANU Centre for Social Research and Methods)

While there are numerous opt-in online panels around the world, the number of probability-based ones is much lower. Some of the most recognised probability-based panels are LISS panel (The Netherlands), GESIS panel and German Internet panel (Germany), GfK Knowledge Panel (USA), ELIPSS panel (France), and NatCen Panel (UK). Some have a long tradition with several points of respondent retirement, sample top-up, and refreshment, while some of them are relatively new.
Since the offline recruitment for online panel research is still a relatively new approach in survey methodology, more evidence is needed on how to increase recruitment and participation rates and how to maintain panels to decrease attrition rates. The existing probability-based online panels differ in many aspects – from recruitment modes and sample sizes to frequency and intensity of interviewing. They also differ substantially in participation rates, and there is not much evidence why. One way to study factors affecting participation rates is to carry out a meta-analysis of participation rates of as many eligible probability-based online panels around the world as possible.
My approach is similar to the approach of several meta-analyses of response rates in online surveys, conducted in the last two decades. The purpose of this presentation is to discuss the results of synthesizing information from different studies, reports and methodological papers reporting participation rates in probability-based online panel surveys. The focus of this research is on all relevant rates as potential sources of bias, i.e. overall response, recruitment and attrition rates. To answer the question about what recruitment and data collection factors affect participation rates, a number of moderators is used: advance letter, recruitment mode, invitation to join panel, recruitment incentives (to study recruitment rates), and survey incentives, frequency of data collection, survey topics, mixed-mode use, and reminders (to study non-response and attrition rates).


Exclusive Recruitment Interview vs. Piggybacking: Comparison of Two Recruitment Strategies for a Probability-Based Self-Administered Panel

Mrs Ines Schaurer (GESIS - Leibniz-Institute for the Social Sciences) - Presenting Author
Mr Bernd Weiss (GESIS - Leibniz-Institute for the Social Sciences)
Mr Kai Weyandt (GESIS - Leibniz-Institute for the Social Sciences)
Mr Michael Blohm (GESIS - Leibniz-Institute for the Social Sciences)

In recent years several probability-based self-administered panels were established in the scientific community. A common characteristic of all of these survey infrastructures is the multi-step recruitment process. The majority of them are based on a personal, interviewer-administered interview that exclusively aimed at recruiting respondents. As the recruitment interview represents a large share of the overall costs of setting up a panel infrastructure, alternative approaches have been tested recently. One alternative is to use established surveys of the general population as vehicles for the recruitment (piggybacking approach).
Using the German GESIS Panel, a probability-based mixed-mode panel, this paper presents a case study that compares the success of the two recruitment strategies: 1) the exclusive recruitment interview, and 2) the piggybacking approach for the recruitment. The first one was applied when the GESIS Panel initially was set up in 2013, the second one with the refreshment of the sample in 2016, using the German General Social Survey (ALLBUS) as a vehicle.
We compare the two recruitment strategies with respect to central quality indicators, namely recruitment success, participation rate, and attrition. Furthermore, we compare the selection mechanisms into the panel and evaluate the quality of the recruited samples in terms of biases. This paper adds to the methodological literature on the recruitment processes of probability-based online panels by presenting a first case study that enables the evaluation of the two different strategies.


Experiments with Non-Email Contact Strategies in a UK Probability-Based Online Panel Survey

Mrs Sally Horton (Ipsos MORI) - Presenting Author
Mr Nicholas Gilby (Ipsos MORI)
Miss Madalina Radu (Ipsos MORI)

Download presentation

The Taking Part web panel is the only probability-based online panel survey in the UK commissioned by a Government Department. A random sample of adults and youths is interviewed face-to-face over the course of a calendar year, on topics such as culture, leisure and sporting activities. All respondents with internet access are asked to join a web panel, and do so if they register or complete an initial 15-minute survey online. Face-to-face fieldwork and web panel recruitment, as well as the web panel fieldwork, take place continuously. Web panel members are invited by email every 90 days to complete a 15-minute device-agnostic quarterly web questionnaire.

In this paper, we look at two experiments with techniques and interventions intended to prevent panel attrition once respondents have joined the web panel. Our objectives are to maximise the number of web panellists and ensure the composition of the web panel is as heterogeneous as possible.

In the first experiment, we use a telephone reminder to encourage “sleepers” to resume web panel participation. “Sleepers” are defined as those joining the web panel, but never completing a quarterly web questionnaire, or who complete at least one quarterly web questionnaire but do not respond for 2 to 4 consecutive quarterly surveys. The telephone reminder collects the main reason respondents give for ceasing participation, but the interviewer also encourages them to resume using a standardised script stating how the data are used and the incentive offered. In the second experiment, we use a text reminder to a treatment group of non-respondents compared to a control group of non-responders not sent a text reminder.

At the conference, we will present findings from each experiment as well as our recommendations for engaging respondents on a probability-based online panel survey using contact modes other than email.