ESRA logo

ESRA 2019 full progam


Monday 15th July Tuesday 16th July Wednesday 17th July Thursday 18th July Friday 19th July


Recruiting and Maintaining Probability-Based Online Panel Surveys 1

Session Organisers Mr Ulrich Krieger (SFB 884, University of Mannheim)
Ms Sabine Friedel (SFB 884, University of Mannheim)
Ms Ines Schaurer (GESIS - Leibniz Institute for the social sciences)
TimeThursday 18th July, 09:00 - 10:30
Room D21

In recent years, there has been an increasing number of online studies based on a probability sample. In addition, traditional large scale panel surveys have increasingly gone online (e.g. UKHLS, ESS).
Recruiting and maintaining such online panel surveys creates its unique challenges.

Probability-based online panels aim to combine the best of the offline and online survey world. Because of the lack of a sampling frame, there are many ways of offline recruitment that have their own challenges. Furthermore, the online survey needs to be constantly adjusted to the rapidly changing web and device infrastructure. As most online panel surveys have short wave spans, contact procedures to respondents need to be adjusted accordingly.

In this session we invite papers presenting solutions and discussing challenges when recruiting, refreshing or maintaining probability-based online panel surveys. This includes topics such as:
Sampling and contacting respondents for online participation.
Pushing respondents to web participation in online panel studies.
Improving sign-up rates to online panel studies.
Strategies for refreshing the samples of running online panel studies.
Techniques and interventions to prevent panel attrition.
Methods to improve the usability of the online surveys, especially for mobile devices.
In addition, all other topics relevant to the session theme are welcome.






Keywords: probability-based online survey, panel maintenance, panel recruitment

What Are the Most Effective Strategies of Web-Push in a Probability-Based Panel?

Mr David Bretschi (GESIS – Leibniz-Institute for the Social Sciences) - Presenting Author
Dr Ines Schaurer (GESIS – Leibniz-Institute for the Social Sciences)
Dr Don Dillman (Washington State University)

Download presentation

In recent years, web-push strategies have been developed in several cross-sectional mixed-mode surveys in order to improve response rates and reduce the costs of data collection. However, pushing respondents into the more cost effective web option has rarely been examined in the context of panel surveys. This study evaluates how different web-push strategies affect the willingness of mail mode respondents in a mixed-mode panel to switch to the web.

We conducted a randomized web-push experiment in the October/November wave 2018 of the GESIS Panel, a probability-based mixed-mode panel in Germany (n=5,736). We used an incompletely crossed experimental design with two factors: A) time of presenting the web-option and B) prepaid vs. promised incentives by randomly assigning 1,986 mail mode panelists to one of three conditions:
1) the web option was offered concurrently with the paper questionnaire including a promised 10 € incentive for completing the survey on the web,
2) the web option was presented sequentially two weeks before sending the paper questionnaire and respondents were also promised an incentive of 10 €,
3) same sequential approach as group 2, but with a prepaid 10 € incentive instead of a promised incentive.
We examine how conditions differ on the web response rate of mail mode respondents and the proportion of respondents who agreed to switch to the web mode for future waves.

Contrary to our expectation, the results show that prepaid incentives do not improve the web response rate compared to promised incentives. In contrast, we found that a sequential presentation of the web option significantly increases the web response rate for the single wave, as opposed to offering the web mode concurrently. However, this effect between experimental groups decreases among respondents who agreed to switch to the web mode for future surveys.


Push-to-Web Recruitment of a Probability-Based Online Panel: Experimental Evidence

Dr Carina Cornesse (SFB 884, University of Mannheim)
Professor Annelies Blom (Department of Political Science and SFB 884, University of Mannheim)
Dr Barbara Felderer (SFB 884, University of Mannheim)
Mrs Marina Fikel (SFB 884, University of Mannheim)
Dr Ulrich Krieger (SFB 884, University of Mannheim) - Presenting Author

Past research has shown that pushing respondents to the web is a successful way to increase response rates, reduce data collection costs, and produce representative outcomes. However, studies in that literature are usually limited to cross-sectional surveys on small and homogeneous target populations. Our study rises beyond this limited scope to a broad and, so far, unique application: We investigate the relative success of pushing respondents to the web compared to alternative survey design strategies across the recruitment stages of a probability-based online panel. In order to do this, we implemented a large-scale experiment into the 2018 recruitment of the German Internet Panel (GIP).

In this experiment, we sampled 12,000 individuals and randomly assigned each individual to an experimental group: online-only, online-first, offline-first, or concurrent-first. Individuals in the online-only group received a mail invitation to participate in the web version of the GIP recruitment survey. Nonrespondents in the online-only group were followed up by invitations to the web version of the GIP recruitment survey again. Individuals assigned to the online-first group received the same invitation letter as the online-only group asking them to participate in the web version of the GIP recruitment survey. However, nonrespondents were followed up with a reminder letter containing a paper-and-pencil version of the GIP recruitment survey. Individuals in the offline-first group received the paper-and-pencil questionnaire with the initial invitation letter and were followed up with invitations to the web version of the GIP recruitment survey. Individuals in the concurrent-first group were given the choice between participating in the web version or the paper-and-pencil version. Then, nonrespondents were followed up by invitations to the web version of the GIP recruitment survey. In our presentation, we will show the results of this experiment and discuss our findings.


Refreshment of the Life in Australia™ Panel

Dr Benjamin Phillips (The Social Research Centre) - Presenting Author
Dr Dina Neiger (The Social Research Centre)
Mr Andrew Ward (The Social Research Centre)
Mr Darren Pennay (The Social Research Centre)

Life in Australia™ is Australia’s only probability-recruited mixed-mode online panel. Initial recruitment was via dual-frame RDD. Offline panellists are accommodated via telephone surveys. We describe the design and weighting procedures of the first refreshment of the panel: replenishment with new panellists, removal of inactive panellists and updating of panel profiles for existing panellists. Our focus is on lessons learned.

Sample design for the recruitment of new panellists aimed to address under-representation of younger adults, the tertiary educated and males. The replenishment sample was single-frame mobile RDD only as the resultant respondent profile is generally younger and more gender-balanced than landline RDD, although also more highly educated. To address the under-representation of those without tertiary qualifications, the intention was to sub-sample respondents with a tertiary-level education; due to a programming error, this was implemented as a quota instead.

Inactive panellists were also removed at the time of the replenishment. These were defined as those who had participated in fewer than three surveys since the inception of the panel and had not completed a survey in the most recent six waves.

Weighting was conducted in three steps. First, dual-frame design weights were calculated including both the original sample and the replenishment. Second, weights were adjusted using propensity score classes, with continuing panellists treated as respondents and inactive and withdrawn panellists treated as non-respondents in the response propensity model (Valliant, Dever and Kreuter 2013); these weights were then trimmed. Third, the adjusted design weights were calibrated to population distributions on key variables to account for non-response. (Additional steps take place for wave-level weights.) We describe the advantages and disadvantages of this procedure.