Overcoming the Digital Divide in Web Surveys |
|
Session Organisers |
Dr Carina Cornesse (University of Mannheim) Ms Jessica Herzing (LINES/FORS, University of Lausanne) Dr Lars Kaczmirek (University of Vienna and Australian National University) |
Time | Tuesday 16th July, 14:00 - 15:30 |
Room | D20 |
In many parts of the world, the internet is now available to nearly everybody at all times and places. However, not everybody is willing and able to use the internet. Reasons for not using the internet are lack of internet access (mostly in rural areas), data protection concerns and fear of fraud on the internet, as well as low digital literacy. Furthermore, even people that are generally willing and able to use the internet vary with regard to devices used, duration and frequency of internet use, variety of internet activities, and purpose of internet use (e.g., only for work or private purposes).
The literature commonly refers to this phenomenon as “digital divide” which is associated to inequalities between people at different socioeconomic levels or with differing demographic characteristics. In addition, the digital divide differentiates people with regard to digital device, digital access, and digital usage. Hence, the digital divide poses a challenge to the survey landscape in general and web surveys in particular. For example, undercoverage of people without internet access threatens the generalizability of survey findings to the general population. In addition, using advanced technical features such as slider scales, drop-down lists, or avatars can confuse respondents with low technical skills and increase measurement error.
Survey methodologists are therefore working on overcoming the digital divide challenge in web surveys. For example, to prevent undercoverage of the offline populations, researchers apply strategies such as providing respondents with the necessary equipment to participate online or implementing mixed-mode strategies to include people who cannot or do not want to participate online. Regarding measurement error, researchers are testing different approaches to simplify and ensure error-free responding, such as providing lookup databases and web probing.
For this session, we invite submissions from researchers who apply and test approaches to overcoming the digital divide in web surveys. We especially encourage submissions of papers that compare different approaches to overcoming the digital divide in web surveys (e.g., mixed-mode approaches, weighting). Furthermore, we are interested in submissions that take into account and empirically disentangle different types of errors due to the digital divide in web surveys (for example coverage error and measurement error).
Keywords: online survey, offliner, mixed mode, IT literacy, digital literacy, measurement, nonresponse, coverage, human-computer interaction
Dr Ruben Bach (University of Mannheim) - Presenting Author
Dr Carina Cornesse (University of Mannheim)
Online panel surveys are often criticized for their inability to cover the offline population. As a response to this critique, several probability-based online panels equip offline households with an internet connection and a simple device to close this gap.
Previous research has shown that offline individuals differ from online individuals on several socio-demographic characteristics, such as age, education, and household size. Unsurprisingly, including offline individuals in online panel surveys increases the overall representativeness of the online panel regarding socio-demographic characteristics.
Much less, however, is known regarding the question whether the recruitment of offline individuals for an online panel leads to substantial changes in actual survey estimates. That is, it is unclear whether estimates derived from the survey data are affected by the socio-demographic differences between the groups of online and offline individuals. Previous research (e.g., Eckman 2016) indicates that substantive conclusions drawn from survey data of online individuals only do not differ much from conclusions drawn from combined offline and online individuals. That is, although offline and online individuals differ on several socio-demographic characteristics, these differences do not affect substantive conclusions drawn from the actual survey data when considering online individuals only.
Against this background, we investigate how the inclusion of the offline population into the German Internet Panel (GIP) affects various estimates drawn from actual survey data such as political interest, life satisfaction and religiosity. While we replicate findings that some socio-demographic covariates differ between the groups of offliners provided with online access and native onliners, we do not find that actual survey estimates derived from native onliners only differ from estimates derived from the combined groups. That is, there is no evidence that equipping otherwise offline households with online access improves the estimates derived from native online households only.
Dr Tom Emery (NIDI) - Presenting Author
Professor Ivan Cipin (University of Zagreb)
The potential benefits of web-based surveys have had limited application in cross-national surveys given that internet penetration is far lower in some countries and applying a push to web across such a survey risks conflating cross-national differences with mode effects. In this presentation we examine the challenges involved in deploying a push to web design in a country with limited experience of such designs, namely Croatia. The survey deployed is the Generations and Gender Survey and the target sample population is aged 18-49.
Internet penetration in Croatia is around 70% (Eurostat, 2018) and there are underlying issues of trust in data infrastructure, polling institutions and the research community. Furthermore, the web survey was deployed shortly after GDPR came into force in Croatia which raised awareness of data protection issues and introduced further complications into the data collection process. As part of the fieldwork, an experiment was also conducted which compared the spacing of reminders in a push to web setting. Half the sample was provided with sequential reminders one week apart and the other half were provided with reminders two weeks apart. After three letters, respondents were followed up face to face by interviewers.
The fieldwork is completing at the time of writing, but the results have been promising with the overall response rate well above what is considered average for push to web designs. The presentation examines the results of the experimental protocols but also the general non-response in Croatia and how response rates varied across the country and population. This is achieved by combining the survey data with a wide range of auxiliary and contextual data to provide a detailed picture of a nationally representative web-survey can be deployed in a country that is traditionally considered as challenging for cross-national surveys.
Mr Joel Williams (Kantar Public) - Presenting Author
It is increasingly common for researchers to survey the general population using a variant of the ‘push-to-web’ design popularised by Don Dillman. One of these is the Community Life Survey, an annual UK Government study of >10,000 adults living in England, focusing on neighbourhood interactions, volunteering (formal and informal), charity giving, and civil engagement. It uses an address sampling frame and a two-mode – web and paper – data collection system.
This paper seeks to answer the question: what difference does the offer of a paper questionnaire make to the survey results?
To answer this, we will show
*what sorts of people are most likely to use the paper questionnaire rather than the web questionnaire,
*how different they are with respect to the survey topic when compared with demographically similar web respondents,
*whether such people can be ‘targeted’, given the limitations of an address sampling frame, and
*whether the web and paper data is safe to be combined or is subject to between-mode measurement effects.
We finish with a set of recommendations for evolving the survey design to reflect these findings.
Mr Joe Rose (National Centre for Social Research) - Presenting Author
The delivery of Face to Face (FtF) fieldwork on social research surveys is increasingly confronted with the challenges of sample apathy and interviewer panel management. Subsequently, response rates across FtF surveys have declined over the past decade, in the UK and globally.
In response to this, the use of web surveys is touted as a viable alternative in some situations. However, web surveys commonly achieve a lower response rate than FtF surveys; this is an issue which must be circumvented for a web survey to be a viable alternative to a FtF equivalent.
Intake24 is an online food diary which respondents on the Scottish Health Survey (SHeS) are asked to complete at the end of their SHeS interview, and again within 10 days of interview. In 2018 respondents were sent an instant email containing a link to directly access their individual Intake24 food diary without the need to type in a URL, username or password. Respondents were also given a generic URL and a unique username and password, as an alternative means of accessing their diaries.
This paper assesses the impact of using an instant email on response propensity to Intake24. Overall response to Intake24 in 2018 is compared with the 2015 sweep, when an instant email was not used. In addition, para-data is used to assess the proportion of 2018 respondents, who: received an instant email, opened their email(s), and used their link to access their food diaries (including proximity to receiving email). The data are augmented by responses to a respondent feedback questionnaire, and also broken down by demographic characteristics. The findings are discussed and proposals for further research in web survey response rates and propensities are presented.
Dr Barbara Felderer (University of Mannheim) - Presenting Author
Ms Jessica Herzing (University of Lausanne)
In absence of email lists of the general population, probability-based online surveys ensure representativeness by recruiting respondents offline, for example by sending invitation letters via postal mail. To gain information about people without Internet access and less IT savvy people, who will likely not participate in the online panel, these mail invitation letters can include paper questionnaires. Hence, a mixed mode postal recruitment design can inform about potential nonresponse error.
In 2018, the German Internet Panel (GIP) used a mixed mode postal recruitment design to recruit new respondents into the panel. In a first step, sample cases received an invitation letter including 1.) an url and personalized login information for the first part of a short recruitment web survey, or 2.) a paper questionnaire containing the same questions as the web survey, or 3.) both, login information and a paper questionnaire. All respondents to the first part (online or paper questionnaire) were then invited to the second part of the questionnaire that was only provided online. Both parts of the survey include questions on Internet usage and experience. The information we collect from the paper questionnaire is specifically useful for respondents who are in principle willing to participate in our survey but are not able to continue the survey online.
Our research questions are:
1.) Can the information about IT savviness in the paper questionnaire be used to create survey weights to compensate for the drop out of less IT savvy respondents in the online part of the survey?
2.) For those who fill in both parts of the online questionnaire: does IT savviness affect panel attrition in subsequent waves?
Our findings are relevant for survey practitioners who want to conduct a recruitment to an online panel, as it can contribute to a better understanding of sources of nonresponse errors.