All time references are in CEST
Overcoming challenges in mobile questionnaire design 2 |
|
Session Organiser | Ms Joanna d'Ardenne (NatCen Social Research) |
Time | Thursday 20 July, 09:00 - 10:30 |
Room | U6-01c |
Research commissioners are increasingly interested in offering an online alternative to surveys that have historically relied on face-to-face data collection. Transitioning a CAPI survey to an online survey can pose practical challenges when it comes to questionnaire design. One challenge is the principle of mobile-first design. CAPI questionnaires will have been designed to be completed by a trained interviewer using a large screen device. Web questionnaires, in contrast, must be accessible to novice users, including those who are participating on small screen devices such as smartphones. Poor mobile design can increase respondent burden and frustration. This increases the risk of missing data and break-off.
Although it is straightforward to render simple questions on mobile screens, some CAPI questions require redesign work to make them smartphone appropriate. In this session we invite survey practitioners to showcase any work they have undertaken to develop mobile versions of more challenging CAPI questions. Examples could include, but are not limited to, the following:
- Questions that include lengthy text or complex instructions
- Household enumeration questions
- Grid based or looped questions
- Questions that make use of hidden codes
- Event History Calendars
We invite practitioners to present findings on their redesign process, which designs were favorable, which were unfavorable, and the methods used for pre-testing their refined mobile-first questions.
Keywords: Questionnaire, mobile, pre-testing, usability, web, online, mode, mode transition
Ms Deirdre Giesen (Statistics Netherlands) - Presenting Author
Dr Maaike Kompier (Statistics Netherlands)
Professor Jan van den Brakel (Statistics Netherlands/Maastricht University)
Smartphones are increasingly used to complete online surveys. Whereas in 2017 less than 10% of first logins in Statistics Netherlands household surveys was using a smartphone, in 2022 this has increased to about 35%. Designing questionnaires for smartphones means designing for a smaller screen and touch screen use. Yet, as different respondents use different devices, questionnaire designers also need to consider how questionnaires will look and function on all these various types of devices and how this may affect the response process.
Statistics Netherlands uses standard style sheets for online surveys. These style sheets define the visual presentation of the questionnaire, such as the use of colours, fonts and the various question formats. Currently, we are developing a new style sheet that uses the smartphone first principle: starting with the design requirements of a smartphone and then expanding this to a larger screen. The main goal in the design is to develop a user friendly style sheet that reduces device differences as much as possible.
In the first quarter of 2023 a large-scale field experiment will be conducted (N=12.000) in which various alternatives of the new style sheet will be compared with each other as well as with the current style sheet. In this presentation, we will present the first results concerning two of the treatments in this experiment: the presentation of grid questions and the use of an alternative login screen. For these design alternatives, we will compare response rates, breakoff rates, several quality indicators (such as straight lining) and the subjective evaluation of the surveys by the respondents.
Dr Antarika Sen (Senior Research Manager at Milieu Insight (https://sg.linkedin.com/company/milieuinsight)) - Presenting Author
With surveys increasingly being taken on mobile phones, it is imperative to re-examine how we think about and design surveys. Sitting across and speaking to a human is a wildly different experience compared to answering questions on a small screen for the same amount of time, and poorly designed surveys can alienate respondents or even worse impact quality of answers.
Here, I discuss findings from two experiments :
1. How lengthy surveys can affect response quality in mobile surveys and is there an optimal survey length beyond which we start to see quality of responses deteriorate? The question around optimal survey length isn’t a new one. However, many of these experiments have focused on the link between survey length and drop-off rate. What we feel is missing is an exploration of the relationship between survey length and actual data quality in mobile surveys. Why does this matter? Focusing on drop-off rates overlooks the potential data quality issues of long surveys for those who didn't drop off. This issue is particularly relevant to online panel companies as the core value exchange is typically rewards (via a redeemable points system) in exchange for survey participation. I discuss how survey length impacts response randomization (drop in attention levels), quality of open-ends, acquiescence bias, and survey taking experience (frustration levels across different survey lengths).
2. The second experiment focussed on understanding if and how grid questions impact data quality and survey taking experience on mobile surveys. Grid questions are enticing to use and understandably so. However, they are not optimised for a smartphone screen. I discuss impact on selection rates in multi-select questions, poor survey taking behaviour and respondent frustration levels.
Ms Vivian Meertens (Dr.) - Presenting Author
Mr Jeldrik Bakker (PhD)
Ms Margreet Geurden-Slis (Dr)
Despite the fact that online surveys are not always fit for small screens and mobile device navigation, the number of respondents that start online surveys on mobile devices instead of PC or laptop device, is still growing. Statistics Netherlands has responded to this trend by developing and designing mixed device surveys. In recent years, we redesigned the Education survey applying a smartphone first approach.
The Dutch Education survey is a yearly web only survey among teenagers and young adults, sample size about 180,000. Originally, it was designed as a questionnaire for paper. Since 2018, we observed that a lot of respondents tried to complete the survey on a smartphone, even though it was not designed for a small screen. As a consequence, a lot of respondents that tried to use the smartphone dropped out instead of switching to a different device and as a result the response rates were declining. Therefore, we decided at Statistics Netherlands to redesign this survey according to a Smartphone First (SF) approach.
In this presentations we will discuss the steps that were taken in this research project to redesign the Dutch Education survey according to a SF approach. This includes changes of questionnaire and visual design elements, the approach and communication strategy, pre-testing, piloting, and adaptations of output demands. In this presentation we will share our lessons learned during this redesign process and reflect on the outcomes, data quality aspects, implications and challenges for the future of social surveys using a SF approach.
Mr Daniel Knapp (Federal Statistical Office of Germany (Destatis)) - Presenting Author
Mrs Karen Blanke (Federal Statistical Office of Germany (Destatis))
The future of data collection in social surveys is an ongoing discussion, both at international and national levels. With online data collection becoming the preferred option in recent years (“Online First”-mode strategy) and an ever-increasing number of mobile users in online surveys, Destatis has strengthened its effort to strive for a respondent-centered survey design.
This includes the introduction of mobile apps as a novel mode of data collection (Time-Use-Survey 2022 being the first survey to feature an app for respondents, followed by the Household Budget Survey in 2023) as well as an ongoing project to completely redesign our online surveys towards a fully responsive layout.
The goal of this redesign was to achieve a survey interface that not only accommodates mobile users, but generally increases the user-friendliness and provides a positive user experience, thereby increasing respondents’ motivation to participate and thus improve response rates. In order to accomplish such a questionnaire design, several iterative functionality and usability pretests have been conducted at Destatis.
This presentation will demonstrate the steps undertaken and the thought processes behind implementing apps, and then present both general design recommendations for mobile questionnaires as well as our learnings when it comes to specific aspects, such as diaries, tables, and error message designs.