ESRA logo

ESRA 2025 Preliminary Program

              



All time references are in CEST

UX, user-centred, and respondent-centred design: its application and development 2

Session Organisers Dr Aditi Das (National Centre for Social Research (NatCen))
Ms Sierra Mesplie-Escott (The National Centre for Social Research )
Ms Sophie Pilley (The National Centre for Social Research )
TimeTuesday 15 July, 15:30 - 17:00
Room Ruppert 005

Survey research commissioners and survey designers are increasingly paying attention to respondents' experiences in completing their surveys, especially as more surveys move to self-completion modes. As such, commissioners and designers are examining their respondents' journeys, measurement instruments, and data user needs to identify how to optimise the user experience by applying design thinking. Some surveys can feature design thinking from inception, allowing many study components to be optimised for the respondents' experience. Other long-standing surveys contend with redesigning components to meet established data-user needs and commissioner requirements.
In this session, we invite practitioners to showcase work undertaken using user/ respondent-centred design thinking or UX research methods in the development of a survey. Examples could include, but are not limited to, the following:
- Usability testing of survey questionnaires, respondent communications, interviewer training, or other survey elements
- Visual design of survey elements, including design for responsiveness and accessibility
- Applying UX principles or frameworks to:
o survey instrument development and testing
o Mapping the respondent or interviewer's journey through the survey
o Rethinking conventional survey methods practices
We invite practitioners to present findings from developing or re-developing a survey featuring UX methods, user-centred or respondent-centred design thinking.

Keywords: Respondent experience, Design thinking, UX, UI, Respondent-centred design, Usability testing, Visual design, Survey instrument development, Accessibility, Interviewer journey

Papers

Human-Centered Redesign to Improve Survey Quality and Reduce Respondent Burden

Mr Brad Edwards (Westat) - Presenting Author
Ms Danielle Mayclin (Westat)
Mr Jesus Arrue (Westat)
Ms Angie Kistler (Westat)
Ms Lena Centeno (Westat)
Ms Casey Fernandes (Westat)

While most surveys are about people, the individuals who contribute their data or collect it are often overlooked. This paper is a case study about an ongoing survey shifting its longstanding researcher-driven design to better address the needs of these people.
The US has a complex, fragmented health care system (18% of GDP). Since 1996 the Medical Expenditure Panel Survey (MEPS) has produced annual data on the system. About 10,000 households are sampled each year for five 90-minute face-to-face interviews over 30 months. The exceedingly high respondent burden includes keeping records of all health care events and completing several self-administered questionnaires (SAQs). The survey began converting the paper SAQs to a web + paper multimode design in 2022. This presentation is about initial experiences using a human-centered, UX approach to move some elements from the face-to-face interview to the web.
Human-centered design (HCD) shares many elements with respondent-centered design. HCD is a problem-solving approach with four main principles: Be people-centered; find the right problem; think of everything as a system; and intervene in small and simple ways. The HCD process includes rapid prototyping, user feedback, and frequent iteration. (We also call it human-centered rather than respondent-centered because we don’t envision eliminating the MEPS interviewer entirely and must keep both actors in mind.) HCD is especially well-suited to redesigns of complex systems.
We sketch the traditional respondent-interviewer journey in collecting medical event data. We highlight a key finding from a focus group with MEPS respondents, then describe a design process to capture key data elements about each event in a web questionnaire. We tested the resulting two-screen design with new respondents in two iterations. We present key findings and discuss next steps, which


Incorporating Data Rotation into the UK's Transformed Labour Force Survey using Respondent Centred Design

Miss Lauren Carter (Office for National Statistics) - Presenting Author
Miss Sian Rosice (Office for National Statistics)

The UK Office for National Statistics (ONS) has been working to transform their Labour Force Survey. With an online-first approach the Transformed Labour Force Survey (TLFS) is achieving improved response rates. However, partial rates continue to provide an area for potential improvement. Enhancing respondent experience across the entire survey journey is essential in tackling this issue.
The Research and Design (R&D) team at the ONS uses Respondent Centred Design methodology to develop questionnaire content. Employing an Agile delivery approach, the team undertakes a variety of research methods to explore and understand the respondent experience. This presentation demonstrates how this approach was used to incorporate data rotation into the online version of the TLFS.
Data rotation refers to the ‘rolling forward’ of data from one survey wave to the next. Check questions can identify where no changes have occurred, allowing the respondent to skip sections of the survey, reducing survey length and respondent burden. The first step to enabling data rotation in the TLFS was to develop a method of establishing who is living in the sampled household. This is essential for linking individuals across the 5 waves of the survey. Following an evidence-based approach, the R&D team undertook a range of Discovery activities including a review of quantitative data, interviewer focus groups, and public acceptability testing. This allowed the team to gain insights into respondents' mental models which were used to inform initial designs. These designs were qualitatively tested during an Alpha phase.
This presentation will discuss the benefits of data rotation as well as the challenges of implementing it. The presenters will walk through the design process from the early exploratory work right through to the findings from their qualitative testing of designs with respondents. At the end, presenters will share the finalised question designs.


Transforming the Crime Survey for England & Wales (CSEW): Balancing respondent needs with complex data requirements

Miss Bethan Jones (Office for National Statistics) - Presenting Author

The Qualitative and Data Collection Methodology (QDCM) team at the Office for National Statistics (ONS) were tasked with redesigning the screener and victimisation modules of the Crime Survey for England and Wales (CSEW). These modules measure the incidence and prevalence of crime, which comprise the CSEW’s primary outputs, and collect other details on the nature and costs of crime. In 2022, the survey moved to a longitudinal design, where Wave 1 is collected face to face and Wave 2 is by telephone. The questions have been largely unchanged for 40 years and can be difficult for both interviewers and respondents to complete. Additionally, following a public consultation, a desire to explore the feasibility of moving to an online, self-complete mode was identified. Our programme of work looked to use a Respondent Centred Design Framework (RCDF) to redesign these screener modules and address issues with the existing CSEW questions, as identified in our previous work (Discovery part 1), and appropriately adapt them for an online, self-completion mode.

We conducted mental models research as part of this redesign. Mental models research is a qualitative research method approach, and an important part of following a RCDF. Collecting and understanding respondent mental models allows us to unpick respondents’ thought processes, understanding and expectations to try to reduce burden and increase usability and participation. We aimed to understand how people conceptualised crime and recalled their experiences, including the terminology they used organically, to inform our question redesign.

Our Discovery part 2 work included:

· mental models research into how people articulated and recalled their experiences of crime
· reviewing data requirements
· examining past CSEW data
· creating user journeys

This presentation will outline our Discovery part 2 work, what we did next and how these methods informed our understanding to substantially redesign the screener module.


Challenging assumption-based design: red and green flag design statements

Miss Alice Toms (Office for National Statistics) - Presenting Author
Miss Laura Wilson (Office for National Statistics)

It can be easy to refer to opinions or make assumptions about how we should design our surveys. This is particularly easy to do when we are under increasing pressure to complete work quickly, or with limited resources. However, designing questionnaires, letters and respondent journeys based on opinions and assumptions can risk us:

- designing the wrong thing, content or fix/ solution
- creating surveys that are not well understood and cannot be used resulting in increased respondent burden
- collecting data, which are inaccurate, unreliable, or poor quality

It is important that our design choices are user centred and user led, and by ‘user’ we mean the respondent. This means that designs are based on research and evidence, and not our personal views. This helps us ensure the survey we are designing meets the needs of our respondents thereby increasing the chances of meeting our surveys goals.

There are some phrases that you can listen out for when designing surveys that show that evidence is not being used to inform the design decisions. We have called these ‘red flag statements’ and have created a series of posters to help you to identify them in your work. Each poster gives an example of red flag statements which are sometimes heard when designing surveys, e.g. “I think that would confuse respondents”. The red flag statements are informed by opinions and assumptions, which means they should be challenged. Next to these red flag statements are examples of green flag statements. These statements give examples of what you should expect to hear when decisions are being made on evidence rather than opinion.

In our talk we step through each of the posters and share the red and green flag statements. These have been published as UK Government Data Quality Hub survey design best practice guidance.