While we tend to sway toward more qualitative methods at STBY, we have been increasingly doing more mixed-methods research. This includes incorporating a lot more surveys into our research process. New tools are making it easier to design mixed-methods studies where qualitative and quantitative can both shine in complementarity. Below is a reflection on three surveying tools we have been using recently, how we have used them, and some pros and cons of each.
dscout
Though not marketed as a Survey builder, dscout allows researchers to conduct diary studies at scale with ease and efficiency. Participants in these studies (called Scouts) complete a number of tasks throughout a fixed period of time. The nature of these tasks can range from completing simple multiple choice questions, to recording small selfie videos. We have used dscout for a few projects now, and it has made mixed-methods research such a joy.
dscout’s Recruit feature lets you rank potential participants by suitability. (Source)
This is how recruitment should be done always
Their recruitment tool is perhaps my favourite feature, as I am not typically the biggest fan of the laborious but essential process of finding good participants. In dscout, all you have to do is launch a screener survey on the platform with the options of multiple choice, open answer and even video questions. It is sent to their pool of over 100,000 participants and researchers can view responses in real-time, rate them based on suitability and invite them to participate. What usually takes weeks, is cut down to a few days. And, I feel so much more confident in the quality and suitability of participants by being able to select them myself based on rich responses and a good sense of articulation and engagement.
Pricey, but worth it?
The Diary study set-up can be a little tricky the first time. You have to get used to their system and structure, but once you do, you quickly realise how flexible and robust it can be. They provide great support too, in case you get lost or have uncertainties. The Live feature, though expensive and only available to subscribers (at least when we used it), provides a really useful way to follow-up with selected participants for richer, more in-depth 1-1 interviews. It also auto-transcribes all of your interviews and lets you easily highlight key quotes, which is a major bonus.
On dscout, in addition to diary studies, you can follow up with 1-1 video interviews via the Live feature. (Source)
We used dscout last for a project looking into how kids (and parents) experienced a prototype of a music app designed just for kids. As we wanted to follow-up with selected participants for in-home interviews, we unfortunately could not use the Recruit feature and instead had to go through our own recruiter and on-board participants on dscout ourselves. This was all quite a hassle. We are not quite sure why we couldn’t make it work, but there are smoke logical reasons one can assume that explain why dscout is a bit guarded. In future we’d like to see them rethink how to let participants opt-in to the potential of home visits.
We also do quite a bit of mixed-methods research in multiple international locations with the help of our partners in the Reach Network. Though I am sure they are working on it, dscout is not quite a global product yet. Diary studies can be localised in 8 different languages, but their Recruit panel is only US-based. So if you work across different markets, you will still have to use a separate partner or agency in each to find participants and onboard them. If dscout had a global panel and the organising of on-site interviews was a possibility, it would be a gamechanger.
UsabilityHub
We started using UsabilityHub about three years ago when we were working closely with Spotify’s editorial design team to get feedback on new visual and typographic directions for Spotify playlist covers. It’s a great tool for getting qualitative and quantitative feedback on visual designs and concepts. It is not as robust as Survey Monkey in terms of question types and analysis features, but it’s simplicity and focus on visual and UX-type questions are big assets. It’s also really user-friendly from a respondent perspective, and we have had feedback that completing our surveys had actually been a fun experience (!).
Usability Hub has various question types to choose from, most of them aimed at visual design or UX-type enquiries.
Targeting options and screening capabilities need work, but we value its simplicity and focus on the visual
Usability Hub recently launched their own panel, but we have not used it yet as our recruits have been quite specific and their targeting options are not yet sufficient. Unfortunately, UsabilityHub does not have a screening option like Survey Monkey does, though they recently introduced test logic which could be used to qualify and disqualify participants. However, this gets a bit complicated when working with an external panel provider in terms of managing incidence rates and completes. We typically have our panel provider pre-screen participants using a mini survey on their end, before sending them onto our survey. It’s not an ideal set-up, so in the future, screening capabilities would make life a lot easier.
Analysis on Usability hub is a simple, no frills affair (Left). Questions are highly visual (right). Source
If you want to design a sophisticated, highly-branched survey and have the ability to do more robust analysis, I’d go with Survey Monkey. UsabilityHub is definitely designed for simple, visual design and UX research and it really shines here. In the future, I’d be curious to think more about how it could be used to prompt and probe participants more visually for larger-n speculative studies.
Typeform + VideoAsk
Though we knew of Typeform’s existence for quite a while, we never thought to use it as a survey builder. That is until we recently needed a really user-friendly survey tool with video-asking capabilities for a policy co-design project we ran in Mexico. Building the survey did not feel as easy compared to Survey Monkey or UsabilityHub. We had quite complex branching and conditional questions for this particular survey, so that might have made things feel more complicated. Their logic jump features were exactly what we needed, but a little bit finicky to implement.
Typeform is known for letting you design really smooth, user-friendly forms
Ramping up the accessibility side of things and reimagining a more personal way of surveying will make us keep an eye on Typeform
Accessibility was a big priority in this particular project as we had a number of visually impaired participants, and Typeform has taken some major steps in this regard recently. Typeform’s default, full-screen respondent experience is WCAG 2.1, Level AA compliant. This means that it is accessible to those who use assistive technology. Big plus. Another recent perk is the addition of VideoAsk, which is a product developed by Typeform. It allows you to have ‘asynchronous’ video conversations with participants. Think of it as a hybrid between a survey and video interview.
Respondents click the link to your VideoAsk, record or write their answer, and send it. Source
In our latest project we connected our Typeform to a VideoAsk. The relationship between the two was a bit unclear and we struggled a bit in terms of embedding the video. We also made the question an optional, open-ended one at the end, with people able to opt-in to us sharing the video publicly. As a result, we did not get too many video responses, but the ones we did get were really impressive and inspired us to use this product more. It’s a lovely experience overall and all you have to do is record a video of yourself asking a question via the app, share or embed the link, and receive video answers from your participants. I’m looking forward to exploring how we can use VideoAsk more in new and innovative ways.
What lies ahead? More conversational surveys and bespoke survey platforms for specific research projects
One Shared House 2030 by Anton + Irene and Space 10 (Source).
There are plenty more tools that we use for surveying, but the ones included here feel the most novel and experimental. I am certain there are many more tools out there that design researchers are using for surveys and we are keen to keep exploring. And we know there are even researchers building their own bespoke survey platforms for specific research projects. Incorporating gamification and speculative design, anton + irene and SPACE10 collaborated on One Shared House 2030, which is a “playful research project that aims to get insights on the future of co-living through a collaborative survey.” If trends like these continue, gone are the days of boring, dull surveys (we hope).
Megan Anderson