The concept underlying Agile Software Development is of rapid and limited design and development iterations with a continual feedback loop and involvement on the part of the users. The testing implications of Agile SW development are fairly obvious: frequent iterative user testing of each design and development improvement.
Unmoderated remote user testing, whereby user experience and usability professionals can set up tests and have participants respond to them at scale, from their home, anywhere in the world, is clearly a major part of the toolset for Agile UX.
One of the most active groups of professionals focused on Agile UX, almost 15,000 members worldwide, is the Agile UX Group in Linkedin (https://www.linkedin.com/groups/3803162). We decided to run a survey with them and see how they use un-moderated remote user testing, how they perceived the value proposition, and what features would have them use this methodology more extensively, as well as obtain buy-in from colleagues and business owners.
The survey was launched and finished in the beginning of June. Following are the results:
The most interesting take away from the chart is that cost is not an important differentiating factor between in-lab usability testing and un-moderated remote UX testing. What the group most values is the ability to reach potential participants anywhere in the world.
An example comment from one of the survey participants: “You can test overnight in other time zones and get feedback by 8am the next morning. On one sprint we tested un-moderated Mon, Tue, and Wed nights. And Friday did in-lab testing. Very few issues left.”
Another advantage that is highly appreciated is the speed at which response and feedback can be generated, a critical factor in Agile SW Development. An example comment: “It’s fast as hell. You can test everyone at the same time – zero onsite planning involved”
Speed is also a key advantage of un-moderated usability testing over moderated. Many of the comments revolved around the advantages of leveraging different countries and time zones with un-moderated remote user testing: “Can test while you’re sleeping. I always complement un-moderated with moderated.”
An interesting observation from the comments was that the quality of responses was superior in the case of un-moderated, or at least had the potential to uncover “unknown unknowns: “Open to the unexpected” and “Participants likely to behave more naturally than having someone looking over their shoulder.”
We’re surprised that competitor benchmarking is not as highly utilized as some of the other use cases. When it is so easy and affordable to set up and receive quick responses using un-moderated remote user testing, we would have thought that UX professionals would take the opportunity to not only test designs and prototypes throughout the process, but also to benchmark them for usability and user experience against the competition.
An interesting comment about the availability of time for these activities: “Our UX teams fight hard to keep one sprint ahead of the dev sprint. This gives us 2 weeks to dig into important unknown problems. We have a very small window to work within for the research effort. We aim to get findings in front of the team ahead of the sprint planning meetings. I’ve learned that Product Managers and Engineers usually need a few days to contemplate findings before they can make hard decisions the team will commit to for the next 1 or 2 sprints ahead. We don’t look much beyond that horizon, our scrum master is dogmatic about the agile/scrum process.”
Cost is the number one issue that would help extend the use of un-moderated remote user testing, even though this methodology is already far superior to in-lab testing or moderated testing, especially if we assign a cost to our time.
An interesting comment on the premise of the question: “Nothing I need to sell. It has proven itself over and over”
We also asked participants for ideas on what would make them use un-moderated user experience testing more often. There was significant variation in the large number of responses and it is difficult to establish a clear cut numerical category.
That said, ease of use, ease of deployment and Picture-in-Picture views of respondents as they interact with test assets were mentioned often.
The last question in the survey was how many participant sessions a month would be conducted if the cost of doing so was zero, and/or the budget for doing so was unlimited. An interesting comment from this section:
“There’s never enough time. The team doesn’t want to have to deal with another large growing backlog of issues they won’t be able to get to because they too are incredibly busy. Teams don’t want to deal with customers all the time, they just want the time to work making cool s…, so it’s hard to demand too much of their attention..”
Our number one takeaway from the survey data is:
The most important issue confronting Agile SW Development teams is time.
Un-moderated user experience testing offers clear advantages to both in-lab and moderated UX testing in terms of speed and ease of execution.
The iterative and time constrained nature of Agile sprints, and the need for constant user involvement and feedback on each design iteration makes un-moderated remote user testing an essential tool for Agile UX professionals.