Usability Testing Hack: Speed Through Videos in Half the Time

There are two reactions to this usability testing hack: 

  1. Doesn’t everybody do it that way? OR, 
  2. I can’t believe the hours I’ve squandered! 

Ready to find out which side you’re on?

Watch your usability testing videos at a playback speed of 1.5 or 2. An hour-long video will only take 30 to 45 minutes to watch. 

When I usability test, I always record and re-watch each session to make sure that I see all the behaviors that were invisible to me at the time, as well as to backup my own notes and assumptions. (Ever finish the sessions feeling like “everybody” missed something, only to discover that fewer than half actually did? This is why I re-watch.). If you’re doing unmoderated remote usability testing through usertesting.com (or similar), you’re also faced with hours of video to watch. Re-watching, though valuable to the process, makes usability testing more expensive for the client, and also lengthens your turnaround time for reporting results. It’s in everyone’s best interest to recover some of this time by adjusting the video’s speed. 

How to Adjust Playback Speed

Nearly every video player has a playback speed control. On a Mac, I like the VLC video player because it’s not obvious how to change playback speed in iTunes or Quicktime (or maybe it’s not possible anymore). If you’re using Windows Media Player on a PC, you can find playback speed if you right-click the video and click on “Enhancements” (I wish I was making this up). 

A speed somewhere between 1.5 and 2 works well for me to be able to watch and take notes. It’s even possible to grab user quotes at this speed. If I’m grabbing timestamps for a time study, and I have already collected my general usability findings, I’ll set the video to play as fast as possible (8-16x) and only look for the clicks that correspond to what I’m timing.

Once you know about this hack, you’ll find yourself watching YouTube at 1.5, speeding through podcasts, and even taking online classes at warp speed - there are so many applications! 

Best UX Books

I believe you can tell a lot about a person by what they read. These books will round out your UX education. 

Utopian Entrepreneur | Brenda Laurel
Two words: signed copy. She's my hero. 

A Web for Everyone: Designing Accessible User Experiences | Sarah Horton & Whitney Quesenbery
A website isn't usable if everyone can't use it. Also: the personas will change the way you think about people with disabilities.  

Observing the User Experience | Mike Kuniavsky
The ultimate handbook for user research, contextual inquiry, and usability testing. An excellent companion book is Measuring the User Experience by Tullis and Albert. 

Content Strategy for the Web | Kristina Halvorson
She gave credibility to what hundreds of writers, webmasters and user experience folks were toiling at in the background for years, making content strategist or content manager a "real" job. 

Search Analytics for Your Site: Conversations with Your Customers | Lou Rosenfeld
Another hands-on pick, but FINALLY someone tells the UX field how to use site search analytics for the betterment of society.  

Employees First, Customers Second | Vineet Nayar
Who is first in your company? It's a knowledge economy, and after reading this, you'll realize what so many agencies and corporations are doing wrong. 

HTML & CSS: Design and Build Websites | Jon Duckett
This book makes HTML and CSS easy for maintaining websites like mine and for trying your hand at UX prototyping with real code. It's a beautiful book that was written with its audience in mind.  

What, no TufteNorman or Nielsen? Yes, they are the UX literary canon, and yes, been there, done that. (Tog is my favorite, anyway.) 

Not asking "why?"

In a recent post, I listed a few ways that you can accidentally end up with a bad user experience. But there's another way to add bloat to your design: failing to ask "why."

The other day, a business analyst asked me to add a checkbox to one of my screens so that an administrative user could indicate, once every 6 months, that someone reviewed the screen for errors. Yet, we already have proof that the screen's data is being maintained because it's in the change log.

 

On the surface, adding a checkbox is an easy update to make. But, we don't know why we're being asked to make this update, or how this feature is going to be valuable to users. Simply put, our customer is requesting a solution without indicating what problem it solves. Though it might a good solution, how do we know it is the BEST solution?

Why ask why?

The next step isn't to drop the checkbox onto a wireframe, write up a few requirements, and head home for the weekend. The next step is to call up the client and ask "why."

  • Why does the client need this feature?
  • What information is the client hoping to collect via this feature?
  • How does the client plan to use this information?
  • Does the client know about the information we're already collecting?  

Once all these questions have been answered (and maybe a few more), we'll know how to proceed.

Needs vs. design

Good requirements describe a user need, but the need is never a "checkbox." The need is bigger. In my example's case, maybe the need is reporting for an audit, or maybe that checkbox is really supposed to notify someone of something. But we'll never know unless we stop and ask why, and solve the real problem.