Click here to buy from Amazon.

A heuristic evaluation or expert review is the bread and butter of my design life. Yes, even more so than actually sitting and drawing stuff. Anyone well versed in the principles of design for a particular domain or platform simply looks at the product (preferably a functioning one actually installed and running) and applies industry knowledge of what are best practices and expected user behaviors (the "heuristics") to identify key problems and estimate how well the product does or will work with actual users. Pretty much every project I work on gets at least one of these. Sometimes, they are quite formal, and other times just is the underlying practice behind acceptance testing.

Which Heuristics?

There's been a bit of a secret crisis over these for the past 10 years or so. Many have promoted these with long checklists which are quite specific. Some of my first issues with these came with trying to apply web-centric tools to new technologies, or to the mobile web way back in 2003. It didn't work. Even simple specifications like the amount of time that users are willing to wait for a page to load will change. Over time, and with the type of service or expected audience. Expectations change, so strict heuristics are difficult to define.

But, that's fine, because I feel that's a totally misguided approach. Norman says to "judge its compliance with recognized usability principles," and seems to reinforce the focus on principles (not specifications) with his 10 usability heuristics but then links to fully 2,397 usability guidelines(!). That's far too many, far too precise and leads to application problems. What about my B2B mobile, social site? Do I apply them all?

Guidelines like those I offer on the previous page are better to start with (for mobile, go back to Norman for general guidelines and other platforms). See other sections like General Touch Interaction Guidelines, General Touch Interaction Guidelines, and really lots of stuff in the Appendix for some additional guidance and details, with caveats and principles to help you apply them to your design.

When encountering a specific widget, like an Infinite List you can refer to the pattern principles to determine if the implementation was well-done. This is a particular gripe of mine; infinite scrolling has gotten a bad name as a whole, but only because the patterns are not being applied correctly. Patterns are (usually) not evil, but bad implementations can ruin them. This is what heuristic evaluation is very good at discovering.

Plan & Setup

As a regular practitioner of this, I have been asked to share my heuristics. Which I am therefore interpreting as sharing the basic principles, checkpoints and methods that I use instead. Since I spent last night doing a review of a new app on Android, for a global audience, for five hours, you are getting a lot of that as the for-instances. Interpret as needed for your domains.

All those devices you have identified (and acquired) need to get charged and laid out in front of you. If you worry it's expensive, get used ones and no plan. WiFi works fine for all this. Check connectivity, and get the app installed if that's what you are testing. Go to screen settings and make sure they do not sleep. If worried about power consumption, get cables and plug into the wall. I have a favorite 11 port USB hub so I have less electrocution risk, but do arrange all this early, so you don't interrupt the process with overhead like charging a phone.

Prepare for screen capture. I strongly prefer DropBox as it can be set to automatically load images (including screenshots) to a folder which you can get to from your desktop computer. Easy. A lot easier than emailing, or trying to make iPhoto work.

A Process for Evaluation

Now you are sitting at your desk with an array of devices in front of you. Now what. Well, my method varies depending on what you are testing.

Don't forget the other devices you need around to test:

Evaluate Views

Say it with me: There are no pages. We have to stop using this word. Believing there are pages leads to lots of missed opportunities and missed bugs. Think in terms of views and states. That means you evaluate every time the data changes, every time you open an accordion, press a multi-selector field, type into a text field, or open a dialogue.

Test interaction as well. Do the tabs work? Great. Do they look good during the transition? Do alternative methods (gesture) work? Both ways?

Do not get hung up on being pixel perfect. Close enough is close enough, with all the variations. If you didn't specify the margin or size in your design document, it probably isn't critical. Think before opening a bug for every single thing that is not like your design.

This is more or less, depending on the product, the order in which I do things:

For each view, I check these items, in more or less this order:

Yes, the above list is fully-inclusive, so not ever product or interface will do all of these. If you think I missed something, it may just be that it's about your project alone. That's fine. This is a list of guidelines, but in many ways test plans are based on accepting the specification for your product. Test against that, certainly! Do be sure to measure when possible. Whether you use a screenshot (carefully, know your scales) or measure directly with a tool like a Touch Template, be sure to not trust your eyeballs. Confirm.

In an ideal world, with infinite time, after doing the evaluation on the actual hardware it is good to take screenshots and compare design to the actual built product side by side. I have discovered things doing this that were not clear on the screen. My favorite example was a 5 px black line around the entire application, that could not be seen when on the handset (it blended into the device bezel).

Recording Your Findings

It is critical to take good, clear notes. You will find many things worth noting, so cannot rely on memory. Really, you can't. As you find more issues your brain will exaggerate or forget previous issues. Write them down, as you find them.

My favorite way is a spreadsheet. My favorite by far is Google Spreadsheet, as it can be shared, so multiple practitioners can work on one document, or it can be shared for the team to discuss what to do about the findings. See this example. Yes, real ones are much longer, but it was easier to sanitize a real one by just keeping a few real and generic points so pretend it's very long.

The large checklist of hundreds of heuristics requires us to assign a value to each point. Meaning you say what is good as well as what is bad. I tend not to do this and only note the bad. Not just because I am mean and only like to talk about the bad things. Instead, for two reasons:


Next: Part I Page


Discuss & Add

Please do not change content above this line, as I am trying to keep this a well-organized, curated experience. Everything else you want to add goes down here.