Usability Testing of Email

So I’m trying to figure out the best method for usability testing emails without deploying live A/B tests. I imagine it would be similar to testing any HTML based page, but I would like to test inside of an email client environment and possibly even deploy it to a user’s local machine but still get mouse tracking and time spent before clicks. The purpose is ultimately to see if one of our emails performs better than another, but also to gain feedback on each specific design (what they got from the email, what they clicked on, what they didn’t get from the email, etc).

Does anyone here have any experience usability testing emails and if so could you please tell me how you went about it?

Thanks,
Tim M

We do this all the time.

We schedule participants for remote sessions, have them share their screens, and watch them interact with the emails we’ve sent them. Standard questions:

  • “What do you think this email is about?”
  • “What is attracting your attention?”
  • “Have you ever seen products/services like this?”
  • "Do you remember what you did the last time you saw something like this
  • “What do you expect to happen when you click on the thing that’s most interesting?”
  • [after they’ve clicked] “Is this what you were expecting?”

That sorta thing.

30 minute sessions work well.

Jared

Jared M. Spool
User Interface Engineering
510 Turnpike St., Suite 102, North Andover, MA 01845
e: jspool@uie.com p: +1 978 327 5561
http://uie.com Blog: http://uie.com/brainsparks Twitter: @jmspool

Oh a reply from the great Jared Spool! I’ve seen you speak a couple times sir, and I’ve enjoyed it every time. Thanks very much for the response, the questions will help me craft my test script.

Hey one additional question for you though, when you say you have them share their screen, are you using any third party software to track mouse pointer heat maps or time spent? Or are you just using a remote connection and manually observing those sorts of things. I’m trying to determine if it’s even worth it to use something like the usertesting.com platform.

Thanks again!

We use Adobe Connect, because it’s a tool we use for other things. GotoMeeting or Skype Screensharing work well, as does Google Hangouts. They all have their advantages.

Because the sessions are very interactive, tracking the mouse pointer or time spent won’t mean much. We’re talking with them a lot over the phone/audio connection and they are giving us a tour of the email. It’s not anything like normal behavior.

I’m not a big fan of unmoderated testing, so I wouldn’t recommend usertesting.com for anything. (Their moderated tool is ok, but overkill when you have the other things I mentioned above.)

Thank you very much Jared this was a huge help. You are the man.

Tim M

Was just about to write what Jared wrote :wink:

The only things I’d add to Jared’s response is that we try pretty hard not to test the email by itself.

If it’s a task related email (registration, password reset, etc.) then we try and test it in the context of that task. You can see some interestingly different behaviour (for example — one thing we’ve seen a few times is folk pulling out their phones to deal with the email that’s just been sent to them by their action using the desktop. In hindsight obvious, but not something we’d foreseen upfront.)

If it’s something that’s kicking off a task (sales email, periodic reminder, notification, etc.) then we try and set things up so the task can be something like “check your email” rather than “read this particular message”. Which again leads to insights (how is our mail prioritised/filtered compared to other mail, etc.) that you won’t get by looking at the mail in isolation.

Thanks for the input @adrianh, that’s good stuff!

I thought I implied it but should’ve been more explicit: Adrian is right that if the email has a call to action, you need to have those links do something intelligent in the test.

Yeah, I mean I would never create a usability test where the CTAs we’re testing don’t do anything :wink:

I think Adrian is going a level deeper by saying you can get some other interesting results if you take it back a step and give them less direction so that you can see how they interact with the email itself in the context of their normal routine.

Thanks guys for the feedback.