Monday, September 2, 2013

Customer Experience - It's not 1950 Anymore

My work and focus is shifting recently. Although it's always centered around the needs of the "user," I've started thinking more about the customer in general. What is their total experience over the lifetime of a product, digital experience, or service?

This has, in turn, made me an even more discriminating and impatient consumer.

Case in point. I heard about an amazing place in southwest Washington where you can rent an old trailer from the 50s and stay there for the night, weekend or week - similar to a B&B, only cooler! Sign me up! But wait, it's not that easy.

The website has great images and a cute look and feel to it. Being very used to booking reservations for everything from Airbnb to a Hilton Hotel online, it seemed only natural that I could do that here. After searching the entire site, I gave up and called. Ok, so they are old school. Old trailers, no online reservations.

One of the nice things about a phone conversation vs. online reservations, is that you find out things you never thought to ask. I found out the trailer I wanted sleeps 6 (unnecessary for us), but they all have kitchens and bathrooms. I put a reservation on a trailer and felt good about it. I did put it on hold it followed up by a few rounds of phone calls (one surreptitious one since I was standing with the person I was trying to surprise with the gift of a weekend) before finally giving them my credit card. At the time of that conversation, I also decided to change to a slightly less expensive, smaller trailer that fit our needs.

After thinking about it for a few days, I realized that I never asked if the new trailer I reserved was available any time other than early October, which seemed far away. I called back (catching someone directly this time), who told me my reservation was still for the larger trailer. After much back and forth we found a date open with a smaller trailer (2 MORE weeks out than I wanted it).

Attempting to be a savvy customer, I asked for a reservation number. "We don't do that." I was told. Really?? Ok, then can you email me a receipt? "We don't do that either." Incredulous I ask how they can run a business like that in this day and age. The woman assures me that my "card" -whatever that is, is now on the date for that trailer. At this time, I picture a huge wall with plastic laminate pockets for every day of the month and every trailer they have. Hopefully, they have a big office.

It's true that everyone adopts and adapts at their own pace, but at some point and time, businesses need to grow and change and keep up with the times. Even without an online reservation system, there should be a simple, easy way to at least guarantee a reservation with a confirmation code. It's just good business. It's not 1950 in the rest of the world.

Lucky for them, they have a bit of a lock on the market as they are the only place I know of like this, so cancelling the reservation was not really an option. Now, I sit back and wait, though you can be sure I'll be calling them every few weeks to be sure we have the Spartan Mansion waiting for us on October 18!

Sunday, March 10, 2013

Testing out Online Testing

I recently had the opportunity to test out some of the remote, unmoderated tools now being offered for user testing. Although I am an avid spokesperson for in person interviews, I also realized that setting up and running 1:1 interviews is time-consuming and expensive. We don't always test things we should because it takes too much time and money. That's where the beauty of the unmoderated, remote sessions come in.

For this experiment, I was actually able to schedule testing using two different tools concurrently with one-on-one interviews. This made sure that the team didn't lose any time or information during the process, and also allowed the opportunity to weigh the pros and cons of each process.

I spent several weeks looking at a number of online tools. The ones I landed on are and


Loop11 allows a quantitative stab at user testing. You set up a series of questions and tasks. After each task you can ask survey type questions of the users. You can invite up to 1000 participants to view your website and subsequent tasks. You can choose demographic criteria, but the more criteria you choose, the more expensive the test is and too many criteria makes it impossible to recruit. Setting it up is a little tricky. Loop11 could take a lesson from Survey Gizmo and allow you to more easily move your questions and tasks around. my 200 person tasked with a relatively broad demographic cost around $800. Each project is $350 if you do them without a yearly package but then you add on the recruiting costs.

Conversely, encourages lower numbers, I tested only six individuals, but successfully recruited six Human Resources representatives which I wasn't sure would be possible. That particular demographic was not available on Loop11. charges around $35 per recruit and  the test cost no more than a few hundred dollars.

It's important to remember that you have a lot less control of the distribution of demographics when using the online tools, but the flipside is that you don't have to spend a week writing a screener and paying for that in both time and money.

If you think about a typical recruit for 1:1 moderated sessions, 5-10 individuals at $150-$200 apiece plus remuneration, a typical study usually runs at least $3000-$4000. Selling the online tools as a cost-saving measure is an easy win. But what about the results?

Both tests were relatively easy to set up. Loop11 allows you to work directly in their interface prior to collecting payment. This was definitely a trickier experience since I had to put in starting URL, success URL and any questions I wanted to ask on each landing. I found it difficult to keep track of and the numbering system doesn't differentiate between tasks and questions so your whole study runs together. (Ideally, each task would have a letter, and then associated tasks/questions might have a number like A1)

User was frustrating because I couldn't set up anything in their online tool until I paid and it took weeks to get through our organization's red tape to be able to set up payment. So I had to set it the test protocolup in Microsoft Word and then cut and paste when we were ready.

The results are where you need to pay the most attention because the output is significantly different. Loop11 delivers much more quantifiable information. You survey/test hundreds of individuals and get large amounts of feedback on your site. We asked questions like whether people thought the site was easy to use or cluttered. In return, Loop11 graphs the responses that were delivered for each question. Loop11 also has heat maps which are great for the team to see a gave us great insight into where participants are focusing. Not only do you see the successes, but you also begin to see the other places on the page where participants clicked. Loop11 also provides click streams, but our beta was not set up in a way that this was valuable. In the future, I would push for better test URLs to be able to track the click streams. Without a live site and discrete URL's for each page - that is difficult. If you want to jump through a few more hoops, you can get audio and video from a collaboration between Open Hallway and I ran out of time and energy and decided I didn't need video from 200 people. You also get a limited amount of information about each participant. I did find myself wanting a better way to export the graphs of feedback data in some way other than through a screen capture. There may be a way, but I didn't find it. delivers raw video along with demographic information, browser information and other key statistics for understanding your participants. The videos I got back were no longer than about 12 min. and I was able to watch them all and take notes within about 90 min. After you take notes in the area provided on the website, you can click a button to export and immediately get a spreadsheet outlining all the demographic data, the answers to the 4 written, open-ended qualitative questions at the end of this test as well as all of the notes you took as a researcher. It's a pretty sweet way to see everything all at once and the dream compared all of the cutting and pasting many of us do after a series of interviews. Because this is a much smaller set of participants, the data is clearly not quantifiable, but in some ways it is a bit richer.

Both of these tools have significant merit for use at the right time in development. seems like a preferred tool for when you have very focused questions and want to understand some limited and specific behaviors on your site and make sure people are finding what they need. is probably closer to a real world, 1:1 moderated session. You are provided with conversational feedback that is valuable a little further forward in the process when you're still making decisions and trying to see how people think when using your site.

The best thing I found in doing this is that the feedback was corroborated across platforms. The team was afraid of that we would get one set of feedback online and one set of feedback in person. I'm happy to report that both the online users (not typical users of our service) and the 1:1 participants (known users of our service) all had feedback that lined up and pointed in the same direction.

For our organization, I hope to be able to use these tools in the future to be able to include the voice of the customer more consistently and for a lower cost; not at the expense of talking with individuals 1:1, but in addition to that as touch points during the development process that keep us agile.