In many of my other blog posts I have discussed user research through various methods. From surveys, to design feedback.
What I had not done, yet, was share my experience with performing remote user research. Technically I did, with my presentation ‘Prospective Optimization‘ at the Dutch Web Analytics Conference back in March, but not specifically on this website.
In this blog post I would like to share with you, a real life example of working with UserTesting.com. To be more precise, a comparison between Airbnb.com and Wimdu.com.
SKIP TO: User Testing Videos
I like to write my articles based on real life experiences in the world of conversion rate optimization. Although there are many worthy alternatives out there for remote user testing, such as Loop11, TryMyUI and YouEye, my practical experience with UserTesting.com is the most extensive.
In future articles I will definitely take the time to focus on the alternatives. Not only for remote user testing, but for all aspects of user research.
Airbnb has certainly established itself as a serious player in the field of travel. By creating a network of hosts, who make their homes, or rooms, available for (short) rental to travellers, it undoubtedly tapped into a vast resource of revenue still available in the online travel agency (OTA) world.
Is Airbnb unique? Some beg to differ, because they were not the first. The same goes for Wimdu. Is Airbnb an improved idea inspired by the likes of Couchsurfing.org? Is Wimdu another spin-off, of a spin-off? My opinion is… who cares. It should be clear that all of these companies are doing well, not only financially, but also in filling a requirement in today’s travel market. So, who are we to criticize?
Well, enough politics for now…
This research was basically performed with two goals in mind.
This form of user research, recording participants’ screens and audio while they perform a set of tasks is great because it allows you to do several things.
Basically, where analytics and feedback forms leave off, (remote) user testing picks up. This is true in more ways than one, because what (remote) user testing lacks is an easy way to quantify the findings.
Nothing is more laborious in the online world than performing and analyzing user research. It is good for the hours if you work freelance, but notoriously time consuming nevertheless. We have gotten lazy with all the great analytic tools on the market these days, that we forget that real insights, takes real-time, real effort… a lot of both to be honest.
The same is the case with remote user research. In several cases I have spent up to 1 hour setting up a test, 5 hours analyzing the results (average video length 20 minutes, analyzing takes me 1 hour per video) and around 8 to 16 hours compiling and documenting the results. Time and effort ultimately depends on the test in question.
Still, it is quicker than alternative methods of performing user research like in a lab.
So how did I go about setting up the test? In this specific test 5 participants were invited to perform a series of tasks defined around a single scenario. The tasks (some default ones from UserTesting.com) were meant to let the participants compare both websites and give their opinion on the experience.
FYI… The participants are pulled from a legion of testers who have signed up at UserTesting.com. In return for a small fee, they perform the test for you, which is included in the ‘per test fee’.
I have conducted of 100+ tests and I can happily report that the quality of participants is good. The only barriers now are geographical ones as the location of the participants is limited to USA, Canada and the UK. YouEye does support European testing at the moment, in case anyone was wondering.
I formulated the following scenario for the test:
Put yourself in the shoes of someone who wants to compare two sites before booking a Bed & Breakfast accommodation for your upcoming holiday to New York City. You will be visiting two sites to compare offerings.
Within the scenario, participants were asked to perform the following tasks:
When viewing the recorded test sessions, I take it upon myself to use a ‘Create Clip’ function whenever I find something that might be useful to discuss. In this case, I have done the same. Now, I did test 5 participants, but I will only show you the findings of 1 of them, just to prevent taking too much of your time and keep this article as enjoyable as possible.
Anyone interested in discussing the findings with me, you are welcome to do so by contacting me, or by sharing your thoughts in the comments below. I will consider making a compilation video of all the sessions and publishing them, but for now, I will limit the compilation video to just 1 participant, just to save some (personal) time.
DISCLAIMER: This list was created by myself and is in no way ‘complete’. Some shortcuts were taken for the purpose of publishing this blog post since I do all of this in my spare time, which, with 3 kids, is not much. I like to consider myself objective and skilled enough to detect possible issues on a website. I agree that because only 5 users were tested it is hard (near impossible) to claim significance, but I know that any finding could be a catalyst to improving the website. Since I work for neither Airbnb nor Wimdu, I cannot say what has already been tested and what has not. It could be that certain functionalities have been designed just as the user has experienced them (ie. Search Button on Homepages).
TIP: Click on the toggle below the video for a (short) summary of the findings in the Airbnb user research session.
TIP: Click on the toggle below the video for a (short) summary of the findings in the Wimdu user research session.
The preferences were evenly split. Wimdu did get some credits for being easier to book, but lacked in recognition. The latter is most likely (very non-scientific term) due to the fact that Wimdu is European based (Germany) and Airbnb US. Not an excuse, but surely worth considering. I think that both websites have enough work cut out for them in optimizing the user experience. My advice: Just keep testing.
Conducting user research this way is… well, call me a nerd, but it is fun. You are in charge of what you want to test, when, and to a large extent where. The insights gained through this research are powerful in a way that it is the users/visitors themselves that indicate what they like and don’t like. Gut feeling plays a small part, so the possible positive effect on conversion rate optimization based on user feedback is potentially huge.
No, there is no contact with the testers to ask additional questions or to steer a test while in progress. Some things you need to take for granted and take the feedback at such a value that it keeps your user experience and conversion rate optimization brain cells in motion.
When viewing the tests, I was glad that my trustlogo research importance was mentioned on several occasions, but I did miss a critical (or at least so for me) item. During my own pre-test, so to speak, I noticed that both websites don’t utilize some very key USP’s (unique selling points) at important stage in the booking process.
The one that caught my eye the most, and which was not visible while booking, was the payment process at Wimdu. So, in closing, please consider this…
At Airbnb, your are charged for you stay as soon as the host accepts your booking request.
At Wimdu, you are not charged until 24 hours after arriving at your destination.
Exactly, (remote) user testing! Please share your experiences with remote user testing!
Let us help you. Check out our service’s page to learn more.
[ReviewAZON name="Airbnb Wimdu Research" id="6" display="multiinlineasins" asins="0470185481,0123849683,0321804813" trackingid="actualinsights-20" country="us" width="100%" float="left" displaytype="list" count="10"]