Menu-wht

ZURB Apps

Verify Updates

Act on User Data, Not Design Intuition Subscribe via RSS.

Verify Updates

Act on User Data, Not Design Intuition Subscribe via RSS.

Search Fields — What To Do and What Not To Do

We're big fans of using concept testing to make measurable improvements to our own apps and pages. We're always curious, studying the world's most popular websites to discover what they are getting right and where they are falling down. We recently used Verify to run studies on world's top 20 consumer electronics websites, as ranked by Alexa. In the studies, we focused on the search fields of the electronics websites.

Millions of dollars can be lost if eCommerce visitors have trouble finding a simple way to search for products they are trying to buy. In order to test how effective various sites' search fields were, we created a Verify click test to record where people clicked after being asked to search for a product. You can try out the Verify test below:

We performed this test for each of the top 20 consumer electronics websites and collected 100 responses for each. Afterwards, we analyzed the data collected to see what lessons we could apply to our own work. We'll start with some of the worst practices we found. Here are a few takeaways from low scoring sites:


TigerDirect — What Not To Do

TigerDirect had an abysmal 50% success rate when people were asked to click on the spot on the page where they would search for a product they want to buy. Those people who did click on the right spot on the page took on average of 27 seconds to find that spot. Why is that the case?

A poor content choice is throwing people off when they're trying to find TigerDirect's search field. The directive seems to arbitrarily restrict searching to "deals" and not "products." This confuses people.

Check out the full set of TigerDirect Verify Results →


FutureShop — What Not To Do

Future Shop did slightly better, with 58% of people correctly identifying the spot on the page where they would search for a product they want to buy. However, those people who did click on the right spot took an average of 32 seconds to find that spot. What happened here? Why did it take them that long to find the spot?

Visually indistinct search actions spread the the visitors' attention and had them lost as to which spot they should click on.

No solid focus on search left users hunting for the search field for a while. Website visitors' attention gets pulled everywhere else by the density of the screen and lack of hierarchy for search or categories.

Check out the full set of FutureShop Verify Results →


eCost — The Right Way To Do It

eCost nailed it. 79% of our test takers successfully navigated to their search field when prompted to. Better yet, they took an average of just 9 seconds to find that search field. Why did eCost perform so much better?

Obvious contrast and prominent placement helped people find the search field much faster. eCost does a great job of using visual markers such as contrasting colors and breaking the grid to call out their search field.

Check out the full set of eCost Verify Results →

We use Verify to learn from other sites this way in addition to testing our own sites to find simple problems that hit you over the head such as "poor content choices." As a result, we've improved the comprehension of our own pages and gained more happy customers. We'll be sharing more case studies in upcoming posts.

Run Your Own Verify Test

Dre (ZURB) says

Dmitry, Thanks for the post! Interesting note. I just finished "Don't Make Me Think" (thanks for the recommendation) and this exact scenario was one of his notes. It's interesting how elegant a simple search box in an easy to see spot works.


Jon R. (ZURB) says

Great post, thanks Dimitry.

What service/method did you use to collect the 100 responses?


Dmitry (ZURB) says

@Dre - Thanks! Yea - love to find these details!

@Jon - the tool we used is Verify (http://verifyapp.com)


Matthijs (ZURB) says

Interesting findings, thanks for the post. I do have one remark about the test: for me it was not clear from the question what to do. " ...click on where you'd go to search for a product you want to buy". Search for what? Which product? On the screenshot I was presented, I just clicked on one of the product categories, because I thought something like "well, maybe I want to buy a television, let's click on electronics". So, for me it was not obvious I had to look for the search field at all!

Maybe I was not the only one experiencing this. Whether it would make much difference to the main conclusions in the article, maybe not. What do you think?


Dmitry (ZURB) says

@Matthijs - thanks for the shout. Great point! In the instructions we gave the test takers we mentioned that they will have to pretend they have come to a consumer electronics site to shop for a product they want to buy (for example: a camera). They will be asked to click where they would perform certain actions on the following screen. That was the introduction. In this sample test above we did not include this extensive introduction.


Matthijs (ZURB) says

Thanks for your reply Dmitry. I can imagine that a more elaborate explanation would clear any confusion up (if there would be any).

The most interesting thing for me was the fact that suddenly it was me being the test subject and I had some hesitation about what to do - even if it was just for a few seconds. Normally (being a webdesigner) I'm just assuming all kinds of things about how users behave. I knew I have to do a lot more real testing, this just reinforces that idea.


Jon R. (ZURB) says

Sorry, I guess I wasn't clear enough. What service did you get your 100 respondies from? Did you use a usability service or do you just have a large list of friends and got 100 responses from that.


Visit Verify →

The ZURB Apps blog is a place where we discuss news, customers, tips and tricks about our web applications. For additional information on ZURB, check out the ZURBlog, where we discuss product design, business and strategy.