Blogs about optimization
I can't remember exactly where, but a few weeks ago I read an article / blog post which stuck in my mind. The author stated that an A/B test had shown reduced visit times and a higher bounce rate when the article creation date was displayed at the top of the article rather than at the bottom. So the author kept the "date on bottom" aspect from the original design.
I understand the visitor behavior demonstrated by that test. When I'm looking for technical advice, I absolutely use an article's date as a signal of how trustworthy the information is. Three years ago - especially in web development time - is ancient history.
It seems to me that the blog author learned the wrong lesson from that A/B test. The visitors were saying "We prefer relevant and up-to date information". Leaving the article's date at the bottom of the article may have hurt the visitor's goodwill (aka trust in the brand) towards the site, and that's something that can't be A/B tested. A better solution would have been to add an update to the top of the articles which gives the visitor a current best practice or points them to a new, relevant article.
Last time, I talked about what optimization testing can do for your site. Now, I'll talk about what it can't do.
- It can't generate new traffic for your site. But optimization testing can help you do more with the traffic that you do get.
- It can't get your web site / IT organization / content management system in order. Testing will require at least one change to the page; and optimization testing can't make a web development team more responsive.
- It can't clarify your web site's marketing objectives. It can tell you how well existing objectives are being met.
Optimization - testing to find out which elements drive desired visitor actions - can do a lot for your web pages.
- One of the most obvious benefits to optimization testing is to find out what your page's conversion rate is. And you can find out when that conversion rate drops or increases.
- A good optimization tool can can tell you what segments - specific niches of visitor traffic - exist within your overall traffic.
- Testing can tell you if those expensive Flash pieces are, in fact, more effective than their static counterparts. Wouldn't it be nice to justify those costs, or to know if you could get better results with something cheaper?
- Most importantly, testing will tell you what is most appealing to visitors; what content changes will drive visitors to conversion actions. How can you possibly know what your visitors like without testing your assumptions?
Ready to start? Very good. If you've got no money, Google has a free optimization tool. Of course, my company has an optimization tool too.
I've been imagining doing some optimization testing on Woot's new design. This is a bit of an intellectual exercise, and a bit of show-and-tell for designing a web page optimization test.
I'd measure the results of 2 different actions - signing up a new user and purchasing the item. There's a good chance that the test elements would influence these events quite differently.
Above is a screen shot with an overlay of the elements I'd test for optimization. (Click for a full size screen shot) For each of these test elements, I'd test the original versus a new idea:
- I'd try different colors on this Call to Action button. In subsequent tests, I'd test the language of the button.
- I'd like to try a different treatment on this quick production information box. Creative styling (like what I've drawn, only good) that highlights the Call to Action button might prove effective.
- Hiding the Discussions box as a test element will tell us whether visitors are being distracted from converting.
- This smaller headline might prevent people from seeing the rest of content that would be more convincing.
- Same for the larger headline.
- I'd test hiding this advertisement box. Once it's influence on conversion rates is identified, we can look at the ROI of the advertising revenue to see if it's worth keeping the ad.
- Lastly, I'd try moving this more straightforward and detailed description of the product near the top of the page.
All in all, that's 7 different elements to test. If we test only 2 versions of the Call to Action Button, we can do it in 8 experiments (the unique number of pages displayed) using a fractional factorial array. If Woot's traffic volume and conversion rate support 16 experiments, we can test 4 versions of the Call to Action button, as well as more versions of the other test elements, or a few completely new test elements.
A recent article about web optimization and the visitor experience got a lot of attention - The $300 Million Button. It's a really exciting read - I recommend you check it out even if you're non-technical.
To summarize, by removing a "Register" button, and adding reassuring "you can just buy" language, the e-commerce site increased their conversions by 45%. (Forty five percent is huge - 300,000,000 dollars is ginormous.
One lesson I see in these results is that if you test big things, you can see big results. It probably took a ton of work to set up business rules and back-end support so that visitors could simply buy an item, without signing up. This effort paid off. The results of the testing are just another example of users despising the sign up. Think about it - do you need another password to remember?
(I suspect that the "major e-commerce site" is Barnes and Noble. I received the "you don't have to sign up" experience when I was shopping recently - and it was quite influential. It convinced me to just buy the item I was looking for, instead of hunting around for some easier way.)
Jared Spool, the usability tester and article author, is rather famous within the web design industry. I really appreciate Spool's sharing of his results, and the excitement that this article are bringing to the optimization industry. Both Spool and Luke Wroblewski will be speaking at a conference in Seattle in spring. I hope I get to go.
More blogs about optimization: