Pew Pew Laser Blog

Code. Glass art. Games. Baking. Cats. From Seattle, Washington and various sundry satellite locations.

Blogs about optimization

Learn the Right Lesson From Your A/B Test.

6.15.2013

I can't remember exactly where, but a few weeks ago I read an article / blog post which stuck in my mind. The author stated that an A/B test had shown reduced visit times and a higher bounce rate when the article creation date was displayed at the top of the article rather than at the bottom. So the author kept the "date on bottom" aspect from the original design.

I understand the visitor behavior demonstrated by that test. When I'm looking for technical advice, I absolutely use an article's date as a signal of how trustworthy the information is. Three years ago - especially in web development time - is ancient history.

It seems to me that the blog author learned the wrong lesson from that A/B test. The visitors were saying "We prefer relevant and up-to date information". Leaving the article's date at the bottom of the article may have hurt the visitor's goodwill (aka trust in the brand) towards the site, and that's something that can't be A/B tested. A better solution would have been to add an update to the top of the articles which gives the visitor a current best practice or points them to a new, relevant article.

What Testing and Optimization Can't Do.

6.12.2010

Last time, I talked about what optimization testing can do for your site. Now, I'll talk about what it can't do.

What Optimization Testing Can Do.

6.6.2010

Optimization - testing to find out which elements drive desired visitor actions - can do a lot for your web pages.

Ready to start? Very good. If you've got no money, Google has a free optimization tool. Of course, my company has an optimization tool too.

What I'd Test.

2.9.2009

I've been imagining doing some optimization testing on Woot's new design. This is a bit of an intellectual exercise, and a bit of show-and-tell for designing a web page optimization test.

All tested elements should run concurrently, and for at least a week. Woot's conversion rate is probably highly dependent on what's for sale that day. If a test were run where the display of the elements were non-random (one element per day, for example), the results would be so heavily influenced by the desirability of the item for sale that the element's actual influence on conversion would be impossible to suss out. The site is highly dynamic, but even the dynamic elements can be tested using style changes and JavaScript.

I'd measure the results of 2 different actions - signing up a new user and purchasing the item. There's a good chance that the test elements would influence these events quite differently.

Screen shot of Woot with overlay of optimization test areas
Above is a screen shot with an overlay of the elements I'd test for optimization. (Click for a full size screen shot) For each of these test elements, I'd test the original versus a new idea:

  1. I'd try different colors on this Call to Action button. In subsequent tests, I'd test the language of the button.
  2. I'd like to try a different treatment on this quick production information box. Creative styling (like what I've drawn, only good) that highlights the Call to Action button might prove effective.
  3. Hiding the Discussions box as a test element will tell us whether visitors are being distracted from converting.
  4. This smaller headline might prevent people from seeing the rest of content that would be more convincing.
  5. Same for the larger headline.
  6. I'd test hiding this advertisement box. Once it's influence on conversion rates is identified, we can look at the ROI of the advertising revenue to see if it's worth keeping the ad.
  7. Lastly, I'd try moving this more straightforward and detailed description of the product near the top of the page.

All in all, that's 7 different elements to test. If we test only 2 versions of the Call to Action Button, we can do it in 8 experiments (the unique number of pages displayed) using a fractional factorial array. If Woot's traffic volume and conversion rate support 16 experiments, we can test 4 versions of the Call to Action button, as well as more versions of the other test elements, or a few completely new test elements.

300 Million Reasons to Optimize.

2.2.2009

A recent article about web optimization and the visitor experience got a lot of attention - The $300 Million Button. It's a really exciting read - I recommend you check it out even if you're non-technical.

To summarize, by removing a "Register" button, and adding reassuring "you can just buy" language, the e-commerce site increased their conversions by 45%. (Forty five percent is huge - 300,000,000 dollars is ginormous.

One lesson I see in these results is that if you test big things, you can see big results. It probably took a ton of work to set up business rules and back-end support so that visitors could simply buy an item, without signing up. This effort paid off. The results of the testing are just another example of users despising the sign up. Think about it - do you need another password to remember?

(I suspect that the "major e-commerce site" is Barnes and Noble. I received the "you don't have to sign up" experience when I was shopping recently - and it was quite influential. It convinced me to just buy the item I was looking for, instead of hunting around for some easier way.)

Jared Spool, the usability tester and article author, is rather famous within the web design industry. I really appreciate Spool's sharing of his results, and the excitement that this article are bringing to the optimization industry. Both Spool and Luke Wroblewski will be speaking at a conference in Seattle in spring. I hope I get to go.

More blogs about optimization: