One of our clients was embarking on a total redesign of their online marketing presence. An arduous process with a lot of decisions to make. Design decisions, user experience decisions, branding decisions, and functional decisions – just to a name a few. Every decision has potential impact on the bottom line, so how do you go about making them?
This client had one landing page with a design that been unchanged for a long time. Since it was the only landing page with the legacy design, they were interested in measuring how it performed against their proposed redesign, which their other landing pages had already been redesigned for.
What to Do?
We set up an A/B test with the control – the A version – being their existing layout and the test – the B version – redesigned to match their new landing pages. Half of the visitors were sent to the control and half were sent to the test.
Take a look at the two designs and try to guess what the results were.
You’ve got a 50-50 chance of guessing correctly, so let’s raise the stakes a little bit. Pretend it’s a landing page you are responsible for, and that one of the of the pages generated twice as many leads as the other (which it did). Now that half of your potential customers are at risk guessing doesn’t seem like such a good idea.
Well, it turns out the control – the old design – was twice as effective as the redesigned page. The visitors coming to that landing page were more inclined to respond to the messaging, layout, and contact form the client had been using all along. That’s the good news.
The bad news? If the old design works better than the new design, what does that mean for all the landing pages with the new design? Are they all under-performing? Are they turning away leads? Throwing away revenue on those pages? Surprisingly, the answer isn’t “Yes,” but rather, “Maybe.”
That’s Being Data-Driven
The immediate – and hasty – response in these situations is usually to roll out the winning design everywhere. But these landing pages were all parts of different campaigns targeting different audiences in different channels. Rolling out such drastic changes without testing would be just as reckless as updating the old page without testing. The preliminary test was designed to determine if there was a difference in performance between the pages, which it did. It established that one of the designs was more effective in a certain context. It didn’t establish that it was more effective in every context and, more importantly, it didn’t establish why.
Taking an iterative approach to design, allowing visitors to “vote” by their actions and reactions, and using that data to make sound decisions is what smart web marketing is all about.