You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

Exclude ineffective campaign traffic from your test audience [TEST 2]

We've recently conducted another multivariate/multivariable test on a key landing page we have on the website. We were looking to optimise this product page and then use this optimised page designs to test against various landing page designs in the future. This test also gave us an indication of what sort of improvements we could expect by testing large changes in page layout within the existing product page template.
As usual we looked to increase throughput of the product page by 5% and improve out visitor to application ratio by 3% (this is always the target for each of our tests).
So we ran a multivariate test on the one of our product pages testing the following elements:
· Product description title
· Bullet point heading
· Bullet point wording
· Apply online button copy
The image below shows on the left the winning, optimised page after testing and on the right the default, existing page design.

We ran the test continuously for 34 days. For the last week of the test we had a situation where we had 3 page designs all showing similar improvements in conversion. As the difference between the 3 were minor it was decided that rather than wait another month or two for one combination reach a 95% significance we would end the product page test prematurely. The winning combination had achieved a 57.42% confidence rate.
It should be noted that campaign activity has significantly impacted on the results for the default design in this test. The issue was that the majority of the 383k visits from campaign activity close the browser instantly (we looked at samples and on one site a large window displayed our product page over the homepage for a mobile phone site, preventing the visitor from seeing the phone site offers). When the browser was closed quickly (before the test variant content was served) then testing tool records the session as 'Default'. The by-product of this is conversion of the Default looks worse that it actually is.
Anyway, the winning page design resulted in a 3.56% conversion rate of visitors submitting an application. Due to issues with campaign activity (discussed later), the testing report console showed that the default design converted at a 0.95% rate but after some further analysis (excluding campaign activity) we believe a fairer value to be around 2.68%. This is an improvement of 32% compared to the default design.
So our findings were:
Buttons
A small red button was the best performing in the test and should be used as the default apply button throughout the site.
Key features
The two best performing body copy designs both highlight headline key features of the product above the full detailed bullet points.
Page headings
Page headings are one of the most important elements the conversion of a page and should not be changed without testing.
Campaign activity
From now on we will always exclude all Campaign activity from all future tests.

SEO and CRO - a marriage made in hell?


There seems to be a bit of noise surrounding the impact of Conversion Rate Optimization activity (CRO, MVT, AB testing etc) upon Search Engine Optimization (SEO). Basically the idea runs that if your rendering test variant content on a page during a test that doesn't always contain the keywords that promote or sustain your websites search rankings then the impact can be a negative effect on your SEO activity.


It seems it depends entirely upon your choice of test tool and experiment implementation methodology. With the exception of SiteSpect most CRO testing tools employ JavaScript in some means to render the test on the page which means that search engines robots and crawlers don't get to index the test content, they fall through into your default content. Apparently the good people at Google et al want their indexers to be part of the test.


From our own experience of conducting MVT and AB testing, which incidentally involves using a tool that employs the JavaScript method for content rendering, we've understandably had a degree of pressure from our SEO colleagues to minimize impact on their area during the course of the test. So we've arrived at a process that helps everyone, more or less. During the early part of our MVT tests we exclude all our search traffic, both paid and unpaid. We do this by looking for specific URL parameters that identify search traffic during the test page load event. If the visitor is classed as search traffic they fall through the test and are presented with the default (non-test) page content, as are the search engine indexing agents. We monitor the test variant performances and after a week or so (given enough traffic) we cull out the variants that perform in an adversely negative way (see early post regarding this culling procedure). Then, when we have a test situation that in theory is performing in a wholly positive way, when conversions are up over the original default, at that point we stop excluding the search traffic and bring them back into the test audience. This way we get more traffic volume to feed the test and the SEO benefits from an optimised content.


I think the concept we should always be focusing on overall in regards to SEO and CRO is that if during testing you can impact on SEO rankings in the short-term, in the long-term the site will benefit from the optimised content and an uplift in conversion overall.

UPDATE: 2nd Augst 2011 ~ Here's  a useful statement from Google on the use of Google Optimizer and the affect on the SEO ranking

Cover story - what magazines can teach us

So what do magazines like Heat, OK, Glamour and the National Enquirer have to do with Optimization and multivariate testing?





Well as far as those glossies are concerned and Web Optimization we're being told as marketeers to adopt the same design model these magazines use to get countless people to buy them week after week. These publications obviously have to compete in a fierce market where the cover is the singular entity that can make the difference between a huge circulation or being just another magazine on the shelf. When constructing your sales message or call to action we should endevour to recreate that strong headline message or cover story that makes a customer buy, purchase, convert. According to industry analysts 80% of the purchasing decision is based upon what's on the front cover of a magazine. These guys are the experts, they know the value of catching the eye and conveying the message in a single shot, so surely our web pages should aspire to do the same whenever we seek to acheive the biggest bang for our buck?

Now in terms of magazines and multivariate testing the National Enquirer, a tabloid magazine, famous for running, amongst other things, items on alien abduction and conspiracy theories, has a massive circulation of around 2.5 million readers in the US.
Faced with a downturn in sales in recent times the publishers decided to enlist a company to undertake an MVT experiment. This company developed several versions of the same magazine cover and distributed them in a limited number of outlets, available only in a couple of states and then monitored which one had the biggest sales. Despite a large cost involved with running the different covers the Enquirer reported a 20% increase in sales as a direct result of the experiment. I think this illustrates the reach and scope of multivariate testing these days showing that's not just confined to the web.

Quick wins for web optimization

If I had to summarise the key points that I have heard again and again from experienced MVT testers, be it web based or published literature I would say the following seem to be touted regularly as quick win changes you can make on your site to gain conversion uplift. From personal experience some work and some just dont. As you will see we're yet to put them all into practice:

1. If you have a call to action - make it BIG.

We have an MVT test running currently that has around 8 different variants of apply buttons. Some are big and some are small in a variety of colours. Currently a large version of a newly designed button is beating the rest of the pack with a 18% uplift in conversion (It's orange by the way). Optimization experts generally recommend making your button big and red to draw as much attention to it as possible. We had one such variant and it performed so badly that we have culled it from the test. At least we've tested it though and put that theory to bed.

2. Have more than one call to action on page.

This should be a no brainer really, but when your running a test you need to be able to track people clicking on such buttons during the course of the test. If you have multiple buttons this can be tricky and from my own experience there are limitations as to how many 'actions' you'll want to feasibly track during the testing period.

3. Have a testimonial of your product on the page.

We havn't been able to test the use of this one yet in an MVT situation yet but we have tests scheduled that will exhaustively test this theory in the future. We have however A/B tested the use of testimonials within the customer journey and there was no real uplift to speak of.

4. Use photos of people in your page.

We've had an unofficial embargo on images on our site until recently due to a perceived impact on SEO rankings. This embargo has lifted and we can now start to use images with avengence once more. We've got a Homepage MVT test scheduled which will pit images against text based variants and I'll let you know how that pans out.

5. Simplify your copy.

If you have loads of copy it's daunting, off-putting and boring. Find ways to either reduce your copy or orgainise it in such a way as to make it look less overwhelming. Try tabbed content, or use expand and collapse containers on a page. The key winning element of our first MVT test was to drastically reduce and simplify the copy on the page by moving some to a pop-up window. We've also actually been aided by recently having to comply with the Plain English campaign. Initially it's a headache getting your content to comply with their rules, but once you've done it it can make a world of difference to the way a message scans to the end user.

I'll personally try to continue working these concepts into each of the tests we
undertake as we move forward. Where they don't work I'll hopefully learn from that and carry that forward to the next test. The key thing is to evolve your experiments as you go, bearing in mind what works for one person may not work for the other. Such simple changes can however be useful if there's a tangible pressure to gain results and an uplift in sales.

Go LARGE


If there's a big learning we've made since starting multivariate testing on our website it's that if you're going to test different content, you're alternative content needs to be as radically different as possible to your existing content to get anything out of a test.
We currently have a MVT test running on a product page on the site. It has 4 MaxyBoxes* , each MaxyBox has around 3-6 alternative variants being served within it. The test has been going around 3 weeks now and for a couple of reasons which I wont go into now, we've had to restart the test. However, the alternative content is not too different from the existing content. We basically have text that has subtle changes in it from the original. Essentially the same message just re-jigged to see if the way we word a benefit or feature of a product has any uplift in conversion. Looking at the reports for this test the test variants performances are fairly uniform with no clear winner emerging from the pack and our mistake of keeping the changes between each variant subtle is clear to see. I think from this test alone we can safely say that subtle changes just dont work in an MVT test. You would need months and months of relatively high traffic volumes to get a result, or what we call statistical significance.
The reason this current test ended up with the 'softer', or subtle variants is that this test was planned when we thought that MVT testing could yield results by making the small changes that would maybe add up to a bigger result.
So the big message is that if you really want to get results quickly from your tests you have to go large. You have to go with the bolder ideas that represent a shift from the existing content.
*MaxyBoxes are areas of the page, usually Divs that are tagged for serving multivariate test variant content from our testing tool Maxymiser.

Culling your test variants


As an Optimization team, we are new to the whole multivariate testing business. We are more than aware that we dont always do things by the book. So when we undertake a multivariate test we dont always adhere to the basic principles of testing.

The biggest rule we tend to ignore is 'Do not tamper with your test'. The trouble is we always have to have an eye on the bottom line. So we are constantly asking ourselves whether what we're undertaking in terms of testing is not impacting upon our sales in a negative way? Are we actually reducing the conversion rate on the website?
My colleague generally monitors what's going on downstream during a test and looks at the basic application submittion rates for our products. If he notices a downturn in conversion rate during the course of a test we get a bit nervous.

Thankfully our multivariate testing tool, Maxymiser allows us to look at how individual variants are performing. If after a period (usually around one week into a test in our case) we start to see a downturn we'll start to examine closely which variants we can 'cull' from the test.
Once we highlight the under-performers we then downweight* them out of the test entirely. This is beneficial for two reasons:

1. You minimise negative impact on conversion and sales.

2. You reduce the number of page combinations in the test.

The lower the number of page combinations the quicker your test period. This is great for us because of the second rule of testing that we frequently ignore 'Allocate enough time for testing'. Basically speaking we run tests for a far shorter period than is recommended.
Most tests, given enough visitor traffic to your site run anywhere from 4 to 10 weeks, or even longer. We tend to have ran tests from 2 to 6 weeks. Our excuse for this is that there is so much other activity going on on the website at any given time by other people that we have a very narrow window in which to test and get a result.

Another key thing when planning your MVT test is knowing how much traffic you get to your site and whether you've got enough traffic to run all your page combinations and see an outright winner at the end of your test. So far we've been reasonably lucky in that we've had enough traffic to run the tests for a relatively short period and still acheive a winner.

Obviously ignoring key testing rules and principles is not recommended. But if lke our Optimzation team, you're stuck between a rock and a hard place, and there's a certain need to get some kind of testing done our early experiences have shown that you can bend the rules to get some kind of learning or outcome in a short space of time.
* In multivariate testing, each variant is usually allocated a weighting. For example, if you give a variant a 50 weighting in the test console it will be served 50% of the time, while the default content is served the other 50% of the time.

Our 1st multivariate test - Killer Questions




We have an application form that is used for various products, in this case we focused on the one used for an online current account.
The first page of this app form is what we call the 'Killer Questions' page. This is where you set the scene for the applicant in terms of the criteria they have to meet to continue with their application and what bits of information they're going to need to have to hand to complete the application process.
Typically we could expect around a 35-40% drop-off rate for this page. This could of course be because people simply dont meet the criteria and depart the app process. Alternatively it could be that the page design, which was by nature quite weighty in terms of the volume of info needed to be conveyed to the user at that time. Based upon the hypothesis that the content may be at fault we decided this was the best candidate for hosting our first multivariate test. Here we would test different content on the killer questions page to see if an uplift in number of people commencing the application could be acheived. We looked to gain a 5% uplift overall.
The Killer Questions page looked rather like this.
The 2 boxes are 'MaxyBoxes'.
To effectively carry out multivariate testing on your website you're going to need a tool to host & serve the alternative page content you want to test and additionally a means of reporting the progress and outcome of your test. We opted for Maxymiser to acheive all this. The MaxyBoxes you see there represent the areas of the page that will have variant content served to the visitor.
So basically in this particular test we wanted to see what would happen when we served different security images in the top-right of the page and different page content or layout in the second MaxyBox on the page (the larger box).
We split the traffic 50-50. Half being served the default content, the other half a combination of new test variants we'd come up with. Coming up with alternative content is a whole new learning experience in itself as no one in our Optimization team is a copywriter or creative type but I'll go into this in more detail in a later post.
After several weeks of multivariate testing we ended up with a winner...

You may notice that this has a different security image to the original, has simplified page copy, all the heavy and intimidating legal wording was removed to a pop-up link.
This page design resulted in 8% more people passing through the killer questions page compared to the default design. It also resulted in 5% more people going on to submit the application. When accept rate* was taken into account the new page design resulted in a 20% increase in productivity over the default design.
This test, although it's our first ever, will probably turn out to be our most important. This is because it was at the very last point in the visitors journey that we can influence whether they buy into the product or not. It's make or break time when they're on the doorstep of the application form, the point at which they choose to purchase or run away.
* note: You can apply for current account - but you might not necessarily be accepted as a customer, hence an accept rate.