You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

Optimizely Experience London event 9th Oct 2014





I attended this very popular event today where we were given several case studies from various clients including The Student Room*, and Haymarket.

Optimizely looks a very capable testing tool with a leaning towards a self managed service or buying Optimizely Partner consultancy.

Key takeaways for me today were:

1. Optimizing for specific audiences. Split out and target sources of business and then focus on improving them individually, i.e spend 20% effort to gain an 80% return.

2. Use mobile wrappers to do proof of concept testing before spending big on a native app development. 

3. The Optimizely native app testing tool is amazing and I want it! An engaging demo of changing the Flappy Bird App on the fly. This was the big one for me today. Effortless AB and MVT testing of an iOS native app. To me this tool is what sets Optimizely apart from the pack. A very trivial demo using a popular game to show the power to change a user experience within an app with just a few clicks? This wasn't creating a test before the app was compiled, this was after the app was live.



In summary a good day to meet others operating with the same issues as me in Conversion Rate Optimization. Lots of Hippos in organisations still making subjective decisions about web design. An expectation of a successful test everytime by the business.

* Love this. The Student Room drag unwitting students off the streets of Brighton and ask them their opinion on their website and then give them a tenner for their troubles. This kind of 'street' qualitative testing I like.


Increasing average basket value

Back in 2009 I wrote an article on increasing average spend citing the creation of the Wilkos 97p shop. Many aspects of this blog have witnessed the web evolve and run with some ideas and drop others. At the time the 97p shop was a novelty, now some 5 years later the $1 or £1 departments are extremely common across ecommerce websites across the web. These are both an easy and powerful tool in driving an uplift in average basket value. As every seasoned online retailer knows it is absolutely crucial  to increase the average balance  of your online shopping trolley if you're to make the difference between a profitable or loss making retailer. They are so effective as a revenue driver because they are so flexible, you can deploy them at the front end of the sales funnel or in the checkout process. Both are effective strategies. Adding low value items to your basket is the obvious method as the customer has already broken through any 'objections' or 'resistance factors' to purchase and adding ancillaries is exactly the same as checking out in a bricks and mortar shop.  Shiny, low value items at the checkout draw us in and we spend less time considering the purchase decision as the mayor hurdle has already been breached with your other items in the trolley.

When deployed up front as in the case of Viking Direct  we can see it is selling  message that they have office supplies that even the most modest of wallets can afford and are breaking down the purchase resistance right at the start of the journey. Either way, if you're in retail deploying this technique in your Optimisation toolkit is highly likely to yield dividends as the last 5 years has proven.



It's interesting to note that Wilkos themselves have abandoned there 97p shop. However the phenomena has perforated very much into the world of online movie rentals.The likes of iTunes, Amazon or previously LoveFilm, Blockbuster (now defunct) and currently BlinkBox have all had their 99c or 99p rental sections although here the act of movie rental is a departure from the rest of retail as the primary product is the movie rental/sale. What it is still doing is breaking down resistance. Someone who hasn't rented online before might be willing to try the experience for a modest outlay. Additionally it helps to keep those already on boarded in the practice of renting. Again it's a win-win selling and marketing technique.

http://www.blinkbox.com
 

Netflix a/b testing

Last week I came across an item on the Netflix Tech Blog that showed a slide and presentation of their plans for a/b testing their new web interface running on Node.js

It was all very worthy stuff and I wasn't entirely sure either what they were doing or trying to achieve by this testing; the body of the content spent most of the time waxing lyrical about not having to touch they're underlying system platform. I guess this is a big deal for Netflix! Anyway as an avid Netflix user I believe I have witnessed their testing strategy firsthand but in a rather perplexing fashion. Below is a screengrab of what I'm talking about. Under documentaries recommendations I frequently see at least two listings of 'The Long Way Down'. These are for the same programm/episode etc but one is shown in some weird Instagram effect the second in a monochrome  colour with a variation in image.

Now this is either a mistake or an entirely new way of a/b testing. I've never seen test variations (if that's what they are) presented side by side before. If this is a legitimate test is it not a rather crude means of promoting an episode to the end user? What is the end goal to see whether fans of Instagram filters opt for one creative over another? Is the 'Long Way Down' just such unmissable viewing entertainment that it warrants mentioning A LOT! Answers on a postcard. Happy testing : )


What is MVT testing?

In it's simplest form MVT or multivariate testing is where you test alternative experiences of an existing webpage against the current page design or user journey. Unlike AB or Split testing where you test one webpage design against another, in MVT you test a combination of page elements like alternative page copy, headings, images, buttons and so on all at once to see which specific combination of alternative designs work best in driving page visitors or customers towards an end goal of your choosing.

All testing, regardless of whether its an AB or MVT test requires two things. Time and Traffic.  By that I mean you need to give a test enough time to establish what's called statistical significance, to basically reach a stable conclusion, and you will also require a volume of website traffic to churn through your test experiences. Typically you need less traffic to run through an AB test which normally only has a couple of alternative page designs to get through. In an MVT test you will typically have a higher number of test combinations to get through and therefore will require enough traffic to 'feed' your test.

In all test situations you continue to send a proportion of traffic to your existing page or default page to benchmark your alternative experiences against it. This is where you are calculationg any uplift in conversion, be that click through rate or the number of people getting to and passing through your check out process.

Testing outcomes can be positive or negative. A positive outcome might be that you test a red check out button versus a blue check out button and the red button delivers 8% more purchases. A negative outcome might be that the same button delivers a -5% drop in purchases. However, either way you have a valuable learning that tells you what changes to make to your site or inform further testing.

The Gutenberg rule - revisited

In a previous post about the Gutenberg rule I mentioned that we'd achieved some good AB and MVT testing results using this design principle. The principle basically works off the theory that humans subconsciously scan a print or web page from top left to bottom right and then loop back up the page. So allowing this principle to manage your page layout seems to have become the norm in web design placing buttons and other key calls to action in the 'fertile' areas, typically bottom right.

Increasingly though I notice that more and more people are breaking from this practice whether through test learnings or just asthetic decisions. An example being Aviva.co.uk .As you can see below their landing page for car insurance aligns much of it's content to the left of the page. Having become comfortable with placing the onward journey point at the bottom right this just jars. Having said that it's important to challenge the norms to see what resonates with the customer. I may have to test this layout myself to check we're not missing a trick.

Aviva.co.uk
In the meantime here's a mock-up of an alternative design I would test based on established findings.

aviva.co.uk

Happy testing : )

Page fold - browser size tool revisited

Problem: Recently Google have moved their Browser Size Tool into Google Analytics. For me and many other sites this means it simply no longer works as the 'In-page' reporting showing heatmaps and so on doesn't work or execute with our lovely CMS.Boo!
Solution: There's a free alternative. Yay! Called whereisthefold.com.

This is what it looks like on this blog.


a/b testing tools comparison by popularity

As part of my role I have to routinely compare A/B testing and MVT testing tools, suppliers and solutions. Many different sites list these suppliers but I've tried to use a few public metrics to determine how popular these tools are with the general public. I call it 'Reach ranking' in that suppliers are ordered 1). by their Alexa.com rankings, 2). then their average monthly searches for their test platform as determined by Google Trends and finally 3). how many clients they list (if at all) on their website.

It's also interesting to see so many new companies and start-ups there are in the CRO business since I last did this audit back in 2010 and also how many providers are no longer around. It's a fierce and competitive business....
So here's what I've found as at August 2014. Sources: Alexa.com, Google Adwords and Own Vendor sites
updated 13th Oct 2014
Reach ranking Vendor Alexa volumes YTD Avg. monthly searches for product No. of publically listed clients
1 SiteSpect 381,175  98,600 55
2 adlucent 324,719  27,000 0
3 Google Experiments 10,806,000 802,000 0
4 Conversion multiplier 3,987,218 73,600 0
5 Global Maxer 3,210,851 51,000 0
6 Conductrics 2,333,087 14,700 0
7 Qubit 1,506,519 25,880 131
8 Avenso 1,020,946 1,030 32
9 Clickthroo 974,000 47,819 0
10 Adobe Test and Target 945,671 78,000 0
11 Visual Website Optimizer 908,000 13,102 0
12 Accenture 860,000 13,240 0
13 Webtrends 763,000 19,600 0
14 Hi Conversion 722,004 47,000 44
15 Get Smart Content 673,173 3,740 14
16 Convert 672,000 126,000 0
17 Taplytics 283,016 12,789 6
18 Optimizely 239,000 58,898 25
19 Maxymiser 149,697 6,980 78
20 Site Tuners  121,678 13,000 72
21 Autonomy 120,271 26,000 0
22 AB Tasty 93,689 27,000 31
23 Monetate 36,501 21,230 39
24 Unbounce 22,230 12,230 0
25 Hubspot 2,700 541 0
26 Genetify 0 6,830 0
27 Vanity 0 6,400 0

Happy testing : )

What's the process for deciding what to test?

I constantly have to remind myself of the process I should go through in identifying what I need to test, the best use of testing resource and how to get buy in. To this end I've done a mildly influenced by Mad Men presentation here:


P.S The online presentation tool I used was Emaze

The ethics & morality of A/B testing - #FacebookExperiment

I was recently sent this article from Techcrunch on the 'Morality of A/B testing'. In brief the article criticises Facebook for recently conducting an A/B test of its newsfeed.It manipulated a subset of it's users newsfeed so that some were shown a higher proportion of negative stories over positive to see if that would impact on these same users on posts; i.e. would they post more negative items as a consequence? It sounds like the outcome was yes, people were influenced to be more negative when exposed to negative news. Techcrunch however questions the morality and ethics of conducting such a test in the first place and that users should at least be offered an opt out citing that people with emotional issues had been dangerously exposed to this unwitting exercise. OK so you cannot lump all A/B testing exercises into the same boat with this one. Conventionally testing has been used to achieve a commercial or service outcome not an emotional change. Perhaps people are right that Facebook shouldn't take it upon themselves to manipulate peoples experiences in this way for their own ends. However a sense or perspective is needed here. It's Facebook we're talking about here after all. If your personality is so fragile that you need to be shielded from any potential bad experience, negative page content in this instance, should you even be on the internet in the first place? To me this is a veiled stab at big companies use of our data, which is fair enough in some instances where trust has been abused. I resent optimisation testing being used in this way, painted as some dark art that should not be deployed without peoples full consent. Secondly it smacks of the Nanny state. Why do we think we should cosset and protect adults in this way? I worry that if this starts to go down the whole cookie directive opt in/out route we'll damage online testing and experimentation and inhibit a whole industry in the process. If we start telling people that they are part of a test group or experiment online and that they can opt out of it you're going to compromise the fundamentals of testing for want of a disproportionate response to an innocuous practice. UPDATE: 3rd July 2014. Looks like the Facebook study has become a bigger issue now with the BBC now covering it here. UPDATE: 29th July 2014. This just in from the BBC, OK Cupid admit testing their subscribers to see if connects can be made with people not normally seen as a good match.

Google Analytics Intraday reporting

Here's a tip if you, like me need to get an intraday view of traffic volumes on your website. In the current version of GA go to Reporting |Audience | Engagement and select the current day from the calender. This report will give you your traffic volumes for the current day. If you need to see realtime volumes you can of course view the Realtime section of GA.

Throwing baby out with the bath water. Website redesign.

I have been working on producing a completely new redesign of our website for more months than I care to remember. As you can imagine with any large company website there have been many, many stakeholders, many, many opinions, many, many conflicts. Design-wise we have certainly utilised our CRO testing learning and thankfully applied those findings to our spanky new website. I have personally had to park my day to day mvt and split testing in this time to concentrate on delivering this site! Shiney and new to the public. And to be honest it's been a great experience. This is the second time I've worked on such a large redesign project, the first time around as a Web Developer, the second time on the Business side of things. Very exciting, 300 plus people involved, lots of £££. What's not to like? Well with all those stakeholders we've had a lot of subjective thinking feeding into content and design. Because I've adopted a more project delivery role focused on delivering this beast I have not fought the good CRO fight for two reasons as I see it. One. I have been sucked into the world of bringing something that looks aesthetically good to market which give good visual impact but perforamce-wise may prove under-performing. Two. Hitting the  reset button on our website means I am suddenly offered many new CRO testing opportunities. I can look to retest every redesigned experience. I can revisit the established learning and try new concepts, I can overturn that recently reinvigorated subjective thinking. Plus with the majority of all redesigns you lose certain features and aspects of design that have proved positive for conversion uplift.  So suddenly I get a fresh field of opportunity to prove the case for MVT, AB testing and CRO (again). All hail periodic website redesign! Happy testing : ) long live optimisation.

The Optimisation landscape has changed. Have you kept up?

A number of things have changed in the world of MVT testing, AB testing and Conversion Rate Optimization (CRO) generally over the last few years. The question we have to ask of ourselves and the people we work with is whether we have truly kept pace with these changes. I've asked this question a lot recently and a number of areas have been found wanting.

1. The economic collapse of  2008 changed everything both in terms of the economy but also the web, how eCommerce businesses either thrived or died. The reduction of cash flow in internet businesses both drove the argument for testing and improving your online experience to squeeze the very last drop of our traffic and it also threw greater intensity on the need to prove ROI . If CRO was ever seen as peripheral then this was the time to prove your worth and justify the ongoing investment as a core player in the online channel not waiting in the wings. Verdict: Done OK. Could do better.

2. Web traffic has radically shifted into different segments. The Mobile and Tablet device accounts for around 30% of web traffic at the moment and continues to rise. As a consequence traditional fixed web testing is becoming less and less relevant as each day passes. If you're not actively testing in these new segments now it may already be too late. Despite some early and ongoing dabbling in these emerging segments we have failed to make any real inroads in terms of replicating the successes achieved in desktop testing. Verdict: Strong start but faded fast.

3. Are still looking at the same things to test time and again? Sometimes you need to retest the same concepts and experiences especially those user journeys core to your sales and service funnel but sometimes you  need a fresh pair of eyes on your business to get new insight and ideas. Verdict: Good but room for improvement.

4. Your choice of testing vendor is still key. The things you look for in a testing partner are those that can help address the shortcomings mentioned above. Someone who does enable testing in those emerging segments, who continually helps evidence ROI for your CRO activity to the budget holders, and someone who is proactive and brings fresh ideas and insight to the table. More importantly now you need someone who wants to be embedded in your business and add value day in day out and doesn't get complacent. Verdict: TBC

I firmly believe that MVT and CRO can still offer huge uptick in web performance for the online player. I guess the biggest danger is falling into patterns of working and thinking that worked well in the past but don't necessarily work as well today. Continual self assessment and re-evaluation of all the components in your testing programme should still yield success but you might have to make some tough decisions and change to bring about renewed success.