You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

Here's a few things I've discovered through a LOT of testing

These are some of the findings I've discovered through many years of testing in a bid to retain this valuable knowledge and avoid past mistakes, I'm sharing them with you now. You're welcome : )

Some key testing findings.....
 
1. Place Call-to-Action on bottom right. Always.
2. What are the 3 main features/benefits – ‘unique value proposition? Now are they immediately obvious on your page? No? Put them there. 
3. Do you have credibility indicators for your website; testimonials, reviews, awards, stats? Use them. Ideally as secondary content. By that I mean placed in content that isn't a lead up to your main Call To Action but in content below or to the side of it.
4. If you can drive people to a product page instead of a hub/portfolio page, do it. 
5. Avoid additional links on your page. Generally speaking these are just leakage points. Customers will use them given the opportunity.
6. Strongly differentiate between products where they are jointly visible on the same page.
7. Is your copy generating any fear? If you have to throw up a negative messages relating to your product such as "this *** if used incorrectly will kill you" consider the tone of voice in which you do it.
8. We’re not selling features, we’re selling solutions to customer problems. Need led propositions works. Always.
9. White space can be used to make a page look less ominous. Too much and it makes it look amateurish.Get an eye for balance.
10. Too much of one colour on a page leads to information blindness. Avoid the Sea of Red/black/yellow.
11. Anything that rotates or animates tends to draw focus to your static secondary content. Place your proposition in the static content.
12. Ensure continuity of look & feel in the journey. You’ll increase drop-out if you ask a customer to go through multiple brand look & feel experiences. Continuity and flow.
13. Your existing customers are considerably more likely to buy from you than new customers. Differentiate between the two segments whenever possible. Use a sense of loyalty.
14. Remove choices. Yes you should always support multichannel but be aware if you offer a CTA to offline channels you are generally loosening your online sales funnel.
15. Understand where the visitor came from. If you said something in a PPC Ad make sure it’s reiterated in your page front & centre. If they came from a Partner/Aggregator  they are a ‘hotter’ lead and need reassurance that you’re up to the job of giving them a solution to their ‘need’.
16. People don't read on the Web. Study after study has shown that less content on a landing page leads to higher conversion rates. Ruthlessly edit your text down to simple headlines and short, bulleted lists. Cut out the self-promoting marketing speak that people won't read anyway. You can link to detailed information on supporting pages.
17. Despite what many people say there IS a page fold. Be aware of it and if it’s crucial to the user journey and the value proposition it SHOULD be above the page fold.

NEVER SECOND GUESS THE CUSTOMER. TEST MULTIPLE EXPERIENCES & THEORIES. LET THE PUBLIC DECIDE WHAT WORKS.

Happy testing : )

Letting go

The human capacity for subjective or indeed plain sentiment in web design is fascinating. As optimisation specialists we testers or CRO experts hold little stock by either of these two concepts and have sacrificed both these a longtime ago in favour of asking the general population what web experience works best. And yet you can conduct some quite clinical testing on the value of a specific web design or element and put hard facts & figures behind its actual value in the user journey but still be told THAT widget or page design is staying because the product team wanted & designed it. Classic case & point this week in my world of testing. We have a web page where the primary call to action is a savings calculator. Some indepth analysis showed that visitors who engaged with this widget converted at just 1%. Conversly visitors who didn't engage with the calculator converted at a staggering 12% by comparison. MVT testing backed this up conclusively. This calculator deserved no place in the sales funnel. Presented with the hard, black & white facts the stakeholder was intent on retaining this ill conceived widget because it was the baby of person X and cost £££££ to build. In conclusion, sometimes you can never win out with testing. Sometimes subjectivity & sentiment wins despite evidence to the contrary. Sometimes either the client or you need to learn to let go. Be strong. Happy testing : )

The vanity of paying for web analytics tools

If it's not obvious then let me come out and say it for the record. I am a massive fan of Google Analtics.
I love this tool for so many reasons. Here's my top 4:
  • It's always improving, new features added almost on a quarterly basis
  • It has almost everything you could possibly need as a professional web analyst
  • It's easy to implement. Tagging is straight forward and it's hard to get it wrong.
  • It's FREE
These are usually the very reasons I hate alternative solutions because they are either lacking in one or all four. You pay stupid $$$ annually for a huge, clunky solution that was probably cutting edge 5 years ago but has languished since in the R&D field for as many years. Tagging will take forever and will be a hateful, hateful experience requiring enormous and valuable IT resource to implement. You'll eventually get it up and running but then realise that one of the key metrics is missing from your dashboard. An early evening call to Seattle will get you talking to the vendor support team who will make it clear that the sales team shouldn't have told you that THAT metric was a feature of your shiny new analytics package. Etc, etc.

To put some qualification to this I can talk from experience of using or evaluating some of the key players in this market who I wont mention or embarrass here but if you're in the states it's odds on favourite your corporation is or has used one of the worst culprits out there.

Which leads me on to my issue of paying for off the shelf web analytics.
Right now the key people in organisations feel a sense of paranoia that if they aren't paying some vendor $$$$ periodically for a high end solution which they will probably only be using 10% capacity of on a regular basis they are somehow doing their Digital strategy a disservice. They feel they are somehow not up with the competition. This is nonsense. If you or your analytics team cant get the meat and veg data they need to get meaningful insight into your online customer experience they should probably take a long hard look at themselves and stop blaming their tools : ) Go ahead, pay that £90K per annum for your 24/7 dedicated management contract, with it's Implementation Specialist and certified Partner support, but don't get caught with your pants down when GA launches it's latest Gizmo, Real-time reporting featurette and you deleted those GA tags from your site 2 months ago because your company didn't want to be seen using a free solution as your primary web data provider.

Happy testing : )





Maxymisers Visual Campaign Builder (VCB) test tool

Updated: 19th August 2014.

If you've been using Maxymiser for your MVT testing in the past you may already appreciate this very competent testing tool. Recently the Maxymiser team have developed a new feature that I've been very keen to get my hands on, their Visual Campaign Builder, aka, VCB.

What is it?

Well essentially it's a way to build and launch your own tests very quickly using the MaxTest solution (and leverage the MaxSegment solution too if you have that) using a WYSIWYG on page wizard. You navigate to the test page, create a new test campaign, create new variants, drag and drop assets, change copy and content, assign an existing Action for action tracking and then publish it.

I've trailed two tests to date, one where I'm introducing a new banner to a page and another where I've embedded a YouTube video. Reporting is exactly the same as conventional testing whereby you log into the Maxymiser report console.

Where is it?
Log into the Maxymiser console and navigate to the Campaigns page, you will see a button for the VCB in top right side...

Once it's enabled for your site, when you click the VCB link it should the option to create or edit an existing campaign...


And how do I use it?

Well you should of course get the Maxymiser team to talk you through the first couple of set ups for VCB testing but here's the easy process in summary...



1. Create Campaign.
2. Give it a name.
3. Choose Audience. Here you can target visitors by device, browser, geography, behavior or just choose to test with all visitors.
4. Select the Page(s) you wish to test on by page URL. Cut & Paste the URL into the the URL box provided and add to campaign. At this point you may want to 'Include any query string parameters' or 'Include both HTTP and HTTPS protocols'
5. The next step is the Content creation step for your test. Here you can either add or remove content or create alternative experiences for your test variants. Tip: This is the tricky part of the whole endeavor and I strongly recommend you test how your variants are looking in all browsers and devices you wish to test against.
6. Next step is to select the Actions. Give it a primary action from your drop-down. If you don't have an appropriate action to track on your test you will need to speak to Maxymiser. This can be a stumbling block for getting a test straight out there but if you've got a suite of test actions you've used before there's potential to re-purpose those for a VCB test as long as it doesn't conflict with any other tests you may be running at the same time.
7. Review & Publish. From here you're more or less going through the normal test publishing process, for example, adding change comments and going via the publishing console to send your test live. Tip: You may get a warning about new pages being mapped and might take an empty publish first before you can publish your VCB test.

That is it in it's simplest form.

Good luck and happy testing : )







Click To Call testing

Consious I havn't posted for a bit. Apologies for my tardiness, just busy with the day job : )

Dead quick summary of our first dedicated smartphone app MVT test using Maxymiser. We were introducing a 0330 phone number on the app for the first time. In the UK this meant that it was actually not too expensive to use your mobile phone to contact our call centre. We wanted to get the best possible call to action for the click to call button from the outset. So we engaged the testing team and we tested several competing button designs and discovered that a nice green button  that closely mimicked the native call button of a smartphone produced the highest click/call volumes. This acheived a 16%+ uplift in fact over competing designs. The take away from this I guess is that familiarity drives positie behaviour.