A/B Testing

Get Started. It's Free
or sign up with your email address
A/B Testing by Mind Map: A/B Testing

1. Or a easier alternative with extra feature?

1.1. Visual Website Optimizer

2. What to test?

2.1. What to test depends on the goals, and every A/B testing is unique, however, certain elements are usually tested :

2.1.1. - The call to action’s (i.e. the button’s) wording, size, color and placement, - Headline or product description, - Form’s length and types of fields, - Layout and style of website, - Product pricing and promotional offers, - Images on landing and product pages, - Amount of text on the page (short vs. long).

3. Tools for A/B Testing

3.1. want a free basic tool and don’t mind fiddling with HTML and JavaScript?

3.1.1. Google Website Optimizer

4. Setting Up A/B Testing

4.1. Replace the element to be tested before the page loads If you are testing a single element on a Web page—say, the sign-up button—then you’ll need to create variations of that button (in HTML) in your testing tool. When the test is live, the A/B tool will randomly replace the original button on the page with one of the variations before displaying the page to the visitor.

4.2. Redirect to another page If you want to A/B test an entire page—say, a green theme vs. a red theme—then you’ll need to create and upload a new page on your website. For example, if your home page is http://www.example.com/index.html, then you’ll need to create a variation located at http://www.example.com/index1.html. When the test runs, your tool will redirect some visitors to one of your alternate URLs.

5. Pros and Cons

5.1. A/B Testing is: -fast -test reality, not theory -Quantifiable -Accurate

5.2. A/B Testing can: -Hurt Web Site Results -Missing the WHY factor -not predictive -dependent on traffic

6. What is A/B Testing?

6.1. As the name have suggested, there are two version of elements (A and B) and a metrics that defines success. A and B will underwent testing to find out which version is more successful and that version will be selected for real world use.

7. How does it related to web analytics?

7.1. By doing A/B Testing on the web, where A is the existing design of the web while B is the new design of the web. Traffic to the web can be split between the two version and use the data to measure important metric such as conversion rate etc. In the end, the most successful version will be selected.

8. Do's and Don'ts

8.1. DON’TS When doing A/B testing, never ever wait to test the variation until after you’ve tested the control. Always test both versions simultaneously. If you test one version one week and the second the next, you’re doing it wrong. It’s possible that version B was actually worse but you just happened to have better sales while testing it. Always split traffic between two versions. Don’t conclude too early. There is a concept called “statistical confidence” that determines whether your test results are significant (that is, whether you should take the results seriously). It prevents you from reading too much into the results if you have only a few conversions or visitors for each variation. Most A/B testing tools report statistical confidence, but if you are testing manually, consider accounting for it with an online calculator. Don’t surprise regular visitors. If you are testing a core part of your website, include only new visitors in the test. You want to avoid shocking regular visitors, especially because the variations may not ultimately be implemented. Don’t let your gut feeling overrule test results. The winners in A/B tests are often surprising or unintuitive. On a green-themed website, a stark red button could emerge as the winner. Even if the red button isn’t easy on the eye, don’t reject it outright. Your goal with the test is a better conversion rate, not aesthetics, so don’t reject the results because of your arbitrary judgment.

8.2. DO’S Know how long to run a test before giving up. Giving up too early can cost you because you may have gotten meaningful results had you waited a little longer. Giving up too late isn’t good either, because poorly performing variations could cost you conversions and sales. Use a calculator (like this one) to determine exactly how long to run a test before giving up. Show repeat visitors the same variations. Your tool should have a mechanism for remembering which variation a visitor has seen. This prevents blunders, such as showing a user a different price or a different promotional offer. Make your A/B test consistent across the whole website. If you are testing a sign-up button that appears in multiple locations, then a visitor should see the same variation everywhere. Showing one variation on page 1 and another variation on page 2 will skew the results. Do many A/B tests. Let’s face it: chances are, your first A/B test will turn out a lemon. But don’t despair. An A/B test can have only three outcomes: no result, a negative result or a positive result. The key to optimizing conversion rates is to do a ton of A/B tests, so that all positive results add up to a huge boost to your sales and achieved goals.