Начать. Это бесплатно
или регистрация c помощью Вашего email-адреса
A/B Testing создатель Mind Map: A/B Testing

1. Pro's

1.1. Fast

1.1.1. it takes very little time to create a modified version of an existing web page that includes a modified item (like a new picture, new copy or other new element) and throw it up on your site Results can be gained quickly too, by splitting traffic 50-50 between the existing and the test version of the pages you can measure the results in short order (assuming you have traffic to your web site that is)

1.2. Tests reality, not theory

1.2.1. Obtaining real results from real users doing real things. That means you’re not using theory, estimates, forecasts, predictions, your Horoscope or Fairy cards to base decisions on.

1.3. Quantifiable

1.3.1. provides actual numbers that can be compared, sliced and diced to evaluate results. Interaction, conversion, number of abandonments – all those numbers are accessible during and after testing. No guessing required!

1.4. Accurate

1.4.1. A/B testing is 100% accurate ASSUMING you have statistically significant data. Understanding error rate and statistical significance and all those other statistics terms you were supposed to be learning in Statistics class is very important.

2. Con's

2.1. Can Hurt Web Site Results

2.1.1. In A/B testing that bad decision, meaning what you thought was an excellent B test item in an A/B test, may go terribly wrong. When that happens (and it will, don’t forget your whole “hair style” incident) you’re going to end up hurting your overall web site results. Be afraid, be very afraid.

2.2. Missing the “Why”

2.2.1. A/B web site testing does not explore the rationale behavioral decisions that are being made by the web site visitors. You’ll see the numbers and results of the test, but you won’t know for sure WHY all those web site visitors picked the item they picked. You may have theories, but you won’t know for sure. Worse, you won’t know why they DIDN’T pick the that new and shiny and (what you thought was) totally great B item.

2.3. Not Predictive

2.3.1. It can’t be used to predict future design change impacts. To a certain extent this means that you’re always stuck doing A/B testing. At some point it would be handy to be able to predict if a whole new web page, web site or application will (or won’t) work, based on fairly accurate predictions of use, without the hassle of actually having to create the web page, web site or application then test it.

2.4. Needs Traffic

2.4.1. In order to provide quick, consistent and reliable results, you’re going to need a pretty good amount of traffic to your web page to run an A/B test.

3. What Is it?

3.1. Consist of two versions of an element (A and B) and a metric that defines success such as conversion rate, sales, bounce rate. To determine which version is better, we have to subject both versions to experimentation simultaneously. At the end, we measure which version is more successful and select that version for real-world use

4. What do we test?

4.1. Depend on our goals such as

4.1.1. 1. The call to action’s (i.e. the button’s) wording, size, colour and placement 2. Headline or product description 3.Form’s length and types of fields 4.Layout and style of website 5.Product pricing and promotional offers 6.Images on landing and product pages 7.Amount of text on the page (short vs. long).

5. How to set up A/B Testing

5.1. 2 Ways

5.1.1. Replace the element to be tested before the page loads

5.1.2. Redirect to another page

5.2. Do's And Don'ts

5.2.1. Do's

5.2.1.1. Know how long to run a test before giving up. Giving up too early can cost you because you may have gotten meaningful results had you waited a little longer. Giving up too late isn’t good either, because poorly performing variations could cost you conversions and sales. Use a calculator (like this one) to determine exactly how long to run a test before giving up.

5.2.1.2. Show repeat visitors the same variations. Your tool should have a mechanism for remembering which variation a visitor has seen. This prevents blunders, such as showing a user a different price or a different promotional offer.

5.2.1.3. Make your A/B test consistent across the whole website. If you are testing a sign-up button that appears in multiple locations, then a visitor should see the same variation everywhere. Showing one variation on page 1 and another variation on page 2 will skew the results.

5.2.1.4. Do many A/B tests. Let’s face it: chances are, your first A/B test will turn out a lemon. But don’t despair. An A/B test can have only three outcomes: no result, a negative result or a positive result. The key to optimizing conversion rates is to do a ton of A/B tests, so that all positive results add up to a huge boost to your sales and achieved goals.

5.2.2. Don'ts

5.2.2.1. When doing A/B testing, never ever wait to test the variation until after you’ve tested the control. Always test both versions simultaneously. If you test one version one week and the second the next, you’re doing it wrong. It’s possible that version B was actually worse but you just happened to have better sales while testing it. Always split traffic between two versions.

5.2.2.2. Don’t conclude too early. There is a concept called “statistical confidence” that determines whether your test results are significant (that is, whether you should take the results seriously). It prevents you from reading too much into the results if you have only a few conversions or visitors for each variation. Most A/B testing tools report statistical confidence, but if you are testing manually, consider accounting for it with an online calculator.

5.2.2.3. Don’t surprise regular visitors. If you are testing a core part of your website, include only new visitors in the test. You want to avoid shocking regular visitors, especially because the variations may not ultimately be implemented.

5.2.2.4. Don’t let your gut feeling overrule test results. The winners in A/B tests are often surprising or unintuitive. On a green-themed website, a stark red button could emerge as the winner. Even if the red button isn’t easy on the eye, don’t reject it outright. Your goal with the test is a better conversion rate, not aesthetics, so don’t reject the results because of your arbitrary judgment.

5.3. Tools to Test

5.3.1. Google Website Optimizer

5.3.2. A/Bingo and Vanity

5.3.3. Visual Website Optimizer

5.3.4. Unbounce and Performable

5.3.5. Vertster, SiteSpect, Webtrends Optimize and Omniture’s Test&Target