Hacker News new | past | comments | ask | show | jobs | submit login
This one A/B testing case study will leave you smarter and wiser (visualwebsiteoptimizer.com)
10 points by paraschopra on Nov 22, 2013 | hide | past | favorite | 6 comments



I would probably click the one without the price (thinking I was getting a price list for free) Then get really annoyed to find it was trying to charge me $2.

increase click rate? ... yes

increase conversion rate ? inconclusive

I bet they have the conversion result numbers but they either contradicted the story, or were weak results that showed the A/B test wasn't actually useful in this case


They didn't measure how this affected revenue. They're still in the process of rolling out that test. (I work for VWO)


I didn't click the link, it sounds too much like an advert.

But i if i did i suspect that i'd find one weird trick that doctors hate.


Thanks for the feedback. Tells us we need to think up better headlines.


Anytime I see some crazy wording like, "Which of these call-to-action button versions increased clickthroughs by 600%?", my first thought is "Too small of a sample size". Over a large sample size, it's hard to believe that the difference in those two buttons would be anywhere close to 600%. Maybe I missed it - did she say anywhere how many days/impressions were used?


Sample size was 4500+ distributed between the two versions. In this case, the sample size was not small but the conversion rate of the page was very low.

Which is why a 600% increase was even possible. That's usually a problem with A/B testing in the real world. There are so many websites whose conversion rates hover in the 0.xx% range. Thoughtful hypotheses almost always result in significant gains. Not because they did something magical, but because they simply communicated their business offer better.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: