For the button, an A/B test of three new word choices—”Learn More,” “Join Us Now,” and “Sign Up Now”—revealed that “Learn More” garnered 18.6 percent more signups per visitor than the default of “Sign Up.” Similarly, a black-and-white photo of the Obama family outperformed the default turquoise image by 13.1 percent. Using both the family image and “Learn More,” signups increased by a thundering 40 percent.
Have you heard of A/B testing? It basically means diverting a small amount of your users to an alternative site, and to test for certain response.
For example if you run a blogs covering the latest technology, and you over premium memberships for access to HD videos. That’s also where your income is from. Suppose you want to see if a “SUBSCRIBE” button placed at the top right or the top left is more likely to entice users to actually subscribe. You would carefully direct say, 10% of your users to a page with the button on the right, while the rest of the users stay with the normal page (button on the left). Then you analyse the statistics and results to determine if the position of the button actually matters. Well this example is largely simplified.
Big businesses today perhaps do more than A/B testing, maybe A/B/C/D/E/F/G testing or more! And the primary aim is to increase their user-base. More often than not, A/B testing focuses on small elements (like the position of the buttons), sometimes even minute, unnoticeable elements such as border width. It aims to use data to analyse the psychological impacts of such changes to the users, because sometimes we make decisions and we don’t even realize it. And because such changes are often small, the article mentions a great point on how it might stifle revolutionary revamps, in favor of small, data-driven tweaks.