A/B testing keeps the user at the centre of design

A/B testing keeps the user at the centre of design

A/B testing is a reliable way to obtain useful and comprehensive data about the functionality of a web service and visitor behaviour. It is an excellent tool for determining the superiority of competing design choices and enables the ongoing development of the web service based on data.

What is A/B testing?

A/B testing refers to a situation where two or more solutions are to be tested to determine which leads best to the desired outcome. In practice, this means that there are multiple versions of a web service or page that are shown to users accessing the service.

The success of A/B testing requires that the site has enough traffic to ensure a sufficient sample size. Clear objectives and metrics must also be set for the testing. It is essential that the test target is truly significant for the functionality of the web service. While trivial and cosmetic changes can certainly be A/B tested, the benefits achieved may be minimal.

“A/B testing is sometimes thought of as tinkering to see if a blue or red button works better. Truly beneficial A/B testing focuses on the language used, the use of space and positioning of user interface elements, and the timely presentation of information,” reflects Crasman's Design Director Teemu Korpilahti.

The implementation of an A/B test typically involves the following steps:

  1. Identified problem

  2. Hypothesis of solution

  3. Design and implementation of options

  4. Waiting, i.e., data collection

  5. Selection of the winner

A/B tests are almost without exception individual cases, and tests developed for one company cannot necessarily be implemented elsewhere as such. However, when talking about consumer e-commerce, there are a few general areas of user experience on which tests are often conducted.

“One common test target is whether it is worthwhile to add a separate sign-in step to the purchase funnel for users: does it positively or negatively affect sales and do we get a larger proportion of users making purchases while signed in? In mobile e-commerce, we are testing various ways to reduce the length of views and the complexity of forms, for example, with one-click checkout type solutions,” says Teemu Korpilahti.

Case Intersport: Store availability as A/B test target

Intersport wanted to find out how they could get more users for their webshop's click and collect functionality. The starting point was the suspicion that the urging phrase “Reserve and pick up” was not understandable from the consumer's perspective.

Crasman implemented an A/B test where an alternative to the “Reserve and pick up” button was created with the more neutral “Store availability”.

The results spoke for themselves:Store availability” proved significantly more effective during the test period with a 16.4 percent conversion rate, i.e., opening the availability function, whereas “Reserve and pick up” remained at 10.8 percent. The test and the changes made to the service based on it directly led to an increase in the number of real store reservations.

“This particular Intersport A/B test is an excellent example of how in the best case a really simple change can produce significant results. If the button text had just been changed blindly, we would not have obtained as reliable results from the experiment,” reflects Korpilahti.

“In the continuous development of web services, more emphasis should generally be placed on data rather than relying solely on assumptions about how users might use the service. A/B testing is an easy way to ensure that the changes made develop the service forward rather than in the opposite direction. Sometimes the test result is that the new version is worse than the old one, but that too would not have been revealed without testing,” says Intersport Finland's Digital Marketing Manager Aki Kaipainen.

User data is a designer's lifeline

Crasman uses A/B testing as a systematic tool for the continuous development of web services and aims to design significant site changes with the help of A/B tests.

The client and designer of a web service can easily fall into a mindset that considers only power users of the services, thus overlooking a vast number of users who seek simplicity or spoon-feeding from user interfaces. In addition to the human factor, varying devices and internet speeds pose their own challenges for design. A change that positively affects the desktop user experience may be negative for mobile users.

In such a situation, it is essential to find the tools that allow the service to speak the users' language and meet their needs.

“Data obtained from user tests is a lifeline for the designer, keeping them on the path of user-centricity even when faced with radically different designs and functionalities from previous ones,” describes Teemu Korpilahti.

Learn more about Intersport's customer story.

Crasman Ltd

10 Jan 2019