I recently tested two different versions of an updated landing page to a site that I manage. During the design process the usability analyst received feedback from people not associated with the site on several manual wireframe prototypes. With this input, she developed a couple of rough mockup designs using powerpoint and went through another round of user group feedback. Ultimately she narrowed down the designs to two leading options.
The two leading page designs were then developed and placed on the site as the new landing page in a 50/50 A/B split. Our initial thought is that we would compare each design against the baseline conversion rate established by the existing landing page (our control). What we found was the the conversion rate was flat to slightly higher for each design but that page B in our test had a higher instance of customers forgetting to select a required radio button choice.
Here’s a quick comparison of the three field design for options A and B. The page had other elements but the form layout varied by a horizontal and vertical stack of the required fields.
So our conversion rate metric improved slightly but we learned that the usability of design A was much greater than design B. At a 2x rate users of the site would not choose a selection for the the radio button question on the form.
Usability is no stranger to A/B tests on eCommerce sites. It’s typical to see it with colors, button styles, form types, etc. In my case, I was so focused on the conversion rate metric prior to the release that I forgot about the usability test of the two design. The two options were setup perfectly for a usability test because their content was the same while we varied the layout pattern of the form on the page that has fields for user input.
One Reply to “A/B testing shows more than conversion rates”
Comments are closed.