How to Use Vidalytics Compare View to Determine Surprising Winners in A/B Split Test

The Test Parameters: A/B Split Test
Type of Page: Video Landing Page
Traffic Source: Cold, Search, Google Display Network

The Analysis & Takeaway

We recently tested two videos side by side that had the same voice, and the same lead, story and product reveal, but had two main differences:

  • In the original version of the video (A), the voice actress was never on camera. In the second version (B), we had her show up on camera in multiple sections.
  • The offer was different. Video A offered the product a at a lower cost than video B. Also, on video B, we created a bundle to offer to new customers, that included a brand new product for free when they purchased the main offer.

So, for this test, the script was the same up to the point where the offer was revealed (in our case, at minute 16:45), but video B had a higher production value, using a model in front of the camera, subtitles and a green screen.

Considering how much “nicer” it is to have a model in camera, we were expecting to see a higher engagement on video B and, if anything, a difference coming in at the moment of the offer reveal.

But we were really surprised with the actual results.

Vid-Stats-Vidalytics

In this case, what was most interesting was that viewers dropped significantly at the beginning of the video on the second version. By 0:35 we had 49% of audience engagement, compared to 68% in the original.

So, we ruled out this being related to the offer, which wasn’t revealed until way later in the videos. Of course, the offer might have affected results as well, but the difference in the audience drop at the beginning of the video showed us that the problem with video B was different.

It also wasn’t the message, as the copy was exactly the same until way further into the video. So it had to be the person in front of the camera.

Having an original video that was mainly stock footage and pictures, and text on screen with some animations, this came as a surprise, as we thought a higher production value would increase the engagement on the new version, but it obviously didn’t work that way.

After analyzing these results, we might consider these next versions of the video to test against the original:

  • ‘Text on screen’ (like video A) with the new offer from video B.
  • A different actress on camera.

Testing these new changes will let us go deeper into why our higher production value video didn’t work as expected.

Hadn’t we used the Compare View in Vidalytics, we might’ve keep thinking that the problem was the new offer, not thinking that maybe our “fancier” video was actually pushing potential customers away.

Vidalytics Compare View was instrumental in allowing us to discover what was going on in this case and it helped us call this test faster than we would have otherwise, saving us ad spend and helping us make decisions quickly.

Have you used our Compare View yet to help you analyze split test data for your videos? If so, did you find any surprises? Let us know in the comments below

And if you haven’t tried Vidalytics yet… the time is now!

Don’t waste any more time or money trying to figure out how your videos will do on a head to head split test. Use our Compare View and take advantage of its powerful insights that’ll help you cut losers faster and move on to the next experiment.

Ready to Increase Your Conversions?

Get Started for FREE Now