Sigstr's A/B Test campaign type allows marketers to measure the performance or two different campaign banners and show which one achieves a higher click rate over time.  This is a great new way to find out how to best optimize your campaign banners and what types of messaging and graphics resonate most with your audience.

How it Works

If you have uploaded two banners and aren't sure which one had more clicks, this will be an extremely useful campaign type.  Simply select the "A/B Test" campaign type and you will be prompted to upload the two banner images that you would like to test.  Sigstr will serve the two campaign banners evenly across your audience, and report on which of the banners is the winning variant based on click rates.


A/B Test campaigns are available to Microsoft customers using the Outlook Agent Version 4.1 or greater, for Gmail customers using the Sigstr Chrome Extension (Version 7.0.1 or greater).  Customers currently utilizing SigstrOne require no changes at all!

Step 1 - Select the A/B Test campaign type

On the Campaigns page, click the "Create Campaign" button and select the "A/B Test" type.

Step 2 - Assign Employee Groups

Assign the Employee Groups that you want to display this A/B Test campaign to and click "Next" at the bottom.

Step 3 - Upload your two banners

Upload the two campaign banners that you would like to test head-to-head.  You can name each banner individually to easily tell them apart from one another.

Step 4 - Verify and Launch!

Verify that your campaign banners look good, schedule when you would like to run it (optional), and launch your campaign!  

Reviewing Results

After your campaign has launched, it will typically take a few days to a week to start seeing results.  You will see a new analytics table and graph added to your Campaign Details page that will show you up to date results of performance of your two campaign banners.  The graph will highlight the campaign banner that currently has a higher click rate with the gold color and trophy next to the winning variant.  Please note that results will change over time, and the chart shows a snapshot in time of the current winning variant.

Best Practices

We recommend that A/B Test campaigns run for at least 2-4 weeks to gather enough data to make educated decisions on the winning variant.  Over time, you will typically see click rates start to level off and a winning variant will emerge.  There are times that click rates may be near the same, in which case there isn't a clear winner.  You should still use the data to guide future image designs!

Please reach out to your Customer Success Manager if you need help setting up any A/B Test campaigns!

Did this answer your question?