In This Article:

    A/B Testing overview

    In This Article:

      A/B Testing: Overview

      When to use it

      A/B Testing allows Salsa campaign managers test different versions of an email blast to see which one performs the best. To do this, follow these simple steps:

      1. Create your email blast as normal.
      2. Copy the original email into two test versions of the email (using the tools described below).
      3. Send the variations out to a small percentage of the total targets for the email blast.
      4. Monitor the open rates or conversion rates of each variation and determine which was the better performing email.
      5. Send the content from the better performing email out to the rest of the target.

      How to get there

      Complete your email in the Salsa HQ as normal.  At Step 7: Summary, click the button before scheduling or sending the email.

      create_test_blast.png

      Please note: First, do not use Percentile targeting and A/B testing together, as this will result in 0 targets receiving emails. Use only one of these functions at a time. Second, if you are considering using different email templates between the main blast and the test blast, make sure that both templates you are using are 'uneditable' type templates.

      Presentation

      Here's a quick slide show that outlines this process (which is documented in more detail below).

      Create Split Blasts

      Click the primary and then the secondary button to create split blasts.

      Create_split_blasts.png

      You've just created two copies of your main blast:

      1. Test A
      2. Test B

      Target Test Blasts

      By default, your targets will be as follows:

      • Test A will be sent to a random 5% of the supporters you targeted for this email blast. You cannot specific which 5% - it's random by design.
      • Test B will be sent to 5% of the supporters you targeted for this email blast. You cannot specific which 5% - it's random by design.
      • When you're done testing, the "winner" of the test will be sent to 90% of the supporters you targeted in your initial email blast setup.

      Use either the editable percentage fields or the slider tools to vary these percentages, if desired.

      set_percentages.png

      Editing Test Blasts

      Click the edit link next to the test blasts and the following box will drop down:

      edit_the_blast.png

      Once you've made changes to the test blast, don't forget to save them!

      Sending Test Blasts

      Once you have made edits to all of your test blasts - submit them using the buttons.  This will launch the test blasts to the specified % of people and bring up the open rate statistics on this summary page. 

      submit_test_blasts.png 

       

      Note: After you send out the test blasts, you need to give the test some time. You're waiting for real humans to receive your email so that you can evaluate their response. Depending upon your supporters and your content, wait at least one hour, but up to several days for best results.

      Evaluate Test Results and Send Main Blast

      After waiting for the appropriate length of time, you'll see test results on the summary tab.

       compare_test_blasts.png

      NOTE: Only open and click rates are available in this view. If you'd like to see conversion rates, check the email summary report.

      If Test A performs better than Test B, replace the main blast content with the content from Test A. To do this:

      • Go to the main blast drop down labeled: Replace Main Content with:
      • Select the version you want to send to the remainder of your targets (usually the best performing in your test)
      • Accept the confirmation "Are you sure?" message
      • Once completed, the drop down will change to "Leave as is" again. 

      Your blast is now ready to be sent with the better content. Once you've selected the content for the main blast, schedule and submit the email as you would with any other blast.

      pick_winner.png

      Best Practices for A/B Testing

      Only test one variable at a time! If you change multiple things, you won't know which thing resulted in better performance and your test will be invalid. Some ideas for varying your email blasts:

      • Vary subject lines.
      • Dynamic content and merge fields (personalization of content)
      • Salutation
      • From name
      • Vary the color on call to action buttons
      • The time the email blasts are sent
      Was this article helpful?
      1 out of 1 found this helpful
      Have more questions? Submit a request

      Comments

      17 comments
      • Dear Jake,

        Is it possible to change the email template in an a/b test?

        Thanks,

        Stefanie

      • Hi Stefanie,

        Yes - When you click the (edit) _link next to one of the test blasts, you'll see a new link in the Quick Edit pop-up that says _Edit/View full email blast - click that link to open the test blast in the full email blast workflow. The first step will allow you to change the template. 

        I've updated the relevant section above to clarify this point. Thanks for the question!

      • I would like to set up an AB test where the email goes out at different times. Is there a way to do that?

      • Bi Betsy,

        Heck yes! Check out the second-to-last picture on this documentation page, under "Sending Test Blasts" - you'll see that each test blast has a 'Reschedule' link. You'll click that to set the different times.

        Then after your date/time is set, click the Reschedule button, and lastly click the Submit Test Blast button to put those test blasts in their scheduled queue. 

         

      • Thanks, I actually had already figured it out, but appreciate the feed back!

      • Is there a way, when A/B testing to send to a queried group of supporters, so that the test goes to a designated group of people? We're having trouble with one of our templates displaying poorly in Yahoo Mail.

      • Hi Matt,

        Unfortunately not. A/B testing randomly pulls splits supporters based on a percentile (example: if your A split test goes to 5% of your list, it will send to people in the first five percentiles, i.e. their supporter keys end in 00-05), so there's no way to manually decide who that 5% should go to.

        Your best bet in that case would be to separate out a specific sub-section into it's own group. Then either suppress that group from your main emails, or send separately just to that group.

      • Is there a way to split the test in two, so it's a random 50-50 split, without a third "main" blast?

      • Hi Jennifer,

        I think there a few ways to interpret your question, but I understand it as:

        When you create your A/B test, you want Test A to receive 50% and Test B to receive 50% of your targeting selection, which obviates the need for the "Main Blast".

        If this is the case, it seems to me that A/B Testing isn't really the best option here, since you're not running a "test" which then determines what content your "main blast" uses. Instead, I would suggest creating two separate emails and using the Percentile query to segment the emails into one 50% group, and another 50% group goes to the other email, as in the screenshot below. (The segments don't change each time they're used - so assuming no new supporters are added or existing supporters are deleted between sending email 1 and email 2, no one will receive both emails.)



        Enlarge.

        But I might totally be missing the goal of your split, so please let me know what your objective is and we can keep the conversation going. I'm always intrigued by the cool ways that our users take advantage of Salsa, so let me know what you're aiming for!

      • Jake,

         

        Thanks for the quick reply! You nailed it, that's exactly what we're aiming to do. We want to split our test in two, so we can get results from our entire list, and then use that knowledge in a subsequent followup mailing. Where can I find instructions on how to set up the percentile query, so that it randomly groups people? (We may want to do a 50/50 split or even a 33/33/33 split.)

         

        Thanks!

         

        Jennifer

      • Jennifer,

        Great! So when you're setting up your email blast, when you get to the Targeting tab, follow the steps in this video:

        http://www.screencast.com/t/QjpfWfoTt

        1. Choose the fourth "Condition Type" option, Percentiles.
        2. In the second field, stick with the default "In in" option. 
        3. Decide what percentage of your list you want to send to, then select the corresponding segments. For instance, to get 50% of your list, you would select segments 1-49. (The second email would use segments 50-100.) In the example video, I'm going with 33/33/33, so choosing segments 1-32 (by holding the shift button while clicking on Segment 32), and clicking the Add button. 
        4. Hit the Save & Continue button to commit the query and move along to send your email blast.

        I hope that's helpful, but feel free to email training@salsalabs.com to continue the conversation. 

         

      • Hey Jake,

        I don't see conversion tracking on my tests. Is there something I need to do to add it? The only data I'm able to see in tests is opens and clicks. I want to track my ratios rather than raw numbers, but need conversion #s.

        Thanks

        Evan

      • Hi Evan,

         

        The "Actions" column is actually a label for actions to be taken (not a report on conversion data).  Since you're already sent your email tests, there is no further action to be taken for them and the column is blank.  There ARE actions to be taken before the A/B test emails are sent and at that time you'll see an option to delete the emails in that column.

        However, that's not your question.  You would like to get conversion numbers on your test emails.  The way to get that is just like any other email blast - in the email summary report.  You can filter the email blast list and pull up the test A and test B emails (just like they were any other blasts) and see the data on their performance.

        To do this:

        • Open the email package
        • Click on Summary Report
        • Filter the Summary Report to pull up the test blasts
        • Review total emails, emails opened, open percentage, emails clicked, click percentage, number unsubscribed, and number of emails failed.

         

         

      • Hi Rebecca,

        Thanks for the response. Unfortunately, it still doesn't get me what I'm looking for -- apologies if I didn't ask my question clearly enough. The report from the summary covers opens and clicks. Nice though those statistics are, what I care about are conversions. I want to know how many of the people who received each test blast (and, for that matter, the main blast) actually completed the action. There's a line for conversions on the table when you view the test plan, as well as in the final report page, however it is not populating with any data. For example, here's a screenshot of one of the report pages, where it says "no conversions tracked."

        So to be a bit more clear, my question is "What do we need to do to make the system track conversions so we can judge our test by that metric?"

        Thanks!

        e

      • Hi Evan,

        The email specialist from our support team will get back to you about this issue.  Thanks for posting!

      • Hi, 

        Is there any way in the A/B test to test out more than just two messages.  Would it be possible to test 4 separate messages using the system?

        Thanks, 

        Dan Ramos, OEA

      • Hi Dan,

        It's not as common, but you can test more than two messages. After you create the test blasts, just keep clicking the button that says...

        Add another A/B split test blast as shown in this video.

      Please sign in to leave a comment.