working on a laptop

It’s really hard to guess whether a change you want to make will actually get better results, especially when improving your email marketing. You just have to give it a try and see!

One way to do that is through A/B testing or split testing.  You pick one variable with two different options. You send Option 1 to half of your list and Option 2 to the other half, and see what works better. Or, you can send each option to a part of your list (say Option A to 20%, Option B to 20% and then you send the winner to the remaining 60%).

For example, MailChimp will let you test different subject lines, from names, content, and send time and will randomly split the list for you.  Constant Contact, Campaign Monitor, Active Campaign, Hubspot, and Emma also offer some level of A/B testing. It’s a fairly standard feature these days, especially for paid plans, so check with whatever provider you use to send your emails.

Here are some tests you could run:

  • Which day of the week is best? Some nonprofits get great results sending on weekends, for example.
  • What time of day is best? Have you tried evenings, for example?
  • Which words in your subject line seem to produce the best response?
  • What about emojis in the subject line?
  • Should the email come from your nonprofit, a staff person, or some combination of the two in the “from” field?
  • Are people more likely to click on a button, linked text, or an image?
  • Do images help or hurt clicks?
  • What about GIFs in the content?
  • Which wording for your calls to action perform best?
  • Does a plain and simple email template (as if you typed it in Gmail or Outlook) work better than a designed template?
  • Do images with people work better than other kinds of images you use?
  • What seems to be the optimum number of stories or links?
  • Does including stories, testimonials, or quotes help?
  • Should you structure the mail as a letter with a signature and a PS?

Of course, you also have to define what “works best” means in each case. Are you measuring open rates, click-through rates, or something else? Typically, you’d look at opens for things like who the email is from and your subject line and use clicks for content. Sending time could be measured either way.

We’d love to hear the results of your split testing. Feel free to share in the comments.

Published On: August 24, 2021|Categories: Email Marketing, Measuring Communications and Marketing|