I Tried ActiveCampaign Split Testing. Here’s What Actually Worked

I’m Kayla, and I run emails for two small shops and a little course on the side. I live in my inbox. So I test a lot. ActiveCampaign’s split testing helped me stop guessing and start sending stuff that people actually click. (If you’d like to see the nuts-and-bolts breakdown, here’s my full ActiveCampaign split-testing deep dive.)

You know what? It felt a bit nerdy at first. But once I saw the numbers, I was hooked.

The short sweet version

  • It’s easy to test subject lines, send times, from names, and email content.
  • Automation splits helped me see which path sold more.
  • I got real gains fast, but setup can feel fussy the first week.

Now let me show you what I did, with real examples.

If you're looking to level up every part of your conversion flow, the tutorials over at Optimization World are a goldmine.


1) Subject lines: emoji vs plain

For my winter gear sale (12,486 people), I tested three subject lines:

  • “⛄ Last chance: 24-hour Winter Gear Sale”
  • “Winter gear sale ends tonight”
  • “Sale ends tonight: winter gear”

Winner after 4 hours: the snowman line.

  • Open rate: 38% (emoji) vs 29% and 27% (plain)
  • Click rate: 5.2% (emoji) vs 3.9% and 3.6%
  • Sales (Shopify synced): $6,240 vs $4,010 and $3,650

Why it worked: short, clear, a tiny bit fun. My crowd likes cozy. I do too.

Small note: I also changed the preheader. ActiveCampaign doesn’t test that as a separate field for campaigns, so I just cloned the version and wrote a new preheader. Not hard. Just a little messy.


2) From name test: person vs brand

I ran this on my candle shop list (8,302 people).

  • “Kayla at Meadow Wick”
  • “Meadow Wick News”

Winner: “Kayla at Meadow Wick.”

  • Opens: 33% vs 26%
  • Clicks: 3.4% vs 2.5%

It felt more human. Also, my mom said it looked like a note from me, not a flyer. Moms are usually right.


3) Send time: morning vs night

I sell to two types: teachers and parents. I split on local time.

  • 8:00 AM vs 8:00 PM

For teachers (school supplies list):

  • 8 AM won. 41% opens vs 27%. Clicks 4.9% vs 3.1%.

For parents (home goods list):

  • 8 PM won. 35% opens vs 30%. Clicks 3.6% vs 3.2%.

ActiveCampaign also has Predictive Sending (I’m on the Professional plan). It helped a bit on the parent list, but plain 8 PM still beat it by a hair that week. Funny, right? I still use Predictive Sending when I’m short on time.


4) Button copy: “Get My Code” vs “Shop Now”

Same email, two buttons. I only changed the words.

  • “Get My 10% Code”
  • “Shop Now”

Winner: “Get My 10% Code.”

  • Click rate: 4.8% vs 3.1%
  • Sales: $3,220 vs $2,140

People like getting something. Shocking, I know. I made the button green for both versions. No trick colors.


5) Automation split: SMS nudge vs extra email

In my 6-email welcome series, I added a split after Email 2:

  • Path A: send a short SMS at 6 PM (“Hey, it’s Kayla—your free wick trimmer is still in your cart”)
  • Path B: send a short FAQ email instead

Winner: SMS (I pay per text, so I checked costs).

  • First purchase rate in 7 days: 7.8% (SMS) vs 5.1% (FAQ)
  • Added cost for SMS: $47
  • Extra revenue: about $1,180
  • Worth it? Yep.

Note: setting SMS needs numbers and consent. I had that. If you don’t, use a short reminder email with a big button. Still works.

ActiveCampaign even has an AI-powered split tester for automations that can pick the winning path automatically if you’d rather let the robots crunch the numbers for you.


6) Abandoned cart offer: free ship vs 10% off

I worried about margin. So I split test inside the automation. I triggered each path with the Send an email action—their official split-testing guide walks you through the clicks if you need a refresher.

  • Path A: free shipping code
  • Path B: 10% off code

Winner: free shipping.

  • Recovery rate: 12.4% (free ship) vs 11.2% (10% off)
  • Margin saved per order was better with free ship on items under $40.
  • On orders over $100, 10% off did better in revenue. So I kept both and used a rule by cart value. Felt fancy, but it paid off.

If you’re curious how straight-up price experiments can play out, here’s a candid write-up on split-testing product prices.


7) Black Friday timing: 3-hour winner test

For my big sale, I sent 20% of the list first, split two subject lines, let it run for 3 hours, then ActiveCampaign pushed the winner to the rest.

  • “Black Friday: 30% off sitewide + free gift”
  • “30% off + free gift (today only)”

Winner: the second one.

  • Opens on sample: 44% vs 39%
  • Clicks on sample: 7.9% vs 6.2%
  • Full send kept the lead and finished strong

Pro tip: set the winner by clicks, not opens. Opens can be weird now with privacy stuff. Same idea works great on landing pages too—see this ClickFunnels page split-test for inspiration.


What bugged me a bit

  • The split block in automations is strong, but it won’t “auto-pick a winner” and switch the whole flow by itself. I had to check stats, then change the paths.
  • Reports can load slow if you filter by device and tag and date. I got a coffee. It helped me and the report.
  • Testing preheader alone takes a clone. I wish it had a separate field in the A/B tool.
  • Naming. Please name every test like “BF-2024-Subject-Emoji vs Plain.” I learned that the hard way when I built the same test twice.

Tiny things that helped a lot

  • Keep a control. Change one thing at a time if you can.
  • Let tests run long enough. I try 4–24 hours, depending on list size. I learned that the hard way while running onboarding experiments in Mixpanel.
  • Pick the right winner metric. I use clicks for campaigns, purchases for automations.
  • Use segments. Parents vs teachers, new vs repeat. Different timing, different wins. For a totally different vertical, imagine curating messages for a discreet dating audience in Kent—this sugar daddy guide for Kent shows how laser-focused, location-based insights can elevate outreach and engagement, and you can borrow the same hyper-local mindset for crafting email segments that convert.
  • Save winning parts as blocks. ActiveCampaign makes that easy, and it saves time.

Real results after 6 weeks

  • Average open rate: up from 26% to 33%
  • Average click rate: up from 2.8% to 4.1%
  • Welcome series sales (first 7 days): up 31%
  • Abandoned cart recovery: up from 9% to 12%

Not perfect. But very real.


Should you try it?

If you send more than one email a month, yes. Even one test per send can help. Start simple: subject lines and send time. Then try a split in your welcome or cart flow.

Here’s the thing: split testing won’t fix a weak offer or a dull product shot. But it will help a good message get seen, and a clear button get clicked. And that matters.

I still mess up names now and then. I still ship with typos sometimes. Mistakes are human—whether you’re pressing “send” on a campaign or figuring out first-time chemistry with someone new—and a quick reality check can be both helpful and hilarious. For a tongue-in-cheek look at slip-ups in a very different