Skip to main content
A/B testing (also known as split testing) allows you to compare two versions of a post-purchase offer to determine which performs better. When you run an A/B test:
  • Layout A (Control): Your original or baseline offer
  • Layout B (Variant): A modified version with one or more changes
Traffic is automatically split evenly between layouts, and performance is tracked separately so you can identify the higher-performing variation. By testing variations of your offers, you can improve conversion rates, increase revenue, and make data-driven decisions about your upsell strategy.

What you can test

You can test nearly any element of your post-purchase offer, including:
  • Products: Different products or product bundles
  • Discounts: Different discount amounts or types (percentage vs. fixed dollar)
  • Copy: Headlines, descriptions, and call-to-action text
  • Images: Product images or lifestyle photos
  • Urgency elements: Countdown timers with different durations
  • Quantities: Default quantity of 1 vs. 2 (or more)
  • Offer design: Different layouts or visual styles
💡 Tip: For best results, test one element at a time so you can clearly identify what drives performance changes. If you want to test multiple variables simultaneously, use Multivariate Testing instead.

How to set up an A/B test

Step 1: Create or open a funnel

  1. Open the Aftersell app from your Shopify Admin
  2. Navigate to Post-Purchase Funnels in the left sidebar
  3. Either create a new funnel or open an existing funnel you want to test

Step 2: Create your test

  1. In your funnel, click Create Test
  2. Select A/B Test from the options Image

Step 3: Set up your variations

  1. Layout A (Control): This is your original offer. Configure it with your baseline settings.
  2. Layout B (Variant): This is your test variation. Make the changes you want to test.
For each layout, you can customize:
  • Product selection
  • Discount amount and type
  • Offer copy and messaging
  • Images and visual elements
  • Countdown timer settings
  • Default quantity
  • Any other offer settings

Step 4: Start the test

  1. Review both variations to ensure they’re configured correctly
  2. Click Start Test to begin splitting traffic between the two versions
  3. Ensure your funnel is enabled and published
Your A/B test is now live!

How Traffic Is Split

Traffic is automatically distributed evenly across all variations within a test. There is no way to manually adjust the traffic percentage for each layout. For example:
  • If you have 2 layouts, traffic will split 50% / 50%
  • If you have 3 layouts, traffic will split approximately 33% / 33% / 33%
  • If you have 4 layouts, traffic will split 25% each
All variations within the same test receive equal traffic distribution.

Managing your test

Once your A/B test is live, its status will show as In progress. From here, you can pause, edit, reset, delete, or select a winner.

Test Statuses

  • Not started: The test has been created but is not active. Traffic is not being split.
    If you see 100% of traffic going to one variation, click Start test.
  • In progress: Traffic is being split evenly across all variations.
  • Paused: Traffic splitting has stopped.

Pause the Test

Click Pause to temporarily stop traffic splitting. When a test is paused:
  • Traffic is no longer split between variations
  • Only the first upsell created, Layout A will be shown
  • You cannot choose which variation displays while paused
To permanently display a different version, you must select a winner or delete the test. When you unpause the test, traffic splitting resumes evenly.

Edit the Test

Use the three-dot menu to select Edit test if you need to adjust products, pricing, layout, or messaging. For accurate results, avoid making major changes while a test is running.

Reset Analytics

Select Reset analytics to clear current test data and restart tracking from zero. This resets only the A/B test data. Historical lifetime offer data is not deleted.

Delete the Test

Select Delete test to permanently remove the A/B test. If deleted:
  • Traffic will no longer be spli
  • The offer returns to normal behavior
  • Lifetime offer analytics become visible again

Select a Winner

When you are ready to end the test:
  1. Click Select winner
  2. Choose the variation to keep
  3. Confirm
This ends the test and applies the selected variation as the active offer. All customers will see the chosen version moving forward. AB Test

How A/B test analytics work

Once your test is running, you can track performance in the Analytics section of your Aftersell dashboard.

Accessing your test results

  1. Go to Analytics in the Aftersell app
  2. Click the Tests tab
  3. Select the Funnel containing your test
  4. Choose the specific Test you want to analyze
Test Anaytics

Understanding the analytics

Your A/B test analytics include three main sections:

1. Group-level performance

This shows side-by-side performance for each variation (Layout A vs. Layout B):
  • Revenue: Total revenue generated by each variation
  • Revenue per visit: Average revenue per impression for each variation
  • Conversion rate: Percentage of customers who accepted each variation
  • Impressions: Number of times each variation was shown
  • Accepted offers: Number of times each variation was accepted

2. Performance by variable

This section shows how each metric performs on average across all variables in your test. This is particularly useful for understanding overall trends.

3. All groups table

A comprehensive table that summarizes and compares all test groups side by side, making it easy to identify the winning variation at a glance.

Key metrics to watch

  • Conversion rate: The most important metric for determining which offer resonates better with customers
  • Revenue per visit: Shows which variation generates more revenue per impression, accounting for both conversion rate and order value
  • Average upsell value: Indicates whether one variation leads to higher-value purchases
💡 Tip: Hover over the ⓘ info icon next to any metric to see exactly how that value is calculated.

Best practices for A/B testing

Run tests long enough

  • Minimum duration: 2-4 weeks
  • Minimum impressions: At least 100-200 impressions per variation (more is better)
  • Shorter tests may not produce statistically reliable results, especially with low traffic.
The analytics dashboard will show you the impressions needed to be statistically viable, helping you know when the data is ready to act on.

Test one element at a time

When you change multiple elements simultaneously, you won’t know which change drove the results. Test one variable at a time for clear insights:
  • Good: Test 10% discount vs. 20% discount (one variable)
  • Avoid: Test 10% discount + Product A vs. 20% discount + Product B (two variables)

Look for consistent performance

Don’t make decisions based on a single metric. A winning variation should perform well across multiple metrics:
  • Higher conversion rate
  • Higher revenue per visit
  • Comparable or better average upsell value

Consider statistical significance

Before declaring a winner, ensure your results are statistically significant. Look for:
  • Clear performance differences (not just 1-2% variations)
  • Consistent trends over time
  • Sufficient sample size (impressions)

Apply the winning variation

Once you’ve identified a clear winner:
  1. Stop the test
  2. Apply the winning variation to your funnel
  3. Monitor performance to ensure results remain consistent
  4. Consider running a new test to further optimize

A/B test vs. multivariate test

Not sure which testing method to use?
A/B TestMultivariate Test
Tests 2 variationsTests multiple combinations of variables
Best for testing single changesBest for testing multiple variables at once
Faster to reach statistical significanceRequires more traffic to get reliable results
Easier to interpret resultsMore complex analysis
Use A/B testing when:
  • You want to test a specific change
  • You have moderate traffic levels
  • You want clear, straightforward results
Use Multivariate Testing when:
  • You want to test multiple variables simultaneously
  • You have high traffic levels
  • You want to find the optimal combination of elements

Common A/B test ideas

Need inspiration? Here are proven A/B tests to try:

Product-based tests

  • Same product vs. complementary product: Test whether customers prefer to buy more of what they just purchased or try something new
  • Single product vs. product bundle: Compare a single item offer against a bundle of complementary products
  • AI-powered vs. static product: Test dynamic AI recommendations against manually selected products

Discount tests

  • Discount amount: Test 10% off vs. 20% off vs. 30% off
  • Discount type: Compare percentage discounts (20% off) vs. fixed dollar amounts ($5 off)
  • No discount vs. discount: Test whether a discount is necessary for your audience

Quantity tests

  • Default quantity: Test quantity of 1 vs. quantity of 2 (especially effective for consumables)

Copy and messaging tests

  • Headline variations: Test different value propositions or messaging angles
  • Urgency messaging: Test with vs. without urgency language
  • CTA button text: Test different call-to-action phrases

Design tests

  • Long-form vs. short-form: Test detailed product descriptions against concise offers
  • Image variations: Test different product images or lifestyle photos
  • Timer duration: Test 5-minute vs. 10-minute vs. 15-minute countdown timers
For more detailed strategies and examples, check out our Best Practices guide.

Need help?

If you have questions about setting up or analyzing your A/B tests, chat with our support team using the live chat at the bottom right of the app.