- Layout A (Control): Your original or baseline offer
- Layout B (Variant): A modified version with one or more changes
What you can test
You can test nearly any element of your post-purchase offer, including:- Products: Different products or product bundles
- Discounts: Different discount amounts or types (percentage vs. fixed dollar)
- Copy: Headlines, descriptions, and call-to-action text
- Images: Product images or lifestyle photos
- Urgency elements: Countdown timers with different durations
- Quantities: Default quantity of 1 vs. 2 (or more)
- Offer design: Different layouts or visual styles
How to set up an A/B test
Step 1: Create or open a funnel
- Open the Aftersell app from your Shopify Admin
- Navigate to Post-Purchase Funnels in the left sidebar
- Either create a new funnel or open an existing funnel you want to test
Step 2: Create your test
- In your funnel, click Create Test
-
Select A/B Test from the options

Step 3: Set up your variations
- Layout A (Control): This is your original offer. Configure it with your baseline settings.
- Layout B (Variant): This is your test variation. Make the changes you want to test.
- Product selection
- Discount amount and type
- Offer copy and messaging
- Images and visual elements
- Countdown timer settings
- Default quantity
- Any other offer settings
Step 4: Start the test
- Review both variations to ensure they’re configured correctly
- Click Start Test to begin splitting traffic between the two versions
- Ensure your funnel is enabled and published
How Traffic Is Split
Traffic is automatically distributed evenly across all variations within a test. There is no way to manually adjust the traffic percentage for each layout. For example:- If you have 2 layouts, traffic will split 50% / 50%
- If you have 3 layouts, traffic will split approximately 33% / 33% / 33%
- If you have 4 layouts, traffic will split 25% each
Managing your test
Once your A/B test is live, its status will show as In progress. From here, you can pause, edit, reset, delete, or select a winner.Test Statuses
- Not started: The test has been created but is not active. Traffic is not being split.
If you see 100% of traffic going to one variation, click Start test. - In progress: Traffic is being split evenly across all variations.
- Paused: Traffic splitting has stopped.
Pause the Test
Click Pause to temporarily stop traffic splitting. When a test is paused:- Traffic is no longer split between variations
- Only the first upsell created, Layout A will be shown
- You cannot choose which variation displays while paused
Edit the Test
Use the three-dot menu to select Edit test if you need to adjust products, pricing, layout, or messaging. For accurate results, avoid making major changes while a test is running.Reset Analytics
Select Reset analytics to clear current test data and restart tracking from zero. This resets only the A/B test data. Historical lifetime offer data is not deleted.Delete the Test
Select Delete test to permanently remove the A/B test. If deleted:- Traffic will no longer be spli
- The offer returns to normal behavior
- Lifetime offer analytics become visible again
Select a Winner
When you are ready to end the test:- Click Select winner
- Choose the variation to keep
- Confirm

How A/B test analytics work
Once your test is running, you can track performance in the Analytics section of your Aftersell dashboard.Accessing your test results
- Go to Analytics in the Aftersell app
- Click the Tests tab
- Select the Funnel containing your test
- Choose the specific Test you want to analyze

Understanding the analytics
Your A/B test analytics include three main sections:1. Group-level performance
This shows side-by-side performance for each variation (Layout A vs. Layout B):- Revenue: Total revenue generated by each variation
- Revenue per visit: Average revenue per impression for each variation
- Conversion rate: Percentage of customers who accepted each variation
- Impressions: Number of times each variation was shown
- Accepted offers: Number of times each variation was accepted
2. Performance by variable
This section shows how each metric performs on average across all variables in your test. This is particularly useful for understanding overall trends.3. All groups table
A comprehensive table that summarizes and compares all test groups side by side, making it easy to identify the winning variation at a glance.Key metrics to watch
- Conversion rate: The most important metric for determining which offer resonates better with customers
- Revenue per visit: Shows which variation generates more revenue per impression, accounting for both conversion rate and order value
- Average upsell value: Indicates whether one variation leads to higher-value purchases
Best practices for A/B testing
Run tests long enough
- Minimum duration: 2-4 weeks
- Minimum impressions: At least 100-200 impressions per variation (more is better)
- Shorter tests may not produce statistically reliable results, especially with low traffic.
Test one element at a time
When you change multiple elements simultaneously, you won’t know which change drove the results. Test one variable at a time for clear insights:- ✅ Good: Test 10% discount vs. 20% discount (one variable)
- ❌ Avoid: Test 10% discount + Product A vs. 20% discount + Product B (two variables)
Look for consistent performance
Don’t make decisions based on a single metric. A winning variation should perform well across multiple metrics:- Higher conversion rate
- Higher revenue per visit
- Comparable or better average upsell value
Consider statistical significance
Before declaring a winner, ensure your results are statistically significant. Look for:- Clear performance differences (not just 1-2% variations)
- Consistent trends over time
- Sufficient sample size (impressions)
Apply the winning variation
Once you’ve identified a clear winner:- Stop the test
- Apply the winning variation to your funnel
- Monitor performance to ensure results remain consistent
- Consider running a new test to further optimize
A/B test vs. multivariate test
Not sure which testing method to use?| A/B Test | Multivariate Test |
|---|---|
| Tests 2 variations | Tests multiple combinations of variables |
| Best for testing single changes | Best for testing multiple variables at once |
| Faster to reach statistical significance | Requires more traffic to get reliable results |
| Easier to interpret results | More complex analysis |
- You want to test a specific change
- You have moderate traffic levels
- You want clear, straightforward results
- You want to test multiple variables simultaneously
- You have high traffic levels
- You want to find the optimal combination of elements
Common A/B test ideas
Need inspiration? Here are proven A/B tests to try:Product-based tests
- Same product vs. complementary product: Test whether customers prefer to buy more of what they just purchased or try something new
- Single product vs. product bundle: Compare a single item offer against a bundle of complementary products
- AI-powered vs. static product: Test dynamic AI recommendations against manually selected products
Discount tests
- Discount amount: Test 10% off vs. 20% off vs. 30% off
- Discount type: Compare percentage discounts (20% off) vs. fixed dollar amounts ($5 off)
- No discount vs. discount: Test whether a discount is necessary for your audience
Quantity tests
- Default quantity: Test quantity of 1 vs. quantity of 2 (especially effective for consumables)
Copy and messaging tests
- Headline variations: Test different value propositions or messaging angles
- Urgency messaging: Test with vs. without urgency language
- CTA button text: Test different call-to-action phrases
Design tests
- Long-form vs. short-form: Test detailed product descriptions against concise offers
- Image variations: Test different product images or lifestyle photos
- Timer duration: Test 5-minute vs. 10-minute vs. 15-minute countdown timers