This section refers to a deprecated version of the product. The new version is FE&R. To access FE&R, contact your CSM.
๐Ÿ“˜ To learn more, read the FE&R documentation.
LogoLogo
PlatformsPricingRessources
  • User documentation
  • Onboarding
  • Help Center
  • Release Notes
  • Flagship - Deprecated
  • Feature Experimentation & Rollout (ex-Flagship) is evolving!
  • GETTING STARTED
    • ๐Ÿ‘‰Managing multiple environments
    • Using the trial version of Flagship
  • First steps with Flagship
    • Quick start guide
    • Glossary
  • Implementation
    • Sending transactions from the AB Tasty Shopify app directly to Feature Experimentation & Rollouts
  • Integrations
    • Heap Analytics integration (Push)
    • Tealium AudienceStream (receive audiences)
    • FullStory integration (segment exports)
    • Heap Analytics integration (Pull)
    • Google Analytics integration (pull audiences)
    • Segment Integration (receive traits)
    • Mixpanel integration (cohort exports)
    • ๐Ÿ‘‰Retrieving your third-party toolsโ€™ audiences in AB Tasty - Feature Experimentation & Rollouts
    • Zapier integration
    • Segment integration
  • Steps configuration
    • ๐Ÿ‘‰Configuring Sequential Testing Alerts
    • ๐Ÿ‘‰Configuring your Flags
    • ๐Ÿ‘‰Using the scheduler
    • ๐Ÿ› ๏ธ[Troubleshooting] How to target a large number of users at the same time?
    • ๐Ÿ‘‰Configuring KPIs
    • ๐Ÿ‘‰Using the automatic rollback option
    • ๐Ÿ‘‰Targeting configuration
    • ๐Ÿ‘‰Dynamic allocation
    • ๐Ÿ‘‰Traffic allocation
  • Team
    • Access Rights, Teams & User Management
    • ๐Ÿ‘‰Defining rights per project
  • DEMO
    • AB Tasty - Feature Experimentation & Rollouts Demo - How to use it
  • Navigating the interface
    • ๐Ÿ‘‰Archiving use cases from the dashboard
    • ๐Ÿ‘‰Flags page
    • ๐Ÿ‘‰Running a search on the dashboard
    • Navigating the Flagship interface
  • REPORTING
    • ๐Ÿ‘‰Verifying your hit setup
    • ๐Ÿ‘‰Exporting reporting data
    • Understanding the "Chances to win" indicator
    • ๐Ÿ› ๏ธ[Troubleshooting] How can I know my test is reliable and my data significant enough to be analyzed?
    • Reporting - A/B Test
    • ๐Ÿ‘‰Using the reporting filters
  • API keys & Settings
    • ๐Ÿ‘‰Acting on your account remotely
    • ๐Ÿ‘‰Using Experience Continuity
    • visitor experiment option
  • FEATURES SETUP
    • ๐Ÿ‘‰Bucket allocation
  • SDKs integration
    • ๐Ÿ‘‰Managing visitor consent
    • ๐Ÿ‘‰Understanding the use of SDKs
  • FAQ
    • Can I make a comparison for each reporting?
    • Can I use Flagship even if my SDK stack is not available?
  • Platform integration
    • ๐Ÿ‘‰Webhooks page
  • Decision API
    • Decision API for non-techie users
  • Account & Profile
    • ๐Ÿ‘‰Configuring account restrictions with MFA
    • ๐Ÿ‘‰Configuring a FA on your profile
  • RELEASE NOTES
    • October - Flagship becomes Feature Experimentation & Rollouts
    • February - Release Notes
    • ๐Ÿ“…January - Release Notes
    • ๐ŸŽ‰December - Release Notes ๐ŸŽ‰
    • ๐ŸฆƒNovember - Release Notes
    • September Release Notes ๐ŸŽจ
    • June Release Notes ๐Ÿž
    • ๐ŸธMay Release Notes โ˜€๏ธ
    • Flagship Release Notes April ๐Ÿ‡
    • Flagship February release notes ๐Ÿ‚
    • Flagship January release notes ๐ŸŽ‰
    • Flagship November release notes ๐Ÿฆƒ
    • Flagship October Release Notes ๐ŸŽƒ
    • Flagship September Release note ๐ŸŽ’
    • Flagship August Release Notes ๐Ÿฌ
    • Flagship Release Notes July โ˜€๏ธ
    • Flagship Release notes June ๐ŸŒป
    • Flagship Spring Release May ๐ŸŒธ
    • Flagship Release Notes: Fall
  • Use cases
    • ๐Ÿ‘‰Duplicating a winning variation
    • ๐Ÿ‘‰Configuring a Feature Toggle/Flag
    • ๐Ÿ‘‰Configuring an A/B Test
    • ๐Ÿ‘‰Configuring a Progressive rollout
    • ๐Ÿ‘‰Configuring a Personalization
  • VIDEO TUTORIALS
    • [Video Tutorial] AB Test
    • [Video Tutorial] Feature Flag
    • [Video Tutorial] Progressive Deployment
Powered by GitBook
LogoLogo

AB Tasty Website

  • Home page AB Tasty
  • Blog
  • Sample size calculator
  • Release note

AB Tasty Plateform

  • Login

ยฉ Copyright 2025 AB Tasty, Inc, All rights reserved

On this page
  • ๐Ÿ“– Definition
  • โš™๏ธ Configuration
  • ๐Ÿ’ก Use case

Was this helpful?

Edit on GitLab
Export as PDF
  1. Use cases

Configuring an A/B Test

PreviousConfiguring a Feature Toggle/FlagNextConfiguring a Progressive rollout

Last updated 2 days ago

Was this helpful?

๐Ÿ“– Definition

An A/B test is a type of feature that enables you to test the performances of a new version of an element on your website or application. After analyzing the results of your test, you need to decide which version has performed best according to the KPIs you wanted to reach (e.g., the conversion rate). You can then apply these changes directly to your website/application.

โš™๏ธ Configuration

To configure an A/B Test, apply the following steps:

  1. From the dashboard, click Create a use case.

  2. Select the AB Test template. [Basic information]

  3. Fill in the name of your AB Test and its description.

  4. Choose the primary and secondary KPIs you want to follow.

  5. Click Save and continue. [Targeting]

  6. Define the type of users who will see your feature. For more information, refer to the article about Targeting configuration.

  7. Click Save and continue. [Variations]

  8. Configure your variations by defining their flag's name, type (text, number, boolean, array, or object), and value.

  9. Click Save and continue. [Allocation]

  10. Define the percentage of the traffic you want to assign to each variation. You can choose between a manual or a dynamic allocation: The manual allocation enables you to determine the percentage to assign to each variation. The dynamic allocation automatically diverts new traffic to the best-performing variation depending on the results of each variation. [Overview]

  11. Click Save and continue.

  12. Check that every step has been configured correctly.

  13. (Optional) Notify your teammates that the A/B Test is ready to go.

๐Ÿ’ก Use case

Let's say that on your e-commerce website, you have noticed that your users have trouble proceeding to checkout once their order is complete. You think this may be due to the color of the button and the wording "Proceed to checkout", which would not be suited. You have come up with 2 solutions but don't know which one would bring the highest conversion rate. In this case, you can create an A/B Test:

  1. Create an AB Test use case on your Flagship account.

  2. In the basic information step, select the KPI related to a click on the โ€œProceed to checkoutโ€ button. You need to have configured it in your codebase beforehand.

  3. Select the conversion rate as a sub KPI type.

  4. In the targeting step, select โ€œAll usersโ€.

  5. In the allocation step, assign 34% of your traffic to the original version, and 33% to the other 2 variations.

  6. Check the overview of your AB Test.

  7. Save and activate your use case from the dashboard.

Need additional information?

Submit your request at product.feedback@abtasty.com

Always happy to help!

In the variations step, configure the two variations with two flags of type Text as follows: For the first variation: For the second variation:

๐Ÿ‘‰