A/B Testing
Introduction
A/B tests are a fundamental tool for optimizing game features by comparing different variants and analyzing their impact on player behavior and game performance. By running A/B tests, you can make data-driven decisions to enhance monetization, engagement, and overall player satisfaction.
What is an A/B Test?
An A/B test involves dividing users into different groups, each exposed to a distinct variant of a feature. By comparing the performance metrics across these groups, you can determine which variant performs better and make informed decisions based on the results.
A/B Test Analysis
We analyze A/B test results using internal analytics tools designed to provide a complete picture of performance. Feature flags allow us to place users into controlled test groups, while our systems track monetization, ad performance, engagement, and other core metrics to evaluate each experiment.
Experiment Naming Convention
Use the following pattern for all experiment names:
{game_code}_{platform}_{TestName}
Components
game_code: Three-letter game code (e.g., FNI, SBP, FMT). If the game doesn’t have a three-letter code, use the game name:
GameNameplatform:
iosfor iOSandfor Android
TestName: The experiment name. Use PascalCase for multi-word names
Examples
fmt_ios_NewUIpie_and_AdventureModeStartXpfni_and_NewWelcomePack
Available Functions
Note: If you're using our Lion Remote Configs package, AbCohort tracking can be set up to run automatically when your app starts - no manual coding required.
The AbCohort function assigns users to a test cohort and adds the AbCohort parameter to all subsequent events. This function should be called every session when users log in.
Critical Parameters
Required
experiment_name: The name of the experiment (e.g.,abc_and_NewInterTimer).experiment_cohort: The name of the variant or cohort within the experiment (e.g.,aggressive,control,passive).
Implementation
The
ab_cohortvalue is automatically reset at the start of a new session.If your workflow requires explicitly ending an experiment during the same session, you can use
ClearAbCohortto remove the parameter immediately.
Example
To illustrate the proper setup and naming convention for an A/B test, consider the following example:
Experiment Setup
An anonymous game (game code ABC) runs an experiment to evaluate the impact of different interstitial ad timings. The experiment is named abc_and_NewInterTimer, with the following variants and configurations:
control
90
90
aggressive
60
60
passive
120
120
Each variant receives different values for the experiment’s inter_between_time and inter_start_time remote config parameters.
✅ Correct Function Usage
Call the ab_cohort function based on the user’s allocation to the variant:
LionAnalytics.AbCohort("abc_and_NewInterTimer", "control")LionAnalytics.AbCohort("abc_and_NewInterTimer", "aggressive")LionAnalytics.AbCohort("abc_and_NewInterTimer", "passive")
❌ Incorrect Function Usage
Avoid calling the function with individual configuration values:
LionAnalytics.AbCohort("abc_and_NewInterTimer", "60")LionAnalytics.AbCohort("abc_and_NewInterTimer", "90")LionAnalytics.AbCohort("abc_and_NewInterTimer", "120")
Last updated
Was this helpful?