HomeDocumentationAPI Reference
Getting StartedAPI ReferenceBug ReportingCrash ReportingAPMHelp Center

Experiments

Evaluate the performance of your app as you implement new features and conduct experiments

There are different cases where you end up having different feature flags or experiments for different users. For example, if you are:

  • Controlling your rollout by enabling features to a % of your users to monitor Performance and Stability
  • Creating and testing different mobile onboarding experiences concurrently
  • Testing different landing pages for your mobile app
  • Implementing new features with different UI

Through the “Experiments” API you can keep track of your experiments and its impact on Bug Reports, Crash Reports and App Performance for each user and even filter by them. This can help you in:

  • Detecting if the the potential source of any latency or issues in the app is introduced by different variants of the experiment or new features
  • Having visibility for the latencies of your variants over different metrics
  • Filtering by your experimental variants to analyze if they impact your performance or cause crashes
  • Debugging issues faster by understanding if the experimental values contributed in a issue

Adding Experiments

Simply, to track your feature flags or experiments in the dashboard, use the next method.

Instabug.addExperiments(["exp1"])
[Instabug addExperiments:@[@"exp1"]];
Instabug.addExperiments(List<String> experiments);
Instabug.addExperiments(List<String> experiments)
Instabug.addExperiments(['exp1']);
Instabug.addExperiments(['exp1']);

You can have up to 200 experiments, with no duplicates. Each experiment can consist of 70 characters at most and is not removed at the end of the session or if logOut is called

Example Usage

Below is an example of where in your code you would use experiments. In this example you are experimenting with feature logic that controls whether or not user has a Dark Mode toggle available.

if darkModeToggleEnabled {
  Instabug.addExperiments(["darkModeToggleAvailable"])
  // Display dark mode toggle
}

Experiments in Performance Monitoring

Once you add the API in your code, you will be able to view the experiments in the patterns section of Cold App Launch, Screen Loading and UI Hangs

You can see the different latencies of your metric in correlation with the experimental variant. For example, in the previous screenshot, users who had guest_mode enabled had a very different Apdex score, p50 and p95 latencies.

You can also isolate your experiment by filtering with a specific experiment value for further analysis to understand if they are impacting the latency of App launch, Screen Loading or UI hangs.

If you filter by guest_mode and No experiments as shown on the following screenshot, the No Experiments presents occurrences without any experiments applied. You can also filter by one or more experimental values.

The No Experiments selection will help you spot and compare any difference in performance in each metric.

Experiments in Crash Reporting

Rolling out new features or doing modifications in your code can increase the number of errors you are seeing. By analyzing how different experiment variants are contributing to your crashes, you can minimize the debugging efforts and team members can save time.

For example, if you just rolled out a new recommendation feature for a subset of your users, you can view all the crashes that occurred to the users who had this feature enabled by using the filters.

In the screenshot below, we filtered by experiment Recommendations_enabled, to view the relevant crashes

You can also view the experiment variants attached to each crash report on your dashboard in the patterns section of a crash.

Experiments and Team Ownership

If you have a team who is responsible for a specific feature flag or an experiment, you can automatically assign them the relevant issues and forward it to their favorite tool. For more details on Team Ownership, click here

In the screenshot below, we wanted to assign crashes relevant to the experiment Recommendations_enabled to the team responsible for this feature and auto-forward it to their Jira board

Removing Experiments

If your experiment is concluded or you would like to simply remove it, you can use this method:

Instabug.removeExperiments(["Recommendations_enabled"])
[Instabug removeExperiments:@[@"exp2"]];
Instabug.removeExperiments(List<String> experiments);
Instabug.removeExperiments(List<String> experiments);
Instabug.removeExperiments(['exp1']);
Instabug.removeExperiments(['exp1']);

Clearing Experiments

You can use the below method to clear all the experiments from your reports:

Instabug.clearAllExperiments()
[Instabug clearAllExperiments];
Instabug.clearAllExperiments();
Instabug.clearAllExperiments();
Instabug.clearAllExperiments();
Instabug.clearAllExperiments();