Platform

Developers

Resources

Pricing

Frequently Asked Questions

A curated summary of the top questions asked on our Slack community, often relating to implementation, functionality, and building better products generally.
Statsig FAQs
TROUBLESHOOTING EXPERIMENTS

Why is my experiment only showing overridden values and not running as expected?

If you're only seeing overridden values in the Exposure Stream for your experiment, there could be several reasons for this. Here are some steps you can take to troubleshoot:

1. Check the Initialization Status: Each hook has an isLoading check, which you can use, or the StatsigProvider provides the StatsigContext which has initialization status as a field as well. This can be used to prevent a gate check unless you know the account has already been passed in and reinitialized.

2. Check the Data Flow: Ensure that the id_type is set correctly and that your ids match the format of ids logged from SDKs. You can check this on the Metrics page of your project.

3. Check the Query History: If your data is still not showing up in the console, check your query history for the user to understand which data is being pulled, and if queries are not executing or are failing.

4. Check the Exposure Counts: If you're seeing lower than expected exposure counts, it could be due to initializing with a StatsigUser object that does not have the userID set, and then providing that on a subsequent render. This causes the SDK to refetch values, but logs an exposure for the “empty” userID first. To prevent this, ensure the userID is set before initializing the StatsigUser object.

If you've checked all these and the issue persists, it might be best to reach out for further assistance.In some cases, users may still qualify for the overrides based on the attributes you’re sending on the user object. This may be due to caching, or it may be due to the user qualifying for other segments that control overrides.

For example, if users have "first_utm_campaign": "34-kaiser-db" being sent on the user object, they would qualify for a segment that’s being used in the overrides.

It's also important to note that overridden users will see the assigned variant but will be excluded from experiment results. We have a way to include users in results for experiments not in layers, but it seems we don’t have that option for experiments in layers.

Lastly, consider why you are using overrides in this scenario instead of a targeting gate. Overrides can be used to test the Test variant in a staging environment before starting the experiment on prod. However, if some of your customers have opted out of being experimented on, a targeting gate might be a more suitable option.

Join the #1 Community for Product Experimentation

Connect with like-minded product leaders, data scientists, and engineers to share the latest in product experimentation.

Try Statsig Today

Get started for free. Add your whole team!

What builders love about us

OpenAI OpenAI
Brex Brex
Notion Notion
SoundCloud SoundCloud
Ancestry Ancestry
At OpenAI, we want to iterate as fast as possible. Statsig enables us to grow, scale, and learn efficiently. Integrating experimentation with product analytics and feature flagging has been crucial for quickly understanding and addressing our users' top priorities.
OpenAI
Dave Cummings
Engineering Manager, ChatGPT
Brex's mission is to help businesses move fast. Statsig is now helping our engineers move fast. It has been a game changer to automate the manual lift typical to running experiments and has helped product teams ship the right features to their users quickly.
Brex
Karandeep Anand
President
At Notion, we're continuously learning what our users value and want every team to run experiments to learn more. It’s also critical to maintain speed as a habit. Statsig's experimentation platform enables both this speed and learning for us.
Notion
Mengying Li
Data Science Manager
We evaluated Optimizely, LaunchDarkly, Split, and Eppo, but ultimately selected Statsig due to its comprehensive end-to-end integration. We wanted a complete solution rather than a partial one, including everything from the stats engine to data ingestion.
SoundCloud
Don Browning
SVP, Data & Platform Engineering
We only had so many analysts. Statsig provided the necessary tools to remove the bottleneck. I know that we are able to impact our key business metrics in a positive way with Statsig. We are definitely heading in the right direction with Statsig.
Ancestry
Partha Sarathi
Director of Engineering
We use cookies to ensure you get the best experience on our website.
Privacy Policy