Daily active users, revenue, clicks on “add to shopping cart,” etc. Within Statsig, we celebrate these increases when they are statistically significant with a deep, satisfying, green color:
But what about when this isn’t the case?
You might want to measure things that are not always better when they go up, like
the count of crashes in your app
removals of items from a shopping cart
page loading time
For all of these, if we see significant increases, we want everyone using the platform to recognize that this is not desired.
In Statsig, we make this easy now from the detail page for any metric. Just expand "Details" and hover your mouse over "Directionality" then click the edit pencil icon.
From there, simply choose whether an increase or a decrease is desired for this metric, and confirm!
We ran into a case like this at Statsig when we ran an experiment to load content from the next page when your cursor hovers on the link, since you might visit there imminently. After collecting data for several weeks, we saw that we significantly reduced the time it takes to load a page (avg_time_to_first_paint)!
Yet, this was displayed in a harsh red color, giving the impression that this effect isn’t what we want. At a glance, people might overlook the meaning of the metric here and draw the wrong conclusion.
After setting the directionality of avg_time_to_first_paint to “Decrease,” that effect is now illustrated in the satisfying green color it deserves!
You can use Statsig to measure all kinds of things you care about, through all of the ups and downs of product development!
What will your team measure?
The Statsig <> Azure AI Integration is a powerful solution for configuring, measuring, and optimizing AI applications. Read More ⇾
Take an inside look at how we built Statsig, and why we handle assignment the way we do. Read More ⇾
Learn the takeaways from Ron Kohavi's presentation at Significance Summit wherein he discussed the challenges of experimentation and how to overcome them. Read More ⇾
Learn how the iconic t-test adapts to real-world A/B testing challenges and discover when alternatives might deliver better results for your experiments. Read More ⇾
See how we’re making support faster, smarter, and more personal for every user by automating what we can, and leveraging real, human help from our engineers. Read More ⇾
Marketing platforms offer basic A/B testing, but their analysis tools fall short. Here's how Statsig helps you bridge the gap and unlock deeper insights. Read More ⇾