(also updates to Experiment Timeline view)
The "batting average" view lets you look at how easy or hard a metric is to move. You can filter to a set of shipped experiments and see how many experiments moved a metric by 1% vs 10%. Like with other meta-analysis views, you can filter down to a team, a tag or even if results were statistically significant.
Common ways to use this include
Sniff testing whether the claim that the next experiment will move this metric by 15% is a good idea.
Establishing reasonable goals, based on past ability to move this metric
This view now features summary stats (e.g. How many experiments shipped control) so you don't have to sit and manually tally stats here.
We’ve expanded Metric Drilldown to provide you with more detailed options for analyzing user engagement. Now, you can easily explore how user behavior changes over daily, weekly, and monthly time frames, helping you uncover deeper insights.
Expanded Chart Granularities
To give you a more comprehensive view of your metrics, we’ve added support for Monthly Active Users (MAU) alongside the existing Daily Active Users (DAU) and Weekly Active Users (WAU). This expansion allows you to track user engagement over various time frames, making it easier to identify long-term trends or seasonal patterns.
Improved UX for Granularity Selection
We’ve also refined the UX for selecting DAU, WAU, and MAU, making it more intuitive and efficient to switch between these views. Whether you’re analyzing short-term spikes or long-term user behavior, the updated interface simplifies the process of drilling down into the details that matter most.
Weekly and Monthly Granularities in Drilldown Charts
To offer even more flexibility, we’ve introduced support for weekly and monthly granularities in metric drilldown charts. This allows you to view and compare metrics over extended periods, helping you to detect broader trends and patterns that might not be visible with daily data alone.
We’ve added a new feature that lets you download your entire dashboard as a PDF, making it easier to share your insights with others. Whether you’re preparing for a meeting, sharing results with your team, or simply saving a snapshot of your data, this feature provides a convenient way to package and distribute your dashboard.
With just a few clicks, you can export your dashboard, preserving all charts, metrics, and visualizations in a format that’s easy to share and view across different devices. This feature ensures that everyone stays aligned and informed, even when working offline or outside of the platform.
To download your dashboards, click the "..." button in the top right corner ever you dashboard and select "Export as PDF".
We’ve completely revamped the User Journeys experience, delivering a modern and intuitive interface that makes it easier than ever to explore and understand user behavior.
Modernized UX
The User Journeys interface has undergone a significant glow up. We’ve introduced a cleaner, more contemporary look and feel, making it easier to navigate and interact with the data. The new design not only improves usability but also enhances your overall analysis experience, allowing you to focus more on insights and less on the interface itself.
Where'd they go AND how'd they get there?
Understanding how users interact with your product is critical. Now, with support for journeys ending with an event, you can choose to analyze the paths users take from a specific starting point or the routes they follow to reach a particular destination. This flexibility allows you to pinpoint critical moments in the user experience, whether you’re interested in the lead-up to a key conversion event or the aftermath of an initial user action.
Sticky Hidden Events
To streamline your analysis, we’ve introduced sticky hidden events. When you hide an event to reduce noise in your journey visualizations, this preference is now preserved across your explorations. No more repeatedly decluttering your data—focus on the paths that matter without unnecessary distractions.
We really really like funnels. We think they’re great. Then we thought, what if we made them even better? So we did! Our funnels now offer richer insights, more configuration options, and are easier to understand at a glance.
Table Overhaul
We’ve overhauled the table in the main Conversion view for funnels. It now presents a detailed tabular view of the key metrics at each step of your funnel. This table is also configurable, allowing you to choose between a high-level summary or more detailed insights. You can now see exactly what’s happening at each step, with metrics like conversion rates, drop-offs, total conversions, and more, all in one place.
Conversion Rate vs. Overall Conversions
You now have flexibility in how you view your funnels. By default, the y-axis displays the conversion rate for each step. We’ve added a toggle to switch the y-axis to show the total number of conversions. The conversion rate view is ideal for comparing how different groups perform in terms of conversion rates, while the total conversions view helps you understand the overall number of users progressing through your funnel.
Information Richness
Funnels now include even more insights at a glance. Under each step, you’ll find summaries of the conversion rate, drop-off rate, total number of conversions, number of drop-offs, and median time to convert.
Rename Funnel Steps
You can now rename steps in your funnel, making them more legible or descriptive when the event names aren’t clear.
We’ve introduced key enhancements to Retention charts, designed to give you more actionable insights into user behavior and long-term engagement.
Flexible Retention Definitions
Accurately tracking user retention is crucial for understanding how your product is performing over time. With our new “Return On” and “Return On or After” retention definitions, you can now better align your analysis with specific business goals. For instance, use “Return On” to measure how many users come back on a precise day, or “Return On or After” to understand longer-term retention trends.
Daily, Weekly, Monthly Retention
User engagement varies depending on the nature of your product and user behavior. That’s why we now offer the ability to analyze retention on a daily, weekly, or monthly basis. Whether you need to track daily active users for a fast-paced app or monitor longer-term engagement for a subscription service, these options let you choose the most relevant time frame for your analysis.
Retention Over Time
Retention isn’t static; it evolves as your product and user base grow. With our new Retention Over Time feature, you can visualize how retention rates shift over weeks or months, helping you spot trends, seasonality, or the impact of product changes. This enables you to make data-driven decisions, whether you’re aiming to boost user retention, identify periods of churn, or validate the success of a new feature.
We’re excited to introduce the beta version of Session Analytics, available to a select group of customers, including you. This feature allows you to leverage a special “statsig::session_end” event within Metric Drilldown charts to analyze user sessions in your product.
A session is defined as a period of user activity followed by at least 30 minutes of inactivity. Each session_end event includes a property that records the session duration in seconds. With this data, you can answer key questions such as:
How many daily sessions are occurring?
What is the median (p50) session duration?
How does session duration vary across different browsers?
As this is a private beta release, some functionality is still under development, but we’re eager to hear your early feedback. Your insights will help us refine and improve the feature. If you would like to be added, reach out on our Slack.
We're thrilled to announce the launch of Custom Experiment Checklist, a new feature that empowers admins to tailor the experimentation guidelines to their company's specific needs. This feature allows you to replace the default Statsig experiment checklist with your own custom checklist that follows internal best experimentation practices.
Ensure adherence to company-specific best practices
Foster a unified experimentation culture across your organization
Increase the quality and consistency of experiments
Over the past few months, our customers expressed a desire for more flexibility in configuring experiment guidelines within Statsig. We listened, and Custom Experiment Checklist is our response to this valuable feedback. Custom Experiment Checklist is now available for all users. To get started, navigate to your Organizational settings and look for the new "Experiment Checklist" option.
We're excited to see how this feature will enhance your experimentation process. As always, we welcome your feedback and suggestions for further improvements.
We're excited to start rolling out our Product Analytics suite to Statsig Warehouse Native.
You can see the exact step in a 5-step checkout workflow where half of your users are dropping off. You can filter and slice metrics down by any property, instantly. Your Growth teams can spelunk in data and generate hypotheses to try.
All of this - with centralized data governance in your warehouse - a single source of truth, with do data duplication or drift.
This plays well with experimentation - letting you group events or a metric by experiment groups, or even look at a few sample rows of data when investigating an issue.
Metrics Explorer is in broad beta on Statsig WHN and is free for the rest of the 2024. It has been Generally Available on Statsig Cloud since March this year.
Statsig will begin filtering out known bot traffic from all exposures data in the Statsig console. These web-crawling bots can sometimes inflate exposure counts but don’t represent the users you’re most often trying to measure. This should improve the accuracy of your experiments, feature gate analytics, and user tracking. Several Statsig customers have requested this feature and we’re excited it’s finally coming.
For more information about this feature, you can check out our updated exposures docs page. Bot filtering will be turned on by default for all Statsig projects, but we’re also building an opt-out setting, which will be available in console.