In the past, we integrated Statsig with Datadog so that you could send events from Statsig to Datadog and use the whole suite of services on Datadog to monitor these events. However, this was only one-directional and any real-time observations you make from Datadog would require manual oversight to take action on Statsig.
đ Intoducing the Statsig and Datadog Trigger Integration
Now weâve made it possible to leverage Datadogâs real-time monitoring to automatically toggle a feature gate on or off in Statsig.
Configuring Datadog and Statsig to monitor events and toggle feature gates is simple:
Simply create a âtriggerâ on Statsig
Create a Datadog webhook using the trigger URL
Configure your monitor to notify that webhook.
Example
Imagine you are rolling out a new feature behind a Statsig feature gate. You can set up a Datadog monitor to detect anomalies in your operational metrics in correlation to changes to this gate.
Instead of having it send an alert to you or your team, you can create a trigger to disable this gate automatically. Now if the monitor fires an alert, by the time you are notified, you can rest assured the gate has already been turned off.
With this new integration, Statsig and Datadog customers can take full advantage of the features of both services. With added benefits of:
đŹ Better monitoring of feature rollouts
đ¨ Faster response time to metric regression
âŽď¸ Overall, more reliability of their service and peace of mind when launching new features.
We hope you enjoy this new functionality! Please get in touch with us with any feedback, and donât hesitate to join the Statsig Slack community!
Bringing you another highly anticipated launch - a new feature set to make it easy for you to manage the lifecycle of your feature gates, including cleanup:
You can now use one of these 4 statuses to represent the different stages of your feature (can be updated in individual feature gate page):
In Progress: feature in the process of being rolled out and tested
Launched: feature has been rolled out to everyone
Disabled: feature has been rolled back from everyone
Archived: feature is now a permanent part of your codebase (i.e. flag reference has been removed)
New filters on gates catalog to provide you useful views -
đ which gates do you need to make a launch decision for?
𧚠which gates should your team clean up from your codebase?Â
đ see all your launched features to celebrate the work your team has done!
Check out our docs for full details! Weâll continue to ramp up the rollout throughout the next 1-2 weeks.Â
đ Follow up features coming soon -
Nudges (emails, slack) to clean up feature gates
Mark your gates âpermanentâ to prevent nudges above!
Hi everyone, coming at ya with an exciting launch announcement that weâve started rolling out Metrics Archival + Deletion!
đŚ (Updated) Archiving Metrics: your metric will no longer be computed, but its history will be retained.
đ (New) Delete Metrics: your metric (and its history) will be removed from Statsig.
Weâve provided a healthy amount of checks in this process to make these features safe to use (e.g. 24-hour grace period, warnings about gate/experiment/metric dependencies, notifying impacted entity owners, etc), so you can manage your metrics confidently without fearing unintended consequences. Please visit the docs page to find out more!
Our plan is to ramp up the roll out to 100% by the end of this week, please let us know if you have any feedback as you start using them!
Christmas came early here at Statsig, with some exciting features coming down the pike. Wishing everyone a happy holiday from snowy Seattle!
Sometimes itâs necessary to reset or reallocate an experiment, but you donât want to lose access to previous Pulse results that have accrued up to that point. Now, weâve made it easy to access historical Pulse results pre-reset via an Experimentâs âHistoryâ.
To access an old Pulse snapshot, go to âHistoryâ and find the reset event, then tap âView Pulse Snapshotâ.
Following a tag will subscribe you to updates on any Experiments, Gates, and (soon) Metrics with that tag throughout your Project. This is an easy way to stay on top of anything happening in Statsig thatâs relevant to your team or key initiatives.
To Follow a tag, go to âProject Settingsâ â âTagsâ.
(Coming Soon)Â Weâre excited to start rolling out a set of upgrades to our Custom Metric creation capabilities. These updates include-
Ability to edit Custom Metrics - Now, after youâve created a Custom Metric if you need to go back and tweak the metric setup, you can do so via the âSetupâ tab of the metric detail view.
Ability to combine multiple, filtered events - By popular request, we have added support for building Custom Metrics using multiple, filtered events.
Include future ID types - At Custom Metric creation, you can now auto opt-in your new Custom Metric to include all future ID types you add to your Project.
Now you can check the status of your imports (succeeded, errored, loaded with no data, in progress, etc.) first thing when you log in to Statsig! With the status right on the homepage, you can now see any delays upfront and diagnose issues as early as possible.
Happy Friday, Statsig Community! We have a fun set of launch announcements for y'all this week.... making every last day count as we come up on the last few weeks of 2022!
Today, weâre excited to add an explicit section into Feature Gates for Monitoring Metrics. This will enable gate creators to call out any metrics they want to monitor as part of a feature rollout, and make it easier for non-creators to know what launch impact to look for.
Note that by default the Core tag will be auto-added to Monitoring Metrics for all new gate creations.
Historically, weâve supported sending in a Value and JSON metadata with every logged event, enabling you to break out Pulse results by a metric's Value inline within Pulse.
Today, weâre expanding the number of dimensions you can configure for an event, supporting up to 4 custom dimensions that you can define and send in with events to split your analysis by. To configure custom dimensions for your event, go to the Metrics tab â Events, select the event you want to configure and tap "Setup." Note that you cannot yet configure multiple dimensions for Custom Metrics.
Reviewing gate and experiment changes is a core part of the rollout process. Today, weâre making reviews even easier by providing a clearer Before/ After experience to more easily view changes, as well as introducing a new review mode called âDiff Viewâ.
To view changes in Diff View, simply toggle the mode selector in the upper right-hand corner of the review unit from âVisual Viewâ to âDiff Viewâ. Voila!
Hey everyone, weâve just released a new integration for receiving console notifications on Slack.
This is different from the current Slack integration which just sends audit logs.
To enable, go to âAccount Settingsâ -> âNotificationsâ tab.
For more information about the app, see https://statsigcommunity.slack.com/apps/A022AA315JN-statsig.
(FYI we are working to get the app approved on Slackâs app store, but this may take some time)
Happy Monday (and Happy Halloween) Statsig Community! We've got some tricks AND some treats up our sleeve for you today, with an exciting set of new product updates-
You may have noticed a new âDashboardsâ tab in the left-hand nav of your Console! Last week, we quietly started rolling out the v1 of our new Dashboards product. Dashboards give you a flexible canvas to build dashboards for the metrics, experiments, and rollouts your team cares most about.
With Dashboards, you can-
Create Custom Time Series - Create line or bar charts of your metrics, including dimension breakdowns for events.
Add Experiment and Rollout Monitoring - Add any Experiments or feature roll-outs that may impact your metrics inline on your Dashboard.
Organize and Label Widgets - Quickly and easily organize your widgets on the drag-and-drop canvas of the Dashboard. Add labels to clearly delineate grouped metrics, as well as caption individual charts to clarify metric definitions.
This is an early v1 foundation for our newest product offering, and something that will continue to evolve. If you have any feedback, we would love to hear it! Donât hesitate to reach out with feature requests or suggestions for improvements.
To make adding relevant folks into the conversation on your Experiments and Gates easier, weâve added the ability to tag team members in Discussions. Tagging team members in a Discussion comment will notify them via email (and soon Slack as well!)
Powerful search capabilities are key to being able to quickly navigate the Statsig Console. Today, weâre excited to announce that weâve added keyword search for âstartedâ, âendedâ, and âactiveâ search keywords, with support for either one date or a date range.
Attached is a table of how to use these. We've also added explicit filter options next to the search bar that will enable you to filter by Status, Health Check Status, ID Type, Creator, & Tag (all of which are also supported directly inline in Search).
Hey all, just wanted to announce that we have completed work on the Amplitude incoming integration. This will allow you to configure Amplitude to forward events to Statsig.
Statsig Docs:Â https://docs.statsig.com/integrations/data-connectors/amplitude
Amplitude Docs:Â https://www.docs.developers.amplitude.com/data/destinations/statsig/
Today, weâre continuing to invest in our Stats Engine with the addition of Sequential Testing capabilities. In Sequential Testing, the p-values for each preliminary analysis window are adjusted to compensate for the increased false positive rate associated with peeking. The goal is to enable early decision-making when there's sufficient evidence, while limiting the risk of false positives.
To enable Sequential Testing on your experiment, we require setting a target duration (which is used to calculate the adjusted p-values). We provide a handy Power Analysis Calculator within Experiment Setup to enable quick and easy estimation of target duration.
Once a target duration is set, simply toggle on Sequential Testing to start seeing adjusted confidence intervals overlayed over the default 95% confidence interval within your Pulse results.