diff --git a/content/departments/data-analytics/cody_analytics.md b/content/departments/data-analytics/cody_analytics.md index 84f4d7d25558..4da711c8ccb3 100644 --- a/content/departments/data-analytics/cody_analytics.md +++ b/content/departments/data-analytics/cody_analytics.md @@ -16,9 +16,9 @@ Below is an overview of a few of the key metrics we're using to measure and iter **Metric: Retention** -- **Definition:** The number of Cody users who were active (based on our [product user definition](#cody-product-daus) 1 and 7 days after installing Cody, respectively. +- **Definition:** The percentage of users who trigger an active product event (based on our [product user definition](#cody-product-dau)) 1 week after signup. Retention can be measured at other intervals besides Week 1 as well (such as Day 1, Day 7, Week 4, etc) but our company-level retention KPI will standarize on Week 1 - **Why this metric:** As we continue to ship improvements to Cody, retention will be key to understanding how much value users are getting from the Cody. -- **Source of truth:** This data is logged by eventlogger, and accessed via [Looker](https://sourcegraph.looker.com/dashboards/476?Server+Endpoint=) (see: “Cody Day 1 Vs Day 7 Retention” chart) +- **Source of truth:** This data is logged by eventlogger, and accessed via [Amplitude](https://app.amplitude.com/analytics/sourcegraph/chart/3pmjrguv) **Metric: Completion acceptance rate (CAR)**