diff --git a/website/docs/docs/build/dimensions.md b/website/docs/docs/build/dimensions.md index 975ae4d3160..25a1c729a7a 100644 --- a/website/docs/docs/build/dimensions.md +++ b/website/docs/docs/build/dimensions.md @@ -22,6 +22,7 @@ All dimensions require a `name`, `type`, and can optionally include an `expr` pa | `description` | A clear description of the dimension. | Optional | String | | `expr` | Defines the underlying column or SQL query for a dimension. If no `expr` is specified, MetricFlow will use the column with the same name as the group. You can use the column name itself to input a SQL expression. | Optional | String | | `label` | Defines the display value in downstream tools. Accepts plain text, spaces, and quotes (such as `orders_total` or `"orders_total"`). | Optional | String | +| [`meta`](/reference/resource-configs/meta) | Set metadata for a resource and organize resources. Accepts plain text, spaces, and quotes. | Optional | Dictionary | Refer to the following for the complete specification for dimensions: @@ -37,6 +38,8 @@ dimensions: Refer to the following example to see how dimensions are used in a semantic model: + + ```yaml semantic_models: - name: transactions @@ -59,6 +62,9 @@ semantic_models: type_params: time_granularity: day label: "Date of transaction" # Recommend adding a label to provide more context to users consuming the data + config: + meta: + data_owner: "Finance team" expr: ts - name: is_bulk type: categorical @@ -66,6 +72,40 @@ semantic_models: - name: type type: categorical ``` + + + + +```yaml +semantic_models: + - name: transactions + description: A record for every transaction that takes place. Carts are considered multiple transactions for each SKU. + model: {{ ref('fact_transactions') }} + defaults: + agg_time_dimension: order_date +# --- entities --- + entities: + - name: transaction + type: primary + ... +# --- measures --- + measures: + ... +# --- dimensions --- + dimensions: + - name: order_date + type: time + type_params: + time_granularity: day + label: "Date of transaction" # Recommend adding a label to provide more context to users consuming the data + expr: ts + - name: is_bulk + type: categorical + expr: case when quantity > 10 then true else false end + - name: type + type: categorical +``` + Dimensions are bound to the primary entity of the semantic model they are defined in. For example the dimension `type` is defined in a model that has `transaction` as a primary entity. `type` is scoped to the `transaction` entity, and to reference this dimension you would use the fully qualified dimension name i.e `transaction__type`. @@ -101,12 +141,28 @@ This section further explains the dimension definitions, along with examples. Di Categorical dimensions are used to group metrics by different attributes, features, or characteristics such as product type. They can refer to existing columns in your dbt model or be calculated using a SQL expression with the `expr` parameter. An example of a categorical dimension is `is_bulk_transaction`, which is a group created by applying a case statement to the underlying column `quantity`. This allows users to group or filter the data based on bulk transactions. + + +```yaml +dimensions: + - name: is_bulk_transaction + type: categorical + expr: case when quantity > 10 then true else false end + config: + meta: + usage: "Filter to identify bulk transactions, like where quantity > 10." +``` + + + + ```yaml dimensions: - name: is_bulk_transaction type: categorical expr: case when quantity > 10 then true else false end ``` + ## Time @@ -130,12 +186,17 @@ You can set `is_partition` for time to define specific time spans. Additionally, Use `is_partition: True` to show that a dimension exists over a specific time window. For example, a date-partitioned dimensional table. When you query metrics from different tables, the dbt Semantic Layer uses this parameter to ensure that the correct dimensional values are joined to measures. + + ```yaml dimensions: - name: created_at type: time label: "Date of creation" expr: ts_created # ts_created is the underlying column name from the table + config: + meta: + notes: "Only valid for orders from 2022 onward" is_partition: True type_params: time_granularity: day @@ -156,6 +217,37 @@ measures: expr: 1 agg: sum ``` + + + + +```yaml +dimensions: + - name: created_at + type: time + label: "Date of creation" + expr: ts_created # ts_created is the underlying column name from the table + is_partition: True + type_params: + time_granularity: day + - name: deleted_at + type: time + label: "Date of deletion" + expr: ts_deleted # ts_deleted is the underlying column name from the table + is_partition: True + type_params: + time_granularity: day + +measures: + - name: users_deleted + expr: 1 + agg: sum + agg_time_dimension: deleted_at + - name: users_created + expr: 1 + agg: sum +``` + diff --git a/website/docs/docs/build/entities.md b/website/docs/docs/build/entities.md index e4ed0773c3c..558dfd3aea4 100644 --- a/website/docs/docs/build/entities.md +++ b/website/docs/docs/build/entities.md @@ -95,17 +95,67 @@ Natural keys are columns or combinations of columns in a table that uniquely ide The following is the complete spec for entities: + + +```yaml +semantic_models: + - name: semantic_model_name + ..rest of the semantic model config + entities: + - name: entity_name ## Required + type: Primary, natural, foreign, or unique ## Required + description: A description of the field or role the entity takes in this table ## Optional + expr: The field that denotes that entity (transaction_id). ## Optional + Defaults to name if unspecified. + [config](/reference/resource-properties/config): Specify configurations for entity. ## Optional + [meta](/reference/resource-configs/meta): {} Set metadata for a resource and organize resources. Accepts plain text, spaces, and quotes. ## Optional +``` + + + + +```yaml +semantic_models: + - name: semantic_model_name + ..rest of the semantic model config + entities: + - name: entity_name ## Required + type: Primary, or natural, or foreign, or unique ## Required + description: A description of the field or role the entity takes in this table ## Optional + expr: The field that denotes that entity (transaction_id). ## Optional + Defaults to name if unspecified. +``` + + +Here's an example of how to define entities in a semantic model: + + + ```yaml entities: - - name: transaction ## Required - type: Primary or natural or foreign or unique ## Required + - name: transaction + type: primary + expr: id_transaction + - name: order + type: foreign + expr: id_order + - name: user + type: foreign + expr: substring(id_order from 2) + entities: + - name: transaction + type: description: A description of the field or role the entity takes in this table ## Optional - expr: The field that denotes that entity (transaction_id). ## Optional + expr: The field that denotes that entity (transaction_id). Defaults to name if unspecified. + [config](/reference/resource-properties/config): + [meta](/reference/resource-configs/meta): + data_owner: "Finance team" ``` + + + -Here's an example of how to define entities in a semantic model: - ```yaml entities: - name: transaction @@ -117,11 +167,18 @@ entities: - name: user type: foreign expr: substring(id_order from 2) + entities: + - name: transaction + type: + description: A description of the field or role the entity takes in this table ## Optional + expr: The field that denotes that entity (transaction_id). + Defaults to name if unspecified. ``` + ## Combine columns with a key -If a table doesn't have any key (like a primary key), use _surrogate combination_ to form a key that will help you identify a record by combining two columns. This applies to any [entity type](/docs/build/entities#entity-types). For example, you can combine `date_key` and `brand_code` from the `raw_brand_target_weekly` table to form a _surrogate key_. The following example creates a surrogate key by joining `date_key` and `brand_code` using a pipe (`|`) as a separator. +If a table doesn't have any key (like a primary key), use _surrogate combination_ to form a key that will help you identify a record by combining two columns. This applies to any [entity type](/docs/build/entities#entity-types). For example, you can combine `date_key` and `brand_code` from the `raw_brand_target_weekly` table to form a _surrogate key_. The following example creates a surrogate key by joining `date_key` and `brand_code` using a pipe (`|`) as a separator. ```yaml diff --git a/website/docs/docs/build/measures.md b/website/docs/docs/build/measures.md index d60aa3f7e21..aa66dc86731 100644 --- a/website/docs/docs/build/measures.md +++ b/website/docs/docs/build/measures.md @@ -18,16 +18,41 @@ import MeasuresParameters from '/snippets/_sl-measures-parameters.md'; An example of the complete YAML measures spec is below. The actual configuration of your measures will depend on the aggregation you're using. + + +```yaml +semantic_models: + - name: semantic_model_name + ..rest of the semantic model config + measures: + - name: The name of the measure + description: 'same as always' ## Optional + agg: the aggregation type. + expr: the field + agg_params: 'specific aggregation properties such as a percentile' ## Optional + agg_time_dimension: The time field. Defaults to the default agg time dimension for the semantic model. ## Optional + non_additive_dimension: 'Use these configs when you need non-additive dimensions.' ## Optional + [config](/reference/resource-properties/config): Use the config property to specify configurations for your measure. ## Optional + [meta](/reference/resource-configs/meta): {} Set metadata for a resource and organize resources. Accepts plain text, spaces, and quotes. ## Optional +``` + + + + ```yaml -measures: - - name: The name of the measure - description: 'same as always' ## Optional - agg: the aggregation type. - expr: the field - agg_params: 'specific aggregation properties such as a percentile' ## Optional - agg_time_dimension: The time field. Defaults to the default agg time dimension for the semantic model. ## Optional - non_additive_dimension: 'Use these configs when you need non-additive dimensions.' ## Optional +semantic_models: + - name: semantic_model_name + ..rest of the semantic model config + measures: + - name: The name of the measure + description: 'same as always' ## Optional + agg: the aggregation type. + expr: the field + agg_params: 'specific aggregation properties such as a percentile' ## Optional + agg_time_dimension: The time field. Defaults to the default agg time dimension for the semantic model. ## Optional + non_additive_dimension: 'Use these configs when you need non-additive dimensions.' ## Optional ``` + ### Name @@ -96,6 +121,96 @@ If you use the `dayofweek` function in the `expr` parameter with the legacy Snow ### Model with different aggregations + + +```yaml +semantic_models: + - name: transactions + description: A record of every transaction that takes place. Carts are considered multiple transactions for each SKU. + model: ref('schema.transactions') + defaults: + agg_time_dimension: transaction_date + +# --- entities --- + entities: + - name: transaction_id + type: primary + - name: customer_id + type: foreign + - name: store_id + type: foreign + - name: product_id + type: foreign + +# --- measures --- + measures: + - name: transaction_amount_usd + description: Total USD value of transactions + expr: transaction_amount_usd + agg: sum + config: + meta: + used_in_reporting: true + - name: transaction_amount_usd_avg + description: Average USD value of transactions + expr: transaction_amount_usd + agg: average + - name: transaction_amount_usd_max + description: Maximum USD value of transactions + expr: transaction_amount_usd + agg: max + - name: transaction_amount_usd_min + description: Minimum USD value of transactions + expr: transaction_amount_usd + agg: min + - name: quick_buy_transactions + description: The total transactions bought as quick buy + expr: quick_buy_flag + agg: sum_boolean + - name: distinct_transactions_count + description: Distinct count of transactions + expr: transaction_id + agg: count_distinct + - name: transaction_amount_avg + description: The average value of transactions + expr: transaction_amount_usd + agg: average + - name: transactions_amount_usd_valid # Notice here how we use expr to compute the aggregation based on a condition + description: The total USD value of valid transactions only + expr: CASE WHEN is_valid = True then transaction_amount_usd else 0 end + agg: sum + - name: transactions + description: The average value of transactions. + expr: transaction_amount_usd + agg: average + - name: p99_transaction_value + description: The 99th percentile transaction value + expr: transaction_amount_usd + agg: percentile + agg_params: + percentile: .99 + use_discrete_percentile: False # False calculates the continuous percentile, True calculates the discrete percentile. + - name: median_transaction_value + description: The median transaction value + expr: transaction_amount_usd + agg: median + +# --- dimensions --- + dimensions: + - name: transaction_date + type: time + expr: date_trunc('day', ts) # expr refers to underlying column ts + type_params: + time_granularity: day + - name: is_bulk_transaction + type: categorical + expr: case when quantity > 10 then true else false end + +``` + + + + ```yaml semantic_models: - name: transactions @@ -177,6 +292,7 @@ semantic_models: expr: case when quantity > 10 then true else false end ``` + ### Non-additive dimensions diff --git a/website/docs/docs/build/packages.md b/website/docs/docs/build/packages.md index 7a2c08d3e70..5ee619989bc 100644 --- a/website/docs/docs/build/packages.md +++ b/website/docs/docs/build/packages.md @@ -165,27 +165,44 @@ dbt Cloud supports private packages from [supported](#prerequisites) Git repos l #### Prerequisites -To use native private packages, you must have one of the following Git providers configured in the **Integrations** section of your **Account settings**: -- [GitHub](/docs/cloud/git/connect-github) -- [Azure DevOps](/docs/cloud/git/connect-azure-devops) -- Support for GitLab is coming soon. - +- To use native private packages, you must have one of the following Git providers configured in the **Integrations** section of your **Account settings**: + - [GitHub](/docs/cloud/git/connect-github) + - [Azure DevOps](/docs/cloud/git/connect-azure-devops) + - Private packages only work within a single Azure DevOps project. If your repositories are in different projects within the same organization, you can't reference them in the `private` key at this time. + - For Azure DevOps, use the `org/repo` path (not the `org_name/project_name/repo_name` path) with the project tier inherited from the integrated source repository. + - Support for GitLab is coming soon. #### Configuration -Use the `private` key in your `packages.yml` or `dependencies.yml` to clone package repos using your existing dbt Cloud Git integration without having to provision an access token or create a dbt Cloud environment variable: +Use the `private` key in your `packages.yml` or `dependencies.yml` to clone package repos using your existing dbt Cloud Git integration without having to provision an access token or create a dbt Cloud environment variable. + ```yaml packages: - - private: dbt-labs/awesome_repo + - private: dbt-labs/awesome_repo # your-org/your-repo path - package: normal packages - - [...] + [...] ``` + + +:::tip Azure DevOps considerations + +- Private packages currently only work if the package repository is in the same Azure DevOps project as the source repo. +- Use the `org/repo` path (not the normal ADO `org_name/project_name/repo_name` path) in the `private` key. +- Repositories in different Azure DevOps projects is currently not supported until a future update. +You can use private packages by specifying `org/repo` in the `private` key: + + + +```yaml +packages: + - private: my-org/my-repo # Works if your ADO source repo and package repo are in the same project +``` +::: You can pin private packages similar to regular dbt packages: diff --git a/website/docs/docs/build/saved-queries.md b/website/docs/docs/build/saved-queries.md index ed56d13dcc9..840b1ebb95c 100644 --- a/website/docs/docs/build/saved-queries.md +++ b/website/docs/docs/build/saved-queries.md @@ -17,8 +17,32 @@ To create a saved query, refer to the following table parameters. :::tip Note that we use the double colon (::) to indicate whether a parameter is nested within another parameter. So for example, `query_params::metrics` means the `metrics` parameter is nested under `query_params`. ::: + + + + +| Parameter | Type | Required | Description | +|-------|---------|----------|----------------| +| `name` | String | Required | Name of the saved query object. | +| `description` | String | Required | A description of the saved query. | +| `label` | String | Required | The display name for your saved query. This value will be shown in downstream tools. | +| `config` | String | Optional | Use the [`config`](/reference/resource-properties/config) property to specify configurations for your saved query. Supports `cache`, [`enabled`](/reference/resource-configs/enabled), `export_as`, [`group`](/reference/resource-configs/group), [`meta`](/reference/resource-configs/meta), [`tags`](/reference/resource-configs/tags), and [`schema`](/reference/resource-configs/schema) configurations. | +| `config::cache::enabled` | Object | Optional | An object with a sub-key used to specify if a saved query should populate the [cache](/docs/use-dbt-semantic-layer/sl-cache). Accepts sub-key `true` or `false`. Defaults to `false` | +| `query_params` | Structure | Required | Contains the query parameters. | +| `query_params::metrics` | List or String | Optional | A list of the metrics to be used in the query as specified in the command line interface. | +| `query_params::group_by` | List or String | Optional | A list of the Entities and Dimensions to be used in the query, which include the `Dimension` or `TimeDimension`. | +| `query_params::where` | List or String | Optional | A list of strings that may include the `Dimension` or `TimeDimension` objects. | +| `exports` | List or Structure | Optional | A list of exports to be specified within the exports structure. | +| `exports::name` | String | Required | Name of the export object. | +| `exports::config` | List or Structure | Required | A [`config`](/reference/resource-properties/config) property for any parameters specifying the export. | +| `exports::config::export_as` | String | Required | The type of export to run. Options include table or view currently and cache in the near future. | +| `exports::config::schema` | String | Optional | The [schema](/reference/resource-configs/schema) for creating the table or view. This option cannot be used for caching. | +| `exports::config::alias` | String | Optional | The table [alias](/reference/resource-configs/alias) used to write to the table or view. This option cannot be used for caching. | + + + - + | Parameter | Type | Required | Description | |-------|---------|----------|----------------| @@ -33,15 +57,15 @@ Note that we use the double colon (::) to indicate whether a parameter is nested | `query_params::where` | List or String | Optional | A list of strings that may include the `Dimension` or `TimeDimension` objects. | | `exports` | List or Structure | Optional | A list of exports to be specified within the exports structure. | | `exports::name` | String | Required | Name of the export object. | -| `exports::config` | List or Structure | Required | A config section for any parameters specifying the export. | +| `exports::config` | List or Structure | Required | A [`config`](/reference/resource-properties/config) property for any parameters specifying the export. | | `exports::config::export_as` | String | Required | The type of export to run. Options include table or view currently and cache in the near future. | | `exports::config::schema` | String | Optional | The schema for creating the table or view. This option cannot be used for caching. | -| `exports::config::alias` | String | Optional | The table alias used to write to the table or view. This option cannot be used for caching. | +| `exports::config::alias` | String | Optional | The table alias used to write to the table or view. This option cannot be used for caching. | - + | Parameter | Type | Required | Description | |-------|---------|----------|----------------| @@ -54,7 +78,7 @@ Note that we use the double colon (::) to indicate whether a parameter is nested | `query_params::where` | List or String | Optional | Conditions nested with the `query_params`: a list of strings that may include the `Dimension` or `TimeDimension` objects. | | `exports` | List or Structure | Optional | A list of exports to be specified within the exports structure. | | `exports::name` | String | Required | Name of export object, nested within `exports`. | -| `exports::config` | List or Structure | Required | A config section for any parameters specifying the export, nested within `exports`. | +| `exports::config` | List or Structure | Required | A [`config`](/reference/resource-properties/config) property for any parameters specifying the export, nested within `exports`. | | `exports::config::export_as` | String | Required | Specifies the type of export: table, view, or upcoming cache options. Nested within `exports` and `config`. | | `exports::config::schema` | String | Optional | Schema for creating the table or view, not applicable for caching. Nested within `exports` and `config`. | | `exports::config::alias` | String | Optional | Table alias used to write to the table or view. This option can't be used for caching. Nested within `exports` and `config`. | @@ -69,13 +93,12 @@ Use saved queries to define and manage common Semantic Layer queries in YAML, in In your saved query config, you can also leverage [caching](/docs/use-dbt-semantic-layer/sl-cache) with the dbt Cloud job scheduler to cache common queries, speed up performance, and reduce compute costs. - - - + In the following example, you can set the saved query in the `semantic_model.yml` file: + ```yaml saved_queries: @@ -84,7 +107,8 @@ saved_queries: label: Test saved query config: cache: - enabled: true # Or false if you want it disabled by default + [enabled](/reference/resource-configs/enabled): true | false + [tags](/reference/resource-configs/tags): 'my_tag' query_params: metrics: - simple_metric @@ -96,12 +120,44 @@ saved_queries: exports: - name: my_export config: + export_as: table alias: my_export_alias + schema: my_export_schema_name +``` + + + + + +```yaml +saved_queries: + - name: test_saved_query + description: "{{ doc('saved_query_description') }}" + label: Test saved query + config: + cache: + enabled: true # Or false if you want it disabled by default + query_params: + metrics: + - simple_metric + group_by: + - "Dimension('user__ds')" + where: + - "{{ Dimension('user__ds', 'DAY') }} <= now()" + - "{{ Dimension('user__ds', 'DAY') }} >= '2023-01-01'" + exports: + - name: my_export + config: export_as: table + alias: my_export_alias schema: my_export_schema_name ``` + + + + Note, that you can set `export_as` to both the saved query and the exports [config](/reference/resource-properties/config), with the exports config value taking precedence. If a key isn't set in the exports config, it will inherit the saved query config value. #### Where clause @@ -121,7 +177,6 @@ filter: | filter: | {{ Metric('metric_name', group_by=['entity_name']) }} ``` - @@ -147,8 +202,8 @@ saved_queries: exports: - name: my_export config: - alias: my_export_alias export_as: table + alias: my_export_alias schema: my_export_schema_name ``` @@ -181,11 +236,15 @@ Once you've configured your saved query and set the foundation block, you can no The following is an example of a saved query with an export: + ```yaml saved_queries: - name: order_metrics description: Relevant order metrics + config: + tags: + - order_metrics query_params: metrics: - orders @@ -204,9 +263,40 @@ saved_queries: - name: order_metrics config: export_as: table # Options available: table, view - schema: YOUR_SCHEMA # Optional - defaults to deployment schema - alias: SOME_TABLE_NAME # Optional - defaults to Export name + [alias](/reference/resource-configs/alias): my_export_alias # Optional - defaults to Export name + [schema](/reference/resource-configs/schema): my_export_schema_name # Optional - defaults to deployment schema ``` + + + + +```yaml +saved_queries: + - name: order_metrics + description: Relevant order metrics + query_params: + metrics: + - orders + - large_order + - food_orders + - order_total + group_by: + - Entity('order_id') + - TimeDimension('metric_time', 'day') + - Dimension('customer__customer_name') + - ... # Additional group_by + where: + - "{{TimeDimension('metric_time')}} > current_timestamp - interval '1 week'" + - ... # Additional where clauses + exports: + - name: order_metrics + config: + export_as: table # Options available: table, view + schema: my_export_schema_name # Optional - defaults to deployment schema + alias: my_export_alias # Optional - defaults to Export name +``` + + ## Run exports diff --git a/website/docs/docs/build/simple.md b/website/docs/docs/build/simple.md index 2deb718d780..19dd4bb0086 100644 --- a/website/docs/docs/build/simple.md +++ b/website/docs/docs/build/simple.md @@ -15,6 +15,7 @@ Simple metrics are metrics that directly reference a single measure, without any Note that we use the double colon (::) to indicate whether a parameter is nested within another parameter. So for example, `query_params::metrics` means the `metrics` parameter is nested under `query_params`. ::: + | Parameter | Description | Required | Type | | --------- | ----------- | ---- | ---- | | `name` | The name of the metric. | Required | String | diff --git a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md index 36c6cc898dc..72c1fbe7af6 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md @@ -64,7 +64,7 @@ The IDE features some delightful tools and layouts to make it easier for you to - - Use the **Prune branches** option to remove local branches that have already been deleted from the remote repository. Selecting this triggers a [pop-up modal](#prune-branches-modal), where you can confirm the deletion of the specific local branches, keeping your branch management tidy. Note that this won't delete the branch you're currently on. Pruning branches isn't available for [managed repositories](/docs/collaborate/git/managed-repository) because they don't have a typical remote setup, which prevents remote branch deletion. + - Use the **Prune branches** option to remove local branches that have already been deleted from the remote repository. Selecting this triggers a [pop-up modal](#prune-branches-modal), where you can confirm the deletion of the specific local branches, keeping your branch management tidy. Note that this won't delete the branch you're currently on. Pruning branches isn't available for [managed repositories](/docs/cloud/git/managed-repository) because they don't have a typical remote setup, which prevents remote branch deletion. ## Additional editing features diff --git a/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md b/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md index fb8c0186236..57558f7cb5b 100644 --- a/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md +++ b/website/docs/docs/cloud/git/git-configuration-in-dbt-cloud.md @@ -1,37 +1,46 @@ --- -title: "Git configuration in dbt Cloud" +title: "Configure Git in dbt Cloud" description: "Learn about the Git providers supported in dbt Cloud" -pagination_next: "docs/cloud/git/import-a-project-by-git-url" -pagination_prev: null +hide_table_of_contents: true +pagination_next: "docs/cloud/git/managed-repository" --- -
+[Version control](/docs/collaborate/git/version-control-basics) — a system that allows you and your teammates to work safely and simultaneously on a single project — is an essential part of the dbt workflow. It enables teams to collaborate effectively and maintain a history of changes to their dbt projects. + +In dbt Cloud, you can configure Git integrations to manage your dbt project code with ease. dbt Cloud offers multiple ways to integrate with you Git provider, catering to diverse team needs and preferences. + +Whether you use a Git integration that natively connects with dbt Cloud or prefer to work with a managed or cloned repository, dbt Cloud supports flexible options to streamline your workflow. + +
+ + -
-
-
-
\ No newline at end of file +
diff --git a/website/docs/docs/cloud/git/import-a-project-by-git-url.md b/website/docs/docs/cloud/git/import-a-project-by-git-url.md index 2b499b39cb7..804b1542e80 100644 --- a/website/docs/docs/cloud/git/import-a-project-by-git-url.md +++ b/website/docs/docs/cloud/git/import-a-project-by-git-url.md @@ -1,8 +1,6 @@ --- -title: "Import a project by git URL" -id: "import-a-project-by-git-url" -pagination_next: "docs/cloud/git/connect-github" -pagination_prev: null +title: "Connect with Git clone" +description: "Learn how to connect to a git repository using a git URL." --- In dbt Cloud, you can import a git repository from any valid git URL that points to a dbt project. There are some important considerations to keep in mind when doing this. @@ -10,7 +8,7 @@ In dbt Cloud, you can import a git repository from any valid git URL that points ## Git protocols You must use the `git@...` or `ssh:..`. version of your git URL, not the `https://...` version. dbt Cloud uses the SSH protocol to clone repositories, so dbt Cloud will be unable to clone repos supplied with the HTTP protocol. -## Managing Deploy Keys +## Managing deploy keys After importing a project by Git URL, dbt Cloud will generate a Deploy Key for your repository. To find the deploy key in dbt Cloud: diff --git a/website/docs/docs/cloud/git/managed-repository.md b/website/docs/docs/cloud/git/managed-repository.md new file mode 100644 index 00000000000..64379611825 --- /dev/null +++ b/website/docs/docs/cloud/git/managed-repository.md @@ -0,0 +1,27 @@ +--- +title: "Connect with managed repository" +id: "managed-repository" +description: "Learn how to set up a project with a managed repository." +pagination_next: "docs/cloud/git/import-a-project-by-git-url" +pagination_prev: "docs/cloud/git/git-configuration-in-dbt-cloud" +--- + +Managed repositories are a great way to trial dbt without needing to create a new repository. If you don't already have a Git repository for your dbt project, you can let dbt Cloud host and manage a repository for you. + +If in the future you choose to host this repository elsewhere, you can export the information from dbt Cloud at any time. Refer to [Move from a managed repository to a self-hosted repository](/faqs/Git/managed-repo) for more information on how to do that. + + +:::info +dbt Labs recommends against using a managed repository in a production environment. You can't use Git features like pull requests, which are part of our recommended version control best practices. +::: + +To set up a project with a managed repository: + +1. From your **Account settings** in dbt Cloud, select the project you want to set up with a managed repository. If the project already has a repository set up, you need to edit the repository settings and disconnect the existing repository. +2. Click **Edit** for the project. +3. Under Repository, click **Configure repository**. +4. Select **Managed**. +5. Enter a name for the repository. For example, "analytics" or "dbt-models." +6. Click **Create**. + + diff --git a/website/docs/docs/collaborate/git/managed-repository.md b/website/docs/docs/collaborate/git/managed-repository.md deleted file mode 100644 index db8e9840ccd..00000000000 --- a/website/docs/docs/collaborate/git/managed-repository.md +++ /dev/null @@ -1,20 +0,0 @@ ---- -title: "Managed repository" -id: "managed-repository" ---- - -If you do not already have a git repository for your dbt project, you can let dbt Cloud manage a repository for you. Managed repositories are a great way to trial dbt without needing to create a new repository. - -To set up a project with a managed repository: - -1. From your Account settings in dbt Cloud, select the project you want to set up with a managed repository. If the project already has a repository set up, you need to edit the repository settings and disconnect the existing repository. -2. Click **Edit** for the project. -3. Under Repository, click **Configure repository**. -4. Select **Managed**. -5. Enter a name for the repository. For example, "analytics" or "dbt-models." -6. Click **Create**. - - -dbt Cloud will host and manage this repository for you. If in the future you choose to host this repository elsewhere, you can export the information from dbt Cloud at any time. - -** We do not recommend using a managed repository in a production environment. You will not be able to use git features like pull requests which are part of our recommended version control best practices. diff --git a/website/docs/docs/collaborate/git/pr-template.md b/website/docs/docs/collaborate/git/pr-template.md index b621e31344d..c14b5e9dd24 100644 --- a/website/docs/docs/collaborate/git/pr-template.md +++ b/website/docs/docs/collaborate/git/pr-template.md @@ -14,7 +14,7 @@ The PR Template URL setting will be automatically set for most repositories, dep - If you connect to your repository via in-app integrations with your git provider or the "Git Clone" method via SSH, this URL setting will be auto-populated and editable. - For AWS CodeCommit, this URL setting isn't auto-populated and must be [manually configured](/docs/cloud/git/import-a-project-by-git-url#step-5-configure-pull-request-template-urls-optional). -- If you connect via a dbt Cloud [Managed repository](/docs/collaborate/git/managed-repository), this URL will not be set, and the IDE will prompt users to merge the changes directly into their default branch. +- If you connect via a dbt Cloud [Managed repository](/docs/cloud/git/managed-repository), this URL will not be set, and the IDE will prompt users to merge the changes directly into their default branch. The PR template URL supports two variables that can be used to build a URL string. These variables, `{{source}}` and `{{destination}}` return branch names based on the diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.7.md index df24b63a2f0..b98a76295cf 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.7.md +++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.7.md @@ -66,7 +66,7 @@ dbt Core v1.5 introduced model governance which we're continuing to refine. v1. ### dbt clean -Starting in v1.7, `dbt clean` will only clean paths within the current working directory. The `--no-clean-project-files-only` flag will delete all paths specified in `clean-paths`, even if they're outside the dbt project. +Starting in v1.7, `dbt clean` will only clean paths within the current working directory. The `--no-clean-project-files-only` flag will delete all paths specified in the `clean-targets` section of `dbt_project.yml`, even if they're outside the dbt project. Supported flags: - `--clean-project-files-only` (default) diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index 9b2205e46d8..369511aae8e 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -20,6 +20,17 @@ Release notes are grouped by month for both multi-tenant and virtual private clo ## December 2024 +- **New**: Saved queries now support [tags](/reference/resource-configs/tags), which allow you to categorize your resources and filter them. Add tags to your [saved queries](/docs/build/saved-queries) in the `semantic_model.yml` file or `dbt_project.yml` file. For example: + + + ```yml + [saved-queries](/docs/build/saved-queries): + jaffle_shop: + customer_order_metrics: + +tags: order_metrics + ``` + +- **New**: [Dimensions](/reference/resource-configs/meta) now support the `meta` config property in [dbt Cloud "Latest" release track](/docs/dbt-versions/cloud-release-tracks) and from dbt Core 1.9. You can add metadata to your dimensions to provide additional context and information about the dimension. Refer to [meta](/reference/resource-configs/meta) for more information. - **New**: [Auto exposures](/docs/collaborate/auto-exposures) are now generally available to dbt Cloud Enterprise plans. Auto-exposures integrate natively with Tableau (Power BI coming soon) and auto-generate downstream lineage in dbt Explorer for a richer experience. - **New**: The dbt Semantic Layer supports Sigma as a [partner integration](/docs/cloud-integrations/avail-sl-integrations), available in Preview. Refer to [Sigma](https://help.sigmacomputing.com/docs/configure-a-dbt-semantic-layer-integration) for more information. - **New**: The dbt Semantic Layer now supports Azure Single-tenant deployments. Refer to [Set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) for more information on how to get started. @@ -29,8 +40,8 @@ Release notes are grouped by month for both multi-tenant and virtual private clo - **New**: You can now use your [Azure OpenAI key](/docs/cloud/account-integrations?ai-integration=azure#ai-integrations) (available in beta) to use dbt Cloud features like [dbt Copilot](/docs/cloud/dbt-copilot) and [Ask dbt](/docs/cloud-integrations/snowflake-native-app) . Additionally, you can use your own [OpenAI API key](/docs/cloud/account-integrations?ai-integration=openai#ai-integrations) or use [dbt Labs-managed OpenAI](/docs/cloud/account-integrations?ai-integration=dbtlabs#ai-integrations) key. Refer to [AI integrations](/docs/cloud/account-integrations#ai-integrations) for more information. - **New**: The [`hard_deletes`](/reference/resource-configs/hard-deletes) config gives you more control on how to handle deleted rows from the source. Supported options are `ignore` (default), `invalidate` (replaces the legacy `invalidate_hard_deletes=true`), and `new_record`. Note that `new_record` will create a new metadata column in the snapshot table. - ## November 2024 + - **Enhancement**: Data health signals in dbt Explorer are now available for Exposures, providing a quick view of data health while browsing resources. To view trust signal icons, go to dbt Explorer and click **Exposures** under the **Resource** tab. Refer to [Data health signals for resources](/docs/collaborate/data-health-signals) for more info. - **Bug**: Identified and fixed an error with Semantic Layer queries that take longer than 10 minutes to complete. - **Fix**: Job environment variable overrides in credentials are now respected for Exports. Previously, they were ignored. @@ -46,6 +57,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo - Better error messaging for queries that can't be parsed correctly. - **Enhancement**: The dbt Semantic Layer supports creating new credentials for users who don't have permissions to create service tokens. In the **Credentials & service tokens** side panel, the **+Add Service Token** option is unavailable for those users who don't have permission. Instead, the side panel displays a message indicating that the user doesn't have permission to create a service token and should contact their administration. Refer to [Set up dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) for more details. + ## October 2024 diff --git a/website/docs/docs/use-dbt-semantic-layer/exports.md b/website/docs/docs/use-dbt-semantic-layer/exports.md index 1883212fb66..2ecffc508a2 100644 --- a/website/docs/docs/use-dbt-semantic-layer/exports.md +++ b/website/docs/docs/use-dbt-semantic-layer/exports.md @@ -24,21 +24,22 @@ Essentially, exports are like any other table in your data platform — they ## Benefits of exports -The following section explains the main benefits of using exports, including: -- [DRY representation](#dry-representation) -- [Easier changes](#easier-changes) -- [Caching](#caching) +The following section explains the main benefits of using exports: -#### DRY representation + Currently, creating tables often involves generating tens, hundreds, or even thousands of tables that denormalize data into summary or metric mart tables. The main benefit of exports is creating a "Don't Repeat Yourself (DRY)" representation of the logic to construct each metric, dimension, join, filter, and so on. This allows you to reuse those components for long-term scalability, even if you're replacing manually written SQL models with references to the metrics or dimensions in saved queries. + -#### Easier changes + Exports ensure that changes to metrics and dimensions are made in one place and then cascade to those various destinations seamlessly. This prevents the problem of needing to update a metric across every model that references that same concept. + + + -#### Caching Use exports to pre-populate the cache, so that you're pre-computing what you need to serve users through the dynamic Semantic Layer APIs. + #### Considerations diff --git a/website/docs/faqs/Git/gitignore.md b/website/docs/faqs/Git/gitignore.md index f5892b30b83..1b9013a4473 100644 --- a/website/docs/faqs/Git/gitignore.md +++ b/website/docs/faqs/Git/gitignore.md @@ -7,9 +7,11 @@ id: gitignore A `.gitignore` file specifies which files git should intentionally ignore or 'untrack'. dbt Cloud indicates untracked files in the project file explorer pane by putting the file or folder name in *italics*. -If you encounter issues like problems reverting changes, checking out or creating a new branch, or not being prompted to open a pull request after a commit in the dbt Cloud IDE — this usually indicates a problem with the [.gitignore](https://github.com/dbt-labs/dbt-starter-project/blob/main/.gitignore) file. The file may be missing or lacks the required entries for dbt Cloud to work correctly. +If you encounter issues like problems reverting changes, checking out or creating a new branch, or not being prompted to open a pull request after a commit in the dbt Cloud IDE — this usually indicates a problem with the [.gitignore](https://github.com/dbt-labs/dbt-starter-project/blob/main/.gitignore) file. The file may be missing or lacks the required entries for dbt Cloud to work correctly. -### Fix in the dbt Cloud IDE +The following sections describe how to fix the `.gitignore` file in: + + To resolve issues with your `gitignore` file, adding the correct entries won't automatically remove (or 'untrack') files or folders that have already been tracked by git. The updated `gitignore` will only prevent new files or folders from being tracked. So you'll need to first fix the `gitignore` file, then perform some additional git operations to untrack any incorrect files or folders. @@ -51,7 +53,9 @@ For more info on `gitignore` syntax, refer to the [Git docs](https://git-scm.com -### Fix in the git provider + + + Sometimes it's necessary to use the git providers web interface to fix a broken `.gitignore` file. Although the specific steps may vary across providers, the general process remains the same. @@ -121,3 +125,4 @@ dbt_modules/ For more info, refer to this [detailed video](https://www.loom.com/share/9b3b8e2b617f41a8bad76ec7e42dd014) for additional guidance. + diff --git a/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md index 4165506993c..5909459fbea 100644 --- a/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md +++ b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md @@ -13,7 +13,7 @@ Your IDE session experienced an unknown error and was terminated. Please contact ``` -You can try to resolve this by adding a repository like a [managed repository](/docs/collaborate/git/managed-repository) or your preferred Git account. To add your Git account, navigate to **Project** > **Repository** and select your repository. +You can try to resolve this by adding a repository like a [managed repository](/docs/cloud/git/managed-repository) or your preferred Git account. To add your Git account, navigate to **Project** > **Repository** and select your repository. If you're still running into this error, please contact the Support team at support@getdbt.com for help. diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md index 091f1006992..5f14222d910 100644 --- a/website/docs/guides/dbt-python-snowpark.md +++ b/website/docs/guides/dbt-python-snowpark.md @@ -262,7 +262,7 @@ We need to obtain our data source by copying our Formula 1 data into Snowflake t ## Configure dbt Cloud -1. We are going to be using [Snowflake Partner Connect](https://docs.snowflake.com/en/user-guide/ecosystem-partner-connect.html) to set up a dbt Cloud account. Using this method will allow you to spin up a fully fledged dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [managed repository](/docs/collaborate/git/managed-repository), environments, and credentials already established. +1. We are going to be using [Snowflake Partner Connect](https://docs.snowflake.com/en/user-guide/ecosystem-partner-connect.html) to set up a dbt Cloud account. Using this method will allow you to spin up a fully fledged dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [managed repository](/docs/cloud/git/managed-repository), environments, and credentials already established. 2. Navigate out of your worksheet back by selecting **home**. 3. In Snowsight, confirm that you are using the **ACCOUNTADMIN** role. 4. Navigate to the **Data Products** **> Partner Connect**. Find **dbt** either by using the search bar or navigating the **Data Integration**. Select the **dbt** tile. @@ -282,7 +282,7 @@ We need to obtain our data source by copying our Formula 1 data into Snowflake t 9. Select **Complete Registration**. You should now be redirected to your dbt Cloud account, complete with a connection to your Snowflake account, a deployment and a development environment, and a sample job. -10. To help you version control your dbt project, we have connected it to a [managed repository](/docs/collaborate/git/managed-repository), which means that dbt Labs will be hosting your repository for you. This will give you access to a Git workflow without you having to create and host the repository yourself. You will not need to know Git for this workshop; dbt Cloud will help guide you through the workflow. In the future, when you’re developing your own project, [feel free to use your own repository](/docs/cloud/git/connect-github). This will allow you to learn more about features like [Slim CI](/docs/deploy/continuous-integration) builds after this workshop. +10. To help you version control your dbt project, we have connected it to a [managed repository](/docs/cloud/git/managed-repository), which means that dbt Labs will be hosting your repository for you. This will give you access to a Git workflow without you having to create and host the repository yourself. You will not need to know Git for this workshop; dbt Cloud will help guide you through the workflow. In the future, when you’re developing your own project, [feel free to use your own repository](/docs/cloud/git/connect-github). This will allow you to learn more about features like [Slim CI](/docs/deploy/continuous-integration) builds after this workshop. ## Change development schema name navigate the IDE diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md index 79038cd1dfc..d70d074485f 100644 --- a/website/docs/guides/sl-snowflake-qs.md +++ b/website/docs/guides/sl-snowflake-qs.md @@ -262,7 +262,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne -Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. +Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/cloud/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. 1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt. @@ -333,7 +333,7 @@ Using Partner Connect allows you to create a complete dbt account with your [Sno ## Set up a dbt Cloud managed repository -If you used Partner Connect, you can skip to [initializing your dbt project](#initialize-your-dbt-project-and-start-developing) as Partner Connect provides you with a [managed repository](/docs/collaborate/git/managed-repository). Otherwise, you will need to create your repository connection. +If you used Partner Connect, you can skip to [initializing your dbt project](#initialize-your-dbt-project-and-start-developing) as Partner Connect provides you with a [managed repository](/docs/cloud/git/managed-repository). Otherwise, you will need to create your repository connection. diff --git a/website/docs/guides/snowflake-qs.md b/website/docs/guides/snowflake-qs.md index f1edd5ffc00..40bdeed1ef2 100644 --- a/website/docs/guides/snowflake-qs.md +++ b/website/docs/guides/snowflake-qs.md @@ -142,7 +142,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne -Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. +Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/cloud/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. 1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt. diff --git a/website/docs/reference/resource-configs/meta.md b/website/docs/reference/resource-configs/meta.md index e1542bdbc82..a7f348d50ba 100644 --- a/website/docs/reference/resource-configs/meta.md +++ b/website/docs/reference/resource-configs/meta.md @@ -16,7 +16,7 @@ hide_table_of_contents: true { label: 'Analyses', value: 'analyses', }, { label: 'Macros', value: 'macros', }, { label: 'Exposures', value: 'exposures', }, - { label: 'Semantic Models', value: 'semantic models', }, + { label: 'Semantic models', value: 'semantic models', }, { label: 'Metrics', value: 'metrics', }, { label: 'Saved queries', value: 'saved queries', }, ] @@ -179,6 +179,27 @@ exposures: +Configure `meta` in the your [semantic models](/docs/build/semantic-models) YAML file or under the `semantic-models` config block in the `dbt_project.yml` file. + + + + + +```yml +semantic_models: + - name: semantic_model_name + config: + meta: {} + +``` + + + + + + +[Dimensions](/docs/build/dimensions), [entities](/docs/build/entities), and [measures](/docs/build/measures) can also have their own `meta` configurations. + ```yml @@ -187,9 +208,25 @@ semantic_models: config: meta: {} + dimensions: + - name: dimension_name + config: + meta: {} + + entities: + - name: entity_name + config: + meta: {} + + measures: + - name: measure_name + config: + meta: {} + ``` + The `meta` config can also be defined under the `semantic-models` config block in `dbt_project.yml`. See [configs and properties](/reference/configs-and-properties) for details. @@ -249,13 +286,11 @@ saved_queries: ``` - - ## Definition -The `meta` field can be used to set metadata for a resource. This metadata is compiled into the `manifest.json` file generated by dbt, and is viewable in the auto-generated documentation. +The `meta` field can be used to set metadata for a resource and accepts any key-value pairs. This metadata is compiled into the `manifest.json` file generated by dbt, and is viewable in the auto-generated documentation. Depending on the resource you're configuring, `meta` may be available within the `config` property, and/or as a top-level key. (For backwards compatibility, `meta` is often (but not always) supported as a top-level key, though without the capabilities of config inheritance.) @@ -343,3 +378,107 @@ models: +### Assign meta to semantic model + + +The following example shows how to assign a `meta` value to a [semantic model](/docs/build/semantic-models) in the `semantic_model.yml` file and `dbt_project.yml` file: + + + + +```yaml +semantic_models: + - name: transaction + model: ref('fact_transactions') + description: "Transaction fact table at the transaction level. This table contains one row per transaction and includes the transaction timestamp." + defaults: + agg_time_dimension: transaction_date + config: + meta: + data_owner: "Finance team" + used_in_reporting: true +``` + + + + + +```yaml +semantic-models: + jaffle_shop: + +meta: + used_in_reporting: true +``` + + + +### Assign meta to dimensions, measures, entities + + + +Available in dbt version 1.9 and later. + + + + + + + + +The following example shows how to assign a `meta` value to a [dimension](/docs/build/dimensions), [entity](/docs/build/entities), and [measure](/docs/build/measures) in a semantic model: + + + +```yml +semantic_models: + - name: semantic_model + ... + dimensions: + - name: order_date + type: time + config: + meta: + data_owner: "Finance team" + used_in_reporting: true + entities: + - name: customer_id + type: primary + config: + meta: + description: "Unique identifier for customers" + data_owner: "Sales team" + used_in_reporting: false + measures: + - name: count_of_users + expr: user_id + config: + meta: + used_in_reporting: true +``` + + + + + + +This second example shows how to assign a `data_owner` and additional metadata value to a dimension in the `dbt_project.yml` file using the `+meta` syntax. The similar syntax can be used for entities and measures. + + + +```yml +semantic-models: + jaffle_shop: + ... + [dimensions](/docs/build/dimensions): + - name: order_date + config: + meta: + data_owner: "Finance team" + used_in_reporting: true +``` + + + + + + diff --git a/website/docs/reference/resource-configs/tags.md b/website/docs/reference/resource-configs/tags.md index f6c46f8a088..c222df8c1ae 100644 --- a/website/docs/reference/resource-configs/tags.md +++ b/website/docs/reference/resource-configs/tags.md @@ -16,59 +16,94 @@ datatype: string | [string] + + ```yml -models: +[models](/reference/model-configs): + [](/reference/resource-configs/resource-path): + +tags: | [] # Supports single strings or list of strings + +[snapshots](/reference/snapshot-configs): [](/reference/resource-configs/resource-path): +tags: | [] -snapshots: +[seeds](/reference/seed-configs): [](/reference/resource-configs/resource-path): +tags: | [] -seeds: +``` + + + + +```yml + +[models](/reference/model-configs): + [](/reference/resource-configs/resource-path): + +tags: | [] # Supports single strings or list of strings + +[snapshots](/reference/snapshot-configs): + [](/reference/resource-configs/resource-path): + +tags: | [] + +[seeds](/reference/seed-configs): + [](/reference/resource-configs/resource-path): + +tags: | [] + +[saved-queries:](/docs/build/saved-queries) [](/reference/resource-configs/resource-path): +tags: | [] ``` + + - + -```yml -version: 2 +The following examples show how to add tags to dbt resources in YAML files. Replace `resource_type` with `models`, `snapshots`, `seeds`, or `saved_queries` as appropriate. + -models: - - name: model_name - config: - tags: | [] + + +The following examples show how to add tags to dbt resources in YAML files. Replace `resource_type` with `models`, `snapshots`, or `seeds` as appropriate. + + + +```yaml +resource_type: + - name: resource_name + config: + tags: | [] # Supports single strings or list of strings + # Optional: Add the following specific properties for models columns: - name: column_name - tags: [] + tags: | [] tests: - : + test-name: config: - tags: | [] + tags: "single-string" # Supports single string + tags: ["string-1", "string-2"] # Supports list of strings ``` - -```jinja - + +```sql {{ config( tags="" | [""] ) }} - ``` + @@ -79,6 +114,7 @@ Apply a tag (or list of tags) to a resource. These tags can be used as part of the [resource selection syntax](/reference/node-selection/syntax), when running the following commands: - `dbt run --select tag:my_tag` +- `dbt build --select tag:my_tag` - `dbt seed --select tag:my_tag` - `dbt snapshot --select tag:my_tag` - `dbt test --select tag:my_tag` (indirectly runs all tests associated with the models that are tagged) @@ -128,14 +164,14 @@ select ... -Then, run part of your project like so: +Run resources with specific tags (or exclude resources with specific tags) using the following commands: -``` +```shell # Run all models tagged "daily" -$ dbt run --select tag:daily + dbt run --select tag:daily # Run all models tagged "daily", except those that are tagged hourly -$ dbt run --select tag:daily --exclude tag:hourly + dbt run --select tag:daily --exclude tag:hourly ``` ### Apply tags to seeds @@ -164,10 +200,68 @@ seeds: +### Apply tags to saved queries + + + +:::tip Upgrade to dbt Core 1.9 + +Applying tags to saved queries is only available in dbt Core versions 1.9 and later. +::: + + + + + +This following example shows how to apply a tag to a saved query in the `dbt_project.yml` file. The saved query is then tagged with `order_metrics`. + + + +```yml +[saved-queries](/docs/build/saved-queries): + jaffle_shop: + customer_order_metrics: + +tags: order_metrics +``` + + + +Then run resources with a specific tag using the following commands: + +```shell +# Run all resources tagged "order_metrics" + dbt run --select tag:order_metrics +``` + +The second example shows how to apply multiple tags to a saved query in the `semantic_model.yml` file. The saved query is then tagged with `order_metrics` and `hourly`. + + + +```yaml +saved_queries: + - name: test_saved_query + description: "{{ doc('saved_query_description') }}" + label: Test saved query + config: + tags: + - order_metrics + - hourly +``` + + + +Run resources with multiple tags using the following commands: + +```shell +# Run all resources tagged "order_metrics" and "hourly" + dbt build --select tag:order_metrics tag:hourly +``` + + ## Usage notes ### Tags are additive -Tags accumulate hierarchically. The above example would result in: +Tags accumulate hierarchically. The [earlier example](/reference/resource-configs/tags#use-tags-to-run-parts-of-your-project) would result in: | Model | Tags | | -------------------------------- | ------------------------------------- | @@ -178,7 +272,7 @@ Tags accumulate hierarchically. The above example would result in: ### Other resource types -Tags can also be applied to sources, exposures, and even _specific columns_ in a resource. +Tags can also be applied to [sources](/docs/build/sources), [exposures](/docs/build/exposures), and even _specific columns_ in a resource. These resources do not yet support the `config` property, so you'll need to specify the tags as a top-level key instead. @@ -210,10 +304,11 @@ sources: + In the example above, the `unique` test would be selected by any of these four tags: ```bash -$ dbt test --select tag:top_level -$ dbt test --select tag:table_level -$ dbt test --select tag:column_level -$ dbt test --select tag:test_level +dbt test --select tag:top_level +dbt test --select tag:table_level +dbt test --select tag:column_level +dbt test --select tag:test_level ``` diff --git a/website/sidebars.js b/website/sidebars.js index db97f1f25da..3a8f560c297 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -142,6 +142,7 @@ const sidebarSettings = { }, items: [ "docs/cloud/git/git-configuration-in-dbt-cloud", + "docs/cloud/git/managed-repository", "docs/cloud/git/import-a-project-by-git-url", "docs/cloud/git/connect-github", "docs/cloud/git/connect-gitlab", @@ -552,7 +553,6 @@ const sidebarSettings = { items: [ "docs/collaborate/git-version-control", "docs/collaborate/git/version-control-basics", - "docs/collaborate/git/managed-repository", "docs/collaborate/git/pr-template", "docs/collaborate/git/merge-conflicts", ], diff --git a/website/snippets/_sl-measures-parameters.md b/website/snippets/_sl-measures-parameters.md index 8d6b84a71dd..f80f90c3063 100644 --- a/website/snippets/_sl-measures-parameters.md +++ b/website/snippets/_sl-measures-parameters.md @@ -1,3 +1,4 @@ + | Parameter | Description | Required | Type | | --- | --- | --- | --- | | [`name`](/docs/build/measures#name) | Provide a name for the measure, which must be unique and can't be repeated across all semantic models in your dbt project. | Required | String | @@ -9,3 +10,4 @@ | `agg_time_dimension` | The time field. Defaults to the default agg time dimension for the semantic model. | Optional | String | | `label` | String that defines the display value in downstream tools. Accepts plain text, spaces, and quotes (such as `orders_total` or `"orders_total"`). Available in dbt version 1.7 or higher. | Optional | String | | `create_metric` | Create a `simple` metric from a measure by setting `create_metric: True`. The `label` and `description` attributes will be automatically propagated to the created metric. Available in dbt version 1.7 or higher. | Optional | Boolean | +| `config` | Use the [`config`](/reference/resource-properties/config) property to specify configurations for your metric. Supports the [`meta`](/reference/resource-configs/meta) property, nested under `config`. | Optional | diff --git a/website/snippets/available-git-providers.md b/website/snippets/available-git-providers.md index 6579d8989bf..5e1200fe2f3 100644 --- a/website/snippets/available-git-providers.md +++ b/website/snippets/available-git-providers.md @@ -1,3 +1,3 @@ When you develop in dbt Cloud, you can leverage [Git](/docs/collaborate/git-version-control) to version control your code. -To connect to a repository, you can either set up a dbt Cloud-hosted [managed repository](/docs/collaborate/git/managed-repository) or directly connect to a [supported git provider](/docs/cloud/git/connect-github). Managed repositories are a great way to trial dbt without needing to create a new repository. In the long run, it's better to connect to a supported git provider to use features like automation and [continuous integration](/docs/deploy/continuous-integration). \ No newline at end of file +To connect to a repository, you can either set up a dbt Cloud-hosted [managed repository](/docs/cloud/git/managed-repository) or directly connect to a [supported git provider](/docs/cloud/git/connect-github). Managed repositories are a great way to trial dbt without needing to create a new repository. In the long run, it's better to connect to a supported git provider to use features like automation and [continuous integration](/docs/deploy/continuous-integration). diff --git a/website/vercel.json b/website/vercel.json index b68dc053db9..993ff9065bd 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,11 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/collaborate/git/managed-repository", + "destination": "/docs/cloud/git/managed-repository", + "permanent": true + }, { "source": "/faqs/API/rotate-token", "destination": "/docs/dbt-cloud-apis/service-tokens#service-token-update", @@ -1372,7 +1377,7 @@ }, { "source": "/docs/dbt-cloud/cloud-configuring-dbt-cloud/cloud-using-a-managed-repository", - "destination": "/docs/collaborate/git/managed-repository", + "destination": "/docs/cloud/git/managed-repository", "permanent": true }, {