Skip to content

Commit

Permalink
Merge branch 'current' into mult-unique-keys
Browse files Browse the repository at this point in the history
  • Loading branch information
runleonarun authored Nov 8, 2024
2 parents 9e9ec1c + 6bc8e2f commit 4206050
Show file tree
Hide file tree
Showing 21 changed files with 67,198 additions and 49 deletions.
11 changes: 2 additions & 9 deletions website/dbt-versions.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
*/
exports.versions = [
{
version: "1.9.1",
version: "1.10",
customDisplay: "Cloud (Versionless)",
},
{
Expand Down Expand Up @@ -74,12 +74,5 @@ exports.versionedPages = [
* @property {string} firstVersion The first version the category is visible in the sidebar
*/
exports.versionedCategories = [
{
category: "Model governance",
firstVersion: "1.5",
},
{
category: "Build your metrics",
firstVersion: "1.6",
},

];
2 changes: 1 addition & 1 deletion website/docs/docs/build/dimensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ semantic_models:
type: categorical
```
Dimensions are bound to the primary entity of the semantic model they are defined in. For example the dimensoin `type` is defined in a model that has `transaction` as a primary entity. `type` is scoped to the `transaction` entity, and to reference this dimension you would use the fully qualified dimension name i.e `transaction__type`.
Dimensions are bound to the primary entity of the semantic model they are defined in. For example the dimension `type` is defined in a model that has `transaction` as a primary entity. `type` is scoped to the `transaction` entity, and to reference this dimension you would use the fully qualified dimension name i.e `transaction__type`.

MetricFlow requires that all semantic models have a primary entity. This is to guarantee unique dimension names. If your data source doesn't have a primary entity, you need to assign the entity a name using the `primary_entity` key. It doesn't necessarily have to map to a column in that table and assigning the name doesn't affect query generation. We recommend making these "virtual primary entities" unique across your semantic model. An example of defining a primary entity for a data source that doesn't have a primary entity column is below:

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/incremental-microbatch.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ id: "incremental-microbatch"

:::info Microbatch

The `microbatch` strategy is available in beta for [dbt Cloud Versionless](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless) and dbt Core v1.9. We have been developing it behind a flag to prevent unintended interactions with existing custom incremental strategies. To enable this feature, set the environment variable `DBT_EXPERIMENTAL_MICROBATCH` to `True` in your dbt Cloud environments or wherever you're running dbt Core.
The `microbatch` strategy is available in beta for [dbt Cloud Versionless](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless) and dbt Core v1.9. We have been developing it behind a flag to prevent unintended interactions with existing custom incremental strategies. To enable this feature, [set the environment variable](/docs/build/environment-variables#setting-and-overriding-environment-variables) `DBT_EXPERIMENTAL_MICROBATCH` to `True` in your dbt Cloud environments or wherever you're running dbt Core.

Read and participate in the discussion: [dbt-core#10672](https://github.com/dbt-labs/dbt-core/discussions/10672)

Expand Down
4 changes: 2 additions & 2 deletions website/docs/docs/build/incremental-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -212,11 +212,11 @@ Currently, `on_schema_change` only tracks top-level column changes. It does not

### Default behavior

This is the behavior if `on_schema_change: ignore`, which is set by default, and on older versions of dbt.
This is the behavior of `on_schema_change: ignore`, which is set by default.

If you add a column to your incremental model, and execute a `dbt run`, this column will _not_ appear in your target table.

Similarly, if you remove a column from your incremental model, and execute a `dbt run`, this column will _not_ be removed from your target table.
If you remove a column from your incremental model and execute a `dbt run`, `dbt run` will fail.

Instead, whenever the logic of your incremental changes, execute a full-refresh run of both your incremental model and any downstream models.

Expand Down
18 changes: 9 additions & 9 deletions website/docs/docs/build/measures.md
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,7 @@ Parameters under the `non_additive_dimension` will specify dimensions that the m

```yaml
semantic_models:
- name: subscription_id
- name: subscriptions
description: A subscription table with one row per date for each active user and their subscription plans.
model: ref('your_schema.subscription_table')
defaults:
Expand All @@ -209,7 +209,7 @@ semantic_models:
entities:
- name: user_id
type: foreign
primary_entity: subscription_table
primary_entity: subscription
dimensions:
- name: subscription_date
Expand All @@ -224,21 +224,21 @@ semantic_models:
expr: user_id
agg: count_distinct
non_additive_dimension:
name: metric_time
name: subscription_date
window_choice: max
- name: mrr
description: Aggregate by summing all users' active subscription plans
expr: subscription_value
agg: sum
non_additive_dimension:
name: metric_time
name: subscription_date
window_choice: max
- name: user_mrr
description: Group by user_id to achieve each user's MRR
expr: subscription_value
agg: sum
non_additive_dimension:
name: metric_time
name: subscription_date
window_choice: max
window_groupings:
- user_id
Expand All @@ -255,15 +255,15 @@ We can query the semi-additive metrics using the following syntax:
For dbt Cloud:

```bash
dbt sl query --metrics mrr_by_end_of_month --group-by metric_time__month --order metric_time__month
dbt sl query --metrics mrr_by_end_of_month --group-by metric_time__week --order metric_time__week
dbt sl query --metrics mrr_by_end_of_month --group-by subscription__subscription_date__month --order subscription__subscription_date__month
dbt sl query --metrics mrr_by_end_of_month --group-by subscription__subscription_date__week --order subscription__subscription_date__week
```

For dbt Core:

```bash
mf query --metrics mrr_by_end_of_month --group-by metric_time__month --order metric_time__month
mf query --metrics mrr_by_end_of_month --group-by metric_time__week --order metric_time__week
mf query --metrics mrr_by_end_of_month --group-by subscription__subscription_date__month --order subscription__subscription_date__month
mf query --metrics mrr_by_end_of_month --group-by subscription__subscription_date__week --order subscription__subscription_date__week
```

import SetUpPages from '/snippets/_metrics-dependencies.md';
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -259,7 +259,7 @@ Create a new query with MetricFlow and execute it against your data platform. Th
```bash
dbt sl query --metrics <metric_name> --group-by <dimension_name> # In dbt Cloud
dbt sl query --saved-query <name> # In dbt Cloud CLI
dbt sl query --saved-query <name> # In dbt Cloud

mf query --metrics <metric_name> --group-by <dimension_name> # In dbt Core

Expand Down
8 changes: 4 additions & 4 deletions website/docs/docs/build/metricflow-time-spine.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ final as (
select * from final
where date_day > dateadd(year, -4, current_timestamp())
and date_hour < dateadd(day, 30, current_timestamp())
and date_day < dateadd(day, 30, current_timestamp())
```

### Daily (BigQuery)
Expand Down Expand Up @@ -180,7 +180,7 @@ select *
from final
-- filter the time spine to a specific range
where date_day > dateadd(year, -4, current_timestamp())
and date_hour < dateadd(day, 30, current_timestamp())
and date_day < dateadd(day, 30, current_timestamp())
```

</File>
Expand Down Expand Up @@ -265,7 +265,7 @@ final as (
select * from final
where date_day > dateadd(year, -4, current_timestamp())
and date_hour < dateadd(day, 30, current_timestamp())
and date_day < dateadd(day, 30, current_timestamp())
```

</File>
Expand Down Expand Up @@ -296,7 +296,7 @@ select *
from final
-- filter the time spine to a specific range
where date_day > dateadd(year, -4, current_timestamp())
and date_hour < dateadd(day, 30, current_timestamp())
and date_day < dateadd(day, 30, current_timestamp())
```

</File>
Expand Down
61 changes: 57 additions & 4 deletions website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ The audit log supports various events for different objects in dbt Cloud. You wi
| Auth Provider Changed | auth_provider.Changed | Authentication provider settings changed |
| Credential Login Succeeded | auth.CredentialsLoginSucceeded | User successfully logged in with username and password |
| SSO Login Failed | auth.SsoLoginFailed | User login via SSO failed |
| SSO Login Succeeded | auth.SsoLoginSucceeded | User successfully logged in via SSO
| SSO Login Succeeded | auth.SsoLoginSucceeded | User successfully logged in via SSO |

### Environment

Expand Down Expand Up @@ -93,7 +93,7 @@ The audit log supports various events for different objects in dbt Cloud. You wi
| ------------- | ----------------------------- | ------------------------------ |
| Group Added | user_group.Added | New Group successfully created |
| Group Changed | user_group.Changed | Group settings changed |
| Group Removed | user_group.Changed | Group successfully removed |
| Group Removed | user_group.Removed | Group successfully removed |

### User

Expand Down Expand Up @@ -149,12 +149,65 @@ The audit log supports various events for different objects in dbt Cloud. You wi

### Credentials

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -------------------------------- |
| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Credentials Added to Project | credentials.Added | Project credentials added |
| Credentials Changed in Project | credentials.Changed | Credentials changed in project |
| Credentials Removed from Project | credentials.Removed | Credentials removed from project |


### Git integration

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| GitLab Application Changed | gitlab_application.changed | GitLab configuration in dbt Cloud changed |

### Webhooks

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Webhook Subscriptions Added | webhook_subscription.added | New webhook configured in settings |
| Webhook Subscriptions Changed | webhook_subscription.changed | Existing webhook configuration altered |
| Webhook Subscriptions Removed | webhook_subscription.removed | Existing webhook deleted |


### Semantic Layer

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Semantic Layer Config Added | semantic_layer_config.added | Semantic Layer config added |
| Semantic Layer Config Changed | semantic_layer_config.changed | Semantic Layer config (not related to credentials) changed |
| Semantic Layer Config Removed | semantic_layer_config.removed | Semantic Layer config removed |
| Semantic Layer Credentials Added | semantic_layer_credentials.added | Semantic Layer credentials added |
| Semantic Layer Credentials Changed| semantic_layer_credentials.changed | Semantic Layer credentials changed. Does not trigger semantic_layer_config.changed|
| Semantic Layer Credentials Removed| semantic_layer_credentials.removed | Semantic Layer credentials removed |

### Extended attributes

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Extended Attribute Added | extended_attributes.added | Extended attribute added to a project |
| Extended Attribute Changed | extended_attributes.changed | Extended attribute changed or removed |


### Account-scoped personal access token

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| Account Scoped Personal Access Token Created | account_scoped_pat.created | An account-scoped PAT was created |
| Account Scoped Personal Access Token Deleted | account_scoped_pat.deleted | An account-scoped PAT was deleted |

### IP restrictions

| Event Name | Event Type | Description |
| -------------------------------- | ----------------------------- | -----------------------|
| IP Restrictions Toggled | ip_restrictions.toggled | IP restrictions feature enabled or disabled |
| IP Restrictions Rule Added | ip_restrictions.rule.added | IP restriction rule created |
| IP Restrictions Rule Changed | ip_restrictions.rule.changed | IP restriction rule edited |
| IP Restrictions Rule Removed | ip_restrictions.rule.removed | IP restriction rule deleted |



## Searching the audit log

You can search the audit log to find a specific event or actor, which is limited to the ones listed in [Events in audit log](#events-in-audit-log). The audit log successfully lists historical events spanning the last 90 days. You can search for an actor or event using the search bar, and then narrow your results using the time window.
Expand Down
28 changes: 26 additions & 2 deletions website/docs/docs/core/connect-data-platform/snowflake-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ my-snowflake-db:

</File>

### SSO Authentication
### SSO authentication

To use SSO authentication for Snowflake, omit a `password` and instead supply an `authenticator` config to your target.
`authenticator` can be one of 'externalbrowser' or a valid Okta URL.
Expand Down Expand Up @@ -332,7 +332,7 @@ my-snowflake-db:

</File>

### SSO Authentication
### SSO authentication

To use SSO authentication for Snowflake, omit a `password` and instead supply an `authenticator` config to your target.
`authenticator` can be one of 'externalbrowser' or a valid Okta URL.
Expand Down Expand Up @@ -421,6 +421,30 @@ my-snowflake-db:

Refer to the [Snowflake docs](https://docs.snowflake.com/en/sql-reference/parameters.html#label-allow-id-token) for info on how to enable this feature in your account.

### OAuth authorization

To learn how to configure OAuth in Snowflake, refer to their [documentation](https://docs.snowflake.com/en/user-guide/oauth-snowflake-overview). Your Snowflake admin needs to generate an [OAuth token](https://community.snowflake.com/s/article/HOW-TO-OAUTH-TOKEN-GENERATION-USING-SNOWFLAKE-CUSTOM-OAUTH) for your configuration to work.

Provide the OAUTH_REDIRECT_URI in Snowflake:`http://localhost:PORT_NUMBER`. For example, `http://localhost:8080`.

Once your Snowflake admin has configured OAuth, add the following to your `profiles.yml` file:

```yaml
my-snowflake-db:
target: dev
outputs:
dev:
type: snowflake
account: [account id]
# The following fields are retrieved from the Snowflake configuration
authenticator: oauth
oauth_client_id: [OAuth client id]
oauth_client_secret: [OAuth client secret]
token: [OAuth refresh token]
```

## Configurations

The "base" configs for Snowflake targets are shown below. Note that you should also specify auth-related configs specific to the authentication method you are using as described above.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ For performance use cases, people typically query the historical or latest appli

It’s helpful to understand how long it takes to build models (tables) and tests to execute during a dbt run. Longer model build times result in higher infrastructure costs and fresh data arriving later to stakeholders. Analyses like these can be in observability tools or ad-hoc queries, like in a notebook.

<Lightbox src="/img/docs/dbt-cloud/discovery-api/model-timing.jpg" width="200%" title="Model timing visualization in dbt Cloud"/>
<Lightbox src="/img/docs/dbt-cloud/discovery-api/model-timing.png" width="200%" title="Model timing visualization in dbt Cloud"/>

<details>
<summary>Example query with code</summary>
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/sl-jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -519,7 +519,7 @@ select * from {{
semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'],
group_by=[Dimension('metric_time')],
limit=10,
order_by=[-'order_gross_profit'])
order_by=['-order_gross_profit'])
}}
```
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-environments.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ To create a new dbt Cloud development environment:

To use the dbt Cloud IDE or dbt Cloud CLI, each developer will need to set up [personal development credentials](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud#get-started-with-the-cloud-ide) to your warehouse connection in their **Profile Settings**. This allows you to set separate target information and maintain individual credentials to connect to your warehouse.

<Lightbox src="/img/docs/dbt-cloud/refresh-ide/new-environment-fields.png" width="85%" height="100" title="Creating a development environment"/>
<Lightbox src="/img/docs/dbt-cloud/refresh-ide/new-environment-fields.png" width="85%" height="200" title="Creating a development environment"/>

## Deployment environment

Expand Down
7 changes: 7 additions & 0 deletions website/docs/docs/dbt-versions/release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,13 @@ Release notes are grouped by month for both multi-tenant and virtual private clo

\* The official release date for this new format of release notes is May 15th, 2024. Historical release notes for prior dates may not reflect all available features released earlier this year or their tenancy availability.

## November 2024
- **Fix**: This update improves [dbt Semantic Layer Tableau integration](/docs/cloud-integrations/semantic-layer/tableau) making query parsing more reliable. Some key fixes include:
- Error messages for unsupported joins between saved queries and ALL tables.
- Improved handling of queries when multiple tables are selected in a data source.
- Fixed a bug when an IN filter contained a lot of values.
- Better error messaging for queries that can't be parsed correctly.

## October 2024
<Expandable alt_header="Coalesce 2024 announcements">

Expand Down
Loading

0 comments on commit 4206050

Please sign in to comment.