diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index 870dadcd183..d2bb72552bd 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -9,6 +9,7 @@ To learn more about the writing conventions used in the dbt Labs docs, see the [ - [ ] I have reviewed the [Content style guide](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-style-guide.md) so my content adheres to these guidelines. - [ ] The topic I'm writing about is for specific dbt version(s) and I have versioned it according to the [version a whole page](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#adding-a-new-version) and/or [version a block of content](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#versioning-blocks-of-content) guidelines. - [ ] I have added checklist item(s) to this list for anything anything that needs to happen before this PR is merged, such as "needs technical review" or "change base branch." +- [ ] The content in this PR requires a dbt release note, so I added one to the [release notes page](https://docs.getdbt.com/docs/dbt-versions/dbt-cloud-release-notes). -Note: The **Defer to staging/production** [toggle](/docs/cloud/about-cloud-develop-defer#defer-in-the-dbt-cloud-ide) button doesn't apply when running Semantic Layer commands in the dbt Cloud IDE. To use defer for Semantic layer commands in the IDE, toggle the button on and manually add the `--defer` flag to the command. This is a temporary workaround and will be available soon. You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-started) from [PyPI](https://pypi.org/project/dbt-metricflow/). You need to use `pip` to install MetricFlow on Windows or Linux operating systems: + + 1. Create or activate your virtual environment `python -m venv venv` 2. Run `pip install dbt-metricflow` * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `python -m pip install "dbt-metricflow[snowflake]"` + + + + +1. Create or activate your virtual environment `python -m venv venv` +2. Run `pip install dbt-metricflow` + * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[adapter_package_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `python -m pip install "dbt-metricflow[dbt-snowflake]"` + + + **Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. Something to note, MetricFlow `mf` commands return an error if you have a Metafont latex package installed. To run `mf` commands, uninstall the package. diff --git a/website/docs/docs/build/metricflow-time-spine.md b/website/docs/docs/build/metricflow-time-spine.md index e932fb36f53..9932a35839c 100644 --- a/website/docs/docs/build/metricflow-time-spine.md +++ b/website/docs/docs/build/metricflow-time-spine.md @@ -463,9 +463,22 @@ For example, if you use a custom calendar in your organization, such as a fiscal - This is useful for calculating metrics based on a custom calendar, such as fiscal quarters or weeks. - Use the `custom_granularities` key to define a non-standard time period for querying data, such as a `retail_month` or `fiscal_week`, instead of standard options like `day`, `month`, or `year`. -- Ensure the the `standard_granularity_column` is a date time type. - This feature provides more control over how time-based metrics are calculated. + + +When working with custom calendars in MetricFlow, it's important to ensure: + +- Consistent data types — Both your dimension column and the time spine column should use the same data type to allow accurate comparisons. Functions like `DATE_TRUNC` don't change the data type of the input in some databases (like Snowflake). Using different data types can lead to mismatches and inaccurate results. + + We recommend using `DATETIME` or `TIMESTAMP` data types for your time dimensions and time spine, as they support all granularities. The `DATE` data type may not support smaller granularities like hours or minutes. + +- Time zones — MetricFlow currently doesn't perform any timezone manipulation. When working with timezone-aware data, inconsistent time zones may lead to unexpected results during aggregations and comparisons. + +For example, if your time spine column is `TIMESTAMP` type and your dimension column is `DATE` type, comparisons between these columns might not work as intended. To fix this, convert your `DATE` column to `TIMESTAMP`, or make sure both columns are the same data type. + + + ### Add custom granularities To add custom granularities, the Semantic Layer supports custom calendar configurations that allow users to query data using non-standard time periods like `fiscal_year` or `retail_month`. You can define these custom granularities (all lowercased) by modifying your model's YAML configuration like this: diff --git a/website/docs/docs/cloud-integrations/configure-auto-exposures.md b/website/docs/docs/cloud-integrations/configure-auto-exposures.md index 4574d69c164..6a3d8a56326 100644 --- a/website/docs/docs/cloud-integrations/configure-auto-exposures.md +++ b/website/docs/docs/cloud-integrations/configure-auto-exposures.md @@ -25,7 +25,6 @@ To access the features, you should meet the following: 3. You have set up a [production](/docs/deploy/deploy-environments#set-as-production-environment) deployment environment for each project you want to explore, with at least one successful job run. 4. You have [admin permissions](/docs/cloud/manage-access/enterprise-permissions) in dbt Cloud to edit project settings or production environment settings. 5. Use Tableau as your BI tool and enable metadata permissions or work with an admin to do so. Compatible with Tableau Cloud or Tableau Server with the Metadata API enabled. -6. Run a production job _after_ saving the Tableau collections. ## Set up in Tableau @@ -63,7 +62,6 @@ To set up [personal access tokens (PATs)](/docs/dbt-cloud-apis/user-tokens#using dbt Cloud automatically imports and syncs any workbook within the selected collections. New additions to the collections will be added to the lineage in dbt Cloud during the next automatic sync (usually once per day). 5. Click **Save**. -6. Run a production job _after_ saving the Tableau collections. dbt Cloud imports everything in the collection(s) and you can continue to view them in Explorer. For more information on how to view and use auto-exposures, refer to [View auto-exposures from dbt Explorer](/docs/collaborate/auto-exposures) page. @@ -72,5 +70,5 @@ dbt Cloud imports everything in the collection(s) and you can continue to view t ## Refresh auto-exposures in jobs :::info Coming soon -Soon, you’ll also be able to use auto-exposures trigger refresh of the data used in your Tableau dashboards from within dbt Cloud. Stay tuned for more on this soon! +Soon, you’ll also be able to use auto-exposures to trigger the refresh of the data used in your Tableau dashboards from within dbt Cloud. Stay tuned for more on this soon! ::: diff --git a/website/docs/docs/cloud-integrations/overview.md b/website/docs/docs/cloud-integrations/overview.md index d925e3e52a7..8334632a7f8 100644 --- a/website/docs/docs/cloud-integrations/overview.md +++ b/website/docs/docs/cloud-integrations/overview.md @@ -13,7 +13,7 @@ Many data applications integrate with dbt Cloud, enabling you to leverage the po
diff --git a/website/docs/docs/cloud/about-cloud-develop-defer.md b/website/docs/docs/cloud/about-cloud-develop-defer.md index 3ee5ac71666..fc55edf8a38 100644 --- a/website/docs/docs/cloud/about-cloud-develop-defer.md +++ b/website/docs/docs/cloud/about-cloud-develop-defer.md @@ -40,9 +40,6 @@ To enable defer in the dbt Cloud IDE, toggle the **Defer to production** button For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` — only the edited model would run. Any `{{ ref() }}` functions will point to the production location of the referenced models. - -Note: The **Defer to staging/production** toggle button doesn't apply when running [dbt Semantic Layer commands](/docs/build/metricflow-commands) in the dbt Cloud IDE. To use defer for Semantic layer commands in the IDE, toggle the button on and manually add the `--defer` flag to the command. This is a temporary workaround and will be available soon. - ### Defer in dbt Cloud CLI diff --git a/website/docs/docs/cloud/about-cloud/architecture.md b/website/docs/docs/cloud/about-cloud/architecture.md index 52614f0cbcd..ecf67b83a96 100644 --- a/website/docs/docs/cloud/about-cloud/architecture.md +++ b/website/docs/docs/cloud/about-cloud/architecture.md @@ -48,7 +48,7 @@ The git repo information is stored on dbt Cloud servers to make it accessible du ### Authentication services -The default settings of dbt Cloud enable local users with credentials stored in dbt Cloud. Still, integrations with various authentication services are offered as an alternative, including [single sign-on services](/docs/cloud/manage-access/sso-overview). Access to features can be granted/restricted by role using [RBAC](/docs/cloud/manage-access/enterprise-permissions). +The default settings of dbt Cloud enable local users with credentials stored in dbt Cloud. Still, integrations with various authentication services are offered as an alternative, including [single sign-on services](/docs/cloud/manage-access/sso-overview). Access to features can be granted/restricted by role using [RBAC](/docs/cloud/manage-access/about-user-access#role-based-access-control-). SSO features are essential because they reduce the number of credentials a user must maintain. Users sign in once and the authentication token is shared among integrated services (such as dbt Cloud). The token expires and must be refreshed at predetermined intervals, requiring the user to go through the authentication process again. If the user is disabled in the SSO provider service, their access to dbt Cloud is disabled, and they cannot override this with local auth credentials. diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md index 398b0cff2a1..c9d2cbbad30 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md @@ -53,7 +53,7 @@ To understand how to navigate the IDE and its user interface elements, refer to | Feature | Description | |---|---| | [**Keyboard shortcuts**](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) | You can access a variety of [commands and actions](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) in the IDE by choosing the appropriate keyboard shortcut. Use the shortcuts for common tasks like building modified models or resuming builds from the last failure. | -| **IDE version control** | The IDE version control section and git button allow you to apply the concept of [version control](/docs/collaborate/git/version-control-basics) to your project directly into the IDE.

- Create or change branches, execute git commands using the git button.
- Commit or revert individual files by right-clicking the edited file
- [Resolve merge conflicts](/docs/collaborate/git/merge-conflicts)
- Link to the repo directly by clicking the branch name
- Edit, format, or lint files and execute dbt commands in your primary protected branch, and commit to a new branch.
- Use Git diff view to view what has been changed in a file before you make a pull request.
- From dbt version 1.6 and higher, use the **Prune branches** [button](/docs/cloud/dbt-cloud-ide/ide-user-interface#prune-branches-modal) to delete local branches that have been deleted from the remote repository, keeping your branch management tidy. | +| **IDE version control** | The IDE version control section and git button allow you to apply the concept of [version control](/docs/collaborate/git/version-control-basics) to your project directly into the IDE.

- Create or change branches, execute git commands using the git button.
- Commit or revert individual files by right-clicking the edited file
- [Resolve merge conflicts](/docs/collaborate/git/merge-conflicts)
- Link to the repo directly by clicking the branch name
- Edit, format, or lint files and execute dbt commands in your primary protected branch, and commit to a new branch.
- Use Git diff view to view what has been changed in a file before you make a pull request.
- Use the **Prune branches** [button](/docs/cloud/dbt-cloud-ide/ide-user-interface#prune-branches-modal) (dbt v1.6 and higher) to delete local branches that have been deleted from the remote repository, keeping your branch management tidy.
- Sign your [git commits](/docs/cloud/dbt-cloud-ide/git-commit-signing) to mark them as 'Verified'. | | **Preview and Compile button** | You can [compile or preview](/docs/cloud/dbt-cloud-ide/ide-user-interface#console-section) code, a snippet of dbt code, or one of your dbt models after editing and saving. | | [**dbt Copilot**](/docs/cloud/dbt-copilot) | A powerful AI engine that can generate documentation, tests, and semantic models for your dbt SQL models. Available for dbt Cloud Enterprise plans. | | **Build, test, and run button** | Build, test, and run your project with a button click or by using the Cloud IDE command bar. diff --git a/website/docs/docs/cloud/dbt-cloud-ide/git-commit-signing.md b/website/docs/docs/cloud/dbt-cloud-ide/git-commit-signing.md new file mode 100644 index 00000000000..afaa0751669 --- /dev/null +++ b/website/docs/docs/cloud/dbt-cloud-ide/git-commit-signing.md @@ -0,0 +1,80 @@ +--- +title: "Git commit signing" +description: "Learn how to sign your Git commits when using the IDE for development." +sidebar_label: Git commit signing +--- + +# Git commit signing + +To prevent impersonation and enhance security, you can sign your Git commits before pushing them to your repository. Using your signature, a Git provider can cryptographically verify a commit and mark it as "verified", providing increased confidence about its origin. + +You can configure dbt Cloud to sign your Git commits when using the IDE for development. To set up, enable the feature in dbt Cloud, follow the flow to generate a keypair, and upload the public key to your Git provider to use for signature verification. + + +## Prerequisites + +- GitHub or GitLab is your Git provider. Currently, Azure DevOps is not supported. +- You have a dbt Cloud account on the [Enterprise plan](https://www.getdbt.com/pricing/). + +## Generate GPG keypair in dbt Cloud + +To generate a GPG keypair in dbt Cloud, follow these steps: +1. Go to your **Personal profile** page in dbt Cloud. +2. Navigate to **Signed Commits** section. +3. Enable the **Sign commits originating from this user** toggle. +4. This will generate a GPG keypair. The private key will be used to sign all future Git commits. The public key will be displayed, allowing you to upload it to your Git provider. + + + +## Upload public key to Git provider + +To upload the public key to your Git provider, follow the detailed documentation provided by the supported Git provider: + +- [GitHub instructions](https://docs.github.com/en/authentication/managing-commit-signature-verification/adding-a-gpg-key-to-your-github-account) +- [GitLab instructions](https://docs.gitlab.com/ee/user/project/repository/signed_commits/gpg.html) + +Once you have uploaded the public key to your Git provider, your Git commits will be marked as "Verified" after you push the changes to the repository. + + + +## Considerations + +- The GPG keypair is tied to the user, not a specific account. There is a 1:1 relationship between the user and keypair. The same key will be used for signing commits on any accounts the user is a member of. +- The GPG keypair generated in dbt Cloud is linked to the email address associated with your account at the time of keypair creation. This email identifies the author of signed commits. +- For your Git commits to be marked as "verified", your dbt Cloud email address must be a verified email address with your Git provider. The Git provider (such as, GitHub, GitLab) checks that the commit's signed email matches a verified email in your Git provider account. If they don’t match, the commit won't be marked as "verified." +- Keep your dbt Cloud email and Git provider's verified email in sync to avoid verification issues. If you change your dbt Cloud email address: + - Generate a new GPG keypair with the updated email, following the [steps mentioned earlier](/docs/cloud/dbt-cloud-ide/git-commit-signing#generate-gpg-keypair-in-dbt-cloud). + - Add and verify the new email in your Git provider. + + + +## FAQs + + + + + +If you delete your GPG keypair in dbt Cloud, your Git commits will no longer be signed. You can generate a new GPG keypair by following the [steps mentioned earlier](/docs/cloud/dbt-cloud-ide/git-commit-signing#generate-gpg-keypair-in-dbt-cloud). + + + + +GitHub and GitLab support commit signing, while Azure DevOps does not. Commit signing is a [git feature](https://git-scm.com/book/ms/v2/Git-Tools-Signing-Your-Work), and is independent of any specific provider. However, not all providers support the upload of public keys, or the display of verification badges on commits. + + + + + +If your Git Provider does not explicitly support the uploading of public GPG keys, then +commits will still be signed using the private key, but no verification information will +be displayed by the provider. + + + + + +If your Git provider is configured to enforce commit verification, then unsigned commits +will be rejected. To avoid this, ensure that you have followed all previous steps to generate +a keypair, and uploaded the public key to the provider. + + diff --git a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md index 8d80483485c..4aec3353544 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/ide-user-interface.md @@ -193,7 +193,7 @@ Use menus and modals to interact with IDE and access useful options to help your * Toggling between dark or light mode for a better viewing experience * Restarting the IDE - * Fully recloning your repository to refresh your git state and view status details + * Rollback your repo to remote, to refresh your git state and view status details * Viewing status details, including the IDE Status modal. - + diff --git a/website/docs/docs/cloud/manage-access/about-access.md b/website/docs/docs/cloud/manage-access/about-access.md index 6b02d9eb17b..b970b0d5763 100644 --- a/website/docs/docs/cloud/manage-access/about-access.md +++ b/website/docs/docs/cloud/manage-access/about-access.md @@ -89,7 +89,7 @@ There are three license types in dbt Cloud: - **Developer** — User can be granted _any_ permissions. - **Read-Only** — User has read-only permissions applied to all dbt Cloud resources regardless of the role-based permissions that the user is assigned. -- **IT** — User has [Security Admin](/docs/cloud/manage-access/enterprise-permissions#security-admin) and [Billing Admin](/docs/cloud/manage-access/enterprise-permissions#billing-admin) permissions applied, regardless of the group permissions assigned. +- **IT** — User has Security Admin and Billing Admin [permissions](/docs/cloud/manage-access/enterprise-permissions) applied, regardless of the group permissions assigned. Developer licenses will make up a majority of the users in your environment and have the highest impact on billing, so it's important to monitor how many you have at any given time. diff --git a/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md b/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md index da19f30ab4c..66d821b90d0 100644 --- a/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md +++ b/website/docs/docs/cloud/manage-access/cloud-seats-and-users.md @@ -55,7 +55,7 @@ If you're on an Enterprise plan and have the correct [permissions](/docs/cloud/m - To add a user, go to **Account Settings** and select **Users**. - Click the [**Invite Users**](/docs/cloud/manage-access/invite-users) button. - - For fine-grained permission configuration, refer to [Role based access control](/docs/cloud/manage-access/enterprise-permissions). + - For fine-grained permission configuration, refer to [Role based access control](/docs/cloud/manage-access/about-user-access#role-based-access-control-). diff --git a/website/docs/docs/cloud/manage-access/enterprise-permissions.md b/website/docs/docs/cloud/manage-access/enterprise-permissions.md index a1f6d795c23..5a56900d529 100644 --- a/website/docs/docs/cloud/manage-access/enterprise-permissions.md +++ b/website/docs/docs/cloud/manage-access/enterprise-permissions.md @@ -22,22 +22,14 @@ The following roles and permission sets are available for assignment in dbt Clou :::tip Licenses or Permission sets -The user's [license](/docs/cloud/manage-access/seats-and-users) type always overrides their assigned permission set. This means that even if a user belongs to a dbt Cloud group with 'Account Admin' permissions, having a 'Read-Only' license would still prevent them from performing administrative actions on the account. +The user's [license](/docs/cloud/manage-access/about-user-access) type always overrides their assigned permission set. This means that even if a user belongs to a dbt Cloud group with 'Account Admin' permissions, having a 'Read-Only' license would still prevent them from performing administrative actions on the account. ::: -## How to set up RBAC Groups in dbt Cloud +## Additional resources -Role-Based Access Control (RBAC) is helpful for automatically assigning permissions to dbt admins based on their SSO provider group associations. RBAC does not apply to [model groups](/docs/collaborate/govern/model-access#groups). +- [Grant users access](/docs/cloud/manage-access/about-user-access#grant-access) +- [Role-based access control](/docs/cloud/manage-access/about-user-access#role-based-access-control-) +- [Environment-level permissions](/docs/cloud/manage-access/environment-permissions) -1. Click the gear icon to the top right and select **Account Settings**. Click **Groups & Licenses** - - - -2. Select an existing group or create a new group to add RBAC. Name the group (this can be any name you like, but it's recommended to keep it consistent with the SSO groups). If you have configured SSO with SAML 2.0, you may have to use the GroupID instead of the name of the group. -3. Configure the SSO provider groups you want to add RBAC by clicking **Add** in the **SSO** section. These fields are case-sensitive and must match the source group formatting. -4. Configure the permissions for users within those groups by clicking **Add** in the **Access** section of the window. - - -5. When you've completed your configurations, click **Save**. Users will begin to populate the group automatically once they have signed in to dbt Cloud with their SSO credentials. diff --git a/website/docs/docs/cloud/manage-access/environment-permissions.md b/website/docs/docs/cloud/manage-access/environment-permissions.md index 44cf2dc9a64..b99da64609c 100644 --- a/website/docs/docs/cloud/manage-access/environment-permissions.md +++ b/website/docs/docs/cloud/manage-access/environment-permissions.md @@ -77,4 +77,4 @@ If the user has the same roles across projects, you can apply environment access ## Related docs --[Environment-level permissions setup](/docs/cloud/manage-access/environment-permissions-setup) +- [Environment-level permissions setup](/docs/cloud/manage-access/environment-permissions-setup) diff --git a/website/docs/docs/cloud/manage-access/invite-users.md b/website/docs/docs/cloud/manage-access/invite-users.md index c82e15fd48f..0922b4dc991 100644 --- a/website/docs/docs/cloud/manage-access/invite-users.md +++ b/website/docs/docs/cloud/manage-access/invite-users.md @@ -17,19 +17,16 @@ You must have proper permissions to invite new users: ## Invite new users -1. In your dbt Cloud account, select the gear menu in the upper right corner and then select **Account Settings**. -2. From the left sidebar, select **Users**. - - - -3. Click on **Invite Users**. +1. In your dbt Cloud account, select your account name in the bottom left corner. Then select **Account settings**. +2. Under **Settings**, select **Users**. +3. Click on **Invite users**. -4. In the **Email Addresses** field, enter the email addresses of the users you would like to invite separated by comma, semicolon, or a new line. +4. In the **Email Addresses** field, enter the email addresses of the users you want to invite separated by a comma, semicolon, or a new line. 5. Select the license type for the batch of users from the **License** dropdown. -6. Select the group(s) you would like the invitees to belong to. -7. Click **Send Invitations**. +6. Select the group(s) you want the invitees to belong to. +7. Click **Send invitations**. - If the list of invitees exceeds the number of licenses your account has available, you will receive a warning when you click **Send Invitations** and the invitations will not be sent. diff --git a/website/docs/docs/cloud/manage-access/licenses-and-groups.md b/website/docs/docs/cloud/manage-access/licenses-and-groups.md deleted file mode 100644 index b91af80f9b3..00000000000 --- a/website/docs/docs/cloud/manage-access/licenses-and-groups.md +++ /dev/null @@ -1,145 +0,0 @@ ---- -title: "Licenses and groups" -id: "licenses-and-groups" ---- - -## Overview - -dbt Cloud administrators can use dbt Cloud's permissioning model to control -user-level access in a dbt Cloud account. This access control comes in two flavors: -License-based and Role-based. - -- **License-based Access Controls:** User are configured with account-wide - license types. These licenses control the specific parts of the dbt Cloud application - that a given user can access. -- **Role-based Access Control (RBAC):** Users are assigned to _groups_ that have - specific permissions on specific projects or the entire account. A user may be - a member of multiple groups, and those groups may have permissions on multiple - projects. - -## License-based access control - -Each user on an account is assigned a license type when the user is first -invited to a given account. This license type may change over time, but a -user can only have one type of license at any given time. - -A user's license type controls the features in dbt Cloud that the user is able -to access. dbt Cloud's three license types are: - - **Read-Only** - - **Developer** - - **IT** - -For more information on these license types, see [Seats & Users](/docs/cloud/manage-access/seats-and-users). -At a high-level, Developers may be granted _any_ permissions, whereas Read-Only -users will have read-only permissions applied to all dbt Cloud resources -regardless of the role-based permissions that the user is assigned. IT users will have Security Admin and Billing Admin permissions applied regardless of the role-based permissions that the user is assigned. - -## Role-based access control - -:::info dbt Cloud Enterprise - -Role-based access control is a feature of the dbt Cloud Enterprise plan - -::: - -Role-based access control allows for fine-grained permissioning in the dbt Cloud -application. With role-based access control, users can be assigned varying -permissions to different projects within a dbt Cloud account. For teams on the -Enterprise tier, role-based permissions can be generated dynamically from -configurations in an [Identity Provider](sso-overview). - -Role-based permissions are applied to _groups_ and pertain to _projects_. The -assignable permissions themselves are granted via _permission sets_. - - -### Groups - -A group is a collection of users. Users may belong to multiple groups. Members -of a group inherit any permissions applied to the group itself. - -Users can be added to a dbt Cloud group based on their group memberships in the -configured [Identity Provider](sso-overview) for the account. In this way, dbt -Cloud administrators can manage access to dbt Cloud resources via identity -management software like Microsoft Entra ID (formerly Azure AD), Okta, or GSuite. See _SSO Mappings_ below for -more information. - -You can view the groups in your account or create new groups from the **Team > Groups** -page in your Account Settings. - - - - -### SSO Mappings - -SSO Mappings connect Identity Provider (IdP) group membership to dbt Cloud group -membership. When a user logs into dbt Cloud via a supported identity provider, -their IdP group memberships are synced with dbt Cloud. Upon logging in -successfully, the user's group memberships (and therefore, permissions) are -adjusted accordingly within dbt Cloud automatically. - -:::tip Creating SSO Mappings - -While dbt Cloud supports mapping multiple IdP groups to a single dbt Cloud -group, we recommend using a 1:1 mapping to make administration as simple as -possible. Consider using the same name for your dbt Cloud groups and your IdP -groups. - -::: - - -### Permission Sets - -Permission sets are predefined collections of granular permissions. Permission -sets combine low-level permission grants into high-level roles that can be -assigned to groups. Some examples of existing permission sets are: - - Account Admin - - Git Admin - - Job Admin - - Job Viewer - - ...and more - -For a full list of enterprise permission sets, see [Enterprise Permissions](/docs/cloud/manage-access/enterprise-permissions). -These permission sets are available for assignment to groups and control the ability -for users in these groups to take specific actions in the dbt Cloud application. - -In the following example, the _dbt Cloud Owners_ group is configured with the -**Account Admin** permission set on _All Projects_ and the **Job Admin** permission -set on the _Internal Analytics_ project. - - - - -### Manual assignment - -dbt Cloud administrators can manually assign users to groups independently of -IdP attributes. If a dbt Cloud group is configured _without_ any -SSO Mappings, then the group will be _unmanaged_ and dbt Cloud will not adjust -group membership automatically when users log into dbt Cloud via an identity -provider. This behavior may be desirable for teams that have connected an identity -provider, but have not yet configured SSO Mappings between dbt Cloud and the -IdP. - -If an SSO Mapping is added to an _unmanaged_ group, then it will become -_managed_, and dbt Cloud may add or remove users to the group automatically at -sign-in time based on the user's IdP-provided group membership information. - - -## FAQs -- **When are IdP group memberships updated for SSO Mapped groups?** Group memberships - are updated every time a user logs into dbt Cloud via a supported SSO provider. If - you've changed group memberships in your identity provider or dbt Cloud, ask your - users to log back into dbt Cloud for these group memberships to be synchronized. - -- **Can I set up SSO without RBAC?** Yes, see the documentation on - [Manual Assignment](#manual-assignment) above for more information on using - SSO without RBAC. - -- **Can I configure a user's License Type based on IdP Attributes?** Yes, see - the docs on [managing license types](/docs/cloud/manage-access/seats-and-users#managing-license-types) - for more information. diff --git a/website/docs/docs/cloud/manage-access/set-up-sso-google-workspace.md b/website/docs/docs/cloud/manage-access/set-up-sso-google-workspace.md index e4ff998015c..2b2575efc57 100644 --- a/website/docs/docs/cloud/manage-access/set-up-sso-google-workspace.md +++ b/website/docs/docs/cloud/manage-access/set-up-sso-google-workspace.md @@ -117,7 +117,7 @@ If the verification information looks appropriate, then you have completed the c ## Setting up RBAC Now you have completed setting up SSO with GSuite, the next steps will be to set up -[RBAC groups](/docs/cloud/manage-access/enterprise-permissions) to complete your access control configuration. +[RBAC groups](/docs/cloud/manage-access/about-user-access#role-based-access-control-) to complete your access control configuration. ## Troubleshooting diff --git a/website/docs/docs/cloud/manage-access/set-up-sso-okta.md b/website/docs/docs/cloud/manage-access/set-up-sso-okta.md index 53986513ce2..fda32f118ef 100644 --- a/website/docs/docs/cloud/manage-access/set-up-sso-okta.md +++ b/website/docs/docs/cloud/manage-access/set-up-sso-okta.md @@ -190,4 +190,4 @@ configured in the steps above. ## Setting up RBAC Now you have completed setting up SSO with Okta, the next steps will be to set up -[RBAC groups](/docs/cloud/manage-access/enterprise-permissions) to complete your access control configuration. +[RBAC groups](/docs/cloud/manage-access/about-user-access#role-based-access-control-) to complete your access control configuration. diff --git a/website/docs/docs/cloud/migration.md b/website/docs/docs/cloud/migration.md index 76d881e7389..2665b8f6a97 100644 --- a/website/docs/docs/cloud/migration.md +++ b/website/docs/docs/cloud/migration.md @@ -15,6 +15,13 @@ Your account will be automatically migrated on or after its scheduled date. Howe ## Recommended actions +:::info Rescheduling your migration + +If you're on the dbt Cloud Enterprise tier, you can postpone your account migration by up to 45 days. To reschedule your migration, navigate to **Account Settings** → **Migration guide**. + +For help, contact the dbt Support Team at [support@getdbt.com](mailto:support@getdbt.com). +::: + We highly recommended you take these actions: - Ensure pending user invitations are accepted or note outstanding invitations. Pending user invitations might be voided during the migration. You can resend user invitations after the migration is complete. diff --git a/website/docs/docs/collaborate/data-tile.md b/website/docs/docs/collaborate/data-tile.md index 70318922d68..1d5b26e26b7 100644 --- a/website/docs/docs/collaborate/data-tile.md +++ b/website/docs/docs/collaborate/data-tile.md @@ -92,7 +92,7 @@ Follow these steps to embed the data health tile in PowerBI: ```html Website = - "" + "" ``` @@ -120,12 +120,34 @@ Follow these steps to embed the data health tile in Tableau: 3. Insert a **Web Page** object. 4. Insert the URL and click **Ok**. - `https://metadata.cloud.getdbt.com/exposure-tile?uniqueId=exposure.snowflake_tpcds_sales_spoke.customer360_test&environmentType=production&environmentId=220370&token=` - + ```html + https://metadata.ACCESS_URL/exposure-tile?uniqueId=exposure.EXPOSURE_NAME&environmentType=production&environmentId=220370&token= + ``` + *Note, replace the placeholders with your actual values.* 5. You should now see the data health tile embedded in your Tableau dashboard. + + + +Follow these steps to embed the data health tile in Sigma: + + + +1. Create a dashboard in Sigma and connect to your database to pull in the data. +2. Ensure you've copied the URL or iFrame snippet available in dbt Explorer's **Data health** section, under the **Embed data health into your dashboard** toggle. +3. Add a new embedded UI element in your Sigma Workbook in the following format: + + ```html + https://metadata.ACCESS_URL/exposure-tile?uniqueId=exposure.EXPOSURE_NAME&environmentType=production&environmentId=ENV_ID_NUMBER&token= + ``` + + *Note, replace the placeholders with your actual values.* +4. You should now see the data health tile embedded in your Sigma dashboard. + + + ## Job-based data health diff --git a/website/docs/docs/collaborate/govern/project-dependencies.md b/website/docs/docs/collaborate/govern/project-dependencies.md index 2e73eee028b..c054d1b27b7 100644 --- a/website/docs/docs/collaborate/govern/project-dependencies.md +++ b/website/docs/docs/collaborate/govern/project-dependencies.md @@ -21,6 +21,7 @@ This year, dbt Labs is introducing an expanded notion of `dependencies` across m - Use a supported version of dbt (v1.6, v1.7, or go versionless with "[Versionless](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless)") for both the upstream ("producer") project and the downstream ("consumer") project. - Define models in an upstream ("producer") project that are configured with [`access: public`](/reference/resource-configs/access). You need at least one successful job run after defining their `access`. - Define a deployment environment in the upstream ("producer") project [that is set to be your Production environment](/docs/deploy/deploy-environments#set-as-production-environment), and ensure it has at least one successful job run in that environment. +- If the upstream project has a Staging environment, run a job in that Staging environment to ensure the downstream cross-project ref resolves. - Each project `name` must be unique in your dbt Cloud account. For example, if you have a dbt project (codebase) for the `jaffle_marketing` team, you should not create separate projects for `Jaffle Marketing - Dev` and `Jaffle Marketing - Prod`. That isolation should instead be handled at the environment level. - We are adding support for environment-level permissions and data warehouse connections; please contact your dbt Labs account team for beta access. - The `dbt_project.yml` file is case-sensitive, which means the project name must exactly match the name in your `dependencies.yml`. For example, if your project name is `jaffle_marketing`, you should use `jaffle_marketing` (not `JAFFLE_MARKETING`) in all related files. @@ -110,7 +111,10 @@ Read [Why use a staging environment](/docs/deploy/deploy-environments#why-use-a- #### Staging with downstream dependencies -dbt Cloud begins using the Staging environment to resolve cross-project references from downstream projects as soon as it exists in a project without "fail-over" to Production. To avoid causing downtime for downstream developers, you should define and trigger a job before marking the environment as Staging: +dbt Cloud begins using the Staging environment to resolve cross-project references from downstream projects as soon as it exists in a project without "fail-over" to Production. This means that dbt Cloud will consistently use metadata from the Staging environment to resolve references in downstream projects, even if there haven't been any successful runs in the configured Staging environment. + +To avoid causing downtime for downstream developers, you should define and trigger a job before marking the environment as Staging: + 1. Create a new environment, but do NOT mark it as **Staging**. 2. Define a job in that environment. 3. Trigger the job to run, and ensure it completes successfully. diff --git a/website/docs/docs/collaborate/model-query-history.md b/website/docs/docs/collaborate/model-query-history.md index 0f43c9b163f..0180757f980 100644 --- a/website/docs/docs/collaborate/model-query-history.md +++ b/website/docs/docs/collaborate/model-query-history.md @@ -25,7 +25,7 @@ So for example, if `model_super_santi` was queried 10 times in the past week, it To access the features, you should meet the following: -1. You have a dbt Cloud account on the [Enterprise plan](https://www.getdbt.com/pricing/). +1. You have a dbt Cloud account on the [Enterprise plan](https://www.getdbt.com/pricing/). Single-tenant accounts should contact their account representative for setup. 2. You have set up a [production](https://docs.getdbt.com/docs/deploy/deploy-environments#set-as-production-environment) deployment environment for each project you want to explore, with at least one successful job run. 3. You have [admin permissions](/docs/cloud/manage-access/enterprise-permissions) in dbt Cloud to edit project settings or production environment settings. 4. Use Snowflake or BigQuery as your data warehouse and can enable query history permissions or work with an admin to do so. Support for additional data platforms coming soon. diff --git a/website/docs/docs/dbt-cloud-apis/service-tokens.md b/website/docs/docs/dbt-cloud-apis/service-tokens.md index fe8ace5fa34..4c0f6222fe9 100644 --- a/website/docs/docs/dbt-cloud-apis/service-tokens.md +++ b/website/docs/docs/dbt-cloud-apis/service-tokens.md @@ -36,80 +36,37 @@ You can assign service account tokens to any permission set available in dbt Clo ### Team plans using service account tokens -The following permissions can be assigned to a service account token on a Team plan. +The following permissions can be assigned to a service account token on a Team plan. Refer to [Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions) for more information about these roles. -**Account Admin**
-Account Admin service tokens have full `read + write` access to an account, so please use them with caution. A Team plan refers to this permission set as an "Owner role." For more on these permissions, see [Account Admin](/docs/cloud/manage-access/enterprise-permissions#account-admin). - -**Metadata Only**
-Metadata-only service tokens authorize requests to the Discovery API. - -**Semantic Layer Only**
-Semantic Layer-only service tokens authorize requests to the Semantic Layer APIs. - -**Job Admin**
-Job admin service tokens can authorize requests for viewing, editing, and creating environments, triggering runs, and viewing historical runs. - -**Job Runner**
-Job runner service tokens can authorize requests for triggering runs and viewing historical runs. - -**Member**
-Member service tokens can authorize requests for viewing and editing resources, triggering runs, and inviting members to the account. Tokens assigned the Member permission set will have the same permissions as a Member user. For more information about Member users, see "[Self-service Team plan permissions](/docs/cloud/manage-access/self-service-permissions)". - -**Read-only**
-Read-only service tokens can authorize requests for viewing a read-only dashboard, viewing generated documentation, and viewing source freshness reports. This token can access and retrieve account-level information endpoints on the [Admin API](/docs/dbt-cloud-apis/admin-cloud-api) and authorize requests to the [Discovery API](/docs/dbt-cloud-apis/discovery-api). +- Account Admin — Account Admin service tokens have full `read + write` access to an account, so please use them with caution. A Team plan refers to this permission set as an "Owner role." +- Billing Admin +- Job Admin +- Metadata Only +- Member +- Read-only +- Semantic Layer Only ### Enterprise plans using service account tokens -The following permissions can be assigned to a service account token on an Enterprise plan. For more details about these permissions, see "[Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions)." - -**Account Admin**
-Account Admin service tokens have full `read + write` access to an account, so please use them with caution. For more on these permissions, see [Account Admin](/docs/cloud/manage-access/enterprise-permissions#account-admin). - -**Security Admin**
-Security Admin service tokens have certain account-level permissions. For more on these permissions, see [Security Admin](/docs/cloud/manage-access/enterprise-permissions#security-admin). - -**Billing Admin**
-Billing Admin service tokens have certain account-level permissions. For more on these permissions, see [Billing Admin](/docs/cloud/manage-access/enterprise-permissions#billing-admin). - -**Manage marketplace apps**
-Used only for service tokens assigned to marketplace apps (for example, the [Snowflake Native app](/docs/cloud-integrations/snowflake-native-app)). - -**Metadata Only**
-Metadata-only service tokens authorize requests to the Discovery API. - -**Semantic Layer Only**
-Semantic Layer-only service tokens authorize requests to the Semantic Layer APIs. - -**Job Admin**
-Job Admin service tokens can authorize requests for viewing, editing, and creating environments, triggering runs, and viewing historical runs. For more on these permissions, see [Job Admin](/docs/cloud/manage-access/enterprise-permissions#job-admin). - -**Account Viewer**
-Account Viewer service tokens have read-only access to dbt Cloud accounts. For more on these permissions, see [Account Viewer](/docs/cloud/manage-access/enterprise-permissions#account-viewer) on the Enterprise Permissions page. - -**Admin**
-Admin service tokens have unrestricted access to projects in dbt Cloud accounts. You have the option to grant that permission all projects in the account or grant the permission only on specific projects. For more on these permissions, see [Admin Service](/docs/cloud/manage-access/enterprise-permissions#admin-service) on the Enterprise Permissions page. - -**Git Admin**
-Git admin service tokens have all the permissions listed in [Git admin](/docs/cloud/manage-access/enterprise-permissions#git-admin) on the Enterprise Permissions page. - -**Database Admin**
-Database admin service tokens have all the permissions listed in [Database admin](/docs/cloud/manage-access/enterprise-permissions#database-admin) on the Enterprise Permissions page. - -**Team Admin**
-Team admin service tokens have all the permissions listed in [Team admin](/docs/cloud/manage-access/enterprise-permissions#team-admin) on the Enterprise Permissions page. - -**Job Viewer**
-Job viewer admin service tokens have all the permissions listed in [Job viewer](/docs/cloud/manage-access/enterprise-permissions#job-viewer) on the Enterprise Permissions page. - -**Developer**
-Developer service tokens have all the permissions listed in [Developer](/docs/cloud/manage-access/enterprise-permissions#developer) on the Enterprise Permissions page. - -**Analyst**
-Analyst admin service tokens have all the permissions listed in [Analyst](/docs/cloud/manage-access/enterprise-permissions#analyst) on the Enterprise Permissions page. - -**Stakeholder**
-Stakeholder service tokens have all the permissions listed in [Stakeholder](/docs/cloud/manage-access/enterprise-permissions#stakeholder) on the Enterprise Permissions page. +Refer to [Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions) for more information about these roles. + +- Account Admin — Account Admin service tokens have full `read + write` access to an account, so please use them with caution. +- Account Viewer +- Admin +- Analyst +- Billing Admin +- Database Admin +- Developer +- Git Admin +- Job Admin +- Job Runner +- Job Viewer +- Manage marketplace apps +- Metadata Only +- Semantic Layer Only +- Security Admin +- Stakeholder +- Team Admin ## Service token update diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index 662fd0f381a..982d72e8fc0 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -20,6 +20,9 @@ Release notes are grouped by month for both multi-tenant and virtual private clo ## October 2024 +- **New**: The dbt Cloud IDE supports signed commits for Git, available for Enterprise plans. You can sign your Git commits when pushing them to the repository to prevent impersonation and enhance security. Supported Git providers are GitHub and GitLab. Refer to [Git commit signing](/docs/cloud/dbt-cloud-ide/git-commit-signing.md) for more information. +- **New:** With dbt Mesh, you can now enable bidirectional dependencies across your projects. Previously, dbt enforced dependencies to only go in one direction. dbt checks for cycles across projects and raises errors if any are detected. For details, refer to [Cycle detection](/docs/collaborate/govern/project-dependencies#cycle-detection). There's also the [Intro to dbt Mesh](/best-practices/how-we-mesh/mesh-1-intro) guide to help you learn more best practices. + Documentation for new features and functionality announced at Coalesce 2024: diff --git a/website/docs/docs/deploy/job-notifications.md b/website/docs/docs/deploy/job-notifications.md index d0df99b4a1d..62c51461ab2 100644 --- a/website/docs/docs/deploy/job-notifications.md +++ b/website/docs/docs/deploy/job-notifications.md @@ -23,7 +23,7 @@ You can receive email alerts about jobs by configuring the dbt Cloud email notif ### Configure email notifications -1. From the gear menu, choose **Notification settings**. +1. Select your profile icon and then click **Notification settings**. 1. By default, dbt Cloud sends notifications to the email address that's in your **User profile** page. If you're an account admin, you can choose a different email address to receive notifications. Select the **Notification email** dropdown and choose another address from the list. The list includes **Internal Users** with access to the account and **External Emails** that have been added. @@ -44,12 +44,12 @@ You can receive email alerts about jobs by configuring the dbt Cloud email notif ### Unsubscribe from email notifications -1. From the gear menu, choose **Notification settings**. +1. Select your profile icon and click on **Notification settings**. 1. On the **Email notifications** page, click **Unsubscribe from all email notifications**. ## Slack notifications -You can receive Slack alerts about jobs by setting up the Slack integration, then configuring the dbt Cloud Slack notification settings. dbt Cloud integrates with Slack via OAuth to ensure secure authentication. +You can receive Slack alerts about jobs by setting up the Slack integration and then configuring the dbt Cloud Slack notification settings. dbt Cloud integrates with Slack via OAuth to ensure secure authentication. :::note If there has been a change in user roles or Slack permissions where you no longer have access to edit a configured Slack channel, please [contact support](mailto:support@getdbt.com) for assistance. @@ -62,7 +62,7 @@ If there has been a change in user roles or Slack permissions where you no longe ### Set up the Slack integration -1. From the gear menu, select **Account settings** and then select **Integrations** from the left sidebar. +1. Select **Account settings** and then select **Integrations** from the left sidebar. 1. Locate the **OAuth** section with the Slack application and click **Link**. @@ -76,13 +76,13 @@ If you're logged out or the Slack app/website is closed, you must authenticate b 1. Complete the field defining the Slack workspace you want to integrate with dbt Cloud. -2. Sign in with an existing identity or use email address and password. +2. Sign in with an existing identity or use the email address and password. 3. Once you have authenticated successfully, accept the permissions. ### Configure Slack notifications -1. From the gear menu, choose **Notification settings**. +1. Select your profile icon and then click on **Notification settings**. 1. Select **Slack notifications** in the left sidebar. 1. Select the **Notification channel** you want to receive the job run notifications from the dropdown. @@ -98,5 +98,5 @@ If you're logged out or the Slack app/website is closed, you must authenticate b ### Disable the Slack integration -1. From the gear menu, select **Account settings**. On the **Integrations** page, scroll to the **OAuth** section. +1. Select **Account settings** and on the **Integrations** page, scroll to the **OAuth** section. 1. Click the trash can icon (on the far right of the Slack integration) and click **Unlink**. Channels that you configured will no longer receive Slack notifications. _This is not an account-wide action._ Channels configured by other account admins will continue to receive Slack notifications if they still have active Slack integrations. To migrate ownership of a Slack channel notification configuration, have another account admin edit their configuration. diff --git a/website/docs/docs/deploy/webhooks.md b/website/docs/docs/deploy/webhooks.md index dfb8b7c7406..ffea38b5b84 100644 --- a/website/docs/docs/deploy/webhooks.md +++ b/website/docs/docs/deploy/webhooks.md @@ -29,7 +29,7 @@ You can also check out the free [dbt Fundamentals course](https://learn.getdbt.c ## Prerequisites - You have a dbt Cloud account that is on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). - For `write` access to webhooks: - - **Enterprise plan accounts** — Permission sets are the same for both API service tokens and the dbt Cloud UI. You, or the API service token, must have the [Account Admin](/docs/cloud/manage-access/enterprise-permissions#account-admin), [Admin](/docs/cloud/manage-access/enterprise-permissions#admin), or [Developer](/docs/cloud/manage-access/enterprise-permissions#developer) permission set. + - **Enterprise plan accounts** — Permission sets are the same for both API service tokens and the dbt Cloud UI. You, or the API service token, must have the Account Admin, Admin, or Developer [permission set](/docs/cloud/manage-access/enterprise-permissions). - **Team plan accounts** — For the dbt Cloud UI, you need to have a [Developer license](/docs/cloud/manage-access/self-service-permissions). For API service tokens, you must assign the service token to have the [Account Admin or Member](/docs/dbt-cloud-apis/service-tokens#team-plans-using-service-account-tokens) permission set. - You have a multi-tenant or an AWS single-tenant deployment model in dbt Cloud. For more information, refer to [Tenancy](/docs/cloud/about-cloud/tenancy). - Your destination system supports [Authorization headers](#troubleshooting). diff --git a/website/docs/faqs/Git/gitignore.md b/website/docs/faqs/Git/gitignore.md index 16575861289..4386a27d4f2 100644 --- a/website/docs/faqs/Git/gitignore.md +++ b/website/docs/faqs/Git/gitignore.md @@ -80,9 +80,9 @@ dbt_modules/ * `target`, `dbt_modules`, `dbt_packages`, `logs` 7. Commit (save) the deletions to the main branch. 8. Switch to the dbt Cloud IDE, and open the project that you're fixing. -9. Reclone your repo in the IDE by clicking on the three dots next to the **IDE Status** button on the lower right corner of the IDE screen, then select **Reclone Repo**. - * **Note** — Any saved but uncommitted changes will be lost, so make sure you copy any modified code that you want to keep in a temporary location outside of dbt Cloud. -10. Once you reclone the repo, open the `.gitignore` file in the branch you're working in. If the new changes aren't included, you'll need to merge the latest commits from the main branch into your working branch. +9. [Rollback your repo to remote](/docs/collaborate/git/version-control-basics#the-git-button-in-the-cloud-ide) in the IDE by clicking on the three dots next to the **IDE Status** button on the lower right corner of the IDE screen, then select **Rollback to remote**. + * **Note** — Rollback to remote resets your repo back to an earlier clone from your remote. Any saved but uncommitted changes will be lost, so make sure you copy any modified code that you want to keep in a temporary location outside of dbt Cloud. +10. Once you rollback to remote, open the `.gitignore` file in the branch you're working in. If the new changes aren't included, you'll need to merge the latest commits from the main branch into your working branch. 11. Go to the **File Explorer** to verify the `.gitignore` file contains the correct entries and make sure the untracked files/folders in the .gitignore file are in *italics*. 12. Great job 🎉! You've configured the `.gitignore` correctly and can continue with your development! @@ -111,9 +111,9 @@ dbt_modules/ 8. Open a merge request using the git provider web interface. The merge request should attempt to merge the changes into the 'main' branch that all development branches are created from. 9. Follow the necessary procedures to get the branch approved and merged into the 'main' branch. You can delete the branch after the merge is complete. 10. Once the merge is complete, go back to the dbt Cloud IDE, and open the project that you're fixing. -11. Reclone your repo in the IDE by clicking on the three dots next to the **IDE Status** button on the lower right corner of the IDE screen, then select **Reclone Repo**. - * **Note** — Any saved but uncommitted changes will be lost, so make sure you copy any modified code that you want to keep in a temporary location outside of dbt Cloud. -12. Once you reclone the repo, open the `.gitignore` file in the branch you're working in. If the new changes aren't included, you'll need to merge the latest commits from the main branch into your working branch. +11. [Rollback your repo to remote](/docs/collaborate/git/version-control-basics#the-git-button-in-the-cloud-ide) in the IDE by clicking on the three dots next to the **IDE Status** button on the lower right corner of the IDE screen, then select **Rollback to remote**. + * **Note** — Rollback to remote resets your repo back to an earlier clone from your remote. Any saved but uncommitted changes will be lost, so make sure you copy any modified code that you want to keep in a temporary location outside of dbt Cloud. +12. Once you rollback to remote, open the `.gitignore` file in the branch you're working in. If the new changes aren't included, you'll need to merge the latest commits from the main branch into your working branch. 13. Go to the **File Explorer** to verify the `.gitignore` file contains the correct entries and make sure the untracked files/folders in the .gitignore file are in *italics*. 14. Great job 🎉! You've configured the `.gitignore` correctly and can continue with your development! diff --git a/website/docs/guides/mesh-qs.md b/website/docs/guides/mesh-qs.md index 0d13d043059..47ece7b29ec 100644 --- a/website/docs/guides/mesh-qs.md +++ b/website/docs/guides/mesh-qs.md @@ -300,6 +300,8 @@ To run your first deployment dbt Cloud job, you will need to create a new dbt Cl 5. After the run is complete, click **Explore** from the upper menu bar. You should now see your lineage, tests, and documentation coming through successfully. +For details on how dbt Cloud uses metadata from the Staging environment to resolve references in downstream projects, check out the section on [Staging with downstream dependencies](/docs/collaborate/govern/project-dependencies#staging-with-downstream-dependencies). + ## Reference a public model in your downstream project In this section, you will set up the downstream project, "Jaffle | Finance", and [cross-project reference](/docs/collaborate/govern/project-dependencies) the `fct_orders` model from the foundational project. Navigate to the **Develop** page to set up our project: diff --git a/website/docs/reference/resource-configs/databricks-configs.md b/website/docs/reference/resource-configs/databricks-configs.md index eea03f2b196..9e1b3282801 100644 --- a/website/docs/reference/resource-configs/databricks-configs.md +++ b/website/docs/reference/resource-configs/databricks-configs.md @@ -35,6 +35,7 @@ When materializing a model as `table`, you may include several optional configs | clustered_by | Each partition in the created table will be split into a fixed number of buckets by the specified columns. | Optional | SQL, Python | `country_code` | | buckets | The number of buckets to create while clustering | Required if `clustered_by` is specified | SQL, Python | `8` | | tblproperties | [Tblproperties](https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-tblproperties.html) to be set on the created table | Optional | SQL, Python* | `{'this.is.my.key': 12}` | +| compression | Set the compression algorithm. | Optional | SQL, Python | `zstd` | \* Beginning in 1.7.12, we have added tblproperties to Python models via an alter statement that runs after table creation. We do not yet have a PySpark API to set tblproperties at table creation, so this feature is primarily to allow users to anotate their python-derived tables with tblproperties. @@ -54,7 +55,8 @@ We do not yet have a PySpark API to set tblproperties at table creation, so this | clustered_by | Each partition in the created table will be split into a fixed number of buckets by the specified columns. | Optional | SQL, Python | `country_code` | | buckets | The number of buckets to create while clustering | Required if `clustered_by` is specified | SQL, Python | `8` | | tblproperties | [Tblproperties](https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-tblproperties.html) to be set on the created table | Optional | SQL, Python* | `{'this.is.my.key': 12}` | -| databricks_tags | [Tags](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html) to be set on the created table | Optional | SQL+, Python+ | `{'my_tag': 'my_value'}` | +| databricks_tags | [Tags](https://docs.databricks.com/en/data-governance/unity-catalog/tags.html) to be set on the created table | Optional | SQL+, Python+ | `{'my_tag': 'my_value'}` | +| compression | Set the compression algorithm. | Optional | SQL, Python | `zstd` | \* Beginning in 1.7.12, we have added tblproperties to Python models via an alter statement that runs after table creation. We do not yet have a PySpark API to set tblproperties at table creation, so this feature is primarily to allow users to anotate their python-derived tables with tblproperties. diff --git a/website/docs/reference/resource-configs/snowflake-configs.md b/website/docs/reference/resource-configs/snowflake-configs.md index faf2d0c5e16..abb516d2258 100644 --- a/website/docs/reference/resource-configs/snowflake-configs.md +++ b/website/docs/reference/resource-configs/snowflake-configs.md @@ -299,7 +299,7 @@ Snowflake allows two configuration scenarios for scheduling automatic refreshes: - **Time-based** — Provide a value of the form ` { seconds | minutes | hours | days }`. For example, if the dynamic table needs to be updated every 30 minutes, use `target_lag='30 minutes'`. - **Downstream** — Applicable when the dynamic table is referenced by other dynamic tables. In this scenario, `target_lag='downstream'` allows for refreshes to be controlled at the target, instead of at each layer. -Learn more about `target_lag` in Snowflake's [docs](https://docs.snowflake.com/en/user-guide/dynamic-tables-refresh#understanding-target-lag). +Learn more about `target_lag` in Snowflake's [docs](https://docs.snowflake.com/en/user-guide/dynamic-tables-refresh#understanding-target-lag). Please note that Snowflake supports a target lag of 1 minute or longer. diff --git a/website/docs/reference/resource-properties/description.md b/website/docs/reference/resource-properties/description.md index 6f32f75efa4..cf7b2b29a5a 100644 --- a/website/docs/reference/resource-properties/description.md +++ b/website/docs/reference/resource-properties/description.md @@ -13,7 +13,7 @@ description: "This guide explains how to use the description key to add YAML des { label: 'Snapshots', value: 'snapshots', }, { label: 'Analyses', value: 'analyses', }, { label: 'Macros', value: 'macros', }, - { label: 'Singular data tests', value: 'singular_data_tests', }, + { label: 'Data tests', value: 'data_tests', }, ] }> @@ -146,17 +146,17 @@ macros: - + - + ```yml version: 2 data_tests: - - name: singular_data_test_name + - name: data_test_name description: markdown_string ``` @@ -167,13 +167,12 @@ data_tests: -The `description` property is available for singular data tests beginning in dbt v1.9. +The `description` property is available for generic and singular data tests beginning in dbt v1.9. - ## Definition diff --git a/website/sidebars.js b/website/sidebars.js index 2178e4cd35a..c8a711815a8 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -288,6 +288,7 @@ const sidebarSettings = { "docs/cloud/dbt-cloud-ide/keyboard-shortcuts", "docs/cloud/dbt-cloud-ide/ide-user-interface", "docs/cloud/dbt-cloud-ide/lint-format", + "docs/cloud/dbt-cloud-ide/git-commit-signing", { type: "category", label: "dbt Copilot", diff --git a/website/snippets/_enterprise-permissions-table.md b/website/snippets/_enterprise-permissions-table.md index d972fc95342..688e8911bf4 100644 --- a/website/snippets/_enterprise-permissions-table.md +++ b/website/snippets/_enterprise-permissions-table.md @@ -43,7 +43,6 @@ Account roles enable you to manage the dbt Cloud account and manage the account | Jobs | W | | W | | R | | Metadata GraphQL API access | R | | R | | R | | Permissions | W | | W | W | R | -| Profile | W | | W | | R | | Projects | W | | W | R | R | | Repositories | W | | W | | R | | Runs | W | | W | | R | @@ -75,15 +74,14 @@ The project roles enable you to work within the projects in various capacities. |Project-level permission | Admin | Analyst | Database admin | Developer | Git Admin | Job admin | Job runner | Job viewer | Metadata

(Discovery API only) | Semantic Layer | Stakeholder | Team admin | Webhook | |--------------------------|:-----:|:-------:|:--------------:|:---------:|:---------:|:---------:|:-----------:|:-----------:|:--------:|:--------------:|:-----------:|:----------:|:-------:| -| Environment credentials (deployment) | W | W | W | W | R | W | | | | | R | R | | +| Environment credentials (deployment) | W | W | W | W | R | W | | | | | R | R | | | Custom env. variables | W | W | W | W | W | W | | R | | | R | W | | -| Data platform configurations | W | W | W | W | R | W | | | | | R | R | | +| Data platform configurations| W | W | W | W | R | W | | | | | R | R | | | Develop
(IDE or dbt Cloud CLI) | W | W | | W | | | | | | | | | | | Environments | W | R | R | R | R | W | | R | | | R | R | | | Jobs | W | R | R | R | R | W | R | R | | | R | R | | | Metadata GraphQL API access | R | R | R | R | R | R | | R | R | | R | R | | -| Permissions (Groups & Licenses) | W | | R | R | R | | | | | | | R | | -| Profile (Credentials) | W | R | | R | R | R | | | | | R | | | +| Permissions (Groups & Licenses) | W | | R | R | R | | | | | | | R | | | | | R | | | | Projects | W | W | W | W | W | R | | R | | | R | W | | | Repositories | W | | R | R | W | | | | | | R | R | | | Runs | W | R | R | R | R | W | W | R | | | R | R | | diff --git a/website/snippets/_mesh-cycle-detection.md b/website/snippets/_mesh-cycle-detection.md index 2a48d0a15bd..2b4b17385ba 100644 --- a/website/snippets/_mesh-cycle-detection.md +++ b/website/snippets/_mesh-cycle-detection.md @@ -1,8 +1,5 @@ -Currently, the default behavior for "project" dependencies enforces that these relationships only go in one direction, meaning that the `jaffle_finance` project could not add a new model that depends, on any public models produced by the `jaffle_marketing` project. dbt will check for cycles across projects and raise errors if any are detected. +You can enable bidirectional dependencies across projects so these relationships can go in either direction, meaning that the `jaffle_finance` project can add a new model that depends on any public models produced by the `jaffle_marketing` project, so long as the new dependency doesn't introduce any node-level cycles. dbt checks for cycles across projects and raises errors if any are detected. -However, many teams may want to be able to share data assets back and forth between teams. _We've added support for enabling bidirectional dependencies across projects, currently in beta_. - -To enable this in your account, set the environment variable `DBT_CLOUD_PROJECT_CYCLES_ALLOWED` to `TRUE` in all your dbt Cloud environments. This allows you to create bidirectional dependencies between projects, so long as the new dependency does not introduce any node-level cycles. When setting up projects that depend on each other, it's important to do so in a stepwise fashion. Each project must run and produce public models before the original producer project can take a dependency on the original consumer project. For example, the order of operations would be as follows for a simple two-project setup: @@ -10,5 +7,3 @@ When setting up projects that depend on each other, it's important to do so in a 2. The `project_b` project adds `project_a` as a dependency. 3. The `project_b` project runs in a deployment environment and produces public models. 4. The `project_a` project adds `project_b` as a dependency. - -If you enable this feature and experience any issues, please reach out to [dbt Cloud support](mailto:support@getdbt.com). diff --git a/website/snippets/_v2-sl-prerequisites.md b/website/snippets/_v2-sl-prerequisites.md index f8108849f4f..68f6e7d10b7 100644 --- a/website/snippets/_v2-sl-prerequisites.md +++ b/website/snippets/_v2-sl-prerequisites.md @@ -1,4 +1,4 @@ -- Have a dbt Cloud Team or Enterprise account. Single-tenant accounts should contact their account representative setup. +- Have a dbt Cloud Team or Enterprise account. Single-tenant accounts should contact their account representative for setup. - Ensure your production and development environments use [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). - Use Snowflake, BigQuery, Databricks, or Redshift. - Create a successful run in the environment where you configure the Semantic Layer. diff --git a/website/src/components/hero/index.js b/website/src/components/hero/index.js index e4bef8e234b..073fb518998 100644 --- a/website/src/components/hero/index.js +++ b/website/src/components/hero/index.js @@ -1,7 +1,7 @@ import React from 'react'; import styles from './styles.module.css'; -function Hero({ heading, subheading, showGraphic = false, customStyles = {}, classNames = '', colClassNames = '' }) { +function Hero({ heading, subheading, showGraphic = false, customStyles = {}, classNames = '', colClassNames = '', callToActionsTitle, callToActions = [] }) { return (
{showGraphic && ( @@ -12,6 +12,22 @@ function Hero({ heading, subheading, showGraphic = false, customStyles = {}, cla

{heading}

{subheading}

+ {callToActionsTitle && {callToActionsTitle}} + {callToActions.length > 0 && ( +
+ {callToActions.map((c, index) => ( + c.href ? ( + + {c.title} + + ) : ( + + {c.title} + + ) + ))} +
+ )}
diff --git a/website/src/components/hero/styles.module.css b/website/src/components/hero/styles.module.css index f596b53762a..67d3c8c5d68 100644 --- a/website/src/components/hero/styles.module.css +++ b/website/src/components/hero/styles.module.css @@ -49,3 +49,34 @@ width: 60%; } } + +.callToActionsTitle { + font-weight: bold; + margin-top: 20px; + margin-bottom: 20px; + font-size: 1.25rem; + display: block; +} + +.callToActions { + display: flex; + flex-flow: wrap; + gap: 0.8rem; + justify-content: center; +} + +.callToAction { + outline: #fff solid 1px; + border-radius: 4px; + padding: 0 12px; + color: #fff; + transition: all .2s; + cursor: pointer; +} + +.callToAction:hover, .callToAction:active, .callToAction:focus { + text-decoration: none; + outline: rgb(4, 115, 119) solid 1px; + background-color: rgb(4, 115, 119); + color: #fff; +} diff --git a/website/src/components/quickstartGuideList/index.js b/website/src/components/quickstartGuideList/index.js index 0f4b5764340..2b87ae3e4a1 100644 --- a/website/src/components/quickstartGuideList/index.js +++ b/website/src/components/quickstartGuideList/index.js @@ -61,7 +61,7 @@ function QuickstartList({ quickstartData }) { // Update the URL with the new search parameters history.replace({ search: params.toString() }); -}; + }; // Handle all filters const handleDataFilter = () => { @@ -98,6 +98,30 @@ function QuickstartList({ quickstartData }) { handleDataFilter(); }, [selectedTags, selectedLevel, searchInput]); // Added searchInput to dependency array + // Set the featured guides that will show as CTAs in the hero section + // The value of the tag must match a tag in the frontmatter of the guides in order for the filter to apply after clicking + const heroCTAs = [ + { + title: 'Quickstart guides', + value: 'Quickstart' + }, + { + title: 'Use Jinja to improve your SQL code', + value: 'Jinja' + }, + { + title: 'Orchestration', + value: 'Orchestration' + }, + ]; + + // Function to handle CTA clicks + const handleCallToActionClick = (value) => { + const params = new URLSearchParams(location.search); + params.set('tags', value); + history.replace({ search: params.toString() }); + }; + return ( @@ -111,6 +135,13 @@ function QuickstartList({ quickstartData }) { showGraphic={false} customStyles={{ marginBottom: 0 }} classNames={styles.quickstartHero} + callToActions={heroCTAs.map(guide => ({ + title: guide.title, + href: guide.href, + onClick: () => handleCallToActionClick(guide.value), + newTab: guide.newTab + }))} + callToActionsTitle={'Popular guides'} />
@@ -135,7 +166,7 @@ function QuickstartList({ quickstartData }) {
- ) + ); } export default QuickstartList; diff --git a/website/static/img/docs/cloud-integrations/example-snowflake-native-app-service-token.png b/website/static/img/docs/cloud-integrations/example-snowflake-native-app-service-token.png index 7e4c7ab99da..930182969c2 100644 Binary files a/website/static/img/docs/cloud-integrations/example-snowflake-native-app-service-token.png and b/website/static/img/docs/cloud-integrations/example-snowflake-native-app-service-token.png differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/sigma-example.jpg b/website/static/img/docs/collaborate/dbt-explorer/sigma-example.jpg new file mode 100644 index 00000000000..b1aa4533e08 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/sigma-example.jpg differ diff --git a/website/static/img/docs/dbt-cloud/cloud-ide/ide-options-menu-with-save.jpg b/website/static/img/docs/dbt-cloud/cloud-ide/ide-options-menu-with-save.jpg index bd57ee514ee..8a968f684bd 100644 Binary files a/website/static/img/docs/dbt-cloud/cloud-ide/ide-options-menu-with-save.jpg and b/website/static/img/docs/dbt-cloud/cloud-ide/ide-options-menu-with-save.jpg differ diff --git a/website/static/img/docs/dbt-cloud/cloud-ide/restart-ide.jpg b/website/static/img/docs/dbt-cloud/cloud-ide/restart-ide.jpg index 98d71403cdd..031ec19227f 100644 Binary files a/website/static/img/docs/dbt-cloud/cloud-ide/restart-ide.jpg and b/website/static/img/docs/dbt-cloud/cloud-ide/restart-ide.jpg differ diff --git a/website/static/img/docs/dbt-cloud/example-git-signed-commits-setting.png b/website/static/img/docs/dbt-cloud/example-git-signed-commits-setting.png new file mode 100644 index 00000000000..bf3f8169359 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-git-signed-commits-setting.png differ diff --git a/website/static/img/docs/dbt-cloud/git-sign-verified.jpg b/website/static/img/docs/dbt-cloud/git-sign-verified.jpg new file mode 100644 index 00000000000..86fbdd58dc9 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/git-sign-verified.jpg differ