Skip to content

Commit

Permalink
Merge branch 'current' into patch-13
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Dec 23, 2024
2 parents ce615e2 + 567d744 commit 872b30f
Show file tree
Hide file tree
Showing 60 changed files with 225 additions and 92 deletions.
1 change: 0 additions & 1 deletion website/docs/docs/build/environment-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,6 @@ dbt Cloud has a number of pre-defined variables built in. Variables are set auto
The following environment variable is set automatically for the dbt Cloud IDE:

- `DBT_CLOUD_GIT_BRANCH` — Provides the development Git branch name in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud).
- Available in dbt v1.6 and later.
- The variable changes when the branch is changed.
- Doesn't require restarting the IDE after a branch change.
- Currently not available in the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation).
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/packages.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ Where `name: 'dbt_utils'` specifies the subfolder of `dbt_packages` that's creat

### Native private packages <Lifecycle status='beta'/>

dbt Cloud supports private packages from [supported](#prerequisites) Git repos leveraging an exisiting [configuration](/docs/cloud/git/git-configuration-in-dbt-cloud) in your environment. Previously, you had to configure a [token](#git-token-method) to retrieve packages from your private repos.
dbt Cloud supports private packages from [supported](#prerequisites) Git repos leveraging an existing [configuration](/docs/cloud/git/git-configuration-in-dbt-cloud) in your environment. Previously, you had to configure a [token](#git-token-method) to retrieve packages from your private repos.

#### Prerequisites

Expand Down
5 changes: 2 additions & 3 deletions website/docs/docs/build/unit-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,10 @@ keywords:
- unit test, unit tests, unit testing, dag
---

:::note
<VersionCallout version="1.8" />


Unit testing functionality is available in [dbt Cloud Release Tracks](/docs/dbt-versions/cloud-release-tracks) or dbt Core v1.8+

:::

Historically, dbt's test coverage was confined to [“data” tests](/docs/build/data-tests), assessing the quality of input data or resulting datasets' structure. However, these tests could only be executed _after_ building a model.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ App users are able to access all information that's available to the API service
## Procurement
The dbt Snowflake Native App is available on the [Snowflake Marketplace](https://app.snowflake.com/marketplace/listing/GZTYZSRT2R3). Purchasing it includes access to the Native App and a dbt Cloud account that's on the Enterprise plan. Existing dbt Cloud Enterprise customers can also access it. If interested, contact your Enterprise account manager.

If you're interested, please [contact us](matilto:[email protected]) for more information.
If you're interested, please [contact us](mailto:[email protected]) for more information.

## Support
If you have any questions about the dbt Snowflake Native App, you may [contact our Support team](mailto:[email protected]) for help. Please provide information about your installation of the Native App, including your dbt Cloud account ID and Snowflake account identifier.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ description: "Import and auto-generate exposures from dashboards and understand
image: /img/docs/cloud-integrations/auto-exposures/explorer-lineage2.jpg
---

# Configure auto-exposures <Lifecycle status="preview,enterprise" />
# Configure auto-exposures <Lifecycle status="enterprise" />

As a data team, it’s critical that you have context into the downstream use cases and users of your data products. [Auto-exposures](/docs/collaborate/auto-exposures) integrates natively with Tableau and [auto-generates downstream lineage](/docs/collaborate/auto-exposures#view-auto-exposures-in-dbt-explorer) in dbt Explorer for a richer experience.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud-integrations/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Many data applications integrate with dbt Cloud, enabling you to leverage the po
<div className="grid--3-col">

<Card
title="Configure auto-exposures (preview)"
title="Configure auto-exposures"
body="Import and auto-generate exposures from dashboards to understand how models are used in downstream tools for a richer downstream lineage."
link="/docs/cloud-integrations/configure-auto-exposures"
icon="dbt-bit"/>
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/git/setup-azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ The service user's permissions will also power which repositories a team can sel

While it's common to enforce multi-factor authentication (MFA) for normal user accounts, service user authentication must not need an extra factor. If you enable a second factor for the service user, this can interrupt production runs and cause a failure to clone the repository. In order for the OAuth access token to work, the best practice is to remove any more burden of proof of identity for service users.

As a result, MFA must be explicity disabled in the Office 365 or Microsoft Entra ID administration panel for the service user. Just having it "un-connected" will not be sufficient, as dbt Cloud will be prompted to set up MFA instead of allowing the credentials to be used as intended.
As a result, MFA must be explicitly disabled in the Office 365 or Microsoft Entra ID administration panel for the service user. Just having it "un-connected" will not be sufficient, as dbt Cloud will be prompted to set up MFA instead of allowing the credentials to be used as intended.


**To disable MFA for a single user using the Office 365 Administration console:**
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/external-oauth.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ Adjust the other settings as needed to meet your organization's configurations i
1. Navigate back to the dbt Cloud **Account settings** —> **Integrations** page you were on at the beginning. It’s time to start filling out all of the fields.
1. `Integration name`: Give the integration a descriptive name that includes identifying information about the Okta environment so future users won’t have to guess where it belongs.
2. `Client ID` and `Client secrets`: Retrieve these from the Okta application page.
<Lightbox src="/img/docs/dbt-cloud/gather-clientid-secret.png" width="60%" title="TThe client ID and secret highlighted in the Okta app" />
<Lightbox src="/img/docs/dbt-cloud/gather-clientid-secret.png" width="60%" title="The client ID and secret highlighted in the Okta app" />
3. Authorize URL and Token URL: Found in the metadata URI.
<Lightbox src="/img/docs/dbt-cloud/gather-authorization-token-endpoints.png" width="60%" title="The authorize and token URLs highlighted in the metadata URI" />

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/invite-users.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ Once the user completes this process, their email and user information will popu

* Is there a limit to the number of users I can invite? _Your ability to invite users is limited to the number of licenses you have available._
* Why are users are clicking the invitation link and getting an `Invalid Invitation Code` error? _We have seen scenarios where embedded secure link technology (such as enterprise Outlooks [Safe Link](https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/safe-links-about?view=o365-worldwide) feature) can result in errors when clicking on the email link. Be sure to include the `getdbt.com` URL in the allowlists for these services._
* Can I have a mixure of users with SSO and username/password authentication? _Once SSO is enabled, you will no longer be able to add local users. If you have contractors or similar contingent workers, we recommend you add them to your SSO service._
* Can I have a mixture of users with SSO and username/password authentication? _Once SSO is enabled, you will no longer be able to add local users. If you have contractors or similar contingent workers, we recommend you add them to your SSO service._
* What happens if I need to resend the invitation? _From the Users page, click on the invite record, and you will be presented with the option to resend the invitation._
* What can I do if I entered an email address incorrectly? _From the Users page, click on the invite record, and you will be presented with the option to revoke it. Once revoked, generate a new invitation to the correct email address._

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/mfa.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ Choose the next steps based on your preferred enrollment selection:

2. Follow the instructions in the modal window and click **Use security key**.

<Lightbox src="/img/docs/dbt-cloud/create-security-key.png" title="Example of the Seciruty Key activation window." />
<Lightbox src="/img/docs/dbt-cloud/create-security-key.png" title="Example of the Security Key activation window." />

3. Scan the QR code or insert and touch activate your USB key to begin the process. Follow the on-screen prompts.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/auto-exposures.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ pagination_next: "docs/collaborate/data-tile"
image: /img/docs/cloud-integrations/auto-exposures/explorer-lineage.jpg
---

# Auto-exposures <Lifecycle status="preview,enterprise" />
# Auto-exposures <Lifecycle status="enterprise" />

As a data team, it’s critical that you have context into the downstream use cases and users of your data products. Auto-exposures integrate natively with Tableau (Power BI coming soon) and auto-generate downstream lineage in dbt Explorer for a richer experience.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/data-tile.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Follow these steps to set up your data health tile:
6. Navigate back to dbt Explorer and select an exposure.
7. Below the **Data health** section, expand on the toggle for instructions on how to embed the exposure tile (if you're an account admin with develop permissions).
8. In the expanded toggle, you'll see a text field where you can paste your **Metadata Only token**.
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-example.jpg" width="85%" title="Expand the toggle to embded data health tile into your dashboard." />
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-example.jpg" width="85%" title="Expand the toggle to embed data health tile into your dashboard." />

9. Once you’ve pasted your token, you can select either **URL** or **iFrame** depending on which you need to add to your dashboard.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/explore-multiple-projects.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ When viewing a downstream (child) project that imports and refs public models fr
- Clicking on a model opens a side panel containing general information about the model, such as the specific dbt Cloud project that produces that model, description, package, and more.
- Double-clicking on a model from another project opens the resource-level lineage graph of the parent project, if you have the permissions to do so.

<Lightbox src="/img/docs/collaborate/dbt-explorer/cross-project-child.png" width="100%" height="100" title="View a downstream (child) project that importants and refs public models from the upstream (parent) project."/>
<Lightbox src="/img/docs/collaborate/dbt-explorer/cross-project-child.png" width="100%" height="100" title="View a downstream (child) project that imports and refs public models from the upstream (parent) project."/>

## Explore the project-level lineage graph

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/govern/model-versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This functionality is new in v1.5 — if you have thoughts, participate in [the

</VersionBlock>

import VersionsCallout from '/snippets/_version-callout.md';
import VersionsCallout from '/snippets/_model-version-callout.md';

<VersionsCallout />

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Microsoft made several changes related to connection encryption. Read more about
### Authentication methods

This adapter is based on the adapter for Microsoft SQL Server.
Therefor, the same authentication methods are supported.
Therefore, the same authentication methods are supported.

The configuration is the same except for 1 major difference:
instead of specifying `type: sqlserver`, you specify `type: synapse`.
Expand Down
22 changes: 22 additions & 0 deletions website/docs/docs/core/connect-data-platform/bigquery-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -388,6 +388,28 @@ my-profile:
execution_project: buck-stops-here-456
```

### Quota project

By default, dbt will use the `quota_project_id` set within the credentials of the account you are using to authenticate to BigQuery.

Optionally, you may specify `quota_project` to bill for query execution instead of the default quota project specified for the account from the environment.

This can sometimes be required when impersonating service accounts that do not have the BigQuery API enabled within the project in which they are defined. Without overriding the quota project, it will fail to connect.

If you choose to set a quota project, the account you use to authenticate must have the `Service Usage Consumer` role on that project.

```yaml
my-profile:
target: dev
outputs:
dev:
type: bigquery
method: oauth
project: abc-123
dataset: my_dataset
quota_project: my-bq-quota-project
```

### Running Python models on Dataproc

import BigQueryDataproc from '/snippets/_bigquery-dataproc.md';
Expand Down
16 changes: 6 additions & 10 deletions website/docs/docs/core/connect-data-platform/dremio-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@ title: "Dremio setup"
description: "Read this guide to learn about the Dremio warehouse setup in dbt."
meta:
maintained_by: Dremio
authors: 'Dremio (formerly Fabrice Etanchaud)'
authors: 'Dremio'
github_repo: 'dremio/dbt-dremio'
pypi_package: 'dbt-dremio'
min_core_version: 'v1.2.0'
min_core_version: 'v1.8.0'
cloud_support: Not Supported
min_supported_version: 'Dremio 22.0'
slack_channel_name: 'n/a'
slack_channel_link: 'https://www.getdbt.com/community'
slack_channel_name: 'db-dremio'
slack_channel_link: '[https://www.getdbt.com/community](https://getdbt.slack.com/archives/C049G61TKBK)'
platform_name: 'Dremio'
config_page: '/reference/resource-configs/no-configs'
---
Expand All @@ -36,10 +36,6 @@ Before connecting from project to Dremio Cloud, follow these prerequisite steps:

* Ensure that you are using version 22.0 or later.
* Ensure that Python 3.9.x or later is installed on the system that you are running dbt on.
* Enable these support keys in your Dremio cluster:
* `dremio.iceberg.enabled`
* `dremio.iceberg.ctas.enabled`
* `dremio.execution.support_unlimited_splits`

See <a target="_blank" rel="noopener noreferrer" href="https://docs.dremio.com/software/advanced-administration/support-settings/#support-keys">Support Keys</a> in the Dremio documentation for the steps.
* If you want to use TLS to secure the connection between dbt and Dremio Software, configure full wire encryption in your Dremio cluster. For instructions, see <a target="_blank" rel="noopener noreferrer" href="https://docs.dremio.com/software/deployment/wire-encryption-config/">Configuring Wire Encryption</a>.
Expand Down Expand Up @@ -84,7 +80,7 @@ For descriptions of the configurations in these profiles, see [Configurations](#
[project name]:
outputs:
dev:
cloud_host: https://api.dremio.cloud
cloud_host: api.dremio.cloud
cloud_project_id: [project ID]
object_storage_source: [name]
object_storage_path: [path]
Expand Down Expand Up @@ -161,7 +157,7 @@ For descriptions of the configurations in these profiles, see [Configurations](#

| Configuration | Required? | Default Value | Description |
| --- | --- | --- | --- |
| `cloud_host` | Yes | `https://api.dremio.cloud` | US Control Plane: `https://api.dremio.cloud`<br></br>EU Control Plane: `https://api.eu.dremio.cloud` |
| `cloud_host` | Yes | `api.dremio.cloud` | US Control Plane: `api.dremio.cloud`<br></br>EU Control Plane: `api.eu.dremio.cloud` |
| `user` | Yes | None | Email address used as a username in Dremio Cloud |
| `pat` | Yes | None | The personal access token to use for authentication. See [Personal Access Tokens](https://docs.dremio.com/cloud/security/authentication/personal-access-token/) for instructions about obtaining a token. |
| `cloud_project_id` | Yes | None | The ID of the Sonar project in which to run transformations. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ your_profile_name:
| type | The specific adapter to use | Required | `ibmdb2` |
| schema | Specify the schema (database) to build models into | Required | `analytics` |
| database | Specify the database you want to connect to | Required | `testdb` |
| host | Hostname or IP-adress | Required | `localhost` |
| host | Hostname or IP-address | Required | `localhost` |
| port | The port to use | Optional | `50000` |
| protocol | Protocol to use | Optional | `TCPIP` |
| username | The username to use to connect to the server | Required | `my-username` |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ _Parameters:_
| Syntax | Description |
| --------- |---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `MODEL_TYPE` | Type of the model your want to train. There are two options: <br/> - `classifier`: A model to predict classes/labels or categories such as spam detection<br/>- `regressor`: A model to predict continious outcomes such as CLV prediction. |
| `MODEL_TYPE` | Type of the model your want to train. There are two options: <br/> - `classifier`: A model to predict classes/labels or categories such as spam detection<br/>- `regressor`: A model to predict continuous outcomes such as CLV prediction. |
| `FEATURES` | Input column names as a list to train your AutoML model. |
| `TARGET` | Target column that you want to predict. |
Expand Down
Loading

0 comments on commit 872b30f

Please sign in to comment.