Skip to content

Commit

Permalink
Merge branch 'current' into mirnawong1-patch-12
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Jan 30, 2024
2 parents db58080 + 6ffbfa2 commit 10b2fa5
Show file tree
Hide file tree
Showing 23 changed files with 235 additions and 79 deletions.
2 changes: 1 addition & 1 deletion website/docs/docs/build/metrics-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ The keys for metrics definitions are:
| `config` | Provide the specific configurations for your metric. | Optional |
| `label` | The display name for your metric. This value will be shown in downstream tools. | Required |
| `filter` | You can optionally add a filter string to any metric type, applying filters to dimensions, entities, or time dimensions during metric computation. Consider it as your WHERE clause. | Optional |
| `meta` | Additional metadata you want to add to your metric. | Optional |


Here's a complete example of the metrics spec configuration:

Expand Down
14 changes: 7 additions & 7 deletions website/docs/docs/cloud/cloud-cli-installation.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Install dbt Cloud CLI
sidebar_label: "Install dbt Cloud CLI"
sidebar_label: "Installation"
id: cloud-cli-installation
description: "Instructions for installing and configuring dbt Cloud CLI"
pagination_next: "docs/cloud/configure-cloud-cli"
Expand Down Expand Up @@ -75,7 +75,7 @@ Before you begin, make sure you have [Homebrew installed](http://brew.sh/) in yo
4. Clone your repository to your local computer using `git clone`. For example, to clone a GitHub repo using HTTPS format, run `git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY`.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like [`dbt environment show`](/reference/commands/dbt-environment) to view your dbt Cloud configuration or `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
</TabItem>
Expand Down Expand Up @@ -106,7 +106,7 @@ Note that if you are using VS Code, you must restart it to pick up modified envi
4. Clone your repository to your local computer using `git clone`. For example, to clone a GitHub repo using HTTPS format, run `git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY`.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like [`dbt environment show`](/reference/commands/dbt-environment) to view your dbt Cloud configuration or `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
</TabItem>
Expand Down Expand Up @@ -140,7 +140,7 @@ Advanced users can configure multiple projects to use the same Cloud CLI executa
4. Clone your repository to your local computer using `git clone`. For example, to clone a GitHub repo using HTTPS format, run `git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY`.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like [`dbt environment show`](/reference/commands/dbt-environment) to view your dbt Cloud configuration or `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
</TabItem>
Expand Down Expand Up @@ -205,7 +205,7 @@ We recommend using virtual environments (venv) to namespace `cloud-cli`.
4. Clone your repository to your local computer using `git clone`. For example, to clone a GitHub repo using HTTPS format, run `git clone https://github.com/YOUR-USERNAME/YOUR-REPOSITORY`.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
5. After cloning your repo, [configure](/docs/cloud/configure-cloud-cli) the dbt Cloud CLI for your dbt Cloud project. This lets you run dbt commands like [`dbt environment show`](/reference/commands/dbt-environment) to view your dbt Cloud configuration or `dbt compile` to compile your project and validate models and tests. You can also add, edit, and synchronize files with your repo.
</TabItem>
Expand Down Expand Up @@ -253,12 +253,12 @@ To update:
Visual Studio (VS) Code extensions enhance command line tools by adding extra functionalities. The dbt Cloud CLI is fully compatible with dbt Core, however, it doesn't support some dbt Core APIs required by certain tools, for example, VS Code extensions.
You can use extensions like [dbt-power-user](https://www.dbt-power-user.com/) with the dbt Cloud CLI by following these steps:
You can use extensions like [dbt-power-user](https://marketplace.visualstudio.com/items?itemName=innoverio.vscode-dbt-power-user) with the dbt Cloud CLI by following these steps:
- [Install](/docs/cloud/cloud-cli-installation?install=brew) it using Homebrew along with dbt Core.
- [Create an alias](#faqs) to run the dbt Cloud CLI as `dbt-cloud`.
This setup allows dbt-power-user to continue to work with dbt Core in the background, alongside the dbt Cloud CLI.
This setup allows dbt-power-user to continue to work with dbt Core in the background, alongside the dbt Cloud CLI. For more, check the dbt Power User [documentation](https://docs.myaltimate.com/).
## FAQs
Expand Down
10 changes: 6 additions & 4 deletions website/docs/docs/cloud/configure-cloud-cli.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
---
title: Configure dbt Cloud CLI
title: Configure and use the dbt Cloud CLI
id: configure-cloud-cli
description: "Instructions on how to configure the dbt Cloud CLI"
sidebar_label: "Configure dbt Cloud CLI"
sidebar_label: "Configuration and usage"
pagination_next: null
---

Expand Down Expand Up @@ -75,7 +75,9 @@ Once you install the dbt Cloud CLI, you need to configure it to connect to a dbt

- To find your project ID, select **Develop** in the dbt Cloud navigation menu. You can use the URL to find the project ID. For example, in `https://cloud.getdbt.com/develop/26228/projects/123456`, the project ID is `123456`.

6. You can now [use the dbt Cloud CLI](#use-the-dbt-cloud-cli) and run [dbt commands](/reference/dbt-commands) like `dbt compile`. With your repo recloned, you can add, edit, and sync files with your repo.
6. You should now be able to [use the dbt Cloud CLI](#use-the-dbt-cloud-cli) and run [dbt commands](/reference/dbt-commands) like [`dbt environment show`](/reference/commands/dbt-environment) to view your dbt Cloud configuration details or `dbt compile` to compile models in your dbt project.

With your repo recloned, you can add, edit, and sync files with your repo.

### Set environment variables

Expand All @@ -89,7 +91,7 @@ To set environment variables in the dbt Cloud CLI for your dbt project:

## Use the dbt Cloud CLI

- The dbt Cloud CLI uses the same set of [dbt commands](/reference/dbt-commands) and [MetricFlow commands](/docs/build/metricflow-commands) as dbt Core to execute the commands you provide.
- The dbt Cloud CLI uses the same set of [dbt commands](/reference/dbt-commands) and [MetricFlow commands](/docs/build/metricflow-commands) as dbt Core to execute the commands you provide. For example, use the [`dbt environment`](/reference/commands/dbt-environment) command to view your dbt Cloud configuration details.
- It allows you to automatically defer build artifacts to your Cloud project's production environment.
- It also supports [project dependencies](/docs/collaborate/govern/project-dependencies), which allows you to depend on another project using the metadata service in dbt Cloud.
- Project dependencies instantly connect to and reference (or `ref`) public models defined in other projects. You don't need to execute or analyze these upstream models yourself. Instead, you treat them as an API that returns a dataset.
Expand Down
16 changes: 4 additions & 12 deletions website/docs/docs/cloud/migration.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,18 +23,10 @@ If your account is scheduled for migration, you will see a banner indicating you
1. **IP addresses** &mdash; dbt Cloud will be using new IPs to access your warehouse after the migration. Make sure to allow inbound traffic from these IPs in your firewall and include it in any database grants. All six of the IPs below should be added to allowlists.
* Old IPs: `52.45.144.63``54.81.134.249``52.22.161.231`
* New IPs: `52.3.77.232``3.214.191.130``34.233.79.135`
2. **APIs and integrations** &mdash; Each dbt Cloud account will be allocated a static access URL like: `aa000.us1.dbt.com`. You should begin migrating your API access and partner integrations to use the new static subdomain as soon as possible. You can find your access URL on:
* Any page where you generate or manage API tokens.
* The **Account Settings** > **Account page**.

:::important Multiple account access
Be careful, each account that you have access to will have a different, dedicated [access URL](https://next.docs.getdbt.com/docs/cloud/about-cloud/access-regions-ip-addresses#accessing-your-account).
:::

3. **IDE sessions** &mdash; Any uncommitted changes in the IDE might be lost during the migration process. dbt Labs _strongly_ encourages you to commit all changes in the IDE before your scheduled migration time.
4. **User invitations** &mdash; Any pending user invitations will be invalidated during the migration. You can resend the invitations once the migration is complete.
5. **Git integrations** &mdash; Native integrations with [GitLab](/docs/cloud/git/connect-gitlab#for-the-dbt-cloud-enterprise-tier) and [Azure DevOps](/docs/cloud/git/connect-azure-devops) will need to be manually updated. dbt Labs will not be migrating any accounts using these integrations at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration.
6. **SSO integrations** &mdash; Integrations with SSO identity providers (IdPs) will need to be manually updated. dbt Labs will not be migrating any accounts using SSO at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration.
2. **IDE sessions** &mdash; Any uncommitted changes in the IDE might be lost during the migration process. dbt Labs _strongly_ encourages you to commit all changes in the IDE before your scheduled migration time.
3. **User invitations** &mdash; Any pending user invitations will be invalidated during the migration. You can resend the invitations once the migration is complete.
4. **Git integrations** &mdash; Native integrations with [GitLab](/docs/cloud/git/connect-gitlab#for-the-dbt-cloud-enterprise-tier) and [Azure DevOps](/docs/cloud/git/connect-azure-devops) will need to be manually updated. dbt Labs will not be migrating any accounts using these integrations at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration.
5. **SSO integrations** &mdash; Integrations with SSO identity providers (IdPs) will need to be manually updated. dbt Labs will not be migrating any accounts using SSO at this time. If you're using one of these integrations and your account is scheduled for migration, please contact support and we will delay your migration.

## Post-migration

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ Refer to the [FAQs](#faqs) for more info.

In order to add project dependencies and resolve cross-project `ref`, you must:
- Use dbt v1.6 or higher for **both** the upstream ("producer") project and the downstream ("consumer") project.
- Define models in an upstream ("producer") project that are configured with [`access: public`](/reference/resource-configs/access)
- Have a deployment environment in the upstream ("producer") project [that is set to be your production environment](/docs/deploy/deploy-environments#set-as-production-environment)
- Have a successful run of the upstream ("producer") project
- Have a multi-tenant or single-tenant [dbt Cloud Enterprise](https://www.getdbt.com/pricing) account (Azure ST is not supported but coming soon)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ This release includes significant new features, and rework to `dbt-core`'s CLI a

Setting `log-path` and `target-path` in `dbt_project.yml` has been deprecated for consistency with other invocation-specific runtime configs ([dbt-core#6882](https://github.com/dbt-labs/dbt-core/issues/6882)). We recommend setting via env var or CLI flag instead.

The `dbt list` command will now include `INFO` level logs by default. Previously, the `list` command (and _only_ the `list` command) had `WARN`-level stdout logging, to support piping its results to [`jq`](https://stedolan.github.io/jq/manual/), a file, or another process. To achieve that goal, you can use either of the following parameters:
The `dbt list` command will now include `INFO` level logs by default. Previously, the `list` command (and _only_ the `list` command) had `WARN`-level stdout logging, to support piping its results to [`jq`](https://jqlang.github.io/jq/manual/), a file, or another process. To achieve that goal, you can use either of the following parameters:
- `dbt --log-level warn list` (recommended; equivalent to previous default)
- `dbt --quiet list` (suppresses all logging less than ERROR level, except for "printed" messages and `list` output)

Expand Down
2 changes: 1 addition & 1 deletion website/docs/guides/create-new-materializations.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ recently_updated: true

## Introduction

The model <Term id="materialization">materializations</Term> you're familiar with, `table`, `view`, and `incremental` are implemented as macros in a package that's distributed along with dbt. You can check out the [source code for these materializations](https://github.com/dbt-labs/dbt-core/tree/main/core/dbt/adapters/include/global_project/macros/materializations). If you need to create your own materializations, reading these files is a good place to start. Continue reading below for a deep-dive into dbt materializations.
The model <Term id="materialization">materializations</Term> you're familiar with, `table`, `view`, and `incremental` are implemented as macros in a package that's distributed along with dbt. You can check out the [source code for these materializations](https://github.com/dbt-labs/dbt-adapters/tree/60005a0a2bd33b61cb65a591bc1604b1b3fd25d5/dbt/include/global_project/macros/materializations). If you need to create your own materializations, reading these files is a good place to start. Continue reading below for a deep-dive into dbt materializations.

:::caution

Expand Down
6 changes: 3 additions & 3 deletions website/docs/guides/debug-schema-names.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ recently_updated: true

## Introduction

If a model uses the [`schema` config](/reference/resource-properties/schema) but builds under an unexpected schema, here are some steps for debugging the issue. The full explanation on custom schemas can be found [here](/docs/build/custom-schemas).
If a model uses the [`schema` config](/reference/resource-properties/schema) but builds under an unexpected schema, here are some steps for debugging the issue. The full explanation of custom schemas can be found [here](/docs/build/custom-schemas).


You can also follow along via this video:
Expand All @@ -25,7 +25,7 @@ You can also follow along via this video:
Do a file search to check if you have a macro named `generate_schema_name` in the `macros` directory of your project.

### You do not have a macro named `generate_schema_name` in your project
This means that you are using dbt's default implementation of the macro, as defined [here](https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/include/global_project/macros/get_custom_name/get_custom_schema.sql#L47C1-L60)
This means that you are using dbt's default implementation of the macro, as defined [here](https://github.com/dbt-labs/dbt-adapters/blob/60005a0a2bd33b61cb65a591bc1604b1b3fd25d5/dbt/include/global_project/macros/get_custom_name/get_custom_schema.sql)

```sql
{% macro generate_schema_name(custom_schema_name, node) -%}
Expand Down Expand Up @@ -53,7 +53,7 @@ If your `generate_schema_name` macro looks like so:
{{ generate_schema_name_for_env(custom_schema_name, node) }}
{%- endmacro %}
```
Your project is switching out the `generate_schema_name` macro for another macro, `generate_schema_name_for_env`. Similar to the above example, this is a macro which is defined in dbt's global project, [here](https://github.com/dbt-labs/dbt-core/blob/main/core/dbt/include/global_project/macros/get_custom_name/get_custom_schema.sql#L47-L60).
Your project is switching out the `generate_schema_name` macro for another macro, `generate_schema_name_for_env`. Similar to the above example, this is a macro which is defined in dbt's global project, [here](https://github.com/dbt-labs/dbt-adapters/blob/main/dbt/include/global_project/macros/get_custom_name/get_custom_schema.sql).
```sql
{% macro generate_schema_name_for_env(custom_schema_name, node) -%}

Expand Down
Loading

0 comments on commit 10b2fa5

Please sign in to comment.