Skip to content

Commit

Permalink
Preview launch of the dbt Snowflake Native App (#5604)
Browse files Browse the repository at this point in the history
## What are you changing in this pull request and why?

Docs for the preview launch of the Snowflake Native App for dbt Cloud

- Sidebar changes. New "dbt Cloud integrations" section. And moved
existing "SL integrations" files under the new section.
- New landing page for "dbt Cloud integrations". Starting with 2 tiles:
dbt Snowflake Native App and "Avail SL integrations".
- New About/overview page (conceptual content) for Native App
- New Set up page (how-to content) for Native App
- Release note about preview
- See checklist below for clean-up/maintenance tasks

## Checklist
- [x] Review the [Content style
guide](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/content-style-guide.md)
so my content adheres to these guidelines.
- [x] For [docs
versioning](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#about-versioning),
review how to [version a whole
page](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#adding-a-new-version)
and [version a block of
content](https://github.com/dbt-labs/docs.getdbt.com/blob/current/contributing/single-sourcing-content.md#versioning-blocks-of-content).
- [x] Needs review from PM

Adding pages:
- [x] Add page in `website/sidebars.js`
- [x] Provide a unique filename for new pages

Moving pages:
- [x] Move page in `website/sidebars.js`
- [x] Add a redirect for moved pages in `website/vercel.json`
- [x] Search and update all URLs with new page location
- [x] Run link testing locally with `npm run build` to update the links
that point to moved pages
  • Loading branch information
nghi-ly authored Jun 4, 2024
2 parents d9abfbe + 6ca3997 commit 897cdec
Show file tree
Hide file tree
Showing 30 changed files with 350 additions and 86 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Now that we’ve done the pipeline work to set up our metrics for the semantic l
## Our Finance, Operations and GTM teams are all looking at the same metrics 😊
To query to Semantic Layer you have two paths: you can query metrics directly through the Semantic Layer APIs or use one of our [first-class integrations](https://docs.getdbt.com/docs/use-dbt-semantic-layer/avail-sl-integrations). Our analytics team and product teams are big Hex users, while our operations and finance teams live and breathe Google Sheets, so it’s important for us to have the same metric definitions available in both tools.
To query to Semantic Layer you have two paths: you can query metrics directly through the Semantic Layer APIs or use one of our [first-class integrations](https://docs.getdbt.com/docs/cloud-integrations/avail-sl-integrations). Our analytics team and product teams are big Hex users, while our operations and finance teams live and breathe Google Sheets, so it’s important for us to have the same metric definitions available in both tools.
The leg work of building our pipeline and defining metrics is all done, which makes last-mile consumption much easier. First, we set up a launch dashboard in Hex as the source of truth for semantic layer product metrics. This tool is used by cross-functional partners like marketing, sales, and the executive team to easily check product and usage metrics like total semantic layer queries, or weekly active semantic layer users. To set up our Hex connection, we simply enter a few details from our dbt Cloud environment and then we can work with metrics directly in Hex notebooks. We can use the JDBC interface, or use Hex’s GUI metric builder to build reports. We run all our WBRs off this dashboard, which allows us to spot trends in consumption and react quickly to changes in our business.
Expand Down
2 changes: 1 addition & 1 deletion website/blog/2024-05-02-semantic-layer-llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -293,6 +293,6 @@ grant usage on function submit_sl_request(string) to role public;

## Wrapping Up

Building this application has been an absolute blast for multiple reasons. First, we’ve been able to use it internally within the SA org to demonstrate how the semantic layer works. It provides yet another [integration](https://docs.getdbt.com/docs/use-dbt-semantic-layer/avail-sl-integrations) point that further drives home the fundamental value prop of using the Semantic Layer. Secondly, and more importantly, it has served as an example to those customers thinking about (or being pushed to think about) how they can best utilize these technologies to further their goals. Finally, I’ve been able to be heads down, hands on keyboard learning about all of these interesting technologies and stepping back into the role of builder is something I will never turn down!
Building this application has been an absolute blast for multiple reasons. First, we’ve been able to use it internally within the SA org to demonstrate how the semantic layer works. It provides yet another [integration](https://docs.getdbt.com/docs/cloud-integrations/avail-sl-integrations) point that further drives home the fundamental value prop of using the Semantic Layer. Secondly, and more importantly, it has served as an example to those customers thinking about (or being pushed to think about) how they can best utilize these technologies to further their goals. Finally, I’ve been able to be heads down, hands on keyboard learning about all of these interesting technologies and stepping back into the role of builder is something I will never turn down!

Finally, to see the entire code, from Snowflake to Streamlit, check out the repo [here](https://github.com/dpguthrie/dbt-sl-cortex-streamlit-blog/tree/main?tab=readme-ov-file).
2 changes: 1 addition & 1 deletion website/docs/docs/build/about-metricflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Before you start, consider the following guidelines:
- Define metrics in YAML and query them using these [new metric specifications](https://github.com/dbt-labs/dbt-core/discussions/7456).
- You must be on [dbt version](/docs/dbt-versions/upgrade-dbt-version-in-cloud) 1.6 or higher to use MetricFlow.
- Use MetricFlow with Snowflake, BigQuery, Databricks, Postgres (dbt Core only), or Redshift.
- Discover insights and query your metrics using the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and its diverse range of [available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations).
- Discover insights and query your metrics using the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and its diverse range of [available integrations](/docs/cloud-integrations/avail-sl-integrations).

## MetricFlow

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/build-metrics-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ MetricFlow allows you to:
<Card
title="Available integrations"
body="Discover the diverse range of partners that seamlessly integrate with the powerful dbt Semantic Layer, allowing you to query and unlock valuable insights from your data ecosystem."
link="/docs/use-dbt-semantic-layer/avail-sl-integrations"
link="/docs/cloud-integrations/avail-sl-integrations"
icon="dbt-bit"/>

</div> <br />
Expand Down
47 changes: 47 additions & 0 deletions website/docs/docs/cloud-integrations/about-snowflake-native-app.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
---
title: "About the dbt Snowflake Native App"
id: "snowflake-native-app"
description: "An overview of the dbt Snowflake Native App for dbt Cloud accounts"
pagination_prev: null
pagination_next: "docs/cloud-integrations/set-up-snowflake-native-app"
---

# About the dbt Snowflake Native App <Lifecycle status='preview' />

The dbt Snowflake Native App &mdash; powered by the Snowflake Native App Framework and Snowpark Container Services &mdash; extends your dbt Cloud experience into the Snowflake user interface. You'll be able to access these three experiences with your Snowflake login:

- **dbt Explorer** &mdash; An embedded version of [dbt Explorer](/docs/collaborate/explore-projects)
- **Ask dbt** &mdash; A dbt-assisted chatbot, powered by [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), OpenAI, and Snowflake Cortex
- **Orchestration observability** &mdash; A view into the [job run history](/docs/deploy/run-visibility) and the ability to trigger Snowflake tasks with [deploy jobs](/docs/deploy/deploy-jobs).

These experiences enable you to extend what's been built with dbt Cloud to users who have traditionally worked downstream from the dbt project, such as BI analysts and technical stakeholders.

For installation instructions, refer to [Set up the dbt Snowflake Native App](/docs/cloud-integrations/set-up-snowflake-native-app).

## Architecture

There are three tools connected to the operation of the dbt Snowflake Native App:

| Tool | Description |
|------------------------------------|-------------|
| Consumer’s Snowflake account | The location of where the Native App is installed, powered by Snowpark Container Services. <br /><br /> The Native App makes calls to the dbt Cloud APIs and Datadog APIs (for logging) using [Snowflake's external network access](https://docs.snowflake.com/en/developer-guide/external-network-access/external-network-access-overview). <br /><br />To power the **Ask dbt** chatbot, the dbt Semantic Layer accesses the Cortex LLM to execute queries and generate text based on the prompt. This is configured when the user sets up the Semantic Layer environment. |
| dbt product Snowflake account | The location of where the Native App application package is hosted and then distributed into the consumer account. <br /><br />The consumer's event table is shared to this account for application monitoring and logging. |
| Consumer’s dbt Cloud account | The Native App interacts with the dbt Cloud APIs for metadata and processing Semantic Layer queries to power the Native App experiences. <br /> <br /> The dbt Cloud account also calls the consumer Snowflake account to utilize the warehouse to execute dbt queries for orchestration and the Cortex LLM Arctic to power the **Ask dbt** chatbot. |

The following diagram provides an illustration of the architecture:

<Lightbox src="/img/docs/cloud-integrations/architecture-dbt-snowflake-native-app.png" title="Architecture of dbt Cloud and Snowflake integration"/>


## Access
You can log in to the dbt Snowflake Native App using your regular Snowflake login authentication method. During this [Preview](/docs/dbt-versions/product-lifecycles#dbt-cloud), you do not need dbt Cloud credentials (a dbt Cloud seat) to access the application but this is subject to change.

App users are able to access all information that's available to the API service token.

## Procurement
The dbt Snowflake Native App is available on the [Snowflake Marketplace](https://app.snowflake.com/marketplace/listing/GZTYZSRT2R3). With the purchase of the listing, users will have access to the Native App and a dbt Cloud account that's on the Enterprise plan.

If you're interested, please [contact us](matilto:[email protected]) for more information.

## Support
If you have any questions about the dbt Snowflake Native App, you may [contact our Support team](mailto:[email protected]) for help. Please provide information about your installation of the Native App, including your dbt Cloud account ID and Snowflake account identifier.
27 changes: 27 additions & 0 deletions website/docs/docs/cloud-integrations/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
title: "About dbt Cloud integrations"
sidebar_label: "About dbt Cloud integrations"
pagination_prev: null
pagination_next: "docs/cloud-integrations/snowflake-native-app"
---

Many data applications integrate with dbt Cloud, enabling you to leverage the power of dbt for a variety of use cases and workflows.


## Integrations with dbt

<div className="grid--2-col">

<Card
title="dbt Snowflake Native App (preview)"
link="/docs/cloud-integrations/snowflake-native-app"
body="Learn about the dbt Snowflake Native App and how you can access key dbt Cloud features within the Snowflake platform."
icon="snowflake"/>

<Card
title="Semantic layer integrations"
body="Review a wide range of partners you can integrate and query with the dbt Semantic Layer."
link="/docs/cloud-integrations/avail-sl-integrations"
icon="dbt-bit"/>

</div>
Loading

0 comments on commit 897cdec

Please sign in to comment.