When deploying a new version of your infrastrucutre, Terraform diffs the current state
+against what you have specified in your infrastructure-as-code.
+The current state is tracked in a JSON document,
+which can be stored in any of a number of locations (including local files).
+
This project stores remote state using the S3 backend.
+
Different applications or environments can be isolated from each other by using
+different S3 buckets for holding their state.
+We reuse a terraform configuration (terraform/s3-remote-state) for setting up the S3 backend
+
+
Note
+
The S3 remote state configuration is not a proper module because it contains
+a provider block. Different deployments of the configuration are controlled
+by giving it different tfvars files, and capturing the outputs for use in
+a tfbackend file.
+
+
Here is an example set of commands for bootstrapping a new S3 backend for a deployment.
+Suppose the deployment is a QA environment of our Snowflake project:
+
cdterraform/snowflake/environments/qa# Go to the new environment directory
+mkdirremote-state# Create a remote-state directory
+cdremote-state
+ln-s../../../s3-remote-state/main.tfmain.tf# symlink the s3 configuration
+terraforminit# initialize the remote state backend
+terraformapply-var="owner=dse"-var="environment=qa"-var="project=snowflake"# Create the infrastructure
+terraformoutput>../dse-snowflake-qa.tfbackend# Pipe the outputs to a .tfbackend
+
+cd..
+terraforminit-backend-config=./dse-snowflake-qa.tfbackend# Configure the deployment with the new backend.
+
Terraform deployments include a lockfile with hashes of installed packages.
+Because we have mixed development environments (i.e., Macs locally, Linux in CI),
+it is helpful to include both Mac and Linux builds of terraform packages in the lockfile.
+This needs to be done every time package versions are updated:
+
terraforminit-upgrade# Upgrade versions
+terraformproviderslock-platform=linux_amd64-platform=darwin_amd64# include Mac and Linux binaries
+
Develop in your branch. Try to keep commits to single ideas. A branch for a good pull request can tell a story.
+
When your branch is ready for review, push it to GitHub:
+
gitpush<remote-name><branch-name>
+
+
From the GitHub UI, open a pull request for merging your code into main.
+
Request one or more reviewers.
+
Go through one or several rounds of review, making changes to your branch as necessary. A healthy review is a conversation, and it's normal to have disagreements.
+
When the reviewer is happy, they can approve and merge the pull request!
+
In general, the author of a PR should not approve and merge their own pull request.
In particular, it's important to remember that you are the subject matter expert for a PR.
+The reviewer will likely not know anything about the path you took to a particular solution,
+what approaches did not work, and what tradeoffs you encountered.
+It's your job to communicate that context for reviewers to help them review your code.
+This can include comments in the GitHub UI, comments in the code base, and even self-reviews.
Making use of code linters and formatters helps to establish a consistent style for a project
+and removes a whole whole class of common errors and disagreements.
+Even if one can take issue with specific conventions or rules,
+having them used consistently within a team pays big dividends over time.
+
Many of our projects use pre-commit to enforce the linter and formatter conventions.
+To set up your pre-commit environment locally (requires a Python development environment, run
+
pre-commitinstall
+
+
The next time you make a commit, the pre-commit hooks will run on the contents of your commit
+(the first time may be a bit slow as there is some additional setup).
More merge commits in a PR can make review more difficult,
+as contents from unrelated work can appear in the code diff.
+Sometimes they are necessary for particularly large or long-running branches,
+but for most work you should try to avoid them.
+The following guidelines can help:
+
+
Usually branch from the latest main
+
Keep feature branches small and focused on a single problem. It is often helpful for both authors and reviewers to have larger efforts broken up into smaller tasks.
+
In some circumstances, a git rebase can help keep a feature branch easy to review and reason about.
If your pull request adds any new features or changes any workflows for users
+of the project, you should include documentation.
+Otherwise, the hard work you did to implement a feature may go unnoticed/unused!
+What this looks like will vary from project to project, but might include:
New functionality ideally should have automated tests.
+As with documentation, these tests will look different depending upon the project needs.
+A good test will:
+
+
Be separated from production environments
+
If fixing a bug, should actually fix the issue.
+
Guard against regressions
+
Not take so long as to be annoying to run.
+
Not rely on internal implementation details of a project (i.e. use public contracts)
+
+
One nice strategy for bugfixes is to write a failing test before making a fix,
+then verifying that the fix makes the test pass.
+It is surprisingly common for tests to accidentally not cover the behavior they are intended to.
As above, reviewers should have empathy for the author of a pull request.
+You as a reviewer are unaware of the constraints and tradeoffs that an author might have encountered.
+
Some general tips for conducting productive reviews:
+
+
If there is something you might have done differently, come in with a constructive attitude and try to understand why the author took their approach.
+
Keep your reviews timely (ideally provide feedback within 24 hours)
+
Try to avoid letting a PR review stretch on too long. A branch with many review cycles stretching for weeks is demoralizing to code authors.
+
Remember that perfect is the enemy of the good. A PR that makes an improvement or is a concrete step forward can be merged without having to solve everything. It's perfectly reasonable to open up issues to capture follow-up work from a PR.
Before merging a pull request, maintainers should make every effort to ensure that CI passes.
+Often this will require looking into the logs of a failed run to see what went wrong,
+and alerting the pull request author.
+Ideally, no pull request should be merged if there are CI failures,
+as broken CI in main can easily mask problems with other PRs,
+and a consistently broken CI can be demoralizing for maintainers.
+
However, in practice, there are occasionally flaky tests,
+broken upstream dependencies, and failures that are otherwise obviously not related to the PR at hand.
+If that is the case, a reviewer may merge a PR with failing tests,
+but they should be prepared to follow up with any failures that result from such an unsafe operation.
+
Note: these conventions and recommendations are partially drawn from maintainer guidelines for
+JupyterHub and
+Dask.
GitHub Codespaces allow you to spin up an ephemeral development environment in VS Code
+which includes a git repository, configurations, and pre-installed libraries.
+It provides an easy way for developers to get started working in a repository,
+especially if they are uncomfortable
Go to the "Code" dropdown from the main repository page,
+select the three dot dropdown, and select "New with options..."
+This will allow more configuration than the default codespace.
+
+
In the codespace configuration form, you will have an option to add "Recommended Secrets".
+This is where you can add your personal Snowflake credentials to your codespace,
+allowing for development against our Snowflake warehouse, including using dbt.
+You should only add credentials for accounts that are protected by multi-factor authentication (MFA).
+
+
After you have added your secrets, click "Create Codespace".
+Building it may take a few minutes,
+but then you should be redirected to a VS Code environment in your browser.
Once your codespace is created, you should be able to launch it
+without re-creating it every time using the "Code" dropdown,
+going to "Open in...", and selecting "Open in browser":
Once you have created and configured a codespace,
+you have access to a relatively full-featured VS Code-based development environment.
+This includes:
When you launch a new codespace, it can take a couple of minutes for all of the extensions to install. In particular, this means that the Python environment may not be fully set-up when you land in VS Code. We recommend closing existing terminal sessions and starting a new one once the extensions have finished installing.
+
The first time you make a commit, the pre-commit hooks will be installed. This may take a few minutes. Subsequent commits will take less time.
+
If the pre-commit hooks fail when making a commit, it will give you the opportunity to open the git logs to view the errors. If you are unable to fix the errors for whatever reason, you can always make a new commit from the command line with --no-verify:
+
The Data Services and Engineering team maintains a derived dataset from the
+Microsoft US Building Footprints
+and Global ML Building Footprints datasets.
+The two datasets are broadly similar, but the latter has global coverage and is more frequently updated.
+
We take the original datasets, and join them with US Census TIGER data to make them more useful
+for demographic and social science research. The additional census-derived fields include:
+
+
State
+
County
+
Tract
+
Block Group
+
Block
+
Place
+
+
If a footprint intersects more than one of the above,
+we assign it to the one with the greater intersection,
+so each footprint should only appear once in the dataset.
+
+
Note
+
Despite the names, these derived datasets are scoped to California only.
The data are stored as files in AWS S3.
+We distribute them in both GeoParquet
+and zipped Shapefile formats.
+
GeoParquet is usually a superior format for doing data analytics as it is:
+
+
An open format, based on the industry-standard Parquet format.
+
Efficiently compressed
+
Cloud-native
+
Uses a columnar data layout optimized for analytical workloads.
+
+
However, GeoParquet is also somewhat newer, and not supported by all tooling yet,
+so the zipped Shapefiles may be better suited for some workflows (especially Esri ones).
Fennis Reed at the California Department of Finance Demographics Research Unit has created an ArcGIS Pro
+toolbox
+for downloading individual footprint files,
+which can be downloaded here.
The following tables contains public links to the datasets partitioned by county.
+The HTTPS URLs can be used to directly download files using a web browser,
+while the S3 URLs are more appropriate for scripts like the examples above.
In most settings, what is considered acceptable performance is relative to business needs and constraints. It's not atypical to deem the performance acceptable as long as there are no scheduling conflicts and models can run within a timeframe dictated by the frequency of the models running. In other words, if you need to run models every hour then the entire job cannot take longer than an hour to run. In general compute costs are not so high to necessarily be worth optimizing the underlying queries but may be high enough to optimize frequency or data size.
Although compute time is relatively cheap, it's sometimes possible with larger datasets that need to be frequently refreshed to optimize performance to save enough costs to be worth the time to optimize. In Snowflake you can easily monitor costs in the Admin/Usage section of the Snowflake UI, where you can see credits used by warehouse and role.
+Snowflake also provides several tables with meta information that can be used to derive exact costs for each query - an approach to this, with a ready-use-query can be found in the Select.dev blog post "Calculating cost per query in Snowflake"
+
Typically, unless model performance is obviously very poor you are better off adjusting the frequency of runs (end users almost always over-state their desire for data freshness) or reducing data set size either by limiting what you provide or by using incremental models.
+
In other words, very often the questions you should be asking are not in the category of SQL performance tuning but rather "do we need this data to be this fresh?" and "do we need all this data?".
Often performance issues show up in scheduling. If you are running jobs once a day it is extremely unlikely you will run into any scheduling conflicts. However, if a much higher frequency is required, it's possible for jobs to take longer than that time between runs. In this case a common first approach is to break up model runs so that things that don't need to run as frequently can run separately from models that require more frequent updating. A typical way of doing this is to either use dbt run --select or dbt tags dbt tags to select models in groups.
+This is not to say performance tuning of individual queries is never worth it but that the big macro gains come more from running models less frequently and/or with less data, e.g. using filtering or incremental models.
It is extremely important to balance time spent in optimizing model performance with compute costs and other concerns. If it takes you a day to optimize a model to run only a few seconds faster and save a few pennies per run, it's not likely worth the effort. Similarly, the use of incremental materilization can certainly reduce build time but introduce complexity and require a degree of monitoring to ensure integrity. See also Materialization Matters below.
With every dbt run or build several artifacts are generated in the target/ directory, including the run_results.json file. This includes detailed information on run execution and many people parse this to create dashboards to report on dbt performance and help with optimization and cost monitoring. There is an important caveat here: simply knowing how long a model took to run is important to uncover which models might need optimization, but cannot tell you anything about why they are performing poorly.
dbt Cloud has a nicer interface for finding which models in a project are running longest. Visit the Deploy > Runs section of dbt Cloud. You'll see a full list of jobs and how long each one toook. To drill down to the model timing level click on a run name. You can expand the "Invoke dbt build" section under "Run Summary" to get a detailed summary of your run as well as timing for each model and test. There is also a "Debug logs" section for even more detail, including the exact queries run and an option to download the logs for easier viewing. Of course this is also where you go to find model and test errors and warnings!
+
+
For a quick visual reference of which models take up the most time in a run, click on the "Model Timing" tab. If you hover over a model you will be shown the specific timing.
Snowflake has quite a lot of performance data readily available through it's information_schema.QUERY_HISTORY() table function and several views in the Account Usage schema. This is great not only for finding expensive queries regardless of source and of course for all sorts of analytics on Snowflake usage, such as credits.
The Query History gives you real time data while the Account Usage is delayed. So Query History is great for analyzing your own queries in development and for current query performance in production.
+
Example Query: Get top time-consuming queries for the dbt Cloud production loads
+
SELECT
+query_text,query_type,database_name,schema_name,
+user_name,total_elapsed_time
+FROM
+-- query_history() is a table function
+table(information_schema.query_history())
+WHEREuser_name='DBT_CLOUD_SVC_USER_PRD'
+ORDERBYtotal_elapsed_timeDESC
+LIMIT20
+
+
As you might have guessed this also lets you search for a model on query text, so you can find specific dbt models or classes of models:
+
The Account Usage schema (snowflake.account_usage) has multiple views that are of interest for monitoring not just query performance and credit usage but warehouse and database usage and more. This data is delayed 45 minutes but has a much longer history.
+
Example Query: Find the queries with highest total execution time this month for the dbt cloud production loads.
+
Now that you've identified which models might need optimization, it's time to figure out how to get them to run faster. These options are roughly in order of bang-for-buck in most situations.
It's common for end-users to say they want the freshest data (who doesn't?) but in practice require a much lower frequency of refreshing. To gain an understand of the real-world needs it's helpful to see the frequency with which end-users actually view reporting and to consider the time scales involved. If someone only cares about monthly results, for example, you can in theory have a 30 day frequency for model runs.
+It's also quite common to have parts of the data be relatively static, and only need to be refreshed occasionally whereas other parts of the data might change much more often.
+An easy way to break up model runs is by using dbt tags.
+
For a good comparison of materialization options and their trade-offs see the Materialization Best Practices section of the dbt docs.
+
Views: Are a trade-off between build performance and read/reporting performance. In cases where you are using a BI tool, you should almost always use table materializations unless data storage size is an issue or refresh frequency is so high that cost or scheduling conflicts become a problem. In cases where performance at time of reporting is not an issue (say, you are generating an aggregated report on a monthly basis) then views can be a great way to cut run time. Another case where views can be a good option is with staging data of relatively small size, where your queries are relatively light-weight and you want to ensure fresh data without having to configure separate runs for those models.
+
Incremental Models:
+For a very large data sets, it can be essential to use incremental models. For this to work, you need some means of filtering records from the source table, typically using a timestamp. You then add a conditional block into your model to only select new records unless you're doing a full refresh. It's worth noting that incremental models can be tricky to get right and you will often want to implement some additional data integrity testing to ensure data is fresh and complete. For a more detailed discussion of Incremental Models, see Incremental models in-depth
+
An example in our current projects is the CalHR Ecos model stg_CertEligibles. This query takes over three minutes to run and no wonder - it generates 858 million rows! This is clearly a case where we should ask if we need all of that data or can filtered in someway and if the answer is yes, then we should consider using an incremental materialization.
A great many books have been written on this subject! The good news is that most of the tools we use provide excellent resources for analyzing query performance.
Because models are often created using a CREATE TABLE... SELECT statement you need to separate out read from write performance to understand if the issue is that your original query is slow or you are simply moving a lot of data and it takes time. It's worth saying that the chances are good that if you are moving a lot of data you are also querying a lot of data and in fact both read and write may be very time consuming but this is not a given -- if you are doing lots of joins on big data sets along with aggregations that output a small number of rows, then probably your model performance is read-bound. If this is the case the first question you should probably ask is can you break up that model into smaller chunks using staging and intermediate models.
+
A good way to get a sense of read vs write performance is to do one or more of:
+1. Simply know the number of rows generated by the model (for some database dbt will output this in the output above). If you are creating tables with millions of rows you should probably consider an incremental model or reassess if you can filter and narrow your data somehow.
+2. Use your database's query profiler, if available, to separate out what part of the execution is taking the most amount of time. In Snowflake for example, you can use the query profile to easily determine whether a query is rebound or write down and also determine where exactly other performance issues may lie. A CREATE TABLE with a simple select, for example, will show that the majority of time is spent in the CreateTableAsSelect node and only a fraction of the time in the Result node.
+Be careful if you are comparing queries across runs - most databases use caching and this will of course affect your results (see Caching Notes below).
+3. Switch the materialization to view. Typically a view will take a fraction of the time to generate, and if that's the case you know your model is slow in writes.
+4. Run the query separately in the database without the CREATE TABLE part. When you do this you can typically assess the execution plan
You can easily pull up the query profile for any query that has been run in Snowflake either from a worksheet or from the query history page. This includes queries run from dbt Cloud! This profile is essential in understanding the elements of your query that are most costly in terms of time, and which might be improved through optimization. Refer to the Analyzing Queries Using Query Profile page in the Snowflake Documentation for complete information including common problems and their solutions.
Most databases use some type of caching which needs to be turned off in order to properly test performance. Snowflake uses both a results cache and a disk cache, but only one can be turned off with a session variable:
+
alter session set use_cached_result = false;
+
+See this in-depth discussion for more details: Deep Dive on Snowflake Caching
+A general workaround (other than to shutdown and restart the warehouse) is to use slightly different result sets which do the same operations and return the same number of rows.
+
Models in a data warehouse do not follow the same naming conventions as raw cloud resources,
+as their most frequent consumers are analytics engineers and data analysts.
+
The following conventions are used where appropriate:
+
Dimension tables are prefixed with dim_.
+
Fact tables are prefixed with fct_.
+
Staging tables are prefixed with stg_.
+
Intermediate tables are prefixed with int_.
+
We may adopt additional conventions for denoting aggregations, column data types, etc. in the future.
+If during the course of a project's model development we determine that simpler human-readable names
+work better for our partners or downstream consumers, we may drop the above prefixing conventions.
dbt's default method for generating custom schema names
+works well for a single-database setup:
+
+
It allows development work to occur in a separate schema from production models.
+
It allows analytics engineers to develop side-by-side without stepping on each others toes.
+
+
A downside of the default is that production models all get a prefix,
+which may not be an ideal naming convention for end-users.
+
Because our architecture separates development and production databases,
+and has strict permissions protecting the RAW database,
+there is less danger of breaking production models.
+So we use our own custom schema name following the modified from the
+approach of the GitLab Data Team.
+
In production, each schema is just the custom schema name without any prefix.
+In non-production environments the default is used, where analytics engineers
+get the custom schema name prefixed with their target schema name (i.e. dbt_username_schemaname),
+and CI runs get the custom schema name prefixed with a CI job name.
+
This approach may be reevaluated as the project matures.
Our Snowflake architecture allows for reasonably safe SELECTing from the production RAW database while developing models.
+While this could be expensive for large tables,
+it also allows for faster and more reliable model development.
+
To develop against production RAW data, first you need someone with the USERADMIN role to grant rights to the TRANSFORMER_DEV role
+(this need only be done once, and can be revoked later):
This grant is not managed via terraform in order to keep the configurations of
+different environments as logically separate as possible. We may revisit this
+decision should the manual grant cause problems.
+
+
You can then run dbt locally and specify the RAW database manually:
+
DBT_RAW_DB=RAW_PRDdbtrun
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/dbt_docs_snowflake/catalog.json b/dbt_docs_snowflake/catalog.json
new file mode 100644
index 00000000..ac03ec01
--- /dev/null
+++ b/dbt_docs_snowflake/catalog.json
@@ -0,0 +1 @@
+{"metadata": {"dbt_schema_version": "https://schemas.getdbt.com/dbt/catalog/v1.json", "dbt_version": "1.6.0", "generated_at": "2023-12-07T18:21:15.416279Z", "invocation_id": "73752f3e-f0c3-4ad7-a412-541df042f42a", "env": {}}, "nodes": {}, "sources": {"source.dse_analytics.state_entities.ebudget_program_budgets": {"metadata": {"type": "BASE TABLE", "schema": "STATE_ENTITIES", "name": "EBUDGET_PROGRAM_BUDGETS", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"program_id": {"type": "NUMBER", "index": 1, "name": "program_id", "comment": null}, "last_upd_date": {"type": "NUMBER", "index": 2, "name": "last_upd_date", "comment": null}, "last_upd_user": {"type": "NUMBER", "index": 3, "name": "last_upd_user", "comment": null}, "py_dols": {"type": "NUMBER", "index": 4, "name": "py_dols", "comment": null}, "cy_dols": {"type": "NUMBER", "index": 5, "name": "cy_dols", "comment": null}, "by_dols": {"type": "NUMBER", "index": 6, "name": "by_dols", "comment": null}, "program_code": {"type": "TEXT", "index": 7, "name": "program_code", "comment": null}, "line_type": {"type": "NUMBER", "index": 8, "name": "line_type", "comment": null}, "indent_nbr": {"type": "NUMBER", "index": 9, "name": "indent_nbr", "comment": null}, "program_titl": {"type": "TEXT", "index": 10, "name": "program_titl", "comment": null}, "org_id": {"type": "NUMBER", "index": 11, "name": "org_id", "comment": null}, "py_pers_yrs": {"type": "FLOAT", "index": 12, "name": "py_pers_yrs", "comment": null}, "cy_pers_yrs": {"type": "FLOAT", "index": 13, "name": "cy_pers_yrs", "comment": null}, "by_pers_yrs": {"type": "FLOAT", "index": 14, "name": "by_pers_yrs", "comment": null}, "spr_include_co_flg": {"type": "TEXT", "index": 15, "name": "spr_include_co_flg", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-10-03 23:31UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 37888.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 683.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.state_entities.ebudget_program_budgets"}, "source.dse_analytics.tiger_2022.blocks": {"metadata": {"type": "BASE TABLE", "schema": "TIGER_2022", "name": "BLOCKS", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"STATEFP20": {"type": "TEXT", "index": 1, "name": "STATEFP20", "comment": null}, "COUNTYFP20": {"type": "TEXT", "index": 2, "name": "COUNTYFP20", "comment": null}, "TRACTCE20": {"type": "TEXT", "index": 3, "name": "TRACTCE20", "comment": null}, "BLOCKCE20": {"type": "TEXT", "index": 4, "name": "BLOCKCE20", "comment": null}, "GEOID20": {"type": "TEXT", "index": 5, "name": "GEOID20", "comment": null}, "NAME20": {"type": "TEXT", "index": 6, "name": "NAME20", "comment": null}, "MTFCC20": {"type": "TEXT", "index": 7, "name": "MTFCC20", "comment": null}, "UR20": {"type": "TEXT", "index": 8, "name": "UR20", "comment": null}, "UACE20": {"type": "TEXT", "index": 9, "name": "UACE20", "comment": null}, "UATYPE20": {"type": "TEXT", "index": 10, "name": "UATYPE20", "comment": null}, "FUNCSTAT20": {"type": "TEXT", "index": 11, "name": "FUNCSTAT20", "comment": null}, "ALAND20": {"type": "NUMBER", "index": 12, "name": "ALAND20", "comment": null}, "AWATER20": {"type": "NUMBER", "index": 13, "name": "AWATER20", "comment": null}, "INTPTLAT20": {"type": "TEXT", "index": 14, "name": "INTPTLAT20", "comment": null}, "INTPTLON20": {"type": "TEXT", "index": 15, "name": "INTPTLON20", "comment": null}, "HOUSING20": {"type": "NUMBER", "index": 16, "name": "HOUSING20", "comment": null}, "POP20": {"type": "NUMBER", "index": 17, "name": "POP20", "comment": null}, "geometry": {"type": "GEOGRAPHY", "index": 18, "name": "geometry", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-10-17 12:35UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 1293569536.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 519723.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.tiger_2022.blocks"}, "source.dse_analytics.building_footprints.us_building_footprints": {"metadata": {"type": "BASE TABLE", "schema": "BUILDING_FOOTPRINTS", "name": "US_BUILDING_FOOTPRINTS", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"release": {"type": "NUMBER", "index": 1, "name": "release", "comment": null}, "capture_dates_range": {"type": "TEXT", "index": 2, "name": "capture_dates_range", "comment": null}, "geometry": {"type": "GEOGRAPHY", "index": 3, "name": "geometry", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-12-07 17:52UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 3424908800.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 11542912.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.building_footprints.us_building_footprints"}, "source.dse_analytics.state_entities.base_entities": {"metadata": {"type": "BASE TABLE", "schema": "STATE_ENTITIES", "name": "BASE_ENTITIES", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"A": {"type": "TEXT", "index": 1, "name": "A", "comment": null}, "B": {"type": "TEXT", "index": 2, "name": "B", "comment": null}, "L1": {"type": "TEXT", "index": 3, "name": "L1", "comment": null}, "L2": {"type": "TEXT", "index": 4, "name": "L2", "comment": null}, "L3": {"type": "TEXT", "index": 5, "name": "L3", "comment": null}, "name": {"type": "TEXT", "index": 6, "name": "name", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-10-03 23:26UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 24064.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 972.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.state_entities.base_entities"}, "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets": {"metadata": {"type": "BASE TABLE", "schema": "STATE_ENTITIES", "name": "EBUDGET_AGENCY_AND_DEPARTMENT_BUDGETS", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"org_cd": {"type": "TEXT", "index": 1, "name": "org_cd", "comment": null}, "web_agency_cd": {"type": "TEXT", "index": 2, "name": "web_agency_cd", "comment": null}, "org_id": {"type": "NUMBER", "index": 3, "name": "org_id", "comment": null}, "legal_titl": {"type": "TEXT", "index": 4, "name": "legal_titl", "comment": null}, "state_budget_year_dols": {"type": "FLOAT", "index": 5, "name": "state_budget_year_dols", "comment": null}, "all_budget_year_dols": {"type": "FLOAT", "index": 6, "name": "all_budget_year_dols", "comment": null}, "budget_year_pers": {"type": "FLOAT", "index": 7, "name": "budget_year_pers", "comment": null}, "general_fund_total": {"type": "FLOAT", "index": 8, "name": "general_fund_total", "comment": null}, "special_fund_total": {"type": "FLOAT", "index": 9, "name": "special_fund_total", "comment": null}, "bond_fund_total": {"type": "FLOAT", "index": 10, "name": "bond_fund_total", "comment": null}, "cap_outlay_total": {"type": "FLOAT", "index": 11, "name": "cap_outlay_total", "comment": null}, "spr_include_co_flg": {"type": "TEXT", "index": 12, "name": "spr_include_co_flg", "comment": null}, "display_on_web_flg": {"type": "TEXT", "index": 13, "name": "display_on_web_flg", "comment": null}, "state_grand_total": {"type": "NUMBER", "index": 14, "name": "state_grand_total", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-10-03 23:31UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 19968.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 241.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"}, "source.dse_analytics.building_footprints.global_ml_building_footprints": {"metadata": {"type": "BASE TABLE", "schema": "BUILDING_FOOTPRINTS", "name": "GLOBAL_ML_BUILDING_FOOTPRINTS", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"height": {"type": "FLOAT", "index": 1, "name": "height", "comment": null}, "confidence": {"type": "FLOAT", "index": 2, "name": "confidence", "comment": null}, "geometry": {"type": "GEOGRAPHY", "index": 3, "name": "geometry", "comment": null}, "quadkey": {"type": "TEXT", "index": 4, "name": "quadkey", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-12-07 18:10UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 3874653696.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 12727952.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.building_footprints.global_ml_building_footprints"}, "source.dse_analytics.tiger_2022.places": {"metadata": {"type": "BASE TABLE", "schema": "TIGER_2022", "name": "PLACES", "database": "RAW_DEV", "comment": null, "owner": "RAW_DEV_READWRITECONTROL"}, "columns": {"STATEFP": {"type": "TEXT", "index": 1, "name": "STATEFP", "comment": null}, "PLACEFP": {"type": "TEXT", "index": 2, "name": "PLACEFP", "comment": null}, "PLACENS": {"type": "TEXT", "index": 3, "name": "PLACENS", "comment": null}, "GEOID": {"type": "TEXT", "index": 4, "name": "GEOID", "comment": null}, "NAME": {"type": "TEXT", "index": 5, "name": "NAME", "comment": null}, "NAMELSAD": {"type": "TEXT", "index": 6, "name": "NAMELSAD", "comment": null}, "LSAD": {"type": "TEXT", "index": 7, "name": "LSAD", "comment": null}, "CLASSFP": {"type": "TEXT", "index": 8, "name": "CLASSFP", "comment": null}, "PCICBSA": {"type": "FLOAT", "index": 9, "name": "PCICBSA", "comment": null}, "PCINECTA": {"type": "FLOAT", "index": 10, "name": "PCINECTA", "comment": null}, "MTFCC": {"type": "TEXT", "index": 11, "name": "MTFCC", "comment": null}, "FUNCSTAT": {"type": "TEXT", "index": 12, "name": "FUNCSTAT", "comment": null}, "ALAND": {"type": "NUMBER", "index": 13, "name": "ALAND", "comment": null}, "AWATER": {"type": "NUMBER", "index": 14, "name": "AWATER", "comment": null}, "INTPTLAT": {"type": "TEXT", "index": 15, "name": "INTPTLAT", "comment": null}, "INTPTLON": {"type": "TEXT", "index": 16, "name": "INTPTLON", "comment": null}, "geometry": {"type": "GEOGRAPHY", "index": 17, "name": "geometry", "comment": null}}, "stats": {"last_modified": {"id": "last_modified", "label": "Last Modified", "value": "2023-10-17 12:35UTC", "include": true, "description": "The timestamp for last update/change"}, "bytes": {"id": "bytes", "label": "Approximate Size", "value": 36128768.0, "include": true, "description": "Approximate size of the table as reported by Snowflake"}, "row_count": {"id": "row_count", "label": "Row Count", "value": 1611.0, "include": true, "description": "An approximate count of rows in this table"}, "has_stats": {"id": "has_stats", "label": "Has Stats?", "value": true, "include": false, "description": "Indicates whether there are statistics for this table"}}, "unique_id": "source.dse_analytics.tiger_2022.places"}}, "errors": null}
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__active_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__active_primary_code.sql
new file mode 100644
index 00000000..e02861f7
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__active_primary_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select primary_code
+from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active
+where primary_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__budgets_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__budgets_primary_code.sql
new file mode 100644
index 00000000..5af7d354
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__budgets_primary_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select primary_code
+from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__budgets
+where primary_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__technical_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__technical_primary_code.sql
new file mode 100644
index 00000000..52ef954b
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__technical_primary_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select primary_code
+from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__technical
+where primary_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/unique_int_state_entities__active_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/unique_int_state_entities__active_primary_code.sql
new file mode 100644
index 00000000..7d0db278
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/unique_int_state_entities__active_primary_code.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ primary_code as unique_field,
+ count(*) as n_records
+
+from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active
+where primary_code is not null
+group by primary_code
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__active.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__active.sql
new file mode 100644
index 00000000..6d0a486e
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__active.sql
@@ -0,0 +1,16 @@
+
+
+with
+active_entities as (
+ select *
+ from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities
+ where
+ do_not_use = false
+ and abolished = false
+ and restricted_use is null
+ and cast(primary_code as int) < 9000
+ and not regexp_like(lower(name_raw), 'moved to|renum\.? to')
+)
+
+select *
+from active_entities
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__budgets.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__budgets.sql
new file mode 100644
index 00000000..6a9386cb
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__budgets.sql
@@ -0,0 +1,29 @@
+
+
+with
+active_entities as (select * from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active),
+
+budgets as (select * from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_ebudget__budgets),
+
+active_agencies_and_departments as (
+ -- only select at deparment level or higher
+ select * from active_entities where coalesce(l2, l3) is null
+),
+
+active_entity_budgets as (
+ select
+ active_agencies_and_departments.primary_code,
+ active_agencies_and_departments.ucm_level,
+ active_agencies_and_departments.name,
+ active_agencies_and_departments.name_alpha,
+ budgets.name as budget_name,
+ budgets.budget_year_dollars
+ from active_agencies_and_departments
+ left join
+ budgets
+ on active_agencies_and_departments.primary_code = budgets.primary_code
+)
+
+select *
+from active_entity_budgets
+order by primary_code asc
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__technical.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__technical.sql
new file mode 100644
index 00000000..3c5ec75f
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__technical.sql
@@ -0,0 +1,13 @@
+
+
+with
+technical_entities as (
+ select *
+ from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities
+ where
+ (do_not_use = false and abolished = false)
+ and (restricted_use is not null or cast(primary_code as int) >= 9000)
+)
+
+select *
+from technical_entities
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql
new file mode 100644
index 00000000..7bada2dc
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql
@@ -0,0 +1,207 @@
+with footprints as (
+ select
+ "height",
+ "geometry"
+ from RAW_DEV.building_footprints.global_ml_building_footprints
+),
+
+blocks_source as (
+ select *
+ from RAW_DEV.tiger_2022.blocks
+),
+
+places_source as (
+ select * from RAW_DEV.tiger_2022.places
+),
+
+blocks as (
+ select
+ "COUNTYFP20" as "county_fips",
+ "TRACTCE20" as "tract",
+ "BLOCKCE20" as "block",
+ "GEOID20" as "block_geoid",
+ "geometry"
+ from blocks_source
+),
+
+places as (
+ select
+ "PLACEFP" as "place_fips",
+ "PLACENS" as "place_ns",
+ "GEOID" as "place_geoid",
+ "NAME" as "place_name",
+ "CLASSFP" as "class_fips_code",
+
+
+case
+ when "CLASSFP" = 'M2'
+ then 'A military or other defense installation entirely within a place'
+ when "CLASSFP" = 'C1'
+ then 'An active incorporated place that does not serve as a county subdivision equivalent'
+ when "CLASSFP" = 'U1'
+ then 'A census designated place with an official federally recognized name'
+ when "CLASSFP" = 'U2'
+ then 'A census designated place without an official federally recognized name'
+ end as "class_fips",
+ "geometry"
+ from places_source
+),
+
+footprints_with_blocks as (
+
+
+with b_left_model_with_id as (
+ select
+ /* Generate a temporary ID for footprints. We will need this to group/partition
+ by unique footprints further down. We could use a UUID, but integers are
+ cheaper to generate and compare. */
+ *, seq4() as _tmp_sjoin_id
+ from footprints
+),
+
+b_joined as (
+ select
+ b_left_model_with_id."height",
+ blocks."county_fips",
+ blocks."tract",
+ blocks."block",
+ blocks."block_geoid",
+ b_left_model_with_id."geometry",
+ /* We don't actually need the intersection for every geometry, only for the
+ ones that intersect more than one. However, in order to establish which
+ ones intersect more than one, we need a windowed COUNT partitioned by
+ _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle
+ (even though it should already be sorted by _tmp_id). In testing we've found
+ that it's cheaper to just do the intersection for all the geometries. */
+ st_area(
+ st_intersection(b_left_model_with_id."geometry", blocks."geometry")
+ ) as _tmp_sjoin_intersection,
+ b_left_model_with_id._tmp_sjoin_id
+ from b_left_model_with_id
+ inner join blocks
+ on st_intersects(b_left_model_with_id."geometry", blocks."geometry")
+),
+
+b_deduplicated as (
+ select
+ -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.
+ -- Fortunately, we know that the geometries are identical when partitioned
+ -- by _tmp_sjoin_id, so we can just choose any_value.
+ any_value("geometry") as "geometry",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("height", coalesce(_tmp_sjoin_intersection, 1.0)) as "height",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("county_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "county_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("tract", coalesce(_tmp_sjoin_intersection, 1.0)) as "tract",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block", coalesce(_tmp_sjoin_intersection, 1.0)) as "block",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "block_geoid"
+ from b_joined
+ group by _tmp_sjoin_id
+)
+
+select * from b_deduplicated
+),
+
+footprints_with_blocks_and_places as (
+
+
+with p_left_model_with_id as (
+ select
+ /* Generate a temporary ID for footprints. We will need this to group/partition
+ by unique footprints further down. We could use a UUID, but integers are
+ cheaper to generate and compare. */
+ *, seq4() as _tmp_sjoin_id
+ from footprints_with_blocks
+),
+
+p_joined as (
+ select
+ p_left_model_with_id."height",
+ p_left_model_with_id."county_fips",
+ p_left_model_with_id."tract",
+ p_left_model_with_id."block",
+ p_left_model_with_id."block_geoid",
+ places."place_fips",
+ places."place_ns",
+ places."place_geoid",
+ places."place_name",
+ places."class_fips_code",
+ places."class_fips",
+ p_left_model_with_id."geometry",
+ /* We don't actually need the intersection for every geometry, only for the
+ ones that intersect more than one. However, in order to establish which
+ ones intersect more than one, we need a windowed COUNT partitioned by
+ _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle
+ (even though it should already be sorted by _tmp_id). In testing we've found
+ that it's cheaper to just do the intersection for all the geometries. */
+ st_area(
+ st_intersection(p_left_model_with_id."geometry", places."geometry")
+ ) as _tmp_sjoin_intersection,
+ p_left_model_with_id._tmp_sjoin_id
+ from p_left_model_with_id
+ left join places
+ on st_intersects(p_left_model_with_id."geometry", places."geometry")
+),
+
+p_deduplicated as (
+ select
+ -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.
+ -- Fortunately, we know that the geometries are identical when partitioned
+ -- by _tmp_sjoin_id, so we can just choose any_value.
+ any_value("geometry") as "geometry",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("height", coalesce(_tmp_sjoin_intersection, 1.0)) as "height",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("county_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "county_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("tract", coalesce(_tmp_sjoin_intersection, 1.0)) as "tract",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block", coalesce(_tmp_sjoin_intersection, 1.0)) as "block",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "block_geoid",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_ns", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_ns",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_geoid",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_name", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_name",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("class_fips_code", coalesce(_tmp_sjoin_intersection, 1.0)) as "class_fips_code",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("class_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "class_fips"
+ from p_joined
+ group by _tmp_sjoin_id
+)
+
+select * from p_deduplicated
+),
+
+footprints_with_blocks_and_places_final as (
+ select
+ *,
+ st_area("geometry") as "area_sqm"
+ from footprints_with_blocks_and_places
+)
+
+select * from footprints_with_blocks_and_places_final
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql
new file mode 100644
index 00000000..c2de63c2
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql
@@ -0,0 +1,216 @@
+with footprints as (
+ select
+ "release",
+ "capture_dates_range",
+ "geometry"
+ from RAW_DEV.building_footprints.us_building_footprints
+),
+
+blocks_source as (
+ select *
+ from RAW_DEV.tiger_2022.blocks
+),
+
+places_source as (
+ select * from RAW_DEV.tiger_2022.places
+),
+
+blocks as (
+ select
+ "COUNTYFP20" as "county_fips",
+ "TRACTCE20" as "tract",
+ "BLOCKCE20" as "block",
+ "GEOID20" as "block_geoid",
+ "geometry"
+ from blocks_source
+),
+
+places as (
+ select
+ "PLACEFP" as "place_fips",
+ "PLACENS" as "place_ns",
+ "GEOID" as "place_geoid",
+ "NAME" as "place_name",
+ "CLASSFP" as "class_fips_code",
+
+
+case
+ when "CLASSFP" = 'M2'
+ then 'A military or other defense installation entirely within a place'
+ when "CLASSFP" = 'C1'
+ then 'An active incorporated place that does not serve as a county subdivision equivalent'
+ when "CLASSFP" = 'U1'
+ then 'A census designated place with an official federally recognized name'
+ when "CLASSFP" = 'U2'
+ then 'A census designated place without an official federally recognized name'
+ end as "class_fips",
+ "geometry"
+ from places_source
+),
+
+footprints_with_blocks as (
+
+
+with b_left_model_with_id as (
+ select
+ /* Generate a temporary ID for footprints. We will need this to group/partition
+ by unique footprints further down. We could use a UUID, but integers are
+ cheaper to generate and compare. */
+ *, seq4() as _tmp_sjoin_id
+ from footprints
+),
+
+b_joined as (
+ select
+ b_left_model_with_id."release",
+ b_left_model_with_id."capture_dates_range",
+ blocks."county_fips",
+ blocks."tract",
+ blocks."block",
+ blocks."block_geoid",
+ b_left_model_with_id."geometry",
+ /* We don't actually need the intersection for every geometry, only for the
+ ones that intersect more than one. However, in order to establish which
+ ones intersect more than one, we need a windowed COUNT partitioned by
+ _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle
+ (even though it should already be sorted by _tmp_id). In testing we've found
+ that it's cheaper to just do the intersection for all the geometries. */
+ st_area(
+ st_intersection(b_left_model_with_id."geometry", blocks."geometry")
+ ) as _tmp_sjoin_intersection,
+ b_left_model_with_id._tmp_sjoin_id
+ from b_left_model_with_id
+ inner join blocks
+ on st_intersects(b_left_model_with_id."geometry", blocks."geometry")
+),
+
+b_deduplicated as (
+ select
+ -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.
+ -- Fortunately, we know that the geometries are identical when partitioned
+ -- by _tmp_sjoin_id, so we can just choose any_value.
+ any_value("geometry") as "geometry",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("release", coalesce(_tmp_sjoin_intersection, 1.0)) as "release",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("capture_dates_range", coalesce(_tmp_sjoin_intersection, 1.0)) as "capture_dates_range",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("county_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "county_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("tract", coalesce(_tmp_sjoin_intersection, 1.0)) as "tract",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block", coalesce(_tmp_sjoin_intersection, 1.0)) as "block",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "block_geoid"
+ from b_joined
+ group by _tmp_sjoin_id
+)
+
+select * from b_deduplicated
+),
+
+footprints_with_blocks_and_places as (
+
+
+with p_left_model_with_id as (
+ select
+ /* Generate a temporary ID for footprints. We will need this to group/partition
+ by unique footprints further down. We could use a UUID, but integers are
+ cheaper to generate and compare. */
+ *, seq4() as _tmp_sjoin_id
+ from footprints_with_blocks
+),
+
+p_joined as (
+ select
+ p_left_model_with_id."release",
+ p_left_model_with_id."capture_dates_range",
+ p_left_model_with_id."county_fips",
+ p_left_model_with_id."tract",
+ p_left_model_with_id."block",
+ p_left_model_with_id."block_geoid",
+ places."place_fips",
+ places."place_ns",
+ places."place_geoid",
+ places."place_name",
+ places."class_fips_code",
+ places."class_fips",
+ p_left_model_with_id."geometry",
+ /* We don't actually need the intersection for every geometry, only for the
+ ones that intersect more than one. However, in order to establish which
+ ones intersect more than one, we need a windowed COUNT partitioned by
+ _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle
+ (even though it should already be sorted by _tmp_id). In testing we've found
+ that it's cheaper to just do the intersection for all the geometries. */
+ st_area(
+ st_intersection(p_left_model_with_id."geometry", places."geometry")
+ ) as _tmp_sjoin_intersection,
+ p_left_model_with_id._tmp_sjoin_id
+ from p_left_model_with_id
+ left join places
+ on st_intersects(p_left_model_with_id."geometry", places."geometry")
+),
+
+p_deduplicated as (
+ select
+ -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.
+ -- Fortunately, we know that the geometries are identical when partitioned
+ -- by _tmp_sjoin_id, so we can just choose any_value.
+ any_value("geometry") as "geometry",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("release", coalesce(_tmp_sjoin_intersection, 1.0)) as "release",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("capture_dates_range", coalesce(_tmp_sjoin_intersection, 1.0)) as "capture_dates_range",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("county_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "county_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("tract", coalesce(_tmp_sjoin_intersection, 1.0)) as "tract",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block", coalesce(_tmp_sjoin_intersection, 1.0)) as "block",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("block_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "block_geoid",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_fips",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_ns", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_ns",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_geoid", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_geoid",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("place_name", coalesce(_tmp_sjoin_intersection, 1.0)) as "place_name",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("class_fips_code", coalesce(_tmp_sjoin_intersection, 1.0)) as "class_fips_code",
+ -- max_by returns null if all the values in a group are null. So if we have a left
+ -- join, we need to guard against nulls with a coalesce to return the single value
+ max_by("class_fips", coalesce(_tmp_sjoin_intersection, 1.0)) as "class_fips"
+ from p_joined
+ group by _tmp_sjoin_id
+)
+
+select * from p_deduplicated
+),
+
+footprints_with_blocks_and_places_final as (
+ select
+ *,
+ st_area("geometry") as "area_sqm"
+ from footprints_with_blocks_and_places
+)
+
+select * from footprints_with_blocks_and_places_final
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_agency_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_agency_code.sql
new file mode 100644
index 00000000..f1d8d2c3
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_agency_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select agency_code
+from ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies
+where agency_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_name.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_name.sql
new file mode 100644
index 00000000..fe3625d2
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_name.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select name
+from ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies
+where name is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_agency_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_agency_code.sql
new file mode 100644
index 00000000..27c366d8
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_agency_code.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ agency_code as unique_field,
+ count(*) as n_records
+
+from ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies
+where agency_code is not null
+group by agency_code
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_name.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_name.sql
new file mode 100644
index 00000000..6f9ce42a
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_name.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ name as unique_field,
+ count(*) as n_records
+
+from ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies
+where name is not null
+group by name
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/dim_state_entities__agencies.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/dim_state_entities__agencies.sql
new file mode 100644
index 00000000..644c34d3
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/marts/state_entities/dim_state_entities__agencies.sql
@@ -0,0 +1,13 @@
+
+
+with
+agencies as (
+ select
+ name,
+ agency_code
+ from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active
+ where subagency_code is null and l1 is null
+)
+
+select *
+from agencies
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571.sql
new file mode 100644
index 00000000..c9c2bb02
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571.sql
@@ -0,0 +1,20 @@
+
+
+
+
+
+
+with validation_errors as (
+
+ select
+ A, B, L1, L2, L3
+ from RAW_DEV.state_entities.base_entities
+ group by A, B, L1, L2, L3
+ having count(*) > 1
+
+)
+
+select *
+from validation_errors
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_department_of_finance__entities_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_department_of_finance__entities_primary_code.sql
new file mode 100644
index 00000000..54dd905c
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_department_of_finance__entities_primary_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select primary_code
+from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities
+where primary_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_ebudget__budgets_primary_code.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_ebudget__budgets_primary_code.sql
new file mode 100644
index 00000000..a51119b3
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_ebudget__budgets_primary_code.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select primary_code
+from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_ebudget__budgets
+where primary_code is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed.sql
new file mode 100644
index 00000000..5bd0f5f7
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select "program_code"
+from RAW_DEV.state_entities.ebudget_program_budgets
+where "program_code" is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d.sql
new file mode 100644
index 00000000..f0ccee73
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select "org_cd"
+from RAW_DEV.state_entities.ebudget_agency_and_department_budgets
+where "org_cd" is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d.sql
new file mode 100644
index 00000000..8edb38d0
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select "web_agency_cd"
+from RAW_DEV.state_entities.ebudget_agency_and_department_budgets
+where "web_agency_cd" is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__A_.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__A_.sql
new file mode 100644
index 00000000..e191e3d3
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__A_.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select "A"
+from RAW_DEV.state_entities.base_entities
+where "A" is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__name_.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__name_.sql
new file mode 100644
index 00000000..ce5c0da9
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__name_.sql
@@ -0,0 +1,11 @@
+
+
+
+
+
+
+select "name"
+from RAW_DEV.state_entities.base_entities
+where "name" is null
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_base_entities__L3_.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_base_entities__L3_.sql
new file mode 100644
index 00000000..ed840c4b
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_base_entities__L3_.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ "L3" as unique_field,
+ count(*) as n_records
+
+from RAW_DEV.state_entities.base_entities
+where "L3" is not null
+group by "L3"
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff.sql
new file mode 100644
index 00000000..63619621
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ "org_cd" as unique_field,
+ count(*) as n_records
+
+from RAW_DEV.state_entities.ebudget_agency_and_department_budgets
+where "org_cd" is not null
+group by "org_cd"
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04.sql
new file mode 100644
index 00000000..b549add4
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04.sql
@@ -0,0 +1,14 @@
+
+
+
+
+select
+ "web_agency_cd" as unique_field,
+ count(*) as n_records
+
+from RAW_DEV.state_entities.ebudget_agency_and_department_budgets
+where "web_agency_cd" is not null
+group by "web_agency_cd"
+having count(*) > 1
+
+
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_department_of_finance__entities.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_department_of_finance__entities.sql
new file mode 100644
index 00000000..cb916b53
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_department_of_finance__entities.sql
@@ -0,0 +1,57 @@
+
+
+
+
+
+
+with
+base_entities as (select * from RAW_DEV.state_entities.base_entities),
+
+invalid_subagencies as (
+ select *
+ from base_entities
+ where contains("name", 'no subagency') and contains("name", 'do not use')
+),
+
+entities as (
+ select
+ -- Extract the first portion of the entity as the name. The other
+ -- two (optional) groups match parentheticals and things like
+ -- "-- DO NOT USE" or " -- DOF USE ONLY"
+ PUBLIC.extract_name("name") as name,
+ coalesce(l3, l2, l1, b, a) as primary_code,
+ a as agency_code,
+ case
+ when b in (select b from invalid_subagencies) then null else b
+ end as subagency_code,
+ l1,
+ l2,
+ l3,
+ regexp_substr("name", '\\((.+?)\\)') as parenthetical,
+ contains(lower("name"), 'do not use') as do_not_use,
+ contains(lower("name"), 'abolished') as abolished,
+ regexp_substr("name", '[A-Z/]+ USE ONLY') as restricted_use,
+ "name" as name_raw
+ from base_entities
+),
+
+entities_with_extras as (
+ select
+ *,
+ PUBLIC.reorder_name_for_alphabetization(name) as name_alpha,
+ case
+ when coalesce(l3, l2, l1, subagency_code) is null
+ then 'agency'
+ when coalesce(l3, l2, l1) is null
+ then 'subagency'
+ when coalesce(l3, l2) is null
+ then 'L1'
+ when l3 is null
+ then 'L2'
+ else 'L3'
+ end as ucm_level
+ from entities
+)
+
+select *
+from entities_with_extras
\ No newline at end of file
diff --git a/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_ebudget__budgets.sql b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_ebudget__budgets.sql
new file mode 100644
index 00000000..8fc0d270
--- /dev/null
+++ b/dbt_docs_snowflake/compiled/dse_analytics/models/staging/department_of_finance/stg_ebudget__budgets.sql
@@ -0,0 +1,16 @@
+with
+agencies_and_departments as (
+ select *
+ from RAW_DEV.state_entities.ebudget_agency_and_department_budgets
+),
+
+ebudget_budgets as (
+ select
+ "web_agency_cd" as primary_code,
+ "legal_titl" as name,
+ "all_budget_year_dols" as budget_year_dollars
+ from agencies_and_departments
+)
+
+select *
+from ebudget_budgets
\ No newline at end of file
diff --git a/dbt_docs_snowflake/graph.gpickle b/dbt_docs_snowflake/graph.gpickle
new file mode 100644
index 00000000..6f8ff42c
Binary files /dev/null and b/dbt_docs_snowflake/graph.gpickle differ
diff --git a/dbt_docs_snowflake/graph_summary.json b/dbt_docs_snowflake/graph_summary.json
new file mode 100644
index 00000000..ebb45bb5
--- /dev/null
+++ b/dbt_docs_snowflake/graph_summary.json
@@ -0,0 +1 @@
+{"_invocation_id": "73752f3e-f0c3-4ad7-a412-541df042f42a", "linked": {"0": {"name": "source.dse_analytics.building_footprints.us_building_footprints", "type": "source", "succ": [10]}, "1": {"name": "source.dse_analytics.building_footprints.global_ml_building_footprints", "type": "source", "succ": [9]}, "2": {"name": "source.dse_analytics.tiger_2022.blocks", "type": "source", "succ": [9, 10]}, "3": {"name": "source.dse_analytics.tiger_2022.places", "type": "source", "succ": [9, 10]}, "4": {"name": "source.dse_analytics.state_entities.base_entities", "type": "source", "succ": [11, 25, 26, 27, 28]}, "5": {"name": "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets", "type": "source", "succ": [12, 29, 30, 31, 32]}, "6": {"name": "source.dse_analytics.state_entities.ebudget_program_budgets", "type": "source", "succ": [33]}, "7": {"name": "model.dse_analytics.dim_state_entities__agencies", "type": "model", "succ": [15, 16, 17, 18]}, "8": {"name": "model.dse_analytics.int_state_entities__active", "type": "model", "succ": [7, 13, 21, 22]}, "9": {"name": "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger", "type": "model"}, "10": {"name": "model.dse_analytics.geo_reference__us_building_footprints_with_tiger", "type": "model"}, "11": {"name": "model.dse_analytics.stg_department_of_finance__entities", "type": "model", "succ": [8, 14, 19]}, "12": {"name": "model.dse_analytics.stg_ebudget__budgets", "type": "model", "succ": [13, 20]}, "13": {"name": "model.dse_analytics.int_state_entities__budgets", "type": "model", "succ": [24]}, "14": {"name": "model.dse_analytics.int_state_entities__technical", "type": "model", "succ": [23]}, "15": {"name": "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21", "type": "test"}, "16": {"name": "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b", "type": "test"}, "17": {"name": "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291", "type": "test"}, "18": {"name": "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e", "type": "test"}, "19": {"name": "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014", "type": "test"}, "20": {"name": "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121", "type": "test"}, "21": {"name": "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863", "type": "test"}, "22": {"name": "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe", "type": "test"}, "23": {"name": "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772", "type": "test"}, "24": {"name": "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f", "type": "test"}, "25": {"name": "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63", "type": "test"}, "26": {"name": "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2", "type": "test"}, "27": {"name": "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4", "type": "test"}, "28": {"name": "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173", "type": "test"}, "29": {"name": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8", "type": "test"}, "30": {"name": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8", "type": "test"}, "31": {"name": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca", "type": "test"}, "32": {"name": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75", "type": "test"}, "33": {"name": "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43", "type": "test"}}}
\ No newline at end of file
diff --git a/dbt_docs_snowflake/index.html b/dbt_docs_snowflake/index.html
new file mode 100644
index 00000000..ac81278c
--- /dev/null
+++ b/dbt_docs_snowflake/index.html
@@ -0,0 +1,102 @@
+
+
+
+
+
+
+ dbt Docs
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/dbt_docs_snowflake/manifest.json b/dbt_docs_snowflake/manifest.json
new file mode 100644
index 00000000..a708f9af
--- /dev/null
+++ b/dbt_docs_snowflake/manifest.json
@@ -0,0 +1 @@
+{"metadata": {"dbt_schema_version": "https://schemas.getdbt.com/dbt/manifest/v10.json", "dbt_version": "1.6.0", "generated_at": "2023-12-07T18:21:06.697792Z", "invocation_id": "73752f3e-f0c3-4ad7-a412-541df042f42a", "env": {}, "project_name": "dse_analytics", "project_id": "2e3f03ee3eb6abe1cc4ac52f9a54cbe1", "user_id": null, "send_anonymous_usage_stats": false, "adapter_type": "snowflake"}, "nodes": {"model.dse_analytics.dim_state_entities__agencies": {"database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_state_entities", "name": "dim_state_entities__agencies", "resource_type": "model", "package_name": "dse_analytics", "path": "marts/state_entities/dim_state_entities__agencies.sql", "original_file_path": "models/marts/state_entities/dim_state_entities__agencies.sql", "unique_id": "model.dse_analytics.dim_state_entities__agencies", "fqn": ["dse_analytics", "marts", "state_entities", "dim_state_entities__agencies"], "alias": "dim_state_entities__agencies", "checksum": {"name": "sha256", "checksum": "dccd79ce6f003f3f2ef316946efdd19fe8de6fb32b9fb738c4cf0583b9ad8e7d"}, "config": {"enabled": true, "alias": null, "schema": "state_entities", "database": "ANALYTICS_DEV", "tags": [], "meta": {}, "group": null, "materialized": "table", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "Agency-level state entities.", "columns": {"name": {"name": "name", "description": "The name of the state agency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "agency_code": {"name": "agency_code", "description": "The numeric code of the state agency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/marts/state_entities/_state_entities__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"materialized": "table", "database": "{{ env_var('DBT_ANALYTICS_DB', 'ANALYTICS_DEV') }}", "schema": "state_entities"}, "created_at": 1701973267.8124297, "relation_name": "ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies", "raw_code": "{{ config(materialized=\"table\") }}\n\nwith\nagencies as (\n select\n name,\n agency_code\n from {{ ref(\"int_state_entities__active\") }}\n where subagency_code is null and l1 is null\n)\n\nselect *\nfrom agencies", "language": "sql", "refs": [{"name": "int_state_entities__active", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": [], "nodes": ["model.dse_analytics.int_state_entities__active"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/state_entities/dim_state_entities__agencies.sql", "compiled": true, "compiled_code": "\n\nwith\nagencies as (\n select\n name,\n agency_code\n from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active\n where subagency_code is null and l1 is null\n)\n\nselect *\nfrom agencies", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger": {"database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema", "name": "geo_reference__global_ml_building_footprints_with_tiger", "resource_type": "model", "package_name": "dse_analytics", "path": "marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql", "original_file_path": "models/marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql", "unique_id": "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger", "fqn": ["dse_analytics", "marts", "geo_reference", "geo_reference__global_ml_building_footprints_with_tiger"], "alias": "geo_reference__global_ml_building_footprints_with_tiger", "checksum": {"name": "sha256", "checksum": "161612d788a87c00df7b7729d2fa77a3045d190e93989605fb1bbc20875e1d85"}, "config": {"enabled": true, "alias": null, "schema": null, "database": "ANALYTICS_DEV", "tags": [], "meta": {}, "group": null, "materialized": "table", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "This data table is a join of the TIGER data for blocks, tracts, counties, and\nplaces with the Microsoft Global ML Building Footprints data for the state of CA.\n", "columns": {"height": {"name": "height", "description": "The height of the building (negative indicates unknown height)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "county_fips": {"name": "county_fips", "description": "2020 Census county FIPS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "tract": {"name": "tract", "description": "2020 Census tract code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "block": {"name": "block", "description": "2020 Census tabulation block number", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "block_geoid": {"name": "block_geoid", "description": "Census block identifier; a concatenation of 2020 Census state FIPS code, 2020 Census county FIPS code, 2020 Census tract code, and 2020 Census block number\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_fips": {"name": "place_fips", "description": "Current place FIPS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_ns": {"name": "place_ns", "description": "Current place GNIS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_geoid": {"name": "place_geoid", "description": "Place identifier; a concatenation of the current state FIPS code and place FIPS code\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_name": {"name": "place_name", "description": "Current name and the translated legal/statistical area description for place\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "class_fips_code": {"name": "class_fips_code", "description": "Current FIPS class code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "class_fips": {"name": "class_fips", "description": "Current FIPS class definition", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "geometry": {"name": "geometry", "description": "The footprint geometry", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "area_sqm": {"name": "area_sqm", "description": "The area of the footprint in square meters", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/marts/geo_reference/_geo_reference__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"materialized": "table", "database": "{{ env_var('DBT_ANALYTICS_DB', 'ANALYTICS_DEV') }}"}, "created_at": 1701973267.8352768, "relation_name": "ANALYTICS_DEV.ci_should_not_create_this_schema.geo_reference__global_ml_building_footprints_with_tiger", "raw_code": "with footprints as (\n select\n \"height\",\n \"geometry\"\n from {{ source('building_footprints', 'global_ml_building_footprints') }}\n),\n\nblocks_source as (\n select *\n from {{ source('tiger_2022', 'blocks') }}\n),\n\nplaces_source as (\n select * from {{ source('tiger_2022', 'places') }}\n),\n\nblocks as (\n select\n \"COUNTYFP20\" as \"county_fips\",\n \"TRACTCE20\" as \"tract\",\n \"BLOCKCE20\" as \"block\",\n \"GEOID20\" as \"block_geoid\",\n \"geometry\"\n from blocks_source\n),\n\nplaces as (\n select\n \"PLACEFP\" as \"place_fips\",\n \"PLACENS\" as \"place_ns\",\n \"GEOID\" as \"place_geoid\",\n \"NAME\" as \"place_name\",\n \"CLASSFP\" as \"class_fips_code\",\n {{ map_class_fips(\"CLASSFP\") }} as \"class_fips\",\n \"geometry\"\n from places_source\n),\n\nfootprints_with_blocks as (\n {{ spatial_join_with_deduplication(\n \"footprints\",\n \"blocks\",\n ['\"height\"'],\n ['\"county_fips\"', '\"tract\"', '\"block\"', '\"block_geoid\"'],\n left_geom='\"geometry\"',\n right_geom='\"geometry\"',\n kind=\"inner\",\n prefix=\"b\",\n ) }}\n),\n\nfootprints_with_blocks_and_places as (\n {{ spatial_join_with_deduplication(\n \"footprints_with_blocks\",\n \"places\",\n ['\"height\"', '\"county_fips\"', '\"tract\"', '\"block\"', '\"block_geoid\"'],\n ['\"place_fips\"', '\"place_ns\"', '\"place_geoid\"', '\"place_name\"', '\"class_fips_code\"', '\"class_fips\"'],\n left_geom='\"geometry\"',\n right_geom='\"geometry\"',\n kind=\"left\",\n prefix=\"p\",\n ) }}\n),\n\nfootprints_with_blocks_and_places_final as (\n select\n *,\n st_area(\"geometry\") as \"area_sqm\"\n from footprints_with_blocks_and_places\n)\n\nselect * from footprints_with_blocks_and_places_final", "language": "sql", "refs": [], "sources": [["building_footprints", "global_ml_building_footprints"], ["tiger_2022", "blocks"], ["tiger_2022", "places"]], "metrics": [], "depends_on": {"macros": ["macro.dse_analytics.map_class_fips", "macro.dse_analytics.spatial_join_with_deduplication"], "nodes": ["source.dse_analytics.building_footprints.global_ml_building_footprints", "source.dse_analytics.tiger_2022.blocks", "source.dse_analytics.tiger_2022.places"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/geo_reference/geo_reference__global_ml_building_footprints_with_tiger.sql", "compiled": true, "compiled_code": "with footprints as (\n select\n \"height\",\n \"geometry\"\n from RAW_DEV.building_footprints.global_ml_building_footprints\n),\n\nblocks_source as (\n select *\n from RAW_DEV.tiger_2022.blocks\n),\n\nplaces_source as (\n select * from RAW_DEV.tiger_2022.places\n),\n\nblocks as (\n select\n \"COUNTYFP20\" as \"county_fips\",\n \"TRACTCE20\" as \"tract\",\n \"BLOCKCE20\" as \"block\",\n \"GEOID20\" as \"block_geoid\",\n \"geometry\"\n from blocks_source\n),\n\nplaces as (\n select\n \"PLACEFP\" as \"place_fips\",\n \"PLACENS\" as \"place_ns\",\n \"GEOID\" as \"place_geoid\",\n \"NAME\" as \"place_name\",\n \"CLASSFP\" as \"class_fips_code\",\n \n\ncase\n when \"CLASSFP\" = 'M2'\n then 'A military or other defense installation entirely within a place'\n when \"CLASSFP\" = 'C1'\n then 'An active incorporated place that does not serve as a county subdivision equivalent'\n when \"CLASSFP\" = 'U1'\n then 'A census designated place with an official federally recognized name'\n when \"CLASSFP\" = 'U2'\n then 'A census designated place without an official federally recognized name'\n end as \"class_fips\",\n \"geometry\"\n from places_source\n),\n\nfootprints_with_blocks as (\n \n\nwith b_left_model_with_id as (\n select\n /* Generate a temporary ID for footprints. We will need this to group/partition\n by unique footprints further down. We could use a UUID, but integers are\n cheaper to generate and compare. */\n *, seq4() as _tmp_sjoin_id\n from footprints\n),\n\nb_joined as (\n select\n b_left_model_with_id.\"height\",\n blocks.\"county_fips\",\n blocks.\"tract\",\n blocks.\"block\",\n blocks.\"block_geoid\",\n b_left_model_with_id.\"geometry\",\n /* We don't actually need the intersection for every geometry, only for the\n ones that intersect more than one. However, in order to establish which\n ones intersect more than one, we need a windowed COUNT partitioned by\n _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle\n (even though it should already be sorted by _tmp_id). In testing we've found\n that it's cheaper to just do the intersection for all the geometries. */\n st_area(\n st_intersection(b_left_model_with_id.\"geometry\", blocks.\"geometry\")\n ) as _tmp_sjoin_intersection,\n b_left_model_with_id._tmp_sjoin_id\n from b_left_model_with_id\n inner join blocks\n on st_intersects(b_left_model_with_id.\"geometry\", blocks.\"geometry\")\n),\n\nb_deduplicated as (\n select\n -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.\n -- Fortunately, we know that the geometries are identical when partitioned\n -- by _tmp_sjoin_id, so we can just choose any_value.\n any_value(\"geometry\") as \"geometry\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"height\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"height\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"county_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"county_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"tract\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"tract\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block_geoid\"\n from b_joined\n group by _tmp_sjoin_id\n)\n\nselect * from b_deduplicated\n),\n\nfootprints_with_blocks_and_places as (\n \n\nwith p_left_model_with_id as (\n select\n /* Generate a temporary ID for footprints. We will need this to group/partition\n by unique footprints further down. We could use a UUID, but integers are\n cheaper to generate and compare. */\n *, seq4() as _tmp_sjoin_id\n from footprints_with_blocks\n),\n\np_joined as (\n select\n p_left_model_with_id.\"height\",\n p_left_model_with_id.\"county_fips\",\n p_left_model_with_id.\"tract\",\n p_left_model_with_id.\"block\",\n p_left_model_with_id.\"block_geoid\",\n places.\"place_fips\",\n places.\"place_ns\",\n places.\"place_geoid\",\n places.\"place_name\",\n places.\"class_fips_code\",\n places.\"class_fips\",\n p_left_model_with_id.\"geometry\",\n /* We don't actually need the intersection for every geometry, only for the\n ones that intersect more than one. However, in order to establish which\n ones intersect more than one, we need a windowed COUNT partitioned by\n _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle\n (even though it should already be sorted by _tmp_id). In testing we've found\n that it's cheaper to just do the intersection for all the geometries. */\n st_area(\n st_intersection(p_left_model_with_id.\"geometry\", places.\"geometry\")\n ) as _tmp_sjoin_intersection,\n p_left_model_with_id._tmp_sjoin_id\n from p_left_model_with_id\n left join places\n on st_intersects(p_left_model_with_id.\"geometry\", places.\"geometry\")\n),\n\np_deduplicated as (\n select\n -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.\n -- Fortunately, we know that the geometries are identical when partitioned\n -- by _tmp_sjoin_id, so we can just choose any_value.\n any_value(\"geometry\") as \"geometry\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"height\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"height\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"county_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"county_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"tract\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"tract\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block_geoid\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_ns\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_ns\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_geoid\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_name\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_name\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"class_fips_code\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"class_fips_code\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"class_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"class_fips\"\n from p_joined\n group by _tmp_sjoin_id\n)\n\nselect * from p_deduplicated\n),\n\nfootprints_with_blocks_and_places_final as (\n select\n *,\n st_area(\"geometry\") as \"area_sqm\"\n from footprints_with_blocks_and_places\n)\n\nselect * from footprints_with_blocks_and_places_final", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.geo_reference__us_building_footprints_with_tiger": {"database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema", "name": "geo_reference__us_building_footprints_with_tiger", "resource_type": "model", "package_name": "dse_analytics", "path": "marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql", "original_file_path": "models/marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql", "unique_id": "model.dse_analytics.geo_reference__us_building_footprints_with_tiger", "fqn": ["dse_analytics", "marts", "geo_reference", "geo_reference__us_building_footprints_with_tiger"], "alias": "geo_reference__us_building_footprints_with_tiger", "checksum": {"name": "sha256", "checksum": "586fa37dc82e3259c0989706c69352c88285d0212f844f6574c601dc6c5b77fe"}, "config": {"enabled": true, "alias": null, "schema": null, "database": "ANALYTICS_DEV", "tags": [], "meta": {}, "group": null, "materialized": "table", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "This data table is a join of the TIGER data for blocks, tracts, counties, and\nplaces with the Microsoft US Building Footprints data for the state of CA.\n", "columns": {"release": {"name": "release", "description": "The version of the data", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "capture_dates_range": {"name": "capture_dates_range", "description": "Each building footprint has a capture date tag from 2019-2020", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "county_fips": {"name": "county_fips", "description": "2020 Census county FIPS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "tract": {"name": "tract", "description": "2020 Census tract code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "block": {"name": "block", "description": "2020 Census tabulation block number", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "block_geoid": {"name": "block_geoid", "description": "Census block identifier; a concatenation of 2020 Census state FIPS code, 2020 Census county FIPS code, 2020 Census tract code, and 2020 Census block number\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_fips": {"name": "place_fips", "description": "Current place FIPS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_ns": {"name": "place_ns", "description": "Current place GNIS code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_geoid": {"name": "place_geoid", "description": "Place identifier; a concatenation of the current state FIPS code and place FIPS code\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "place_name": {"name": "place_name", "description": "Current name and the translated legal/statistical area description for place\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "class_fips_code": {"name": "class_fips_code", "description": "Current FIPS class code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "class_fips": {"name": "class_fips", "description": "Current FIPS class definition", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "geometry": {"name": "geometry", "description": "The footprint geometry", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "area_sqm": {"name": "area_sqm", "description": "The area of the footprint in square meters", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/marts/geo_reference/_geo_reference__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"materialized": "table", "database": "{{ env_var('DBT_ANALYTICS_DB', 'ANALYTICS_DEV') }}"}, "created_at": 1701973267.8339899, "relation_name": "ANALYTICS_DEV.ci_should_not_create_this_schema.geo_reference__us_building_footprints_with_tiger", "raw_code": "with footprints as (\n select\n \"release\",\n \"capture_dates_range\",\n \"geometry\"\n from {{ source('building_footprints', 'us_building_footprints') }}\n),\n\nblocks_source as (\n select *\n from {{ source('tiger_2022', 'blocks') }}\n),\n\nplaces_source as (\n select * from {{ source('tiger_2022', 'places') }}\n),\n\nblocks as (\n select\n \"COUNTYFP20\" as \"county_fips\",\n \"TRACTCE20\" as \"tract\",\n \"BLOCKCE20\" as \"block\",\n \"GEOID20\" as \"block_geoid\",\n \"geometry\"\n from blocks_source\n),\n\nplaces as (\n select\n \"PLACEFP\" as \"place_fips\",\n \"PLACENS\" as \"place_ns\",\n \"GEOID\" as \"place_geoid\",\n \"NAME\" as \"place_name\",\n \"CLASSFP\" as \"class_fips_code\",\n {{ map_class_fips(\"CLASSFP\") }} as \"class_fips\",\n \"geometry\"\n from places_source\n),\n\nfootprints_with_blocks as (\n {{ spatial_join_with_deduplication(\n \"footprints\",\n \"blocks\",\n ['\"release\"', '\"capture_dates_range\"'],\n ['\"county_fips\"', '\"tract\"', '\"block\"', '\"block_geoid\"'],\n left_geom='\"geometry\"',\n right_geom='\"geometry\"',\n kind=\"inner\",\n prefix=\"b\",\n ) }}\n),\n\nfootprints_with_blocks_and_places as (\n {{ spatial_join_with_deduplication(\n \"footprints_with_blocks\",\n \"places\",\n ['\"release\"', '\"capture_dates_range\"', '\"county_fips\"', '\"tract\"', '\"block\"', '\"block_geoid\"'],\n ['\"place_fips\"', '\"place_ns\"', '\"place_geoid\"', '\"place_name\"', '\"class_fips_code\"', '\"class_fips\"'],\n left_geom='\"geometry\"',\n right_geom='\"geometry\"',\n kind=\"left\",\n prefix=\"p\",\n ) }}\n),\n\nfootprints_with_blocks_and_places_final as (\n select\n *,\n st_area(\"geometry\") as \"area_sqm\"\n from footprints_with_blocks_and_places\n)\n\nselect * from footprints_with_blocks_and_places_final", "language": "sql", "refs": [], "sources": [["building_footprints", "us_building_footprints"], ["tiger_2022", "blocks"], ["tiger_2022", "places"]], "metrics": [], "depends_on": {"macros": ["macro.dse_analytics.map_class_fips", "macro.dse_analytics.spatial_join_with_deduplication"], "nodes": ["source.dse_analytics.building_footprints.us_building_footprints", "source.dse_analytics.tiger_2022.blocks", "source.dse_analytics.tiger_2022.places"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/geo_reference/geo_reference__us_building_footprints_with_tiger.sql", "compiled": true, "compiled_code": "with footprints as (\n select\n \"release\",\n \"capture_dates_range\",\n \"geometry\"\n from RAW_DEV.building_footprints.us_building_footprints\n),\n\nblocks_source as (\n select *\n from RAW_DEV.tiger_2022.blocks\n),\n\nplaces_source as (\n select * from RAW_DEV.tiger_2022.places\n),\n\nblocks as (\n select\n \"COUNTYFP20\" as \"county_fips\",\n \"TRACTCE20\" as \"tract\",\n \"BLOCKCE20\" as \"block\",\n \"GEOID20\" as \"block_geoid\",\n \"geometry\"\n from blocks_source\n),\n\nplaces as (\n select\n \"PLACEFP\" as \"place_fips\",\n \"PLACENS\" as \"place_ns\",\n \"GEOID\" as \"place_geoid\",\n \"NAME\" as \"place_name\",\n \"CLASSFP\" as \"class_fips_code\",\n \n\ncase\n when \"CLASSFP\" = 'M2'\n then 'A military or other defense installation entirely within a place'\n when \"CLASSFP\" = 'C1'\n then 'An active incorporated place that does not serve as a county subdivision equivalent'\n when \"CLASSFP\" = 'U1'\n then 'A census designated place with an official federally recognized name'\n when \"CLASSFP\" = 'U2'\n then 'A census designated place without an official federally recognized name'\n end as \"class_fips\",\n \"geometry\"\n from places_source\n),\n\nfootprints_with_blocks as (\n \n\nwith b_left_model_with_id as (\n select\n /* Generate a temporary ID for footprints. We will need this to group/partition\n by unique footprints further down. We could use a UUID, but integers are\n cheaper to generate and compare. */\n *, seq4() as _tmp_sjoin_id\n from footprints\n),\n\nb_joined as (\n select\n b_left_model_with_id.\"release\",\n b_left_model_with_id.\"capture_dates_range\",\n blocks.\"county_fips\",\n blocks.\"tract\",\n blocks.\"block\",\n blocks.\"block_geoid\",\n b_left_model_with_id.\"geometry\",\n /* We don't actually need the intersection for every geometry, only for the\n ones that intersect more than one. However, in order to establish which\n ones intersect more than one, we need a windowed COUNT partitioned by\n _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle\n (even though it should already be sorted by _tmp_id). In testing we've found\n that it's cheaper to just do the intersection for all the geometries. */\n st_area(\n st_intersection(b_left_model_with_id.\"geometry\", blocks.\"geometry\")\n ) as _tmp_sjoin_intersection,\n b_left_model_with_id._tmp_sjoin_id\n from b_left_model_with_id\n inner join blocks\n on st_intersects(b_left_model_with_id.\"geometry\", blocks.\"geometry\")\n),\n\nb_deduplicated as (\n select\n -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.\n -- Fortunately, we know that the geometries are identical when partitioned\n -- by _tmp_sjoin_id, so we can just choose any_value.\n any_value(\"geometry\") as \"geometry\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"release\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"release\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"capture_dates_range\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"capture_dates_range\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"county_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"county_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"tract\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"tract\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block_geoid\"\n from b_joined\n group by _tmp_sjoin_id\n)\n\nselect * from b_deduplicated\n),\n\nfootprints_with_blocks_and_places as (\n \n\nwith p_left_model_with_id as (\n select\n /* Generate a temporary ID for footprints. We will need this to group/partition\n by unique footprints further down. We could use a UUID, but integers are\n cheaper to generate and compare. */\n *, seq4() as _tmp_sjoin_id\n from footprints_with_blocks\n),\n\np_joined as (\n select\n p_left_model_with_id.\"release\",\n p_left_model_with_id.\"capture_dates_range\",\n p_left_model_with_id.\"county_fips\",\n p_left_model_with_id.\"tract\",\n p_left_model_with_id.\"block\",\n p_left_model_with_id.\"block_geoid\",\n places.\"place_fips\",\n places.\"place_ns\",\n places.\"place_geoid\",\n places.\"place_name\",\n places.\"class_fips_code\",\n places.\"class_fips\",\n p_left_model_with_id.\"geometry\",\n /* We don't actually need the intersection for every geometry, only for the\n ones that intersect more than one. However, in order to establish which\n ones intersect more than one, we need a windowed COUNT partitioned by\n _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle\n (even though it should already be sorted by _tmp_id). In testing we've found\n that it's cheaper to just do the intersection for all the geometries. */\n st_area(\n st_intersection(p_left_model_with_id.\"geometry\", places.\"geometry\")\n ) as _tmp_sjoin_intersection,\n p_left_model_with_id._tmp_sjoin_id\n from p_left_model_with_id\n left join places\n on st_intersects(p_left_model_with_id.\"geometry\", places.\"geometry\")\n),\n\np_deduplicated as (\n select\n -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.\n -- Fortunately, we know that the geometries are identical when partitioned\n -- by _tmp_sjoin_id, so we can just choose any_value.\n any_value(\"geometry\") as \"geometry\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"release\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"release\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"capture_dates_range\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"capture_dates_range\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"county_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"county_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"tract\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"tract\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"block_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"block_geoid\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_fips\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_ns\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_ns\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_geoid\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_geoid\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"place_name\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"place_name\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"class_fips_code\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"class_fips_code\",\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by(\"class_fips\", coalesce(_tmp_sjoin_intersection, 1.0)) as \"class_fips\"\n from p_joined\n group by _tmp_sjoin_id\n)\n\nselect * from p_deduplicated\n),\n\nfootprints_with_blocks_and_places_final as (\n select\n *,\n st_area(\"geometry\") as \"area_sqm\"\n from footprints_with_blocks_and_places\n)\n\nselect * from footprints_with_blocks_and_places_final", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.stg_department_of_finance__entities": {"database": "TRANSFORM_DEV", "schema": "ci_should_not_create_this_schema_department_of_finance", "name": "stg_department_of_finance__entities", "resource_type": "model", "package_name": "dse_analytics", "path": "staging/department_of_finance/stg_department_of_finance__entities.sql", "original_file_path": "models/staging/department_of_finance/stg_department_of_finance__entities.sql", "unique_id": "model.dse_analytics.stg_department_of_finance__entities", "fqn": ["dse_analytics", "staging", "department_of_finance", "stg_department_of_finance__entities"], "alias": "stg_department_of_finance__entities", "checksum": {"name": "sha256", "checksum": "bb3d4a6736c5b02840f8cf22888cae85d66cede925e1efa8626ab6aa2105250c"}, "config": {"enabled": true, "alias": null, "schema": "department_of_finance", "database": "TRANSFORM_DEV", "tags": [], "meta": {}, "group": null, "materialized": "table", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "sql_header": "\n\n-- Warning! The SQL header is rendered separately from the rest of the template,\n-- so we redefine the udf_schema in this block:\n-- https://github.com/dbt-labs/dbt-core/issues/2793\n\n\ncreate or replace temp function\n PUBLIC.reorder_name_for_alphabetization(name string)\nreturns string\nlanguage javascript\nas\n $$\n // Replace fancy quotes with normal ones.\n const name = NAME.replace(\"\u2019\", \"'\");\n\n // Skip some exceptions\n const skip = [\"Governor's Office\"];\n if (skip.includes(name)) {\n return name;\n }\n\n // Annoying exceptions\n if (name.includes(\"Milton Marks\") && name.includes(\"Little Hoover\")) {\n return \"Little Hoover Commission\";\n }\n\n // Basic organizational types by which we don't want to organize.\n const patterns = [\n \"Office of the Secretary (?:for|of)?\",\n \"Commission (?:on|for)?\",\n \"Board of Governors (?:for|of)?\",\n \"Board (?:of|on|for)?\",\n \"Agency (?:on|for)?\",\n \"(?:Department|Dept\\\\.) of\",\n \"Commission (?:on|for)?\",\n \"Committee (?:on|for)?\",\n \"Bureau of\",\n \"Council on\",\n \"Policy Council on\",\n \"Institute of\",\n \"Office (?:for|of)?\",\n \"Secretary (?:for|of)?\",\n \"\", // Empty pattern to catch the prefixes below.\n ].map(\n // Lots of entities also start with throat clearing like \"California this\"\n // or \"State that\", which we also want to skip. Some also include a definite\n // article after the organizational unit.\n (p) =>\n \"(?:California\\\\s+)?(?:Governor's\\\\s+)?(?:State\\\\s+|St\\\\.\\\\s+)?(?:Intergovernmental\\\\s+)?\" +\n p +\n \"(?:\\\\s+the)?\"\n );\n\n const all_patterns = `(${patterns.join(\"|\")})`;\n const re = RegExp(`^${all_patterns}\\\\s*(.+)$`); // \\s* because some of the above eat spaces.\n const match = name.match(re);\n // Empty prefixes are matched, so skip if we don't get a full match.\n if (match && match[1] && match[2]) {\n return `${match[2].trim()}, ${match[1].trim()}`;\n } else {\n return name;\n }\n$$\n;\n\ncreate or replace temp function PUBLIC.extract_name(name string)\nreturns string\nlanguage javascript\nas $$\n const match = NAME.match(/^(.+?)(?:(?:\\s*\\(.*\\)\\s*|\\s*[-\u2013]+\\s*[A-Z/ ]+)*)$/);\n if (match && match[1]) {\n return match[1];\n }\n return NAME;\n$$\n;", "post-hook": [], "pre-hook": []}, "tags": [], "description": "Cleaned list of state entities per department of finance.\n", "columns": {"name": {"name": "name", "description": "Name of the entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "primary_code": {"name": "primary_code", "description": "The most specific non-null entity code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "agency_code": {"name": "agency_code", "description": "Agency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "subagency_code": {"name": "subagency_code", "description": "Subagency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L1": {"name": "L1", "description": "Level beneath subagency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L2": {"name": "L2", "description": "Level beneath L1", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L3": {"name": "L3", "description": "Level beneath L2", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "parenthetical": {"name": "parenthetical", "description": "Any text extracted from a paranthetical statement in the original text\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "do_not_use": {"name": "do_not_use", "description": "Whether any entity features \"DO NOT USE\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "abolished": {"name": "abolished", "description": "Whether the entity features \"abolished\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "restricted_use": {"name": "restricted_use", "description": "Whether the entity contains a directive like \"DOF USE ONLY\" or \"SCO USE ONLY\"\nin the description.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_raw": {"name": "name_raw", "description": "The original name, as well as any parentheticals or directives for the entity.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_alpha": {"name": "name_alpha", "description": "The name with things like \"Office of\" moved to the end,\nsuitable for alphabetization.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "ucm_level": {"name": "ucm_level", "description": "The level in the hierarchy of the Uniform Control Manual\n(agency, subagency, L1, L2, or L3)\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/staging/department_of_finance/_department_of_finance__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"database": "{{ env_var('DBT_TRANSFORM_DB', 'TRANSFORM_DEV') }}", "schema": "department_of_finance", "materialized": "table", "sql_header": "\n\n-- Warning! The SQL header is rendered separately from the rest of the template,\n-- so we redefine the udf_schema in this block:\n-- https://github.com/dbt-labs/dbt-core/issues/2793\n\n\ncreate or replace temp function\n PUBLIC.reorder_name_for_alphabetization(name string)\nreturns string\nlanguage javascript\nas\n $$\n // Replace fancy quotes with normal ones.\n const name = NAME.replace(\"\u2019\", \"'\");\n\n // Skip some exceptions\n const skip = [\"Governor's Office\"];\n if (skip.includes(name)) {\n return name;\n }\n\n // Annoying exceptions\n if (name.includes(\"Milton Marks\") && name.includes(\"Little Hoover\")) {\n return \"Little Hoover Commission\";\n }\n\n // Basic organizational types by which we don't want to organize.\n const patterns = [\n \"Office of the Secretary (?:for|of)?\",\n \"Commission (?:on|for)?\",\n \"Board of Governors (?:for|of)?\",\n \"Board (?:of|on|for)?\",\n \"Agency (?:on|for)?\",\n \"(?:Department|Dept\\\\.) of\",\n \"Commission (?:on|for)?\",\n \"Committee (?:on|for)?\",\n \"Bureau of\",\n \"Council on\",\n \"Policy Council on\",\n \"Institute of\",\n \"Office (?:for|of)?\",\n \"Secretary (?:for|of)?\",\n \"\", // Empty pattern to catch the prefixes below.\n ].map(\n // Lots of entities also start with throat clearing like \"California this\"\n // or \"State that\", which we also want to skip. Some also include a definite\n // article after the organizational unit.\n (p) =>\n \"(?:California\\\\s+)?(?:Governor's\\\\s+)?(?:State\\\\s+|St\\\\.\\\\s+)?(?:Intergovernmental\\\\s+)?\" +\n p +\n \"(?:\\\\s+the)?\"\n );\n\n const all_patterns = `(${patterns.join(\"|\")})`;\n const re = RegExp(`^${all_patterns}\\\\s*(.+)$`); // \\s* because some of the above eat spaces.\n const match = name.match(re);\n // Empty prefixes are matched, so skip if we don't get a full match.\n if (match && match[1] && match[2]) {\n return `${match[2].trim()}, ${match[1].trim()}`;\n } else {\n return name;\n }\n$$\n;\n\ncreate or replace temp function PUBLIC.extract_name(name string)\nreturns string\nlanguage javascript\nas $$\n const match = NAME.match(/^(.+?)(?:(?:\\s*\\(.*\\)\\s*|\\s*[-\u2013]+\\s*[A-Z/ ]+)*)$/);\n if (match && match[1]) {\n return match[1];\n }\n return NAME;\n$$\n;"}, "created_at": 1701973267.851232, "relation_name": "TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities", "raw_code": "{{ config(materialized=\"table\") }}\n\n{% set udf_schema = \"PUBLIC\" %}\n\n{% call set_sql_header(config) %}\n\n-- Warning! The SQL header is rendered separately from the rest of the template,\n-- so we redefine the udf_schema in this block:\n-- https://github.com/dbt-labs/dbt-core/issues/2793\n{% set udf_schema = \"PUBLIC\" %}\n\ncreate or replace temp function\n {{ udf_schema }}.reorder_name_for_alphabetization(name string)\nreturns string\nlanguage javascript\nas\n $$\n // Replace fancy quotes with normal ones.\n const name = NAME.replace(\"\u2019\", \"'\");\n\n // Skip some exceptions\n const skip = [\"Governor's Office\"];\n if (skip.includes(name)) {\n return name;\n }\n\n // Annoying exceptions\n if (name.includes(\"Milton Marks\") && name.includes(\"Little Hoover\")) {\n return \"Little Hoover Commission\";\n }\n\n // Basic organizational types by which we don't want to organize.\n const patterns = [\n \"Office of the Secretary (?:for|of)?\",\n \"Commission (?:on|for)?\",\n \"Board of Governors (?:for|of)?\",\n \"Board (?:of|on|for)?\",\n \"Agency (?:on|for)?\",\n \"(?:Department|Dept\\\\.) of\",\n \"Commission (?:on|for)?\",\n \"Committee (?:on|for)?\",\n \"Bureau of\",\n \"Council on\",\n \"Policy Council on\",\n \"Institute of\",\n \"Office (?:for|of)?\",\n \"Secretary (?:for|of)?\",\n \"\", // Empty pattern to catch the prefixes below.\n ].map(\n // Lots of entities also start with throat clearing like \"California this\"\n // or \"State that\", which we also want to skip. Some also include a definite\n // article after the organizational unit.\n (p) =>\n \"(?:California\\\\s+)?(?:Governor's\\\\s+)?(?:State\\\\s+|St\\\\.\\\\s+)?(?:Intergovernmental\\\\s+)?\" +\n p +\n \"(?:\\\\s+the)?\"\n );\n\n const all_patterns = `(${patterns.join(\"|\")})`;\n const re = RegExp(`^${all_patterns}\\\\s*(.+)$`); // \\s* because some of the above eat spaces.\n const match = name.match(re);\n // Empty prefixes are matched, so skip if we don't get a full match.\n if (match && match[1] && match[2]) {\n return `${match[2].trim()}, ${match[1].trim()}`;\n } else {\n return name;\n }\n$$\n;\n\ncreate or replace temp function {{ udf_schema }}.extract_name(name string)\nreturns string\nlanguage javascript\nas $$\n const match = NAME.match(/^(.+?)(?:(?:\\s*\\(.*\\)\\s*|\\s*[-\u2013]+\\s*[A-Z/ ]+)*)$/);\n if (match && match[1]) {\n return match[1];\n }\n return NAME;\n$$\n;\n{%- endcall %}\n\nwith\nbase_entities as (select * from {{ source(\"state_entities\", \"base_entities\") }}),\n\ninvalid_subagencies as (\n select *\n from base_entities\n where contains(\"name\", 'no subagency') and contains(\"name\", 'do not use')\n),\n\nentities as (\n select\n -- Extract the first portion of the entity as the name. The other\n -- two (optional) groups match parentheticals and things like\n -- \"-- DO NOT USE\" or \" -- DOF USE ONLY\"\n {{ udf_schema }}.extract_name(\"name\") as name,\n coalesce(l3, l2, l1, b, a) as primary_code,\n a as agency_code,\n case\n when b in (select b from invalid_subagencies) then null else b\n end as subagency_code,\n l1,\n l2,\n l3,\n regexp_substr(\"name\", '\\\\((.+?)\\\\)') as parenthetical,\n contains(lower(\"name\"), 'do not use') as do_not_use,\n contains(lower(\"name\"), 'abolished') as abolished,\n regexp_substr(\"name\", '[A-Z/]+ USE ONLY') as restricted_use,\n \"name\" as name_raw\n from base_entities\n),\n\nentities_with_extras as (\n select\n *,\n {{ udf_schema }}.reorder_name_for_alphabetization(name) as name_alpha,\n case\n when coalesce(l3, l2, l1, subagency_code) is null\n then 'agency'\n when coalesce(l3, l2, l1) is null\n then 'subagency'\n when coalesce(l3, l2) is null\n then 'L1'\n when l3 is null\n then 'L2'\n else 'L3'\n end as ucm_level\n from entities\n)\n\nselect *\nfrom entities_with_extras", "language": "sql", "refs": [], "sources": [["state_entities", "base_entities"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.set_sql_header"], "nodes": ["source.dse_analytics.state_entities.base_entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/stg_department_of_finance__entities.sql", "compiled": true, "compiled_code": "\n\n\n\n\n\nwith\nbase_entities as (select * from RAW_DEV.state_entities.base_entities),\n\ninvalid_subagencies as (\n select *\n from base_entities\n where contains(\"name\", 'no subagency') and contains(\"name\", 'do not use')\n),\n\nentities as (\n select\n -- Extract the first portion of the entity as the name. The other\n -- two (optional) groups match parentheticals and things like\n -- \"-- DO NOT USE\" or \" -- DOF USE ONLY\"\n PUBLIC.extract_name(\"name\") as name,\n coalesce(l3, l2, l1, b, a) as primary_code,\n a as agency_code,\n case\n when b in (select b from invalid_subagencies) then null else b\n end as subagency_code,\n l1,\n l2,\n l3,\n regexp_substr(\"name\", '\\\\((.+?)\\\\)') as parenthetical,\n contains(lower(\"name\"), 'do not use') as do_not_use,\n contains(lower(\"name\"), 'abolished') as abolished,\n regexp_substr(\"name\", '[A-Z/]+ USE ONLY') as restricted_use,\n \"name\" as name_raw\n from base_entities\n),\n\nentities_with_extras as (\n select\n *,\n PUBLIC.reorder_name_for_alphabetization(name) as name_alpha,\n case\n when coalesce(l3, l2, l1, subagency_code) is null\n then 'agency'\n when coalesce(l3, l2, l1) is null\n then 'subagency'\n when coalesce(l3, l2) is null\n then 'L1'\n when l3 is null\n then 'L2'\n else 'L3'\n end as ucm_level\n from entities\n)\n\nselect *\nfrom entities_with_extras", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.stg_ebudget__budgets": {"database": "TRANSFORM_DEV", "schema": "ci_should_not_create_this_schema_department_of_finance", "name": "stg_ebudget__budgets", "resource_type": "model", "package_name": "dse_analytics", "path": "staging/department_of_finance/stg_ebudget__budgets.sql", "original_file_path": "models/staging/department_of_finance/stg_ebudget__budgets.sql", "unique_id": "model.dse_analytics.stg_ebudget__budgets", "fqn": ["dse_analytics", "staging", "department_of_finance", "stg_ebudget__budgets"], "alias": "stg_ebudget__budgets", "checksum": {"name": "sha256", "checksum": "6bd9117d5f907a339e5df21d55dfefb575ebb17f2246ff15c0c0b8300d147920"}, "config": {"enabled": true, "alias": null, "schema": "department_of_finance", "database": "TRANSFORM_DEV", "tags": [], "meta": {}, "group": null, "materialized": "view", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "Budget information for all state entities from ebudget.ca.gov", "columns": {"primary_code": {"name": "primary_code", "description": "Four digit business unit code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name": {"name": "name", "description": "The name of the state entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "budget_year_dollars": {"name": "budget_year_dollars", "description": "The budget of the entity for the current budget year", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/staging/department_of_finance/_department_of_finance__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"database": "{{ env_var('DBT_TRANSFORM_DB', 'TRANSFORM_DEV') }}", "schema": "department_of_finance"}, "created_at": 1701973267.852215, "relation_name": "TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_ebudget__budgets", "raw_code": "with\nagencies_and_departments as (\n select *\n from {{ source('state_entities', 'ebudget_agency_and_department_budgets') }}\n),\n\nebudget_budgets as (\n select\n \"web_agency_cd\" as primary_code,\n \"legal_titl\" as name,\n \"all_budget_year_dols\" as budget_year_dollars\n from agencies_and_departments\n)\n\nselect *\nfrom ebudget_budgets", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_agency_and_department_budgets"]], "metrics": [], "depends_on": {"macros": [], "nodes": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/stg_ebudget__budgets.sql", "compiled": true, "compiled_code": "with\nagencies_and_departments as (\n select *\n from RAW_DEV.state_entities.ebudget_agency_and_department_budgets\n),\n\nebudget_budgets as (\n select\n \"web_agency_cd\" as primary_code,\n \"legal_titl\" as name,\n \"all_budget_year_dols\" as budget_year_dollars\n from agencies_and_departments\n)\n\nselect *\nfrom ebudget_budgets", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.int_state_entities__budgets": {"database": "TRANSFORM_DEV", "schema": "ci_should_not_create_this_schema_state_entities", "name": "int_state_entities__budgets", "resource_type": "model", "package_name": "dse_analytics", "path": "intermediate/state_entities/int_state_entities__budgets.sql", "original_file_path": "models/intermediate/state_entities/int_state_entities__budgets.sql", "unique_id": "model.dse_analytics.int_state_entities__budgets", "fqn": ["dse_analytics", "intermediate", "state_entities", "int_state_entities__budgets"], "alias": "int_state_entities__budgets", "checksum": {"name": "sha256", "checksum": "4e47d090b79d649eb716caecc4b1539204d7e55d1b43b85cee3a339bc976451c"}, "config": {"enabled": true, "alias": null, "schema": "state_entities", "database": "TRANSFORM_DEV", "tags": [], "meta": {}, "group": null, "materialized": "view", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "Fiscal year budgets for state entities", "columns": {"primary_code": {"name": "primary_code", "description": "Four digit business unit code for entity.", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name": {"name": "name", "description": "Entity name", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "ucm_level": {"name": "ucm_level", "description": "The level in the hierarchy of the Uniform Control Manual\n(agency, subagency, L1, L2, or L3)\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_alpha": {"name": "name_alpha", "description": "Variant of name for easier alphabetization", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "budget_year_dollars": {"name": "budget_year_dollars", "description": "Budget for current fiscal year.", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/intermediate/state_entities/_int_state_entities__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"database": "{{ env_var('DBT_TRANSFORM_DB', 'TRANSFORM_DEV') }}", "schema": "state_entities", "materialized": "view"}, "created_at": 1701973267.8719447, "relation_name": "TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__budgets", "raw_code": "{{ config(materialized=\"view\") }}\n\nwith\nactive_entities as (select * from {{ ref(\"int_state_entities__active\") }}),\n\nbudgets as (select * from {{ ref(\"stg_ebudget__budgets\") }}),\n\nactive_agencies_and_departments as (\n -- only select at deparment level or higher\n select * from active_entities where coalesce(l2, l3) is null\n),\n\nactive_entity_budgets as (\n select\n active_agencies_and_departments.primary_code,\n active_agencies_and_departments.ucm_level,\n active_agencies_and_departments.name,\n active_agencies_and_departments.name_alpha,\n budgets.name as budget_name,\n budgets.budget_year_dollars\n from active_agencies_and_departments\n left join\n budgets\n on active_agencies_and_departments.primary_code = budgets.primary_code\n)\n\nselect *\nfrom active_entity_budgets\norder by primary_code asc", "language": "sql", "refs": [{"name": "int_state_entities__active", "package": null, "version": null}, {"name": "stg_ebudget__budgets", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": [], "nodes": ["model.dse_analytics.int_state_entities__active", "model.dse_analytics.stg_ebudget__budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__budgets.sql", "compiled": true, "compiled_code": "\n\nwith\nactive_entities as (select * from TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active),\n\nbudgets as (select * from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_ebudget__budgets),\n\nactive_agencies_and_departments as (\n -- only select at deparment level or higher\n select * from active_entities where coalesce(l2, l3) is null\n),\n\nactive_entity_budgets as (\n select\n active_agencies_and_departments.primary_code,\n active_agencies_and_departments.ucm_level,\n active_agencies_and_departments.name,\n active_agencies_and_departments.name_alpha,\n budgets.name as budget_name,\n budgets.budget_year_dollars\n from active_agencies_and_departments\n left join\n budgets\n on active_agencies_and_departments.primary_code = budgets.primary_code\n)\n\nselect *\nfrom active_entity_budgets\norder by primary_code asc", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.int_state_entities__active": {"database": "TRANSFORM_DEV", "schema": "ci_should_not_create_this_schema_state_entities", "name": "int_state_entities__active", "resource_type": "model", "package_name": "dse_analytics", "path": "intermediate/state_entities/int_state_entities__active.sql", "original_file_path": "models/intermediate/state_entities/int_state_entities__active.sql", "unique_id": "model.dse_analytics.int_state_entities__active", "fqn": ["dse_analytics", "intermediate", "state_entities", "int_state_entities__active"], "alias": "int_state_entities__active", "checksum": {"name": "sha256", "checksum": "a5020da59a9a0706b7d8404f6b43691d94f5a0af14a68c92b17a8ee85a845e80"}, "config": {"enabled": true, "alias": null, "schema": "state_entities", "database": "TRANSFORM_DEV", "tags": [], "meta": {}, "group": null, "materialized": "view", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "Active state entities from the Department of Finance list.\nEntities which are flagged as \"DO NOT USE\", \"abolished\", or\nare technical entities (e.g. \"DOF USE ONLY\") are filtered out.\n", "columns": {"name": {"name": "name", "description": "Name of the entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "primary_code": {"name": "primary_code", "description": "The most specific non-null entity code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "agency_code": {"name": "agency_code", "description": "Agency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "subagency_code": {"name": "subagency_code", "description": "Subagency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L1": {"name": "L1", "description": "Level beneath subagency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L2": {"name": "L2", "description": "Level beneath L1", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L3": {"name": "L3", "description": "Level beneath L2", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "parenthetical": {"name": "parenthetical", "description": "Any text extracted from a paranthetical statement in the original text\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "do_not_use": {"name": "do_not_use", "description": "Whether any entity features \"DO NOT USE\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "abolished": {"name": "abolished", "description": "Whether the entity features \"abolished\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "restricted_use": {"name": "restricted_use", "description": "Whether the entity contains a directive like \"DOF USE ONLY\" or \"SCO USE ONLY \"\nin the description.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_raw": {"name": "name_raw", "description": "The original name, as well as any parentheticals or directives for the entity.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_alpha": {"name": "name_alpha", "description": "The name with things like \"Office of\" moved to the end,\nsuitable for alphabetization.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "ucm_level": {"name": "ucm_level", "description": "The level in the hierarchy of the Uniform Control Manual\n(agency, subagency, L1, L2, or L3)\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/intermediate/state_entities/_int_state_entities__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"database": "{{ env_var('DBT_TRANSFORM_DB', 'TRANSFORM_DEV') }}", "schema": "state_entities", "materialized": "view"}, "created_at": 1701973267.8697274, "relation_name": "TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active", "raw_code": "{{ config(materialized=\"view\") }}\n\nwith\nactive_entities as (\n select *\n from {{ ref(\"stg_department_of_finance__entities\") }}\n where\n do_not_use = false\n and abolished = false\n and restricted_use is null\n and cast(primary_code as int) < 9000\n and not regexp_like(lower(name_raw), 'moved to|renum\\.? to')\n)\n\nselect *\nfrom active_entities", "language": "sql", "refs": [{"name": "stg_department_of_finance__entities", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": [], "nodes": ["model.dse_analytics.stg_department_of_finance__entities"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__active.sql", "compiled": true, "compiled_code": "\n\nwith\nactive_entities as (\n select *\n from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities\n where\n do_not_use = false\n and abolished = false\n and restricted_use is null\n and cast(primary_code as int) < 9000\n and not regexp_like(lower(name_raw), 'moved to|renum\\.? to')\n)\n\nselect *\nfrom active_entities", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "model.dse_analytics.int_state_entities__technical": {"database": "TRANSFORM_DEV", "schema": "ci_should_not_create_this_schema_state_entities", "name": "int_state_entities__technical", "resource_type": "model", "package_name": "dse_analytics", "path": "intermediate/state_entities/int_state_entities__technical.sql", "original_file_path": "models/intermediate/state_entities/int_state_entities__technical.sql", "unique_id": "model.dse_analytics.int_state_entities__technical", "fqn": ["dse_analytics", "intermediate", "state_entities", "int_state_entities__technical"], "alias": "int_state_entities__technical", "checksum": {"name": "sha256", "checksum": "7cb56d8248796a920e7c7d8586f39636276507a7a93d6944fb3bd13d0e7ddadc"}, "config": {"enabled": true, "alias": null, "schema": "state_entities", "database": "TRANSFORM_DEV", "tags": [], "meta": {}, "group": null, "materialized": "view", "incremental_strategy": null, "persist_docs": {}, "quoting": {}, "column_types": {}, "full_refresh": null, "unique_key": null, "on_schema_change": "ignore", "on_configuration_change": "apply", "grants": {}, "packages": [], "docs": {"show": true, "node_color": null}, "contract": {"enforced": false}, "post-hook": [], "pre-hook": []}, "tags": [], "description": "Acitve technical entities from the Department of Finance list.\n", "columns": {"name": {"name": "name", "description": "Name of the entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "primary_code": {"name": "primary_code", "description": "The most specific non-null entity code", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "agency_code": {"name": "agency_code", "description": "Agency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "subagency_code": {"name": "subagency_code", "description": "Subagency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L1": {"name": "L1", "description": "Level beneath subagency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L2": {"name": "L2", "description": "Level beneath L1", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L3": {"name": "L3", "description": "Level beneath L2", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "parenthetical": {"name": "parenthetical", "description": "Any text extracted from a paranthetical statement in the original text\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "do_not_use": {"name": "do_not_use", "description": "Whether any entity features \"DO NOT USE\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "abolished": {"name": "abolished", "description": "Whether the entity features \"abolished\" in the description", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "restricted_use": {"name": "restricted_use", "description": "Whether the entity contains a directive like \"DOF USE ONLY\" or \"SCO USE ONLY \"\nin the description.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_raw": {"name": "name_raw", "description": "The original name, as well as any parentheticals or directives for the entity.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "name_alpha": {"name": "name_alpha", "description": "The name with things like \"Office of\" moved to the end,\nsuitable for alphabetization.\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "ucm_level": {"name": "ucm_level", "description": "The level in the hierarchy of the Uniform Control Manual\n(agency, subagency, L1, L2, or L3)\n", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://models/intermediate/state_entities/_int_state_entities__models.yml", "build_path": null, "deferred": false, "unrendered_config": {"database": "{{ env_var('DBT_TRANSFORM_DB', 'TRANSFORM_DEV') }}", "schema": "state_entities", "materialized": "view"}, "created_at": 1701973267.870794, "relation_name": "TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__technical", "raw_code": "{{ config(materialized=\"view\") }}\n\nwith\ntechnical_entities as (\n select *\n from {{ ref(\"stg_department_of_finance__entities\") }}\n where\n (do_not_use = false and abolished = false)\n and (restricted_use is not null or cast(primary_code as int) >= 9000)\n)\n\nselect *\nfrom technical_entities", "language": "sql", "refs": [{"name": "stg_department_of_finance__entities", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": [], "nodes": ["model.dse_analytics.stg_department_of_finance__entities"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/int_state_entities__technical.sql", "compiled": true, "compiled_code": "\n\nwith\ntechnical_entities as (\n select *\n from TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities\n where\n (do_not_use = false and abolished = false)\n and (restricted_use is not null or cast(primary_code as int) >= 9000)\n)\n\nselect *\nfrom technical_entities", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "access": "protected", "constraints": [], "version": null, "latest_version": null, "deprecation_date": null}, "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "name", "model": "{{ get_where_subquery(ref('dim_state_entities__agencies')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "unique_dim_state_entities__agencies_name", "resource_type": "test", "package_name": "dse_analytics", "path": "unique_dim_state_entities__agencies_name.sql", "original_file_path": "models/marts/state_entities/_state_entities__models.yml", "unique_id": "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21", "fqn": ["dse_analytics", "marts", "state_entities", "unique_dim_state_entities__agencies_name"], "alias": "unique_dim_state_entities__agencies_name", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8222969, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "dim_state_entities__agencies", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.dim_state_entities__agencies"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_name.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n name as unique_field,\n count(*) as n_records\n\nfrom ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies\nwhere name is not null\ngroup by name\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "name", "file_key_name": "models.dim_state_entities__agencies", "attached_node": "model.dse_analytics.dim_state_entities__agencies"}, "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "name", "model": "{{ get_where_subquery(ref('dim_state_entities__agencies')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_dim_state_entities__agencies_name", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_dim_state_entities__agencies_name.sql", "original_file_path": "models/marts/state_entities/_state_entities__models.yml", "unique_id": "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b", "fqn": ["dse_analytics", "marts", "state_entities", "not_null_dim_state_entities__agencies_name"], "alias": "not_null_dim_state_entities__agencies_name", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8240716, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "dim_state_entities__agencies", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.dim_state_entities__agencies"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_name.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect name\nfrom ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies\nwhere name is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "name", "file_key_name": "models.dim_state_entities__agencies", "attached_node": "model.dse_analytics.dim_state_entities__agencies"}, "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "agency_code", "model": "{{ get_where_subquery(ref('dim_state_entities__agencies')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "unique_dim_state_entities__agencies_agency_code", "resource_type": "test", "package_name": "dse_analytics", "path": "unique_dim_state_entities__agencies_agency_code.sql", "original_file_path": "models/marts/state_entities/_state_entities__models.yml", "unique_id": "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291", "fqn": ["dse_analytics", "marts", "state_entities", "unique_dim_state_entities__agencies_agency_code"], "alias": "unique_dim_state_entities__agencies_agency_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8257194, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "dim_state_entities__agencies", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.dim_state_entities__agencies"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/unique_dim_state_entities__agencies_agency_code.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n agency_code as unique_field,\n count(*) as n_records\n\nfrom ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies\nwhere agency_code is not null\ngroup by agency_code\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "agency_code", "file_key_name": "models.dim_state_entities__agencies", "attached_node": "model.dse_analytics.dim_state_entities__agencies"}, "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "agency_code", "model": "{{ get_where_subquery(ref('dim_state_entities__agencies')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_dim_state_entities__agencies_agency_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_dim_state_entities__agencies_agency_code.sql", "original_file_path": "models/marts/state_entities/_state_entities__models.yml", "unique_id": "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e", "fqn": ["dse_analytics", "marts", "state_entities", "not_null_dim_state_entities__agencies_agency_code"], "alias": "not_null_dim_state_entities__agencies_agency_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8273656, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "dim_state_entities__agencies", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.dim_state_entities__agencies"]}, "compiled_path": "target/compiled/dse_analytics/models/marts/state_entities/_state_entities__models.yml/not_null_dim_state_entities__agencies_agency_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect agency_code\nfrom ANALYTICS_DEV.ci_should_not_create_this_schema_state_entities.dim_state_entities__agencies\nwhere agency_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "agency_code", "file_key_name": "models.dim_state_entities__agencies", "attached_node": "model.dse_analytics.dim_state_entities__agencies"}, "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('stg_department_of_finance__entities')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_stg_department_of_finance__entities_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_stg_department_of_finance__entities_primary_code.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014", "fqn": ["dse_analytics", "staging", "department_of_finance", "not_null_stg_department_of_finance__entities_primary_code"], "alias": "not_null_stg_department_of_finance__entities_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8531604, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "stg_department_of_finance__entities", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.stg_department_of_finance__entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_department_of_finance__entities_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect primary_code\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_department_of_finance__entities\nwhere primary_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.stg_department_of_finance__entities", "attached_node": "model.dse_analytics.stg_department_of_finance__entities"}, "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('stg_ebudget__budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_stg_ebudget__budgets_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_stg_ebudget__budgets_primary_code.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121", "fqn": ["dse_analytics", "staging", "department_of_finance", "not_null_stg_ebudget__budgets_primary_code"], "alias": "not_null_stg_ebudget__budgets_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8548646, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "stg_ebudget__budgets", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.stg_ebudget__budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/not_null_stg_ebudget__budgets_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect primary_code\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_department_of_finance.stg_ebudget__budgets\nwhere primary_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.stg_ebudget__budgets", "attached_node": "model.dse_analytics.stg_ebudget__budgets"}, "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('int_state_entities__active')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_int_state_entities__active_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_int_state_entities__active_primary_code.sql", "original_file_path": "models/intermediate/state_entities/_int_state_entities__models.yml", "unique_id": "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863", "fqn": ["dse_analytics", "intermediate", "state_entities", "not_null_int_state_entities__active_primary_code"], "alias": "not_null_int_state_entities__active_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8728776, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "int_state_entities__active", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.int_state_entities__active"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__active_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect primary_code\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active\nwhere primary_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.int_state_entities__active", "attached_node": "model.dse_analytics.int_state_entities__active"}, "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('int_state_entities__active')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "unique_int_state_entities__active_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "unique_int_state_entities__active_primary_code.sql", "original_file_path": "models/intermediate/state_entities/_int_state_entities__models.yml", "unique_id": "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe", "fqn": ["dse_analytics", "intermediate", "state_entities", "unique_int_state_entities__active_primary_code"], "alias": "unique_int_state_entities__active_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8745892, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "int_state_entities__active", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.int_state_entities__active"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/unique_int_state_entities__active_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n primary_code as unique_field,\n count(*) as n_records\n\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__active\nwhere primary_code is not null\ngroup by primary_code\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.int_state_entities__active", "attached_node": "model.dse_analytics.int_state_entities__active"}, "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('int_state_entities__technical')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_int_state_entities__technical_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_int_state_entities__technical_primary_code.sql", "original_file_path": "models/intermediate/state_entities/_int_state_entities__models.yml", "unique_id": "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772", "fqn": ["dse_analytics", "intermediate", "state_entities", "not_null_int_state_entities__technical_primary_code"], "alias": "not_null_int_state_entities__technical_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.876234, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "int_state_entities__technical", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.int_state_entities__technical"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__technical_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect primary_code\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__technical\nwhere primary_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.int_state_entities__technical", "attached_node": "model.dse_analytics.int_state_entities__technical"}, "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "primary_code", "model": "{{ get_where_subquery(ref('int_state_entities__budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "not_null_int_state_entities__budgets_primary_code", "resource_type": "test", "package_name": "dse_analytics", "path": "not_null_int_state_entities__budgets_primary_code.sql", "original_file_path": "models/intermediate/state_entities/_int_state_entities__models.yml", "unique_id": "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f", "fqn": ["dse_analytics", "intermediate", "state_entities", "not_null_int_state_entities__budgets_primary_code"], "alias": "not_null_int_state_entities__budgets_primary_code", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.8780456, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [{"name": "int_state_entities__budgets", "package": null, "version": null}], "sources": [], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["model.dse_analytics.int_state_entities__budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/intermediate/state_entities/_int_state_entities__models.yml/not_null_int_state_entities__budgets_primary_code.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect primary_code\nfrom TRANSFORM_DEV.ci_should_not_create_this_schema_state_entities.int_state_entities__budgets\nwhere primary_code is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "primary_code", "file_key_name": "models.int_state_entities__budgets", "attached_node": "model.dse_analytics.int_state_entities__budgets"}, "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63": {"test_metadata": {"name": "unique_combination_of_columns", "kwargs": {"combination_of_columns": ["A", "B", "L1", "L2", "L3"], "model": "{{ get_where_subquery(source('state_entities', 'base_entities')) }}"}, "namespace": "dbt_utils"}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3", "resource_type": "test", "package_name": "dse_analytics", "path": "dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63", "fqn": ["dse_analytics", "staging", "department_of_finance", "dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3"], "alias": "dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571"}, "created_at": 1701973267.8925402, "relation_name": null, "raw_code": "{{ dbt_utils.test_unique_combination_of_columns(**_dbt_generic_test_kwargs) }}{{ config(alias=\"dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "base_entities"]], "metrics": [], "depends_on": {"macros": ["macro.dbt_utils.test_unique_combination_of_columns", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.base_entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/dbt_utils_source_unique_combin_cfd0d0475e06134eaf6d360efc1db571.sql", "compiled": true, "compiled_code": "\n\n\n\n\n\nwith validation_errors as (\n\n select\n A, B, L1, L2, L3\n from RAW_DEV.state_entities.base_entities\n group by A, B, L1, L2, L3\n having count(*) > 1\n\n)\n\nselect *\nfrom validation_errors\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": null, "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "\"A\"", "model": "{{ get_where_subquery(source('state_entities', 'base_entities')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_not_null_state_entities_base_entities__A_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_not_null_state_entities_base_entities__A_.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_not_null_state_entities_base_entities__A_"], "alias": "source_not_null_state_entities_base_entities__A_", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.905643, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [], "sources": [["state_entities", "base_entities"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.base_entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__A_.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect \"A\"\nfrom RAW_DEV.state_entities.base_entities\nwhere \"A\" is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"A\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "\"L3\"", "model": "{{ get_where_subquery(source('state_entities', 'base_entities')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_unique_state_entities_base_entities__L3_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_unique_state_entities_base_entities__L3_.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_unique_state_entities_base_entities__L3_"], "alias": "source_unique_state_entities_base_entities__L3_", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.9073033, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [], "sources": [["state_entities", "base_entities"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.base_entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_base_entities__L3_.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n \"L3\" as unique_field,\n count(*) as n_records\n\nfrom RAW_DEV.state_entities.base_entities\nwhere \"L3\" is not null\ngroup by \"L3\"\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"L3\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "\"name\"", "model": "{{ get_where_subquery(source('state_entities', 'base_entities')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_not_null_state_entities_base_entities__name_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_not_null_state_entities_base_entities__name_.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_not_null_state_entities_base_entities__name_"], "alias": "source_not_null_state_entities_base_entities__name_", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": null, "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {}, "created_at": 1701973267.9089704, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}", "language": "sql", "refs": [], "sources": [["state_entities", "base_entities"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.base_entities"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_base_entities__name_.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect \"name\"\nfrom RAW_DEV.state_entities.base_entities\nwhere \"name\" is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"name\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "\"org_cd\"", "model": "{{ get_where_subquery(source('state_entities', 'ebudget_agency_and_department_budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_"], "alias": "source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d"}, "created_at": 1701973267.911053, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}{{ config(alias=\"source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_agency_and_department_budgets"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_9d354781a7b07309ffe80cdbe131987d.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect \"org_cd\"\nfrom RAW_DEV.state_entities.ebudget_agency_and_department_budgets\nwhere \"org_cd\" is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"org_cd\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "\"org_cd\"", "model": "{{ get_where_subquery(source('state_entities', 'ebudget_agency_and_department_budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_"], "alias": "source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff"}, "created_at": 1701973267.9127102, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}{{ config(alias=\"source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_agency_and_department_budgets"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_40b6c3eeef1a8f43330c0939302d30ff.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n \"org_cd\" as unique_field,\n count(*) as n_records\n\nfrom RAW_DEV.state_entities.ebudget_agency_and_department_budgets\nwhere \"org_cd\" is not null\ngroup by \"org_cd\"\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"org_cd\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "\"web_agency_cd\"", "model": "{{ get_where_subquery(source('state_entities', 'ebudget_agency_and_department_budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_"], "alias": "source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d"}, "created_at": 1701973267.9143248, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}{{ config(alias=\"source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_agency_and_department_budgets"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_b79497e81eb4a43013fc09604ea6799d.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect \"web_agency_cd\"\nfrom RAW_DEV.state_entities.ebudget_agency_and_department_budgets\nwhere \"web_agency_cd\" is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"web_agency_cd\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75": {"test_metadata": {"name": "unique", "kwargs": {"column_name": "\"web_agency_cd\"", "model": "{{ get_where_subquery(source('state_entities', 'ebudget_agency_and_department_budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_"], "alias": "source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04"}, "created_at": 1701973267.9161546, "relation_name": null, "raw_code": "{{ test_unique(**_dbt_generic_test_kwargs) }}{{ config(alias=\"source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_agency_and_department_budgets"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_unique", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_unique_state_entities_e_d76d60ebd57bc020a2453cc723e3be04.sql", "compiled": true, "compiled_code": "\n \n \n\nselect\n \"web_agency_cd\" as unique_field,\n count(*) as n_records\n\nfrom RAW_DEV.state_entities.ebudget_agency_and_department_budgets\nwhere \"web_agency_cd\" is not null\ngroup by \"web_agency_cd\"\nhaving count(*) > 1\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"web_agency_cd\"", "file_key_name": "sources.state_entities", "attached_node": null}, "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43": {"test_metadata": {"name": "not_null", "kwargs": {"column_name": "\"program_code\"", "model": "{{ get_where_subquery(source('state_entities', 'ebudget_program_budgets')) }}"}, "namespace": null}, "database": "ANALYTICS_DEV", "schema": "ci_should_not_create_this_schema_dbt_test__audit", "name": "source_not_null_state_entities_ebudget_program_budgets__program_code_", "resource_type": "test", "package_name": "dse_analytics", "path": "source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed.sql", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43", "fqn": ["dse_analytics", "staging", "department_of_finance", "source_not_null_state_entities_ebudget_program_budgets__program_code_"], "alias": "source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed", "checksum": {"name": "none", "checksum": ""}, "config": {"enabled": true, "alias": "source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed", "schema": "dbt_test__audit", "database": null, "tags": [], "meta": {}, "group": null, "materialized": "test", "severity": "ERROR", "store_failures": null, "where": null, "limit": null, "fail_calc": "count(*)", "warn_if": "!= 0", "error_if": "!= 0"}, "tags": [], "description": "", "columns": {}, "meta": {}, "group": null, "docs": {"show": true, "node_color": null}, "patch_path": null, "build_path": null, "deferred": false, "unrendered_config": {"alias": "source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed"}, "created_at": 1701973267.9180367, "relation_name": null, "raw_code": "{{ test_not_null(**_dbt_generic_test_kwargs) }}{{ config(alias=\"source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed\") }}", "language": "sql", "refs": [], "sources": [["state_entities", "ebudget_program_budgets"]], "metrics": [], "depends_on": {"macros": ["macro.dbt.test_not_null", "macro.dbt.get_where_subquery"], "nodes": ["source.dse_analytics.state_entities.ebudget_program_budgets"]}, "compiled_path": "target/compiled/dse_analytics/models/staging/department_of_finance/_department_of_finance__models.yml/source_not_null_state_entities_1e716cc516fd8536dae58edde10efeed.sql", "compiled": true, "compiled_code": "\n \n \n\n\n\nselect \"program_code\"\nfrom RAW_DEV.state_entities.ebudget_program_budgets\nwhere \"program_code\" is null\n\n\n", "extra_ctes_injected": true, "extra_ctes": [], "contract": {"enforced": false, "checksum": null}, "column_name": "\"program_code\"", "file_key_name": "sources.state_entities", "attached_node": null}}, "sources": {"source.dse_analytics.building_footprints.us_building_footprints": {"database": "RAW_DEV", "schema": "building_footprints", "name": "us_building_footprints", "resource_type": "source", "package_name": "dse_analytics", "path": "models/marts/geo_reference/_geo_reference__models.yml", "original_file_path": "models/marts/geo_reference/_geo_reference__models.yml", "unique_id": "source.dse_analytics.building_footprints.us_building_footprints", "fqn": ["dse_analytics", "marts", "geo_reference", "building_footprints", "us_building_footprints"], "source_name": "building_footprints", "source_description": "", "loader": "", "identifier": "us_building_footprints", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "[Microsoft US Building Footprints](https://github.com/Microsoft/USBuildingFootprints) dataset for California.", "columns": {}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.building_footprints.us_building_footprints", "created_at": 1701973267.8860989}, "source.dse_analytics.building_footprints.global_ml_building_footprints": {"database": "RAW_DEV", "schema": "building_footprints", "name": "global_ml_building_footprints", "resource_type": "source", "package_name": "dse_analytics", "path": "models/marts/geo_reference/_geo_reference__models.yml", "original_file_path": "models/marts/geo_reference/_geo_reference__models.yml", "unique_id": "source.dse_analytics.building_footprints.global_ml_building_footprints", "fqn": ["dse_analytics", "marts", "geo_reference", "building_footprints", "global_ml_building_footprints"], "source_name": "building_footprints", "source_description": "", "loader": "", "identifier": "global_ml_building_footprints", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "[Microsoft Global ML Building Footprints](https://github.com/microsoft/GlobalMLBuildingFootprints) dataset for California. This contains some null geometries,as well as geometries that fall somewhat outside of California", "columns": {}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.building_footprints.global_ml_building_footprints", "created_at": 1701973267.8863254}, "source.dse_analytics.tiger_2022.blocks": {"database": "RAW_DEV", "schema": "tiger_2022", "name": "blocks", "resource_type": "source", "package_name": "dse_analytics", "path": "models/marts/geo_reference/_geo_reference__models.yml", "original_file_path": "models/marts/geo_reference/_geo_reference__models.yml", "unique_id": "source.dse_analytics.tiger_2022.blocks", "fqn": ["dse_analytics", "marts", "geo_reference", "tiger_2022", "blocks"], "source_name": "tiger_2022", "source_description": "", "loader": "", "identifier": "blocks", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "", "columns": {}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.tiger_2022.blocks", "created_at": 1701973267.8865128}, "source.dse_analytics.tiger_2022.places": {"database": "RAW_DEV", "schema": "tiger_2022", "name": "places", "resource_type": "source", "package_name": "dse_analytics", "path": "models/marts/geo_reference/_geo_reference__models.yml", "original_file_path": "models/marts/geo_reference/_geo_reference__models.yml", "unique_id": "source.dse_analytics.tiger_2022.places", "fqn": ["dse_analytics", "marts", "geo_reference", "tiger_2022", "places"], "source_name": "tiger_2022", "source_description": "", "loader": "", "identifier": "places", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "", "columns": {}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.tiger_2022.places", "created_at": 1701973267.8866916}, "source.dse_analytics.state_entities.base_entities": {"database": "RAW_DEV", "schema": "state_entities", "name": "base_entities", "resource_type": "source", "package_name": "dse_analytics", "path": "models/staging/department_of_finance/_department_of_finance__models.yml", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "source.dse_analytics.state_entities.base_entities", "fqn": ["dse_analytics", "staging", "department_of_finance", "state_entities", "base_entities"], "source_name": "state_entities", "source_description": "Dataset for state entities data modeling and reporting", "loader": "", "identifier": "base_entities", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "Starting hierarchical entities list from Department of Finance.", "columns": {"A": {"name": "A", "description": "Agency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}, "B": {"name": "B", "description": "Subagency code for entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L1": {"name": "L1", "description": "Level beneath subagency", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L2": {"name": "L2", "description": "Level beneath L1", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "L3": {"name": "L3", "description": "Level beneath L2", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}, "name": {"name": "name", "description": "Name of entity", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.state_entities.base_entities", "created_at": 1701973267.9101448}, "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets": {"database": "RAW_DEV", "schema": "state_entities", "name": "ebudget_agency_and_department_budgets", "resource_type": "source", "package_name": "dse_analytics", "path": "models/staging/department_of_finance/_department_of_finance__models.yml", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets", "fqn": ["dse_analytics", "staging", "department_of_finance", "state_entities", "ebudget_agency_and_department_budgets"], "source_name": "state_entities", "source_description": "Dataset for state entities data modeling and reporting", "loader": "", "identifier": "ebudget_agency_and_department_budgets", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "Budget information at the agency and department levels from ebudget.ca.gov\n", "columns": {"org_cd": {"name": "org_cd", "description": "Four digit business unit code", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}, "web_agency_cd": {"name": "web_agency_cd", "description": "Four digit business unit code (same as org_cd, it seems)", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}, "org_id": {"name": "org_id", "description": "ID for the entity within the ebudget system (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "legal_titl": {"name": "legal_titl", "description": "Name of agency or department", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "state_budget_year_dols": {"name": "state_budget_year_dols", "description": "Budget for the fiscal year, in dollars", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "all_budget_year_dols": {"name": "all_budget_year_dols", "description": "TODO", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "budget_year_pers": {"name": "budget_year_pers", "description": "Headcount for the budget year (not authoritative)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "general_fund_total": {"name": "general_fund_total", "description": "TODO", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "special_fund_total": {"name": "special_fund_total", "description": "TODO", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "bond_fund_total": {"name": "bond_fund_total", "description": "TODO", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "capital_outlay_total": {"name": "capital_outlay_total", "description": "TODO", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "spr_include_co_fig": {"name": "spr_include_co_fig", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "display_on_web_fig": {"name": "display_on_web_fig", "description": "Unused", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "state_grand_total": {"name": "state_grand_total", "description": "Total budget for the state, not this row (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.state_entities.ebudget_agency_and_department_budgets", "created_at": 1701973267.9171546}, "source.dse_analytics.state_entities.ebudget_program_budgets": {"database": "RAW_DEV", "schema": "state_entities", "name": "ebudget_program_budgets", "resource_type": "source", "package_name": "dse_analytics", "path": "models/staging/department_of_finance/_department_of_finance__models.yml", "original_file_path": "models/staging/department_of_finance/_department_of_finance__models.yml", "unique_id": "source.dse_analytics.state_entities.ebudget_program_budgets", "fqn": ["dse_analytics", "staging", "department_of_finance", "state_entities", "ebudget_program_budgets"], "source_name": "state_entities", "source_description": "Dataset for state entities data modeling and reporting", "loader": "", "identifier": "ebudget_program_budgets", "quoting": {"database": null, "schema": null, "identifier": null, "column": null}, "loaded_at_field": null, "freshness": {"warn_after": {"count": null, "period": null}, "error_after": {"count": null, "period": null}, "filter": null}, "external": null, "description": "Budget information at the program level from ebudget.ca.gov", "columns": {"program_id": {"name": "program_id", "description": "Unknown (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "last_upd_date": {"name": "last_upd_date", "description": "Unknown (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "last_upd_user": {"name": "last_upd_user", "description": "Unknown (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "py_dols": {"name": "py_dols", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "cy_dols": {"name": "cy_dols", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "by_dols": {"name": "by_dols", "description": "Budget for the fiscal year, in dollars", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "program_code": {"name": "program_code", "description": "Four digit business unit code", "meta": {}, "data_type": null, "constraints": [], "quote": true, "tags": []}, "line_type": {"name": "line_type", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "indent_nbr": {"name": "indent_nbr", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "program_titl": {"name": "program_titl", "description": "Name of the state entity", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "org_id": {"name": "org_id", "description": "ID for the parent entity within the ebudget system (DO NOT USE)", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "py_pers_yrs": {"name": "py_pers_yrs", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "cy_pers_yrs": {"name": "cy_pers_yrs", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "by_pers_yrs": {"name": "by_pers_yrs", "description": "Personnel for the fiscal year, in full-time-person-years", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}, "spr_include_co_fig": {"name": "spr_include_co_fig", "description": "Unknown", "meta": {}, "data_type": null, "constraints": [], "quote": null, "tags": []}}, "meta": {}, "source_meta": {}, "tags": [], "config": {"enabled": true}, "patch_path": null, "unrendered_config": {}, "relation_name": "RAW_DEV.state_entities.ebudget_program_budgets", "created_at": 1701973267.918991}}, "macros": {"macro.dse_analytics.spatial_join_with_deduplication": {"name": "spatial_join_with_deduplication", "resource_type": "macro", "package_name": "dse_analytics", "path": "macros/spatial_join_with_deduplication.sql", "original_file_path": "macros/spatial_join_with_deduplication.sql", "unique_id": "macro.dse_analytics.spatial_join_with_deduplication", "macro_sql": "{% macro spatial_join_with_deduplication(left_model, right_model, left_cols, right_cols, left_geom=\"geometry\", right_geom=\"geometry\", op=\"st_intersects\", kind=\"left\", prefix=\"\") %}\n\nwith {{ prefix }}_left_model_with_id as (\n select\n /* Generate a temporary ID for footprints. We will need this to group/partition\n by unique footprints further down. We could use a UUID, but integers are\n cheaper to generate and compare. */\n *, seq4() as _tmp_sjoin_id\n from {{ left_model }}\n),\n\n{{ prefix }}_joined as (\n select\n {% for lcol in left_cols -%}\n {{ prefix }}_left_model_with_id.{{ lcol }},\n {% endfor -%}\n {% for rcol in right_cols -%}\n {{ right_model }}.{{ rcol }},\n {% endfor -%}\n {{ prefix }}_left_model_with_id.{{ left_geom }},\n /* We don't actually need the intersection for every geometry, only for the\n ones that intersect more than one. However, in order to establish which\n ones intersect more than one, we need a windowed COUNT partitioned by\n _tmp_sjoin_id. This is an expensive operation, as it likely triggers a shuffle\n (even though it should already be sorted by _tmp_id). In testing we've found\n that it's cheaper to just do the intersection for all the geometries. */\n st_area(\n st_intersection({{ prefix }}_left_model_with_id.{{ left_geom }}, {{ right_model }}.{{ right_geom }})\n ) as _tmp_sjoin_intersection,\n {{ prefix }}_left_model_with_id._tmp_sjoin_id\n from {{ prefix }}_left_model_with_id\n {{ kind }} join {{ right_model }}\n on {{ op }}({{ prefix }}_left_model_with_id.{{ left_geom }}, {{ right_model }}.{{ right_geom }})\n),\n\n{{ prefix }}_deduplicated as (\n select\n -- Snowflake doesn't support geometries in max_by. It should, but it doesn't.\n -- Fortunately, we know that the geometries are identical when partitioned\n -- by _tmp_sjoin_id, so we can just choose any_value.\n any_value({{ left_geom }}) as {{ left_geom }},\n {% for lcol in left_cols -%}\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by({{ lcol }}, coalesce(_tmp_sjoin_intersection, 1.0)) as {{ lcol }},\n {% endfor -%}\n {% for rcol in right_cols -%}\n -- max_by returns null if all the values in a group are null. So if we have a left\n -- join, we need to guard against nulls with a coalesce to return the single value\n max_by({{ rcol }}, coalesce(_tmp_sjoin_intersection, 1.0)) as {{ rcol }}{{ \",\" if not loop.last }}\n {% endfor -%}\n from {{ prefix }}_joined\n group by _tmp_sjoin_id\n)\n\nselect * from {{ prefix }}_deduplicated\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "Macro to perform a spatial join between two relations with deduplication of the\ngeometries in the left table. For all left geometries that satisfy the predicate\nfor more than one geometry in the right table, we compute their intersection and\nthen choose the left geometry with the greatest intersection.\n", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": "dse_analytics://macros/_macros.yml", "arguments": [{"name": "left_model", "type": "string", "description": "The left model to join. Can be a relation or CTE."}, {"name": "right_model", "type": "string", "description": "The right model to join. Can be a relation or CTE."}, {"name": "left_cols", "type": "list of strings", "description": "List columns to keep from the left table\n(excluding the geometry column, which is always retained)\n"}, {"name": "right_cols", "type": "list of strings", "description": "List of columns to keep from the right table\n(excluding the geometry column, which is never retained).\nCannot share any names with left_cols\n"}, {"name": "left_geom", "type": "string", "description": "The name of the left geometry column, defaults to \"geometry\""}, {"name": "right_geom", "type": "string", "description": "The name of the right geometry column, defaults to \"geometry\""}, {"name": "op", "type": null, "description": "The spatial predicate function to choose,\ndefaults to \"st_intersects\"\n"}, {"name": "kind", "type": "string", "description": "The kind of join, either \"left\" or \"inner\". Defaults to \"left\""}, {"name": "prefix", "type": "string", "description": "An optional prefix to give to temporary CTEs to improve legibility and\navoid name collisions."}], "created_at": 1701973267.885392, "supported_languages": null}, "macro.dse_analytics.generate_schema_name": {"name": "generate_schema_name", "resource_type": "macro", "package_name": "dse_analytics", "path": "macros/get_custom_schema.sql", "original_file_path": "macros/get_custom_schema.sql", "unique_id": "macro.dse_analytics.generate_schema_name", "macro_sql": "{% macro generate_schema_name(custom_schema_name, node) -%}\n\n{#\n Definitions:\n - custom_schema_name: schema provided via dbt_project.yml or model config\n - target.name: name of the target (dev for local development, prod for production, etc.)\n - target.schema: schema provided by the target defined in profiles.yml\n\n Rather than write to a schema prefixed with target.schema, we instead just write\n to the actual schema name, and get safety by separating dev and prod databases.\n If we start to experience analytics engineers stepping on each others toes in\n dev, we may want to restore prefixes there (while maintaining a prefix-free\n lifestyle in prod).\n #}\n {%- if custom_schema_name is none -%} {{ target.schema.lower() | trim }}\n\n{%- elif target.name == 'prd' -%} {{ custom_schema_name.lower() | trim }}\n\n{%- else -%} {{ target.schema.lower() | trim }}_{{ custom_schema_name | trim }}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.734302, "supported_languages": null}, "macro.dse_analytics.map_class_fips": {"name": "map_class_fips", "resource_type": "macro", "package_name": "dse_analytics", "path": "macros/map_class_fp.sql", "original_file_path": "macros/map_class_fp.sql", "unique_id": "macro.dse_analytics.map_class_fips", "macro_sql": "{% macro map_class_fips(class_fips, k, v) -%}\n\n{#\n Class Codes source: https://www.census.gov/library/reference/code-lists/class-codes.html\n#}\n\n{% set class_fips_dict = {\n \"M2\" : \"A military or other defense installation entirely within a place\",\n \"C1\" : \"An active incorporated place that does not serve as a county subdivision equivalent\",\n \"U1\" : \"A census designated place with an official federally recognized name\",\n \"U2\" : \"A census designated place without an official federally recognized name\"\n} -%}\n\ncase\n {% for k, v in class_fips_dict.items() -%}\n when \"{{ class_fips }}\" = '{{ k }}'\n then '{{ v }}'\n {% endfor -%}\nend\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7354248, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_catalog": {"name": "snowflake__get_catalog", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/catalog.sql", "original_file_path": "macros/catalog.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_catalog", "macro_sql": "{% macro snowflake__get_catalog(information_schema, schemas) -%}\n {% set query %}\n with tables as (\n\n select\n table_catalog as \"table_database\",\n table_schema as \"table_schema\",\n table_name as \"table_name\",\n table_type as \"table_type\",\n comment as \"table_comment\",\n\n -- note: this is the _role_ that owns the table\n table_owner as \"table_owner\",\n\n 'Clustering Key' as \"stats:clustering_key:label\",\n clustering_key as \"stats:clustering_key:value\",\n 'The key used to cluster this table' as \"stats:clustering_key:description\",\n (clustering_key is not null) as \"stats:clustering_key:include\",\n\n 'Row Count' as \"stats:row_count:label\",\n row_count as \"stats:row_count:value\",\n 'An approximate count of rows in this table' as \"stats:row_count:description\",\n (row_count is not null) as \"stats:row_count:include\",\n\n 'Approximate Size' as \"stats:bytes:label\",\n bytes as \"stats:bytes:value\",\n 'Approximate size of the table as reported by Snowflake' as \"stats:bytes:description\",\n (bytes is not null) as \"stats:bytes:include\",\n\n 'Last Modified' as \"stats:last_modified:label\",\n to_varchar(convert_timezone('UTC', last_altered), 'yyyy-mm-dd HH24:MI'||'UTC') as \"stats:last_modified:value\",\n 'The timestamp for last update/change' as \"stats:last_modified:description\",\n (last_altered is not null and table_type='BASE TABLE') as \"stats:last_modified:include\"\n\n from {{ information_schema }}.tables\n where (\n {%- for schema in schemas -%}\n upper(\"table_schema\") = upper('{{ schema }}'){%- if not loop.last %} or {% endif -%}\n {%- endfor -%}\n )\n\n ),\n\n columns as (\n\n select\n table_catalog as \"table_database\",\n table_schema as \"table_schema\",\n table_name as \"table_name\",\n\n column_name as \"column_name\",\n ordinal_position as \"column_index\",\n data_type as \"column_type\",\n comment as \"column_comment\"\n\n from {{ information_schema }}.columns\n where (\n {%- for schema in schemas -%}\n upper(\"table_schema\") = upper('{{ schema }}'){%- if not loop.last %} or {% endif -%}\n {%- endfor -%}\n )\n )\n\n select *\n from tables\n join columns using (\"table_database\", \"table_schema\", \"table_name\")\n order by \"column_index\"\n {%- endset -%}\n\n {{ return(run_query(query)) }}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.737503, "supported_languages": null}, "macro.dbt_snowflake.snowflake__create_table_as": {"name": "snowflake__create_table_as", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__create_table_as", "macro_sql": "{% macro snowflake__create_table_as(temporary, relation, compiled_code, language='sql') -%}\n {%- if language == 'sql' -%}\n {%- set transient = config.get('transient', default=true) -%}\n {%- set cluster_by_keys = config.get('cluster_by', default=none) -%}\n {%- set enable_automatic_clustering = config.get('automatic_clustering', default=false) -%}\n {%- set copy_grants = config.get('copy_grants', default=false) -%}\n\n {%- if cluster_by_keys is not none and cluster_by_keys is string -%}\n {%- set cluster_by_keys = [cluster_by_keys] -%}\n {%- endif -%}\n {%- if cluster_by_keys is not none -%}\n {%- set cluster_by_string = cluster_by_keys|join(\", \")-%}\n {% else %}\n {%- set cluster_by_string = none -%}\n {%- endif -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {{ sql_header if sql_header is not none }}\n\n create or replace {% if temporary -%}\n temporary\n {%- elif transient -%}\n transient\n {%- endif %} table {{ relation }}\n {%- set contract_config = config.get('contract') -%}\n {%- if contract_config.enforced -%}\n {{ get_assert_columns_equivalent(sql) }}\n {{ get_table_columns_and_constraints() }}\n {% set compiled_code = get_select_subquery(compiled_code) %}\n {% endif %}\n {% if copy_grants and not temporary -%} copy grants {%- endif %} as\n (\n {%- if cluster_by_string is not none -%}\n select * from (\n {{ compiled_code }}\n ) order by ({{ cluster_by_string }})\n {%- else -%}\n {{ compiled_code }}\n {%- endif %}\n );\n {% if cluster_by_string is not none and not temporary -%}\n alter table {{relation}} cluster by ({{cluster_by_string}});\n {%- endif -%}\n {% if enable_automatic_clustering and cluster_by_string is not none and not temporary -%}\n alter table {{relation}} resume recluster;\n {%- endif -%}\n\n {%- elif language == 'python' -%}\n {{ py_write_table(compiled_code=compiled_code, target_relation=relation, temporary=temporary) }}\n {%- else -%}\n {% do exceptions.raise_compiler_error(\"snowflake__create_table_as macro didn't get supported language, it got %s\" % language) %}\n {%- endif -%}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_assert_columns_equivalent", "macro.dbt.get_table_columns_and_constraints", "macro.dbt.get_select_subquery", "macro.dbt_snowflake.py_write_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.760661, "supported_languages": null}, "macro.dbt_snowflake.get_column_comment_sql": {"name": "get_column_comment_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.get_column_comment_sql", "macro_sql": "{% macro get_column_comment_sql(column_name, column_dict) -%}\n {% if (column_name|upper in column_dict) -%}\n {% set matched_column = column_name|upper -%}\n {% elif (column_name|lower in column_dict) -%}\n {% set matched_column = column_name|lower -%}\n {% elif (column_name in column_dict) -%}\n {% set matched_column = column_name -%}\n {% else -%}\n {% set matched_column = None -%}\n {% endif -%}\n {% if matched_column -%}\n {{ adapter.quote(column_name) }} COMMENT $${{ column_dict[matched_column]['description'] | replace('$', '[$]') }}$$\n {%- else -%}\n {{ adapter.quote(column_name) }} COMMENT $$$$\n {%- endif -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7621, "supported_languages": null}, "macro.dbt_snowflake.get_persist_docs_column_list": {"name": "get_persist_docs_column_list", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.get_persist_docs_column_list", "macro_sql": "{% macro get_persist_docs_column_list(model_columns, query_columns) %}\n(\n {% for column_name in query_columns %}\n {{ get_column_comment_sql(column_name, model_columns) }}\n {{- \", \" if not loop.last else \"\" }}\n {% endfor %}\n)\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.get_column_comment_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7626705, "supported_languages": null}, "macro.dbt_snowflake.snowflake__create_view_as_with_temp_flag": {"name": "snowflake__create_view_as_with_temp_flag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__create_view_as_with_temp_flag", "macro_sql": "{% macro snowflake__create_view_as_with_temp_flag(relation, sql, is_temporary=False) -%}\n {%- set secure = config.get('secure', default=false) -%}\n {%- set copy_grants = config.get('copy_grants', default=false) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {{ sql_header if sql_header is not none }}\n create or replace {% if secure -%}\n secure\n {%- endif %} {% if is_temporary -%}\n temporary\n {%- endif %} view {{ relation }}\n {% if config.persist_column_docs() -%}\n {% set model_columns = model.columns %}\n {% set query_columns = get_columns_in_query(sql) %}\n {{ get_persist_docs_column_list(model_columns, query_columns) }}\n\n {%- endif %}\n {%- set contract_config = config.get('contract') -%}\n {%- if contract_config.enforced -%}\n {{ get_assert_columns_equivalent(sql) }}\n {%- endif %}\n {% if copy_grants -%} copy grants {%- endif %} as (\n {{ sql }}\n );\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_columns_in_query", "macro.dbt_snowflake.get_persist_docs_column_list", "macro.dbt.get_assert_columns_equivalent"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7645836, "supported_languages": null}, "macro.dbt_snowflake.snowflake__create_view_as": {"name": "snowflake__create_view_as", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__create_view_as", "macro_sql": "{% macro snowflake__create_view_as(relation, sql) -%}\n {{ snowflake__create_view_as_with_temp_flag(relation, sql) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__create_view_as_with_temp_flag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7649322, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_columns_in_relation": {"name": "snowflake__get_columns_in_relation", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_columns_in_relation", "macro_sql": "{% macro snowflake__get_columns_in_relation(relation) -%}\n {%- set sql -%}\n describe table {{ relation }}\n {%- endset -%}\n {%- set result = run_query(sql) -%}\n\n {% set maximum = 10000 %}\n {% if (result | length) >= maximum %}\n {% set msg %}\n Too many columns in relation {{ relation }}! dbt can only get\n information about relations with fewer than {{ maximum }} columns.\n {% endset %}\n {% do exceptions.raise_compiler_error(msg) %}\n {% endif %}\n\n {% set columns = [] %}\n {% for row in result %}\n {% do columns.append(api.Column.from_description(row['name'], row['type'])) %}\n {% endfor %}\n {% do return(columns) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7664557, "supported_languages": null}, "macro.dbt_snowflake.snowflake__list_schemas": {"name": "snowflake__list_schemas", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__list_schemas", "macro_sql": "{% macro snowflake__list_schemas(database) -%}\n {# 10k limit from here: https://docs.snowflake.net/manuals/sql-reference/sql/show-schemas.html#usage-notes #}\n {% set maximum = 10000 %}\n {% set sql -%}\n show terse schemas in database {{ database }}\n limit {{ maximum }}\n {%- endset %}\n {% set result = run_query(sql) %}\n {% if (result | length) >= maximum %}\n {% set msg %}\n Too many schemas in database {{ database }}! dbt can only get\n information about databases with fewer than {{ maximum }} schemas.\n {% endset %}\n {% do exceptions.raise_compiler_error(msg) %}\n {% endif %}\n {{ return(result) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7675548, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_paginated_relations_array": {"name": "snowflake__get_paginated_relations_array", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_paginated_relations_array", "macro_sql": "{% macro snowflake__get_paginated_relations_array(max_iter, max_results_per_iter, max_total_results, schema_relation, watermark) %}\n\n {% set paginated_relations = [] %}\n\n {% for _ in range(0, max_iter) %}\n\n {%- set paginated_sql -%}\n show terse objects in {{ schema_relation }} limit {{ max_results_per_iter }} from '{{ watermark.table_name }}'\n {%- endset -%}\n\n {%- set paginated_result = run_query(paginated_sql) %}\n {%- set paginated_n = (paginated_result | length) -%}\n\n {#\n terminating condition: if there are 0 records in the result we reached\n the end exactly on the previous iteration\n #}\n {%- if paginated_n == 0 -%}\n {%- break -%}\n {%- endif -%}\n\n {#\n terminating condition: At some point the user needs to be reasonable with how\n many objects are contained in their schemas. Since there was already\n one iteration before attempting pagination, loop.index == max_iter means\n the limit has been surpassed.\n #}\n\n {%- if loop.index == max_iter -%}\n {%- set msg -%}\n dbt will list a maximum of {{ max_total_results }} objects in schema {{ schema_relation }}.\n Your schema exceeds this limit. Please contact support@getdbt.com for troubleshooting tips,\n or review and reduce the number of objects contained.\n {%- endset -%}\n\n {% do exceptions.raise_compiler_error(msg) %}\n {%- endif -%}\n\n {%- do paginated_relations.append(paginated_result) -%}\n {% set watermark.table_name = paginated_result.columns[1].values()[-1] %}\n\n {#\n terminating condition: paginated_n < max_results_per_iter means we reached the end\n #}\n {%- if paginated_n < max_results_per_iter -%}\n {%- break -%}\n {%- endif -%}\n {%- endfor -%}\n\n {{ return(paginated_relations) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7695997, "supported_languages": null}, "macro.dbt_snowflake.snowflake__list_relations_without_caching": {"name": "snowflake__list_relations_without_caching", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__list_relations_without_caching", "macro_sql": "{% macro snowflake__list_relations_without_caching(schema_relation, max_iter=10, max_results_per_iter=10000) %}\n\n {%- set max_total_results = max_results_per_iter * max_iter -%}\n\n {%- set sql -%}\n show terse objects in {{ schema_relation }} limit {{ max_results_per_iter }}\n {%- endset -%}\n\n {%- set result = run_query(sql) -%}\n\n {%- set n = (result | length) -%}\n {%- set watermark = namespace(table_name=result.columns[1].values()[-1]) -%}\n {%- set paginated = namespace(result=[]) -%}\n\n {% if n >= max_results_per_iter %}\n\n {% set paginated.result = snowflake__get_paginated_relations_array(\n max_iter,\n max_results_per_iter,\n max_total_results,\n schema_relation,\n watermark\n )\n %}\n\n {% endif %}\n\n {%- set all_results_array = [result] + paginated.result -%}\n {%- set result = result.merge(all_results_array) -%}\n {%- do return(result) -%}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query", "macro.dbt_snowflake.snowflake__get_paginated_relations_array"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7714348, "supported_languages": null}, "macro.dbt_snowflake.snowflake__check_schema_exists": {"name": "snowflake__check_schema_exists", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__check_schema_exists", "macro_sql": "{% macro snowflake__check_schema_exists(information_schema, schema) -%}\n {% call statement('check_schema_exists', fetch_result=True) -%}\n select count(*)\n from {{ information_schema }}.schemata\n where upper(schema_name) = upper('{{ schema }}')\n and upper(catalog_name) = upper('{{ information_schema.database }}')\n {%- endcall %}\n {{ return(load_result('check_schema_exists').table) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.772086, "supported_languages": null}, "macro.dbt_snowflake.snowflake__rename_relation": {"name": "snowflake__rename_relation", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__rename_relation", "macro_sql": "{% macro snowflake__rename_relation(from_relation, to_relation) -%}\n {% call statement('rename_relation') -%}\n alter table {{ from_relation }} rename to {{ to_relation }}\n {%- endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7724845, "supported_languages": null}, "macro.dbt_snowflake.snowflake__alter_column_type": {"name": "snowflake__alter_column_type", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__alter_column_type", "macro_sql": "{% macro snowflake__alter_column_type(relation, column_name, new_column_type) -%}\n {% call statement('alter_column_type') %}\n alter table {{ relation }} alter {{ adapter.quote(column_name) }} set data type {{ new_column_type }};\n {% endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7730079, "supported_languages": null}, "macro.dbt_snowflake.snowflake__alter_relation_comment": {"name": "snowflake__alter_relation_comment", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__alter_relation_comment", "macro_sql": "{% macro snowflake__alter_relation_comment(relation, relation_comment) -%}\n comment on {{ relation.type }} {{ relation }} IS $${{ relation_comment | replace('$', '[$]') }}$$;\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7734144, "supported_languages": null}, "macro.dbt_snowflake.snowflake__alter_column_comment": {"name": "snowflake__alter_column_comment", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__alter_column_comment", "macro_sql": "{% macro snowflake__alter_column_comment(relation, column_dict) -%}\n {% set existing_columns = adapter.get_columns_in_relation(relation) | map(attribute=\"name\") | list %}\n alter {{ relation.type }} {{ relation }} alter\n {% for column_name in existing_columns if (column_name in existing_columns) or (column_name|lower in existing_columns) %}\n {{ get_column_comment_sql(column_name, column_dict) }} {{- ',' if not loop.last else ';' }}\n {% endfor %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.get_column_comment_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7744172, "supported_languages": null}, "macro.dbt_snowflake.get_current_query_tag": {"name": "get_current_query_tag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.get_current_query_tag", "macro_sql": "{% macro get_current_query_tag() -%}\n {{ return(run_query(\"show parameters like 'query_tag' in session\").rows[0]['value']) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7747827, "supported_languages": null}, "macro.dbt_snowflake.set_query_tag": {"name": "set_query_tag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.set_query_tag", "macro_sql": "{% macro set_query_tag() -%}\n {{ return(adapter.dispatch('set_query_tag', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__set_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.775114, "supported_languages": null}, "macro.dbt_snowflake.snowflake__set_query_tag": {"name": "snowflake__set_query_tag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__set_query_tag", "macro_sql": "{% macro snowflake__set_query_tag() -%}\n {% set new_query_tag = config.get('query_tag') %}\n {% if new_query_tag %}\n {% set original_query_tag = get_current_query_tag() %}\n {{ log(\"Setting query_tag to '\" ~ new_query_tag ~ \"'. Will reset to '\" ~ original_query_tag ~ \"' after materialization.\") }}\n {% do run_query(\"alter session set query_tag = '{}'\".format(new_query_tag)) %}\n {{ return(original_query_tag)}}\n {% endif %}\n {{ return(none)}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.get_current_query_tag", "macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7760801, "supported_languages": null}, "macro.dbt_snowflake.unset_query_tag": {"name": "unset_query_tag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.unset_query_tag", "macro_sql": "{% macro unset_query_tag(original_query_tag) -%}\n {{ return(adapter.dispatch('unset_query_tag', 'dbt')(original_query_tag)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7764618, "supported_languages": null}, "macro.dbt_snowflake.snowflake__unset_query_tag": {"name": "snowflake__unset_query_tag", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__unset_query_tag", "macro_sql": "{% macro snowflake__unset_query_tag(original_query_tag) -%}\n {% set new_query_tag = config.get('query_tag') %}\n {% if new_query_tag %}\n {% if original_query_tag %}\n {{ log(\"Resetting query_tag to '\" ~ original_query_tag ~ \"'.\") }}\n {% do run_query(\"alter session set query_tag = '{}'\".format(original_query_tag)) %}\n {% else %}\n {{ log(\"No original query_tag, unsetting parameter.\") }}\n {% do run_query(\"alter session unset query_tag\") %}\n {% endif %}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7774763, "supported_languages": null}, "macro.dbt_snowflake.snowflake__alter_relation_add_remove_columns": {"name": "snowflake__alter_relation_add_remove_columns", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__alter_relation_add_remove_columns", "macro_sql": "{% macro snowflake__alter_relation_add_remove_columns(relation, add_columns, remove_columns) %}\n\n {% if add_columns %}\n\n {% set sql -%}\n alter {{ relation.type }} {{ relation }} add column\n {% for column in add_columns %}\n {{ column.name }} {{ column.data_type }}{{ ',' if not loop.last }}\n {% endfor %}\n {%- endset -%}\n\n {% do run_query(sql) %}\n\n {% endif %}\n\n {% if remove_columns %}\n\n {% set sql -%}\n alter {{ relation.type }} {{ relation }} drop column\n {% for column in remove_columns %}\n {{ column.name }}{{ ',' if not loop.last }}\n {% endfor %}\n {%- endset -%}\n\n {% do run_query(sql) %}\n\n {% endif %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7791288, "supported_languages": null}, "macro.dbt_snowflake.snowflake_dml_explicit_transaction": {"name": "snowflake_dml_explicit_transaction", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake_dml_explicit_transaction", "macro_sql": "{% macro snowflake_dml_explicit_transaction(dml) %}\n {#\n Use this macro to wrap all INSERT, MERGE, UPDATE, DELETE, and TRUNCATE\n statements before passing them into run_query(), or calling in the 'main' statement\n of a materialization\n #}\n {% set dml_transaction -%}\n begin;\n {{ dml }};\n commit;\n {%- endset %}\n\n {% do return(dml_transaction) %}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7795672, "supported_languages": null}, "macro.dbt_snowflake.snowflake__truncate_relation": {"name": "snowflake__truncate_relation", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__truncate_relation", "macro_sql": "{% macro snowflake__truncate_relation(relation) -%}\n {% set truncate_dml %}\n truncate table {{ relation }}\n {% endset %}\n {% call statement('truncate_relation') -%}\n {{ snowflake_dml_explicit_transaction(truncate_dml) }}\n {%- endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt_snowflake.snowflake_dml_explicit_transaction"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7800734, "supported_languages": null}, "macro.dbt_snowflake.snowflake__drop_relation": {"name": "snowflake__drop_relation", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/adapters.sql", "original_file_path": "macros/adapters.sql", "unique_id": "macro.dbt_snowflake.snowflake__drop_relation", "macro_sql": "{% macro snowflake__drop_relation(relation) -%}\n {%- if relation.is_dynamic_table -%}\n {% call statement('drop_relation', auto_begin=False) -%}\n drop dynamic table if exists {{ relation }}\n {%- endcall %}\n {%- else -%}\n {{- default__drop_relation(relation) -}}\n {%- endif -%}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.default__drop_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7806501, "supported_languages": null}, "macro.dbt_snowflake.snowflake__copy_grants": {"name": "snowflake__copy_grants", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/apply_grants.sql", "original_file_path": "macros/apply_grants.sql", "unique_id": "macro.dbt_snowflake.snowflake__copy_grants", "macro_sql": "{% macro snowflake__copy_grants() %}\n {% set copy_grants = config.get('copy_grants', False) %}\n {{ return(copy_grants) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.781205, "supported_languages": null}, "macro.dbt_snowflake.snowflake__support_multiple_grantees_per_dcl_statement": {"name": "snowflake__support_multiple_grantees_per_dcl_statement", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/apply_grants.sql", "original_file_path": "macros/apply_grants.sql", "unique_id": "macro.dbt_snowflake.snowflake__support_multiple_grantees_per_dcl_statement", "macro_sql": "\n\n{%- macro snowflake__support_multiple_grantees_per_dcl_statement() -%}\n {{ return(False) }}\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7814355, "supported_languages": null}, "macro.dbt_snowflake.snowflake__load_csv_rows": {"name": "snowflake__load_csv_rows", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/seed.sql", "original_file_path": "macros/materializations/seed.sql", "unique_id": "macro.dbt_snowflake.snowflake__load_csv_rows", "macro_sql": "{% macro snowflake__load_csv_rows(model, agate_table) %}\n {% set batch_size = get_batch_size() %}\n {% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}\n {% set bindings = [] %}\n\n {% set statements = [] %}\n\n {% for chunk in agate_table.rows | batch(batch_size) %}\n {% set bindings = [] %}\n\n {% for row in chunk %}\n {% do bindings.extend(row) %}\n {% endfor %}\n\n {% set sql %}\n insert into {{ this.render() }} ({{ cols_sql }}) values\n {% for row in chunk -%}\n ({%- for column in agate_table.column_names -%}\n %s\n {%- if not loop.last%},{%- endif %}\n {%- endfor -%})\n {%- if not loop.last%},{%- endif %}\n {%- endfor %}\n {% endset %}\n\n {% do adapter.add_query('BEGIN', auto_begin=False) %}\n {% do adapter.add_query(sql, bindings=bindings, abridge_sql_log=True) %}\n {% do adapter.add_query('COMMIT', auto_begin=False) %}\n\n {% if loop.index0 == 0 %}\n {% do statements.append(sql) %}\n {% endif %}\n {% endfor %}\n\n {# Return SQL so we can render it out into the compiled files #}\n {{ return(statements[0]) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_batch_size", "macro.dbt.get_seed_column_quoted_csv"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.785286, "supported_languages": null}, "macro.dbt_snowflake.materialization_seed_snowflake": {"name": "materialization_seed_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/seed.sql", "original_file_path": "macros/materializations/seed.sql", "unique_id": "macro.dbt_snowflake.materialization_seed_snowflake", "macro_sql": "{% materialization seed, adapter='snowflake' %}\n {% set original_query_tag = set_query_tag() %}\n\n {% set relations = materialization_seed_default() %}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {{ return(relations) }}\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.materialization_seed_default", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.785899, "supported_languages": ["sql"]}, "macro.dbt_snowflake.materialization_view_snowflake": {"name": "materialization_view_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/view.sql", "original_file_path": "macros/materializations/view.sql", "unique_id": "macro.dbt_snowflake.materialization_view_snowflake", "macro_sql": "{% materialization view, adapter='snowflake' -%}\n\n {% set original_query_tag = set_query_tag() %}\n {% set to_return = create_or_replace_view() %}\n\n {% set target_relation = this.incorporate(type='view') %}\n\n {% do persist_docs(target_relation, model, for_columns=false) %}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {% do return(to_return) %}\n\n{%- endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.create_or_replace_view", "macro.dbt.persist_docs", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7870145, "supported_languages": ["sql"]}, "macro.dbt_snowflake.materialization_test_snowflake": {"name": "materialization_test_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/test.sql", "original_file_path": "macros/materializations/test.sql", "unique_id": "macro.dbt_snowflake.materialization_test_snowflake", "macro_sql": "{%- materialization test, adapter='snowflake' -%}\n\n {% set original_query_tag = set_query_tag() %}\n {% set relations = materialization_test_default() %}\n {% do unset_query_tag(original_query_tag) %}\n {{ return(relations) }}\n\n{%- endmaterialization -%}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.materialization_test_default", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7877257, "supported_languages": ["sql"]}, "macro.dbt_snowflake.dbt_snowflake_get_tmp_relation_type": {"name": "dbt_snowflake_get_tmp_relation_type", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/incremental.sql", "original_file_path": "macros/materializations/incremental.sql", "unique_id": "macro.dbt_snowflake.dbt_snowflake_get_tmp_relation_type", "macro_sql": "{% macro dbt_snowflake_get_tmp_relation_type(strategy, unique_key, language) %}\n{%- set tmp_relation_type = config.get('tmp_relation_type') -%}\n /* {#\n High-level principles:\n If we are running multiple statements (DELETE + INSERT),\n and we want to guarantee identical inputs to both statements,\n then we must first save the model query results as a temporary table\n (which presumably comes with a performance cost).\n If we are running a single statement (MERGE or INSERT alone),\n we _may_ save the model query definition as a view instead,\n for (presumably) faster overall incremental processing.\n\n Low-level specifics:\n If an invalid option is specified, then we will raise an\n excpetion with corresponding message.\n\n Languages other than SQL (like Python) will use a temporary table.\n With the default strategy of merge, the user may choose between a temporary\n table and view (defaulting to view).\n\n The append strategy can use a view because it will run a single INSERT statement.\n\n When unique_key is none, the delete+insert strategy can use a view beacuse a\n single INSERT statement is run with no DELETES as part of the statement.\n Otherwise, play it safe by using a temporary table.\n #} */\n\n {% if language == \"python\" and tmp_relation_type is not none %}\n {% do exceptions.raise_compiler_error(\n \"Python models currently only support 'table' for tmp_relation_type but \"\n ~ tmp_relation_type ~ \" was specified.\"\n ) %}\n {% endif %}\n\n {% if strategy == \"delete+insert\" and tmp_relation_type is not none and tmp_relation_type != \"table\" and unique_key is not none %}\n {% do exceptions.raise_compiler_error(\n \"In order to maintain consistent results when `unique_key` is not none,\n the `delete+insert` strategy only supports `table` for `tmp_relation_type` but \"\n ~ tmp_relation_type ~ \" was specified.\"\n )\n %}\n {% endif %}\n\n {% if language != \"sql\" %}\n {{ return(\"table\") }}\n {% elif tmp_relation_type == \"table\" %}\n {{ return(\"table\") }}\n {% elif tmp_relation_type == \"view\" %}\n {{ return(\"view\") }}\n {% elif strategy in (\"default\", \"merge\", \"append\") %}\n {{ return(\"view\") }}\n {% elif strategy == \"delete+insert\" and unique_key is none %}\n {{ return(\"view\") }}\n {% else %}\n {{ return(\"table\") }}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7921903, "supported_languages": null}, "macro.dbt_snowflake.materialization_incremental_snowflake": {"name": "materialization_incremental_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/incremental.sql", "original_file_path": "macros/materializations/incremental.sql", "unique_id": "macro.dbt_snowflake.materialization_incremental_snowflake", "macro_sql": "{% materialization incremental, adapter='snowflake', supported_languages=['sql', 'python'] -%}\n\n {% set original_query_tag = set_query_tag() %}\n\n {#-- Set vars --#}\n {%- set full_refresh_mode = (should_full_refresh()) -%}\n {%- set language = model['language'] -%}\n {% set target_relation = this %}\n {% set existing_relation = load_relation(this) %}\n\n {#-- The temp relation will be a view (faster) or temp table, depending on upsert/merge strategy --#}\n {%- set unique_key = config.get('unique_key') -%}\n {% set incremental_strategy = config.get('incremental_strategy') or 'default' %}\n {% set tmp_relation_type = dbt_snowflake_get_tmp_relation_type(incremental_strategy, unique_key, language) %}\n {% set tmp_relation = make_temp_relation(this).incorporate(type=tmp_relation_type) %}\n\n {% set grant_config = config.get('grants') %}\n\n {% set on_schema_change = incremental_validate_on_schema_change(config.get('on_schema_change'), default='ignore') %}\n\n {{ run_hooks(pre_hooks) }}\n\n {% if existing_relation is none %}\n {%- call statement('main', language=language) -%}\n {{ create_table_as(False, target_relation, compiled_code, language) }}\n {%- endcall -%}\n\n {% elif existing_relation.is_view %}\n {#-- Can't overwrite a view with a table - we must drop --#}\n {{ log(\"Dropping relation \" ~ target_relation ~ \" because it is a view and this model is a table.\") }}\n {% do adapter.drop_relation(existing_relation) %}\n {%- call statement('main', language=language) -%}\n {{ create_table_as(False, target_relation, compiled_code, language) }}\n {%- endcall -%}\n {% elif full_refresh_mode %}\n {%- call statement('main', language=language) -%}\n {{ create_table_as(False, target_relation, compiled_code, language) }}\n {%- endcall -%}\n\n {% else %}\n {#-- Create the temp relation, either as a view or as a temp table --#}\n {% if tmp_relation_type == 'view' %}\n {%- call statement('create_tmp_relation') -%}\n {{ snowflake__create_view_as_with_temp_flag(tmp_relation, compiled_code, True) }}\n {%- endcall -%}\n {% else %}\n {%- call statement('create_tmp_relation', language=language) -%}\n {{ create_table_as(True, tmp_relation, compiled_code, language) }}\n {%- endcall -%}\n {% endif %}\n\n {% do adapter.expand_target_column_types(\n from_relation=tmp_relation,\n to_relation=target_relation) %}\n {#-- Process schema changes. Returns dict of changes if successful. Use source columns for upserting/merging --#}\n {% set dest_columns = process_schema_changes(on_schema_change, tmp_relation, existing_relation) %}\n {% if not dest_columns %}\n {% set dest_columns = adapter.get_columns_in_relation(existing_relation) %}\n {% endif %}\n\n {#-- Get the incremental_strategy, the macro to use for the strategy, and build the sql --#}\n {% set incremental_predicates = config.get('predicates', none) or config.get('incremental_predicates', none) %}\n {% set strategy_sql_macro_func = adapter.get_incremental_strategy_macro(context, incremental_strategy) %}\n {% set strategy_arg_dict = ({'target_relation': target_relation, 'temp_relation': tmp_relation, 'unique_key': unique_key, 'dest_columns': dest_columns, 'incremental_predicates': incremental_predicates }) %}\n\n {%- call statement('main') -%}\n {{ strategy_sql_macro_func(strategy_arg_dict) }}\n {%- endcall -%}\n {% endif %}\n\n {% do drop_relation_if_exists(tmp_relation) %}\n\n {{ run_hooks(post_hooks) }}\n\n {% set target_relation = target_relation.incorporate(type='table') %}\n\n {% set should_revoke =\n should_revoke(existing_relation.is_table, full_refresh_mode) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {{ return({'relations': [target_relation]}) }}\n\n{%- endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.should_full_refresh", "macro.dbt.load_relation", "macro.dbt_snowflake.dbt_snowflake_get_tmp_relation_type", "macro.dbt.make_temp_relation", "macro.dbt.incremental_validate_on_schema_change", "macro.dbt.run_hooks", "macro.dbt.statement", "macro.dbt.create_table_as", "macro.dbt_snowflake.snowflake__create_view_as_with_temp_flag", "macro.dbt.process_schema_changes", "macro.dbt.drop_relation_if_exists", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7989929, "supported_languages": ["sql", "python"]}, "macro.dbt_snowflake.snowflake__get_incremental_default_sql": {"name": "snowflake__get_incremental_default_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/incremental.sql", "original_file_path": "macros/materializations/incremental.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_incremental_default_sql", "macro_sql": "{% macro snowflake__get_incremental_default_sql(arg_dict) %}\n {{ return(get_incremental_merge_sql(arg_dict)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_incremental_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.7993402, "supported_languages": null}, "macro.dbt_snowflake.materialization_snapshot_snowflake": {"name": "materialization_snapshot_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/snapshot.sql", "original_file_path": "macros/materializations/snapshot.sql", "unique_id": "macro.dbt_snowflake.materialization_snapshot_snowflake", "macro_sql": "{% materialization snapshot, adapter='snowflake' %}\n {% set original_query_tag = set_query_tag() %}\n {% set relations = materialization_snapshot_default() %}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {{ return(relations) }}\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.materialization_snapshot_default", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.800061, "supported_languages": ["sql"]}, "macro.dbt_snowflake.snowflake__get_merge_sql": {"name": "snowflake__get_merge_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/merge.sql", "original_file_path": "macros/materializations/merge.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_merge_sql", "macro_sql": "{% macro snowflake__get_merge_sql(target, source_sql, unique_key, dest_columns, incremental_predicates) -%}\n\n {#\n Workaround for Snowflake not being happy with a merge on a constant-false predicate.\n When no unique_key is provided, this macro will do a regular insert. If a unique_key\n is provided, then this macro will do a proper merge instead.\n #}\n\n {%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute='name')) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {%- set dml -%}\n {%- if unique_key is none -%}\n\n {{ sql_header if sql_header is not none }}\n\n insert into {{ target }} ({{ dest_cols_csv }})\n (\n select {{ dest_cols_csv }}\n from {{ source_sql }}\n )\n\n {%- else -%}\n\n {{ default__get_merge_sql(target, source_sql, unique_key, dest_columns, incremental_predicates) }}\n\n {%- endif -%}\n {%- endset -%}\n\n {% do return(snowflake_dml_explicit_transaction(dml)) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_quoted_csv", "macro.dbt.default__get_merge_sql", "macro.dbt_snowflake.snowflake_dml_explicit_transaction"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.802301, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_delete_insert_merge_sql": {"name": "snowflake__get_delete_insert_merge_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/merge.sql", "original_file_path": "macros/materializations/merge.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_delete_insert_merge_sql", "macro_sql": "{% macro snowflake__get_delete_insert_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) %}\n {% set dml = default__get_delete_insert_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) %}\n {% do return(snowflake_dml_explicit_transaction(dml)) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_delete_insert_merge_sql", "macro.dbt_snowflake.snowflake_dml_explicit_transaction"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.803046, "supported_languages": null}, "macro.dbt_snowflake.snowflake__snapshot_merge_sql": {"name": "snowflake__snapshot_merge_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/merge.sql", "original_file_path": "macros/materializations/merge.sql", "unique_id": "macro.dbt_snowflake.snowflake__snapshot_merge_sql", "macro_sql": "{% macro snowflake__snapshot_merge_sql(target, source, insert_cols) %}\n {% set dml = default__snapshot_merge_sql(target, source, insert_cols) %}\n {% do return(snowflake_dml_explicit_transaction(dml)) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__snapshot_merge_sql", "macro.dbt_snowflake.snowflake_dml_explicit_transaction"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8035896, "supported_languages": null}, "macro.dbt_snowflake.snowflake__can_clone_table": {"name": "snowflake__can_clone_table", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/clone.sql", "original_file_path": "macros/materializations/clone.sql", "unique_id": "macro.dbt_snowflake.snowflake__can_clone_table", "macro_sql": "{% macro snowflake__can_clone_table() %}\n {{ return(True) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8040438, "supported_languages": null}, "macro.dbt_snowflake.snowflake__create_or_replace_clone": {"name": "snowflake__create_or_replace_clone", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/clone.sql", "original_file_path": "macros/materializations/clone.sql", "unique_id": "macro.dbt_snowflake.snowflake__create_or_replace_clone", "macro_sql": "{% macro snowflake__create_or_replace_clone(this_relation, defer_relation) %}\n create or replace\n {{ \"transient\" if config.get(\"transient\", true) }}\n table {{ this_relation }}\n clone {{ defer_relation }}\n {{ \"copy grants\" if config.get(\"copy_grants\", false) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.80465, "supported_languages": null}, "macro.dbt_snowflake.materialization_table_snowflake": {"name": "materialization_table_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/table.sql", "original_file_path": "macros/materializations/table.sql", "unique_id": "macro.dbt_snowflake.materialization_table_snowflake", "macro_sql": "{% materialization table, adapter='snowflake', supported_languages=['sql', 'python']%}\n\n {% set original_query_tag = set_query_tag() %}\n\n {%- set identifier = model['alias'] -%}\n {%- set language = model['language'] -%}\n\n {% set grant_config = config.get('grants') %}\n\n {%- set old_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) -%}\n {%- set target_relation = api.Relation.create(identifier=identifier,\n schema=schema,\n database=database, type='table') -%}\n\n {{ run_hooks(pre_hooks) }}\n\n {#-- Drop the relation if it was a view to \"convert\" it in a table. This may lead to\n -- downtime, but it should be a relatively infrequent occurrence #}\n {% if old_relation is not none and not old_relation.is_table %}\n {{ log(\"Dropping relation \" ~ old_relation ~ \" because it is of type \" ~ old_relation.type) }}\n {{ drop_relation_if_exists(old_relation) }}\n {% endif %}\n\n {% call statement('main', language=language) -%}\n {{ create_table_as(False, target_relation, compiled_code, language) }}\n {%- endcall %}\n\n {{ run_hooks(post_hooks) }}\n\n {% set should_revoke = should_revoke(old_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.run_hooks", "macro.dbt.drop_relation_if_exists", "macro.dbt.statement", "macro.dbt.create_table_as", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8091693, "supported_languages": ["sql", "python"]}, "macro.dbt_snowflake.py_write_table": {"name": "py_write_table", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/table.sql", "original_file_path": "macros/materializations/table.sql", "unique_id": "macro.dbt_snowflake.py_write_table", "macro_sql": "{% macro py_write_table(compiled_code, target_relation, temporary=False) %}\n{{ compiled_code }}\ndef materialize(session, df, target_relation):\n # make sure pandas exists\n import importlib.util\n package_name = 'pandas'\n if importlib.util.find_spec(package_name):\n import pandas\n if isinstance(df, pandas.core.frame.DataFrame):\n session.use_database(target_relation.database)\n session.use_schema(target_relation.schema)\n # session.write_pandas does not have overwrite function\n df = session.createDataFrame(df)\n {% set target_relation_name = resolve_model_name(target_relation) %}\n df.write.mode(\"overwrite\").save_as_table('{{ target_relation_name }}', create_temp_table={{temporary}})\n\ndef main(session):\n dbt = dbtObj(session.table)\n df = model(dbt, session)\n materialize(session, df, dbt.this)\n return \"OK\"\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.resolve_model_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8097699, "supported_languages": null}, "macro.dbt_snowflake.py_script_comment": {"name": "py_script_comment", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/table.sql", "original_file_path": "macros/materializations/table.sql", "unique_id": "macro.dbt_snowflake.py_script_comment", "macro_sql": "{% macro py_script_comment()%}\n# To run this in snowsight, you need to select entry point to be main\n# And you may have to modify the return type to text to get the result back\n# def main(session):\n# dbt = dbtObj(session.table)\n# df = model(dbt, session)\n# return df.collect()\n\n# to run this in local notebook, you need to create a session following examples https://github.com/Snowflake-Labs/sfguide-getting-started-snowpark-python\n# then you can do the following to run model\n# dbt = dbtObj(session.table)\n# df = model(dbt, session)\n{%endmacro%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8099756, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_alter_dynamic_table_as_sql": {"name": "snowflake__get_alter_dynamic_table_as_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_alter_dynamic_table_as_sql", "macro_sql": "{% macro snowflake__get_alter_dynamic_table_as_sql(\n target_relation,\n configuration_changes,\n sql,\n existing_relation,\n backup_relation,\n intermediate_relation\n) -%}\n {{- log('Applying ALTER to: ' ~ target_relation) -}}\n\n {% if configuration_changes.requires_full_refresh %}\n {{- snowflake__get_replace_dynamic_table_as_sql(target_relation, sql, existing_relation, backup_relation, intermediate_relation) -}}\n\n {% else %}\n\n {%- set target_lag = configuration_changes.target_lag -%}\n {%- if target_lag -%}{{- log('Applying UPDATE TARGET_LAG to: ' ~ existing_relation) -}}{%- endif -%}\n {%- set warehouse = configuration_changes.warehouse -%}\n {%- if warehouse -%}{{- log('Applying UPDATE WAREHOUSE to: ' ~ existing_relation) -}}{%- endif -%}\n\n alter dynamic table {{ existing_relation }} set\n {% if target_lag %}target_lag = '{{ target_lag.context }}'{% endif %}\n {% if warehouse %}warehouse = {{ warehouse.context }}{% endif %}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_replace_dynamic_table_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8147752, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_create_dynamic_table_as_sql": {"name": "snowflake__get_create_dynamic_table_as_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_create_dynamic_table_as_sql", "macro_sql": "{% macro snowflake__get_create_dynamic_table_as_sql(relation, sql) -%}\n {{- log('Applying CREATE to: ' ~ relation) -}}\n\n create or replace dynamic table {{ relation }}\n target_lag = '{{ config.get(\"target_lag\") }}'\n warehouse = {{ config.get(\"snowflake_warehouse\") }}\n as (\n {{ sql }}\n )\n ;\n {{ snowflake__refresh_dynamic_table(relation) }}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__refresh_dynamic_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.815444, "supported_languages": null}, "macro.dbt_snowflake.snowflake__describe_dynamic_table": {"name": "snowflake__describe_dynamic_table", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__describe_dynamic_table", "macro_sql": "{% macro snowflake__describe_dynamic_table(relation) %}\n {%- set _dynamic_table_sql -%}\n show dynamic tables\n like '{{ relation.identifier }}'\n in schema {{ relation.database }}.{{ relation.schema }}\n ;\n select\n \"name\",\n \"schema_name\",\n \"database_name\",\n \"text\",\n \"target_lag\",\n \"warehouse\"\n from table(result_scan(last_query_id()))\n {%- endset %}\n {% set _dynamic_table = run_query(_dynamic_table_sql) %}\n\n {% do return({'dynamic_table': _dynamic_table}) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8161726, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_replace_dynamic_table_as_sql": {"name": "snowflake__get_replace_dynamic_table_as_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_replace_dynamic_table_as_sql", "macro_sql": "{% macro snowflake__get_replace_dynamic_table_as_sql(target_relation, sql, existing_relation, backup_relation, intermediate_relation) -%}\n {{- log('Applying REPLACE to: ' ~ target_relation) -}}\n {{ snowflake__get_drop_dynamic_table_sql(existing_relation) }};\n {{ snowflake__get_create_dynamic_table_as_sql(target_relation, sql) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_drop_dynamic_table_sql", "macro.dbt_snowflake.snowflake__get_create_dynamic_table_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8167129, "supported_languages": null}, "macro.dbt_snowflake.snowflake__refresh_dynamic_table": {"name": "snowflake__refresh_dynamic_table", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__refresh_dynamic_table", "macro_sql": "{% macro snowflake__refresh_dynamic_table(relation) -%}\n {{- log('Applying REFRESH to: ' ~ relation) -}}\n\n alter dynamic table {{ relation }} refresh\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8170342, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_dynamic_table_configuration_changes": {"name": "snowflake__get_dynamic_table_configuration_changes", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_dynamic_table_configuration_changes", "macro_sql": "{% macro snowflake__get_dynamic_table_configuration_changes(existing_relation, new_config) -%}\n {% set _existing_dynamic_table = snowflake__describe_dynamic_table(existing_relation) %}\n {% set _configuration_changes = existing_relation.dynamic_table_config_changeset(_existing_dynamic_table, new_config) %}\n {% do return(_configuration_changes) %}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__describe_dynamic_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8175895, "supported_languages": null}, "macro.dbt_snowflake.snowflake__get_drop_dynamic_table_sql": {"name": "snowflake__get_drop_dynamic_table_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/ddl.sql", "original_file_path": "macros/materializations/dynamic_table/ddl.sql", "unique_id": "macro.dbt_snowflake.snowflake__get_drop_dynamic_table_sql", "macro_sql": "{% macro snowflake__get_drop_dynamic_table_sql(relation) %}\n drop dynamic table if exists {{ relation }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8178144, "supported_languages": null}, "macro.dbt_snowflake.materialization_dynamic_table_snowflake": {"name": "materialization_dynamic_table_snowflake", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.materialization_dynamic_table_snowflake", "macro_sql": "{% materialization dynamic_table, adapter='snowflake' %}\n\n {% set original_query_tag = set_query_tag() %}\n\n {% set existing_relation = load_cached_relation(this) %}\n {% set target_relation = this.incorporate(type=this.DynamicTable) %}\n {% set intermediate_relation = make_intermediate_relation(target_relation) %}\n {% set backup_relation_type = target_relation.DynamicTable if existing_relation is none else existing_relation.type %}\n {% set backup_relation = make_backup_relation(target_relation, backup_relation_type) %}\n\n {{ dynamic_table_setup(backup_relation, intermediate_relation, pre_hooks) }}\n\n {% set build_sql = dynamic_table_get_build_sql(existing_relation, target_relation, backup_relation, intermediate_relation) %}\n\n {% if build_sql == '' %}\n {{ dynamic_table_execute_no_op(target_relation) }}\n {% else %}\n {{ dynamic_table_execute_build_sql(build_sql, existing_relation, target_relation, post_hooks) }}\n {% endif %}\n\n {{ dynamic_table_teardown(backup_relation, intermediate_relation, post_hooks) }}\n\n {% do unset_query_tag(original_query_tag) %}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt_snowflake.set_query_tag", "macro.dbt.load_cached_relation", "macro.dbt.make_intermediate_relation", "macro.dbt.make_backup_relation", "macro.dbt_snowflake.dynamic_table_setup", "macro.dbt_snowflake.dynamic_table_get_build_sql", "macro.dbt_snowflake.dynamic_table_execute_no_op", "macro.dbt_snowflake.dynamic_table_execute_build_sql", "macro.dbt_snowflake.dynamic_table_teardown", "macro.dbt_snowflake.unset_query_tag"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8258152, "supported_languages": ["sql"]}, "macro.dbt_snowflake.dynamic_table_setup": {"name": "dynamic_table_setup", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.dynamic_table_setup", "macro_sql": "{% macro dynamic_table_setup(backup_relation, intermediate_relation, pre_hooks) %}\n\n -- backup_relation and intermediate_relation should not already exist in the database\n -- it's possible these exist because of a previous run that exited unexpectedly\n {% set preexisting_backup_relation = load_cached_relation(backup_relation) %}\n {% set preexisting_intermediate_relation = load_cached_relation(intermediate_relation) %}\n\n -- drop the temp relations if they exist already in the database\n {{ snowflake__get_drop_dynamic_table_sql(preexisting_backup_relation) }}\n {{ snowflake__get_drop_dynamic_table_sql(preexisting_intermediate_relation) }}\n\n {{ run_hooks(pre_hooks) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt_snowflake.snowflake__get_drop_dynamic_table_sql", "macro.dbt.run_hooks"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.826597, "supported_languages": null}, "macro.dbt_snowflake.dynamic_table_teardown": {"name": "dynamic_table_teardown", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.dynamic_table_teardown", "macro_sql": "{% macro dynamic_table_teardown(backup_relation, intermediate_relation, post_hooks) %}\n\n -- drop the temp relations if they exist to leave the database clean for the next run\n {{ snowflake__get_drop_dynamic_table_sql(backup_relation) }}\n {{ snowflake__get_drop_dynamic_table_sql(intermediate_relation) }}\n\n {{ run_hooks(post_hooks) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_drop_dynamic_table_sql", "macro.dbt.run_hooks"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.827054, "supported_languages": null}, "macro.dbt_snowflake.dynamic_table_get_build_sql": {"name": "dynamic_table_get_build_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.dynamic_table_get_build_sql", "macro_sql": "{% macro dynamic_table_get_build_sql(existing_relation, target_relation, backup_relation, intermediate_relation) %}\n\n {% set full_refresh_mode = should_full_refresh() %}\n\n -- determine the scenario we're in: create, full_refresh, alter, refresh data\n {% if existing_relation is none %}\n {% set build_sql = snowflake__get_create_dynamic_table_as_sql(target_relation, sql) %}\n {% elif full_refresh_mode or not existing_relation.is_dynamic_table %}\n {% set build_sql = snowflake__get_replace_dynamic_table_as_sql(target_relation, sql, existing_relation, backup_relation, intermediate_relation) %}\n {% else %}\n\n -- get config options\n {% set on_configuration_change = config.get('on_configuration_change') %}\n {% set configuration_changes = snowflake__get_dynamic_table_configuration_changes(existing_relation, config) %}\n\n {% if configuration_changes is none %}\n {% set build_sql = '' %}\n {{ exceptions.warn(\"No configuration changes were identified on: `\" ~ target_relation ~ \"`. Continuing.\") }}\n\n {% elif on_configuration_change == 'apply' %}\n {% set build_sql = snowflake__get_alter_dynamic_table_as_sql(target_relation, configuration_changes, sql, existing_relation, backup_relation, intermediate_relation) %}\n {% elif on_configuration_change == 'continue' %}\n {% set build_sql = '' %}\n {{ exceptions.warn(\"Configuration changes were identified and `on_configuration_change` was set to `continue` for `\" ~ target_relation ~ \"`\") }}\n {% elif on_configuration_change == 'fail' %}\n {{ exceptions.raise_fail_fast_error(\"Configuration changes were identified and `on_configuration_change` was set to `fail` for `\" ~ target_relation ~ \"`\") }}\n\n {% else %}\n -- this only happens if the user provides a value other than `apply`, 'continue', 'fail'\n {{ exceptions.raise_compiler_error(\"Unexpected configuration scenario: `\" ~ on_configuration_change ~ \"`\") }}\n\n {% endif %}\n\n {% endif %}\n\n {% do return(build_sql) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.should_full_refresh", "macro.dbt_snowflake.snowflake__get_create_dynamic_table_as_sql", "macro.dbt_snowflake.snowflake__get_replace_dynamic_table_as_sql", "macro.dbt_snowflake.snowflake__get_dynamic_table_configuration_changes", "macro.dbt_snowflake.snowflake__get_alter_dynamic_table_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8298852, "supported_languages": null}, "macro.dbt_snowflake.dynamic_table_execute_no_op": {"name": "dynamic_table_execute_no_op", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.dynamic_table_execute_no_op", "macro_sql": "{% macro dynamic_table_execute_no_op(target_relation) %}\n {% do store_raw_result(\n name=\"main\",\n message=\"skip \" ~ target_relation,\n code=\"skip\",\n rows_affected=\"-1\"\n ) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8303523, "supported_languages": null}, "macro.dbt_snowflake.dynamic_table_execute_build_sql": {"name": "dynamic_table_execute_build_sql", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/materializations/dynamic_table/materialization.sql", "original_file_path": "macros/materializations/dynamic_table/materialization.sql", "unique_id": "macro.dbt_snowflake.dynamic_table_execute_build_sql", "macro_sql": "{% macro dynamic_table_execute_build_sql(build_sql, existing_relation, target_relation, post_hooks) %}\n\n {% set grant_config = config.get('grants') %}\n\n {% call statement(name=\"main\") %}\n {{ build_sql }}\n {% endcall %}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.831357, "supported_languages": null}, "macro.dbt_snowflake.snowflake__current_timestamp": {"name": "snowflake__current_timestamp", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/timestamps.sql", "original_file_path": "macros/utils/timestamps.sql", "unique_id": "macro.dbt_snowflake.snowflake__current_timestamp", "macro_sql": "{% macro snowflake__current_timestamp() -%}\n convert_timezone('UTC', current_timestamp())\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8318796, "supported_languages": null}, "macro.dbt_snowflake.snowflake__snapshot_string_as_time": {"name": "snowflake__snapshot_string_as_time", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/timestamps.sql", "original_file_path": "macros/utils/timestamps.sql", "unique_id": "macro.dbt_snowflake.snowflake__snapshot_string_as_time", "macro_sql": "{% macro snowflake__snapshot_string_as_time(timestamp) -%}\n {%- set result = \"to_timestamp_ntz('\" ~ timestamp ~ \"')\" -%}\n {{ return(result) }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8322427, "supported_languages": null}, "macro.dbt_snowflake.snowflake__snapshot_get_time": {"name": "snowflake__snapshot_get_time", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/timestamps.sql", "original_file_path": "macros/utils/timestamps.sql", "unique_id": "macro.dbt_snowflake.snowflake__snapshot_get_time", "macro_sql": "{% macro snowflake__snapshot_get_time() -%}\n to_timestamp_ntz({{ current_timestamp() }})\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.current_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8324609, "supported_languages": null}, "macro.dbt_snowflake.snowflake__current_timestamp_backcompat": {"name": "snowflake__current_timestamp_backcompat", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/timestamps.sql", "original_file_path": "macros/utils/timestamps.sql", "unique_id": "macro.dbt_snowflake.snowflake__current_timestamp_backcompat", "macro_sql": "{% macro snowflake__current_timestamp_backcompat() %}\n current_timestamp::{{ type_timestamp() }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.type_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8327022, "supported_languages": null}, "macro.dbt_snowflake.snowflake__current_timestamp_in_utc_backcompat": {"name": "snowflake__current_timestamp_in_utc_backcompat", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/timestamps.sql", "original_file_path": "macros/utils/timestamps.sql", "unique_id": "macro.dbt_snowflake.snowflake__current_timestamp_in_utc_backcompat", "macro_sql": "{% macro snowflake__current_timestamp_in_utc_backcompat() %}\n convert_timezone('UTC', {{ snowflake__current_timestamp_backcompat() }})::{{ type_timestamp() }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__current_timestamp_backcompat", "macro.dbt.type_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.832989, "supported_languages": null}, "macro.dbt_snowflake.snowflake__escape_single_quotes": {"name": "snowflake__escape_single_quotes", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/escape_single_quotes.sql", "original_file_path": "macros/utils/escape_single_quotes.sql", "unique_id": "macro.dbt_snowflake.snowflake__escape_single_quotes", "macro_sql": "{% macro snowflake__escape_single_quotes(expression) -%}\n{{ expression | replace(\"'\", \"\\\\'\") }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.833364, "supported_languages": null}, "macro.dbt_snowflake.snowflake__array_construct": {"name": "snowflake__array_construct", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/array_construct.sql", "original_file_path": "macros/utils/array_construct.sql", "unique_id": "macro.dbt_snowflake.snowflake__array_construct", "macro_sql": "{% macro snowflake__array_construct(inputs, data_type) -%}\n array_construct( {{ inputs|join(' , ') }} )\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8337266, "supported_languages": null}, "macro.dbt_snowflake.snowflake__bool_or": {"name": "snowflake__bool_or", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/bool_or.sql", "original_file_path": "macros/utils/bool_or.sql", "unique_id": "macro.dbt_snowflake.snowflake__bool_or", "macro_sql": "{% macro snowflake__bool_or(expression) -%}\n\n boolor_agg({{ expression }})\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8339996, "supported_languages": null}, "macro.dbt_snowflake.snowflake__right": {"name": "snowflake__right", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/right.sql", "original_file_path": "macros/utils/right.sql", "unique_id": "macro.dbt_snowflake.snowflake__right", "macro_sql": "{% macro snowflake__right(string_text, length_expression) %}\n\n case when {{ length_expression }} = 0\n then ''\n else\n right(\n {{ string_text }},\n {{ length_expression }}\n )\n end\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.834435, "supported_languages": null}, "macro.dbt_snowflake.snowflake__safe_cast": {"name": "snowflake__safe_cast", "resource_type": "macro", "package_name": "dbt_snowflake", "path": "macros/utils/safe_cast.sql", "original_file_path": "macros/utils/safe_cast.sql", "unique_id": "macro.dbt_snowflake.snowflake__safe_cast", "macro_sql": "{% macro snowflake__safe_cast(field, type) %}\n try_cast({{field}} as {{type}})\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.834767, "supported_languages": null}, "macro.dbt.resolve_model_name": {"name": "resolve_model_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.resolve_model_name", "macro_sql": "{% macro resolve_model_name(input_model_name) %}\n {{ return(adapter.dispatch('resolve_model_name', 'dbt')(input_model_name)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__resolve_model_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8372433, "supported_languages": null}, "macro.dbt.default__resolve_model_name": {"name": "default__resolve_model_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.default__resolve_model_name", "macro_sql": "\n\n{%- macro default__resolve_model_name(input_model_name) -%}\n {{ input_model_name | string | replace('\"', '\\\"') }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8375568, "supported_languages": null}, "macro.dbt.build_ref_function": {"name": "build_ref_function", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.build_ref_function", "macro_sql": "{% macro build_ref_function(model) %}\n\n {%- set ref_dict = {} -%}\n {%- for _ref in model.refs -%}\n {% set _ref_args = [_ref.get('package'), _ref['name']] if _ref.get('package') else [_ref['name'],] %}\n {%- set resolved = ref(*_ref_args, v=_ref.get('version')) -%}\n {%- if _ref.get('version') -%}\n {% do _ref_args.extend([\"v\" ~ _ref['version']]) %}\n {%- endif -%}\n {%- do ref_dict.update({_ref_args | join('.'): resolve_model_name(resolved)}) -%}\n {%- endfor -%}\n\ndef ref(*args, **kwargs):\n refs = {{ ref_dict | tojson }}\n key = '.'.join(args)\n version = kwargs.get(\"v\") or kwargs.get(\"version\")\n if version:\n key += f\".v{version}\"\n dbt_load_df_function = kwargs.get(\"dbt_load_df_function\")\n return dbt_load_df_function(refs[key])\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.resolve_model_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.839299, "supported_languages": null}, "macro.dbt.build_source_function": {"name": "build_source_function", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.build_source_function", "macro_sql": "{% macro build_source_function(model) %}\n\n {%- set source_dict = {} -%}\n {%- for _source in model.sources -%}\n {%- set resolved = source(*_source) -%}\n {%- do source_dict.update({_source | join('.'): resolve_model_name(resolved)}) -%}\n {%- endfor -%}\n\ndef source(*args, dbt_load_df_function):\n sources = {{ source_dict | tojson }}\n key = '.'.join(args)\n return dbt_load_df_function(sources[key])\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.resolve_model_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8401208, "supported_languages": null}, "macro.dbt.build_config_dict": {"name": "build_config_dict", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.build_config_dict", "macro_sql": "{% macro build_config_dict(model) %}\n {%- set config_dict = {} -%}\n {% set config_dbt_used = zip(model.config.config_keys_used, model.config.config_keys_defaults) | list %}\n {%- for key, default in config_dbt_used -%}\n {# weird type testing with enum, would be much easier to write this logic in Python! #}\n {%- if key == \"language\" -%}\n {%- set value = \"python\" -%}\n {%- endif -%}\n {%- set value = model.config.get(key, default) -%}\n {%- do config_dict.update({key: value}) -%}\n {%- endfor -%}\nconfig_dict = {{ config_dict }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.84131, "supported_languages": null}, "macro.dbt.py_script_postfix": {"name": "py_script_postfix", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.py_script_postfix", "macro_sql": "{% macro py_script_postfix(model) %}\n# This part is user provided model code\n# you will need to copy the next section to run the code\n# COMMAND ----------\n# this part is dbt logic for get ref work, do not modify\n\n{{ build_ref_function(model ) }}\n{{ build_source_function(model ) }}\n{{ build_config_dict(model) }}\n\nclass config:\n def __init__(self, *args, **kwargs):\n pass\n\n @staticmethod\n def get(key, default=None):\n return config_dict.get(key, default)\n\nclass this:\n \"\"\"dbt.this() or dbt.this.identifier\"\"\"\n database = \"{{ this.database }}\"\n schema = \"{{ this.schema }}\"\n identifier = \"{{ this.identifier }}\"\n {% set this_relation_name = resolve_model_name(this) %}\n def __repr__(self):\n return '{{ this_relation_name }}'\n\n\nclass dbtObj:\n def __init__(self, load_df_function) -> None:\n self.source = lambda *args: source(*args, dbt_load_df_function=load_df_function)\n self.ref = lambda *args, **kwargs: ref(*args, **kwargs, dbt_load_df_function=load_df_function)\n self.config = config\n self.this = this()\n self.is_incremental = {{ is_incremental() }}\n\n# COMMAND ----------\n{{py_script_comment()}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.build_ref_function", "macro.dbt.build_source_function", "macro.dbt.build_config_dict", "macro.dbt.resolve_model_name", "macro.dbt.is_incremental", "macro.dbt.py_script_comment"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8422546, "supported_languages": null}, "macro.dbt.py_script_comment": {"name": "py_script_comment", "resource_type": "macro", "package_name": "dbt", "path": "macros/python_model/python.sql", "original_file_path": "macros/python_model/python.sql", "unique_id": "macro.dbt.py_script_comment", "macro_sql": "{%macro py_script_comment()%}\n{%endmacro%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8424113, "supported_languages": null}, "macro.dbt.statement": {"name": "statement", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/statement.sql", "original_file_path": "macros/etc/statement.sql", "unique_id": "macro.dbt.statement", "macro_sql": "\n{%- macro statement(name=None, fetch_result=False, auto_begin=True, language='sql') -%}\n {%- if execute: -%}\n {%- set compiled_code = caller() -%}\n\n {%- if name == 'main' -%}\n {{ log('Writing runtime {} for node \"{}\"'.format(language, model['unique_id'])) }}\n {{ write(compiled_code) }}\n {%- endif -%}\n {%- if language == 'sql'-%}\n {%- set res, table = adapter.execute(compiled_code, auto_begin=auto_begin, fetch=fetch_result) -%}\n {%- elif language == 'python' -%}\n {%- set res = submit_python_job(model, compiled_code) -%}\n {#-- TODO: What should table be for python models? --#}\n {%- set table = None -%}\n {%- else -%}\n {% do exceptions.raise_compiler_error(\"statement macro didn't get supported language\") %}\n {%- endif -%}\n\n {%- if name is not none -%}\n {{ store_result(name, response=res, agate_table=table) }}\n {%- endif -%}\n\n {%- endif -%}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8451707, "supported_languages": null}, "macro.dbt.noop_statement": {"name": "noop_statement", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/statement.sql", "original_file_path": "macros/etc/statement.sql", "unique_id": "macro.dbt.noop_statement", "macro_sql": "{% macro noop_statement(name=None, message=None, code=None, rows_affected=None, res=None) -%}\n {%- set sql = caller() -%}\n\n {%- if name == 'main' -%}\n {{ log('Writing runtime SQL for node \"{}\"'.format(model['unique_id'])) }}\n {{ write(sql) }}\n {%- endif -%}\n\n {%- if name is not none -%}\n {{ store_raw_result(name, message=message, code=code, rows_affected=rows_affected, agate_table=res) }}\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8463259, "supported_languages": null}, "macro.dbt.run_query": {"name": "run_query", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/statement.sql", "original_file_path": "macros/etc/statement.sql", "unique_id": "macro.dbt.run_query", "macro_sql": "{% macro run_query(sql) %}\n {% call statement(\"run_query_statement\", fetch_result=true, auto_begin=false) %}\n {{ sql }}\n {% endcall %}\n\n {% do return(load_result(\"run_query_statement\").table) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.84692, "supported_languages": null}, "macro.dbt.convert_datetime": {"name": "convert_datetime", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/datetime.sql", "original_file_path": "macros/etc/datetime.sql", "unique_id": "macro.dbt.convert_datetime", "macro_sql": "{% macro convert_datetime(date_str, date_fmt) %}\n\n {% set error_msg -%}\n The provided partition date '{{ date_str }}' does not match the expected format '{{ date_fmt }}'\n {%- endset %}\n\n {% set res = try_or_compiler_error(error_msg, modules.datetime.datetime.strptime, date_str.strip(), date_fmt) %}\n {{ return(res) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.849794, "supported_languages": null}, "macro.dbt.dates_in_range": {"name": "dates_in_range", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/datetime.sql", "original_file_path": "macros/etc/datetime.sql", "unique_id": "macro.dbt.dates_in_range", "macro_sql": "{% macro dates_in_range(start_date_str, end_date_str=none, in_fmt=\"%Y%m%d\", out_fmt=\"%Y%m%d\") %}\n {% set end_date_str = start_date_str if end_date_str is none else end_date_str %}\n\n {% set start_date = convert_datetime(start_date_str, in_fmt) %}\n {% set end_date = convert_datetime(end_date_str, in_fmt) %}\n\n {% set day_count = (end_date - start_date).days %}\n {% if day_count < 0 %}\n {% set msg -%}\n Partiton start date is after the end date ({{ start_date }}, {{ end_date }})\n {%- endset %}\n\n {{ exceptions.raise_compiler_error(msg, model) }}\n {% endif %}\n\n {% set date_list = [] %}\n {% for i in range(0, day_count + 1) %}\n {% set the_date = (modules.datetime.timedelta(days=i) + start_date) %}\n {% if not out_fmt %}\n {% set _ = date_list.append(the_date) %}\n {% else %}\n {% set _ = date_list.append(the_date.strftime(out_fmt)) %}\n {% endif %}\n {% endfor %}\n\n {{ return(date_list) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.convert_datetime"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8521795, "supported_languages": null}, "macro.dbt.partition_range": {"name": "partition_range", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/datetime.sql", "original_file_path": "macros/etc/datetime.sql", "unique_id": "macro.dbt.partition_range", "macro_sql": "{% macro partition_range(raw_partition_date, date_fmt='%Y%m%d') %}\n {% set partition_range = (raw_partition_date | string).split(\",\") %}\n\n {% if (partition_range | length) == 1 %}\n {% set start_date = partition_range[0] %}\n {% set end_date = none %}\n {% elif (partition_range | length) == 2 %}\n {% set start_date = partition_range[0] %}\n {% set end_date = partition_range[1] %}\n {% else %}\n {{ exceptions.raise_compiler_error(\"Invalid partition time. Expected format: {Start Date}[,{End Date}]. Got: \" ~ raw_partition_date) }}\n {% endif %}\n\n {{ return(dates_in_range(start_date, end_date, in_fmt=date_fmt)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.dates_in_range"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8537173, "supported_languages": null}, "macro.dbt.py_current_timestring": {"name": "py_current_timestring", "resource_type": "macro", "package_name": "dbt", "path": "macros/etc/datetime.sql", "original_file_path": "macros/etc/datetime.sql", "unique_id": "macro.dbt.py_current_timestring", "macro_sql": "{% macro py_current_timestring() %}\n {% set dt = modules.datetime.datetime.now() %}\n {% do return(dt.strftime(\"%Y%m%d%H%M%S%f\")) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8541856, "supported_languages": null}, "macro.dbt.get_columns_in_relation": {"name": "get_columns_in_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.get_columns_in_relation", "macro_sql": "{% macro get_columns_in_relation(relation) -%}\n {{ return(adapter.dispatch('get_columns_in_relation', 'dbt')(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_columns_in_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.857628, "supported_languages": null}, "macro.dbt.default__get_columns_in_relation": {"name": "default__get_columns_in_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__get_columns_in_relation", "macro_sql": "{% macro default__get_columns_in_relation(relation) -%}\n {{ exceptions.raise_not_implemented(\n 'get_columns_in_relation macro not implemented for adapter '+adapter.type()) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8579628, "supported_languages": null}, "macro.dbt.sql_convert_columns_in_relation": {"name": "sql_convert_columns_in_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.sql_convert_columns_in_relation", "macro_sql": "{% macro sql_convert_columns_in_relation(table) -%}\n {% set columns = [] %}\n {% for row in table %}\n {% do columns.append(api.Column(*row)) %}\n {% endfor %}\n {{ return(columns) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.858594, "supported_languages": null}, "macro.dbt.get_empty_subquery_sql": {"name": "get_empty_subquery_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.get_empty_subquery_sql", "macro_sql": "{% macro get_empty_subquery_sql(select_sql, select_sql_header=none) -%}\n {{ return(adapter.dispatch('get_empty_subquery_sql', 'dbt')(select_sql, select_sql_header)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_empty_subquery_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.859031, "supported_languages": null}, "macro.dbt.default__get_empty_subquery_sql": {"name": "default__get_empty_subquery_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__get_empty_subquery_sql", "macro_sql": "{% macro default__get_empty_subquery_sql(select_sql, select_sql_header=none) %}\n {%- if select_sql_header is not none -%}\n {{ select_sql_header }}\n {%- endif -%}\n select * from (\n {{ select_sql }}\n ) as __dbt_sbq\n where false\n limit 0\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8594441, "supported_languages": null}, "macro.dbt.get_empty_schema_sql": {"name": "get_empty_schema_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.get_empty_schema_sql", "macro_sql": "{% macro get_empty_schema_sql(columns) -%}\n {{ return(adapter.dispatch('get_empty_schema_sql', 'dbt')(columns)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_empty_schema_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8598125, "supported_languages": null}, "macro.dbt.default__get_empty_schema_sql": {"name": "default__get_empty_schema_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__get_empty_schema_sql", "macro_sql": "{% macro default__get_empty_schema_sql(columns) %}\n {%- set col_err = [] -%}\n select\n {% for i in columns %}\n {%- set col = columns[i] -%}\n {%- if col['data_type'] is not defined -%}\n {{ col_err.append(col['name']) }}\n {%- endif -%}\n {% set col_name = adapter.quote(col['name']) if col.get('quote') else col['name'] %}\n cast(null as {{ col['data_type'] }}) as {{ col_name }}{{ \", \" if not loop.last }}\n {%- endfor -%}\n {%- if (col_err | length) > 0 -%}\n {{ exceptions.column_type_missing(column_names=col_err) }}\n {%- endif -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8614879, "supported_languages": null}, "macro.dbt.get_column_schema_from_query": {"name": "get_column_schema_from_query", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.get_column_schema_from_query", "macro_sql": "{% macro get_column_schema_from_query(select_sql, select_sql_header=none) -%}\n {% set columns = [] %}\n {# -- Using an 'empty subquery' here to get the same schema as the given select_sql statement, without necessitating a data scan.#}\n {% set sql = get_empty_subquery_sql(select_sql, select_sql_header) %}\n {% set column_schema = adapter.get_column_schema_from_query(sql) %}\n {{ return(column_schema) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_empty_subquery_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8622155, "supported_languages": null}, "macro.dbt.get_columns_in_query": {"name": "get_columns_in_query", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.get_columns_in_query", "macro_sql": "{% macro get_columns_in_query(select_sql) -%}\n {{ return(adapter.dispatch('get_columns_in_query', 'dbt')(select_sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_columns_in_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8625867, "supported_languages": null}, "macro.dbt.default__get_columns_in_query": {"name": "default__get_columns_in_query", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__get_columns_in_query", "macro_sql": "{% macro default__get_columns_in_query(select_sql) %}\n {% call statement('get_columns_in_query', fetch_result=True, auto_begin=False) -%}\n {{ get_empty_subquery_sql(select_sql) }}\n {% endcall %}\n {{ return(load_result('get_columns_in_query').table.columns | map(attribute='name') | list) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.get_empty_subquery_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8632777, "supported_languages": null}, "macro.dbt.alter_column_type": {"name": "alter_column_type", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.alter_column_type", "macro_sql": "{% macro alter_column_type(relation, column_name, new_column_type) -%}\n {{ return(adapter.dispatch('alter_column_type', 'dbt')(relation, column_name, new_column_type)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__alter_column_type"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8637304, "supported_languages": null}, "macro.dbt.default__alter_column_type": {"name": "default__alter_column_type", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__alter_column_type", "macro_sql": "{% macro default__alter_column_type(relation, column_name, new_column_type) -%}\n {#\n 1. Create a new column (w/ temp name and correct type)\n 2. Copy data over to it\n 3. Drop the existing column (cascade!)\n 4. Rename the new column to existing column\n #}\n {%- set tmp_column = column_name + \"__dbt_alter\" -%}\n\n {% call statement('alter_column_type') %}\n alter table {{ relation }} add column {{ adapter.quote(tmp_column) }} {{ new_column_type }};\n update {{ relation }} set {{ adapter.quote(tmp_column) }} = {{ adapter.quote(column_name) }};\n alter table {{ relation }} drop column {{ adapter.quote(column_name) }} cascade;\n alter table {{ relation }} rename column {{ adapter.quote(tmp_column) }} to {{ adapter.quote(column_name) }}\n {% endcall %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.864945, "supported_languages": null}, "macro.dbt.alter_relation_add_remove_columns": {"name": "alter_relation_add_remove_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.alter_relation_add_remove_columns", "macro_sql": "{% macro alter_relation_add_remove_columns(relation, add_columns = none, remove_columns = none) -%}\n {{ return(adapter.dispatch('alter_relation_add_remove_columns', 'dbt')(relation, add_columns, remove_columns)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__alter_relation_add_remove_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8654623, "supported_languages": null}, "macro.dbt.default__alter_relation_add_remove_columns": {"name": "default__alter_relation_add_remove_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/columns.sql", "original_file_path": "macros/adapters/columns.sql", "unique_id": "macro.dbt.default__alter_relation_add_remove_columns", "macro_sql": "{% macro default__alter_relation_add_remove_columns(relation, add_columns, remove_columns) %}\n\n {% if add_columns is none %}\n {% set add_columns = [] %}\n {% endif %}\n {% if remove_columns is none %}\n {% set remove_columns = [] %}\n {% endif %}\n\n {% set sql -%}\n\n alter {{ relation.type }} {{ relation }}\n\n {% for column in add_columns %}\n add column {{ column.name }} {{ column.data_type }}{{ ',' if not loop.last }}\n {% endfor %}{{ ',' if add_columns and remove_columns }}\n\n {% for column in remove_columns %}\n drop column {{ column.name }}{{ ',' if not loop.last }}\n {% endfor %}\n\n {%- endset -%}\n\n {% do run_query(sql) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8670418, "supported_languages": null}, "macro.dbt.drop_relation": {"name": "drop_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.drop_relation", "macro_sql": "{% macro drop_relation(relation) -%}\n {{ return(adapter.dispatch('drop_relation', 'dbt')(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__drop_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8685424, "supported_languages": null}, "macro.dbt.default__drop_relation": {"name": "default__drop_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.default__drop_relation", "macro_sql": "{% macro default__drop_relation(relation) -%}\n {% call statement('drop_relation', auto_begin=False) -%}\n {%- if relation.is_table -%}\n {{- drop_table(relation) -}}\n {%- elif relation.is_view -%}\n {{- drop_view(relation) -}}\n {%- elif relation.is_materialized_view -%}\n {{- drop_materialized_view(relation) -}}\n {%- else -%}\n drop {{ relation.type }} if exists {{ relation }} cascade\n {%- endif -%}\n {%- endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.drop_table", "macro.dbt.drop_view", "macro.dbt.drop_materialized_view"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8694792, "supported_languages": null}, "macro.dbt.drop_table": {"name": "drop_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.drop_table", "macro_sql": "{% macro drop_table(relation) -%}\n {{ return(adapter.dispatch('drop_table', 'dbt')(relation)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__drop_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8698502, "supported_languages": null}, "macro.dbt.default__drop_table": {"name": "default__drop_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.default__drop_table", "macro_sql": "{% macro default__drop_table(relation) -%}\n drop table if exists {{ relation }} cascade\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8700695, "supported_languages": null}, "macro.dbt.drop_view": {"name": "drop_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.drop_view", "macro_sql": "{% macro drop_view(relation) -%}\n {{ return(adapter.dispatch('drop_view', 'dbt')(relation)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__drop_view"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.870432, "supported_languages": null}, "macro.dbt.default__drop_view": {"name": "default__drop_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.default__drop_view", "macro_sql": "{% macro default__drop_view(relation) -%}\n drop view if exists {{ relation }} cascade\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8706546, "supported_languages": null}, "macro.dbt.drop_materialized_view": {"name": "drop_materialized_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.drop_materialized_view", "macro_sql": "{% macro drop_materialized_view(relation) -%}\n {{ return(adapter.dispatch('drop_materialized_view', 'dbt')(relation)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__drop_materialized_view"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8710067, "supported_languages": null}, "macro.dbt.default__drop_materialized_view": {"name": "default__drop_materialized_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/drop_relation.sql", "original_file_path": "macros/adapters/drop_relation.sql", "unique_id": "macro.dbt.default__drop_materialized_view", "macro_sql": "{% macro default__drop_materialized_view(relation) -%}\n drop materialized view if exists {{ relation }} cascade\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8712208, "supported_languages": null}, "macro.dbt.current_timestamp": {"name": "current_timestamp", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.current_timestamp", "macro_sql": "{%- macro current_timestamp() -%}\n {{ adapter.dispatch('current_timestamp', 'dbt')() }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__current_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8720825, "supported_languages": null}, "macro.dbt.default__current_timestamp": {"name": "default__current_timestamp", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.default__current_timestamp", "macro_sql": "{% macro default__current_timestamp() -%}\n {{ exceptions.raise_not_implemented(\n 'current_timestamp macro not implemented for adapter ' + adapter.type()) }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.872397, "supported_languages": null}, "macro.dbt.snapshot_get_time": {"name": "snapshot_get_time", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.snapshot_get_time", "macro_sql": "\n\n{%- macro snapshot_get_time() -%}\n {{ adapter.dispatch('snapshot_get_time', 'dbt')() }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__snapshot_get_time"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.872706, "supported_languages": null}, "macro.dbt.default__snapshot_get_time": {"name": "default__snapshot_get_time", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.default__snapshot_get_time", "macro_sql": "{% macro default__snapshot_get_time() %}\n {{ current_timestamp() }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.current_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.872926, "supported_languages": null}, "macro.dbt.current_timestamp_backcompat": {"name": "current_timestamp_backcompat", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.current_timestamp_backcompat", "macro_sql": "{% macro current_timestamp_backcompat() %}\n {{ return(adapter.dispatch('current_timestamp_backcompat', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__current_timestamp_backcompat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8732874, "supported_languages": null}, "macro.dbt.default__current_timestamp_backcompat": {"name": "default__current_timestamp_backcompat", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.default__current_timestamp_backcompat", "macro_sql": "{% macro default__current_timestamp_backcompat() %}\n current_timestamp::timestamp\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8734481, "supported_languages": null}, "macro.dbt.current_timestamp_in_utc_backcompat": {"name": "current_timestamp_in_utc_backcompat", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.current_timestamp_in_utc_backcompat", "macro_sql": "{% macro current_timestamp_in_utc_backcompat() %}\n {{ return(adapter.dispatch('current_timestamp_in_utc_backcompat', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__current_timestamp_in_utc_backcompat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8737872, "supported_languages": null}, "macro.dbt.default__current_timestamp_in_utc_backcompat": {"name": "default__current_timestamp_in_utc_backcompat", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/timestamps.sql", "original_file_path": "macros/adapters/timestamps.sql", "unique_id": "macro.dbt.default__current_timestamp_in_utc_backcompat", "macro_sql": "{% macro default__current_timestamp_in_utc_backcompat() %}\n {{ return(adapter.dispatch('current_timestamp_backcompat', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.current_timestamp_backcompat", "macro.dbt_snowflake.snowflake__current_timestamp_backcompat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8742428, "supported_languages": null}, "macro.dbt.collect_freshness": {"name": "collect_freshness", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/freshness.sql", "original_file_path": "macros/adapters/freshness.sql", "unique_id": "macro.dbt.collect_freshness", "macro_sql": "{% macro collect_freshness(source, loaded_at_field, filter) %}\n {{ return(adapter.dispatch('collect_freshness', 'dbt')(source, loaded_at_field, filter))}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__collect_freshness"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8750105, "supported_languages": null}, "macro.dbt.default__collect_freshness": {"name": "default__collect_freshness", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/freshness.sql", "original_file_path": "macros/adapters/freshness.sql", "unique_id": "macro.dbt.default__collect_freshness", "macro_sql": "{% macro default__collect_freshness(source, loaded_at_field, filter) %}\n {% call statement('collect_freshness', fetch_result=True, auto_begin=False) -%}\n select\n max({{ loaded_at_field }}) as max_loaded_at,\n {{ current_timestamp() }} as snapshotted_at\n from {{ source }}\n {% if filter %}\n where {{ filter }}\n {% endif %}\n {% endcall %}\n {{ return(load_result('collect_freshness')) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.current_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8758419, "supported_languages": null}, "macro.dbt.validate_sql": {"name": "validate_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/validate_sql.sql", "original_file_path": "macros/adapters/validate_sql.sql", "unique_id": "macro.dbt.validate_sql", "macro_sql": "{% macro validate_sql(sql) -%}\n {{ return(adapter.dispatch('validate_sql', 'dbt')(sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__validate_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8763785, "supported_languages": null}, "macro.dbt.default__validate_sql": {"name": "default__validate_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/validate_sql.sql", "original_file_path": "macros/adapters/validate_sql.sql", "unique_id": "macro.dbt.default__validate_sql", "macro_sql": "{% macro default__validate_sql(sql) -%}\n {% call statement('validate_sql') -%}\n explain {{ sql }}\n {% endcall %}\n {{ return(load_result('validate_sql')) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8768578, "supported_languages": null}, "macro.dbt.create_schema": {"name": "create_schema", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/schema.sql", "original_file_path": "macros/adapters/schema.sql", "unique_id": "macro.dbt.create_schema", "macro_sql": "{% macro create_schema(relation) -%}\n {{ adapter.dispatch('create_schema', 'dbt')(relation) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__create_schema"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8775434, "supported_languages": null}, "macro.dbt.default__create_schema": {"name": "default__create_schema", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/schema.sql", "original_file_path": "macros/adapters/schema.sql", "unique_id": "macro.dbt.default__create_schema", "macro_sql": "{% macro default__create_schema(relation) -%}\n {%- call statement('create_schema') -%}\n create schema if not exists {{ relation.without_identifier() }}\n {% endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8779247, "supported_languages": null}, "macro.dbt.drop_schema": {"name": "drop_schema", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/schema.sql", "original_file_path": "macros/adapters/schema.sql", "unique_id": "macro.dbt.drop_schema", "macro_sql": "{% macro drop_schema(relation) -%}\n {{ adapter.dispatch('drop_schema', 'dbt')(relation) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__drop_schema"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8782535, "supported_languages": null}, "macro.dbt.default__drop_schema": {"name": "default__drop_schema", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/schema.sql", "original_file_path": "macros/adapters/schema.sql", "unique_id": "macro.dbt.default__drop_schema", "macro_sql": "{% macro default__drop_schema(relation) -%}\n {%- call statement('drop_schema') -%}\n drop schema if exists {{ relation.without_identifier() }} cascade\n {% endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8786259, "supported_languages": null}, "macro.dbt.make_intermediate_relation": {"name": "make_intermediate_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.make_intermediate_relation", "macro_sql": "{% macro make_intermediate_relation(base_relation, suffix='__dbt_tmp') %}\n {{ return(adapter.dispatch('make_intermediate_relation', 'dbt')(base_relation, suffix)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__make_intermediate_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8827796, "supported_languages": null}, "macro.dbt.default__make_intermediate_relation": {"name": "default__make_intermediate_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__make_intermediate_relation", "macro_sql": "{% macro default__make_intermediate_relation(base_relation, suffix) %}\n {{ return(default__make_temp_relation(base_relation, suffix)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__make_temp_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8831184, "supported_languages": null}, "macro.dbt.make_temp_relation": {"name": "make_temp_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.make_temp_relation", "macro_sql": "{% macro make_temp_relation(base_relation, suffix='__dbt_tmp') %}\n {{ return(adapter.dispatch('make_temp_relation', 'dbt')(base_relation, suffix)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__make_temp_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8835523, "supported_languages": null}, "macro.dbt.default__make_temp_relation": {"name": "default__make_temp_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__make_temp_relation", "macro_sql": "{% macro default__make_temp_relation(base_relation, suffix) %}\n {%- set temp_identifier = base_relation.identifier ~ suffix -%}\n {%- set temp_relation = base_relation.incorporate(\n path={\"identifier\": temp_identifier}) -%}\n\n {{ return(temp_relation) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8841305, "supported_languages": null}, "macro.dbt.make_backup_relation": {"name": "make_backup_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.make_backup_relation", "macro_sql": "{% macro make_backup_relation(base_relation, backup_relation_type, suffix='__dbt_backup') %}\n {{ return(adapter.dispatch('make_backup_relation', 'dbt')(base_relation, backup_relation_type, suffix)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__make_backup_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8846295, "supported_languages": null}, "macro.dbt.default__make_backup_relation": {"name": "default__make_backup_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__make_backup_relation", "macro_sql": "{% macro default__make_backup_relation(base_relation, backup_relation_type, suffix) %}\n {%- set backup_identifier = base_relation.identifier ~ suffix -%}\n {%- set backup_relation = base_relation.incorporate(\n path={\"identifier\": backup_identifier},\n type=backup_relation_type\n ) -%}\n {{ return(backup_relation) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8853598, "supported_languages": null}, "macro.dbt.truncate_relation": {"name": "truncate_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.truncate_relation", "macro_sql": "{% macro truncate_relation(relation) -%}\n {{ return(adapter.dispatch('truncate_relation', 'dbt')(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__truncate_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8857334, "supported_languages": null}, "macro.dbt.default__truncate_relation": {"name": "default__truncate_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__truncate_relation", "macro_sql": "{% macro default__truncate_relation(relation) -%}\n {% call statement('truncate_relation') -%}\n truncate table {{ relation }}\n {%- endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8860667, "supported_languages": null}, "macro.dbt.rename_relation": {"name": "rename_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.rename_relation", "macro_sql": "{% macro rename_relation(from_relation, to_relation) -%}\n {{ return(adapter.dispatch('rename_relation', 'dbt')(from_relation, to_relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__rename_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8864796, "supported_languages": null}, "macro.dbt.default__rename_relation": {"name": "default__rename_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__rename_relation", "macro_sql": "{% macro default__rename_relation(from_relation, to_relation) -%}\n {% set target_name = adapter.quote_as_configured(to_relation.identifier, 'identifier') %}\n {% call statement('rename_relation') -%}\n alter table {{ from_relation }} rename to {{ target_name }}\n {%- endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8871539, "supported_languages": null}, "macro.dbt.get_or_create_relation": {"name": "get_or_create_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.get_or_create_relation", "macro_sql": "{% macro get_or_create_relation(database, schema, identifier, type) -%}\n {{ return(adapter.dispatch('get_or_create_relation', 'dbt')(database, schema, identifier, type)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_or_create_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8876488, "supported_languages": null}, "macro.dbt.default__get_or_create_relation": {"name": "default__get_or_create_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.default__get_or_create_relation", "macro_sql": "{% macro default__get_or_create_relation(database, schema, identifier, type) %}\n {%- set target_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) %}\n\n {% if target_relation %}\n {% do return([true, target_relation]) %}\n {% endif %}\n\n {%- set new_relation = api.Relation.create(\n database=database,\n schema=schema,\n identifier=identifier,\n type=type\n ) -%}\n {% do return([false, new_relation]) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8887649, "supported_languages": null}, "macro.dbt.load_cached_relation": {"name": "load_cached_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.load_cached_relation", "macro_sql": "{% macro load_cached_relation(relation) %}\n {% do return(adapter.get_relation(\n database=relation.database,\n schema=relation.schema,\n identifier=relation.identifier\n )) -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.889229, "supported_languages": null}, "macro.dbt.load_relation": {"name": "load_relation", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.load_relation", "macro_sql": "{% macro load_relation(relation) %}\n {{ return(load_cached_relation(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8895187, "supported_languages": null}, "macro.dbt.drop_relation_if_exists": {"name": "drop_relation_if_exists", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/relation.sql", "original_file_path": "macros/adapters/relation.sql", "unique_id": "macro.dbt.drop_relation_if_exists", "macro_sql": "{% macro drop_relation_if_exists(relation) %}\n {% if relation is not none %}\n {{ adapter.drop_relation(relation) }}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8899121, "supported_languages": null}, "macro.dbt.alter_column_comment": {"name": "alter_column_comment", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.alter_column_comment", "macro_sql": "{% macro alter_column_comment(relation, column_dict) -%}\n {{ return(adapter.dispatch('alter_column_comment', 'dbt')(relation, column_dict)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__alter_column_comment"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8910716, "supported_languages": null}, "macro.dbt.default__alter_column_comment": {"name": "default__alter_column_comment", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.default__alter_column_comment", "macro_sql": "{% macro default__alter_column_comment(relation, column_dict) -%}\n {{ exceptions.raise_not_implemented(\n 'alter_column_comment macro not implemented for adapter '+adapter.type()) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8914137, "supported_languages": null}, "macro.dbt.alter_relation_comment": {"name": "alter_relation_comment", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.alter_relation_comment", "macro_sql": "{% macro alter_relation_comment(relation, relation_comment) -%}\n {{ return(adapter.dispatch('alter_relation_comment', 'dbt')(relation, relation_comment)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__alter_relation_comment"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8918228, "supported_languages": null}, "macro.dbt.default__alter_relation_comment": {"name": "default__alter_relation_comment", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.default__alter_relation_comment", "macro_sql": "{% macro default__alter_relation_comment(relation, relation_comment) -%}\n {{ exceptions.raise_not_implemented(\n 'alter_relation_comment macro not implemented for adapter '+adapter.type()) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8921635, "supported_languages": null}, "macro.dbt.persist_docs": {"name": "persist_docs", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.persist_docs", "macro_sql": "{% macro persist_docs(relation, model, for_relation=true, for_columns=true) -%}\n {{ return(adapter.dispatch('persist_docs', 'dbt')(relation, model, for_relation, for_columns)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.892715, "supported_languages": null}, "macro.dbt.default__persist_docs": {"name": "default__persist_docs", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/persist_docs.sql", "original_file_path": "macros/adapters/persist_docs.sql", "unique_id": "macro.dbt.default__persist_docs", "macro_sql": "{% macro default__persist_docs(relation, model, for_relation, for_columns) -%}\n {% if for_relation and config.persist_relation_docs() and model.description %}\n {% do run_query(alter_relation_comment(relation, model.description)) %}\n {% endif %}\n\n {% if for_columns and config.persist_column_docs() and model.columns %}\n {% do run_query(alter_column_comment(relation, model.columns)) %}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query", "macro.dbt.alter_relation_comment", "macro.dbt.alter_column_comment"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8936973, "supported_languages": null}, "macro.dbt.get_catalog": {"name": "get_catalog", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.get_catalog", "macro_sql": "{% macro get_catalog(information_schema, schemas) -%}\n {{ return(adapter.dispatch('get_catalog', 'dbt')(information_schema, schemas)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_catalog"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8959506, "supported_languages": null}, "macro.dbt.default__get_catalog": {"name": "default__get_catalog", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.default__get_catalog", "macro_sql": "{% macro default__get_catalog(information_schema, schemas) -%}\n\n {% set typename = adapter.type() %}\n {% set msg -%}\n get_catalog not implemented for {{ typename }}\n {%- endset %}\n\n {{ exceptions.raise_compiler_error(msg) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8964796, "supported_languages": null}, "macro.dbt.information_schema_name": {"name": "information_schema_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.information_schema_name", "macro_sql": "{% macro information_schema_name(database) %}\n {{ return(adapter.dispatch('information_schema_name', 'dbt')(database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__information_schema_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.896896, "supported_languages": null}, "macro.dbt.default__information_schema_name": {"name": "default__information_schema_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.default__information_schema_name", "macro_sql": "{% macro default__information_schema_name(database) -%}\n {%- if database -%}\n {{ database }}.INFORMATION_SCHEMA\n {%- else -%}\n INFORMATION_SCHEMA\n {%- endif -%}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8972282, "supported_languages": null}, "macro.dbt.list_schemas": {"name": "list_schemas", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.list_schemas", "macro_sql": "{% macro list_schemas(database) -%}\n {{ return(adapter.dispatch('list_schemas', 'dbt')(database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__list_schemas"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8976018, "supported_languages": null}, "macro.dbt.default__list_schemas": {"name": "default__list_schemas", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.default__list_schemas", "macro_sql": "{% macro default__list_schemas(database) -%}\n {% set sql %}\n select distinct schema_name\n from {{ information_schema_name(database) }}.SCHEMATA\n where catalog_name ilike '{{ database }}'\n {% endset %}\n {{ return(run_query(sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.information_schema_name", "macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8980982, "supported_languages": null}, "macro.dbt.check_schema_exists": {"name": "check_schema_exists", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.check_schema_exists", "macro_sql": "{% macro check_schema_exists(information_schema, schema) -%}\n {{ return(adapter.dispatch('check_schema_exists', 'dbt')(information_schema, schema)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__check_schema_exists"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8985262, "supported_languages": null}, "macro.dbt.default__check_schema_exists": {"name": "default__check_schema_exists", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.default__check_schema_exists", "macro_sql": "{% macro default__check_schema_exists(information_schema, schema) -%}\n {% set sql -%}\n select count(*)\n from {{ information_schema.replace(information_schema_view='SCHEMATA') }}\n where catalog_name='{{ information_schema.database }}'\n and schema_name='{{ schema }}'\n {%- endset %}\n {{ return(run_query(sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.replace", "macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8991394, "supported_languages": null}, "macro.dbt.list_relations_without_caching": {"name": "list_relations_without_caching", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.list_relations_without_caching", "macro_sql": "{% macro list_relations_without_caching(schema_relation) %}\n {{ return(adapter.dispatch('list_relations_without_caching', 'dbt')(schema_relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__list_relations_without_caching"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8996327, "supported_languages": null}, "macro.dbt.default__list_relations_without_caching": {"name": "default__list_relations_without_caching", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/metadata.sql", "original_file_path": "macros/adapters/metadata.sql", "unique_id": "macro.dbt.default__list_relations_without_caching", "macro_sql": "{% macro default__list_relations_without_caching(schema_relation) %}\n {{ exceptions.raise_not_implemented(\n 'list_relations_without_caching macro not implemented for adapter '+adapter.type()) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.8999634, "supported_languages": null}, "macro.dbt.get_create_index_sql": {"name": "get_create_index_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.get_create_index_sql", "macro_sql": "{% macro get_create_index_sql(relation, index_dict) -%}\n {{ return(adapter.dispatch('get_create_index_sql', 'dbt')(relation, index_dict)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_create_index_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9014046, "supported_languages": null}, "macro.dbt.default__get_create_index_sql": {"name": "default__get_create_index_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.default__get_create_index_sql", "macro_sql": "{% macro default__get_create_index_sql(relation, index_dict) -%}\n {% do return(None) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9016905, "supported_languages": null}, "macro.dbt.create_indexes": {"name": "create_indexes", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.create_indexes", "macro_sql": "{% macro create_indexes(relation) -%}\n {{ adapter.dispatch('create_indexes', 'dbt')(relation) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__create_indexes"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9020138, "supported_languages": null}, "macro.dbt.default__create_indexes": {"name": "default__create_indexes", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.default__create_indexes", "macro_sql": "{% macro default__create_indexes(relation) -%}\n {%- set _indexes = config.get('indexes', default=[]) -%}\n\n {% for _index_dict in _indexes %}\n {% set create_index_sql = get_create_index_sql(relation, _index_dict) %}\n {% if create_index_sql %}\n {% do run_query(create_index_sql) %}\n {% endif %}\n {% endfor %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_create_index_sql", "macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9028373, "supported_languages": null}, "macro.dbt.get_drop_index_sql": {"name": "get_drop_index_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.get_drop_index_sql", "macro_sql": "{% macro get_drop_index_sql(relation, index_name) -%}\n {{ adapter.dispatch('get_drop_index_sql', 'dbt')(relation, index_name) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_drop_index_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9032085, "supported_languages": null}, "macro.dbt.default__get_drop_index_sql": {"name": "default__get_drop_index_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.default__get_drop_index_sql", "macro_sql": "{% macro default__get_drop_index_sql(relation, index_name) -%}\n {{ exceptions.raise_compiler_error(\"`get_drop_index_sql has not been implemented for this adapter.\") }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9034855, "supported_languages": null}, "macro.dbt.get_show_indexes_sql": {"name": "get_show_indexes_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.get_show_indexes_sql", "macro_sql": "{% macro get_show_indexes_sql(relation) -%}\n {{ adapter.dispatch('get_show_indexes_sql', 'dbt')(relation) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_show_indexes_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9038048, "supported_languages": null}, "macro.dbt.default__get_show_indexes_sql": {"name": "default__get_show_indexes_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/indexes.sql", "original_file_path": "macros/adapters/indexes.sql", "unique_id": "macro.dbt.default__get_show_indexes_sql", "macro_sql": "{% macro default__get_show_indexes_sql(relation) -%}\n {{ exceptions.raise_compiler_error(\"`get_show_indexes_sql has not been implemented for this adapter.\") }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9040606, "supported_languages": null}, "macro.dbt.copy_grants": {"name": "copy_grants", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.copy_grants", "macro_sql": "{% macro copy_grants() %}\n {{ return(adapter.dispatch('copy_grants', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__copy_grants"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.906782, "supported_languages": null}, "macro.dbt.default__copy_grants": {"name": "default__copy_grants", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__copy_grants", "macro_sql": "{% macro default__copy_grants() %}\n {{ return(True) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9070253, "supported_languages": null}, "macro.dbt.support_multiple_grantees_per_dcl_statement": {"name": "support_multiple_grantees_per_dcl_statement", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.support_multiple_grantees_per_dcl_statement", "macro_sql": "{% macro support_multiple_grantees_per_dcl_statement() %}\n {{ return(adapter.dispatch('support_multiple_grantees_per_dcl_statement', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__support_multiple_grantees_per_dcl_statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9073675, "supported_languages": null}, "macro.dbt.default__support_multiple_grantees_per_dcl_statement": {"name": "default__support_multiple_grantees_per_dcl_statement", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__support_multiple_grantees_per_dcl_statement", "macro_sql": "\n\n{%- macro default__support_multiple_grantees_per_dcl_statement() -%}\n {{ return(True) }}\n{%- endmacro -%}\n\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9076033, "supported_languages": null}, "macro.dbt.should_revoke": {"name": "should_revoke", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.should_revoke", "macro_sql": "{% macro should_revoke(existing_relation, full_refresh_mode=True) %}\n\n {% if not existing_relation %}\n {#-- The table doesn't already exist, so no grants to copy over --#}\n {{ return(False) }}\n {% elif full_refresh_mode %}\n {#-- The object is being REPLACED -- whether grants are copied over depends on the value of user config --#}\n {{ return(copy_grants()) }}\n {% else %}\n {#-- The table is being merged/upserted/inserted -- grants will be carried over --#}\n {{ return(True) }}\n {% endif %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.copy_grants"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9082973, "supported_languages": null}, "macro.dbt.get_show_grant_sql": {"name": "get_show_grant_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.get_show_grant_sql", "macro_sql": "{% macro get_show_grant_sql(relation) %}\n {{ return(adapter.dispatch(\"get_show_grant_sql\", \"dbt\")(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_show_grant_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9086998, "supported_languages": null}, "macro.dbt.default__get_show_grant_sql": {"name": "default__get_show_grant_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__get_show_grant_sql", "macro_sql": "{% macro default__get_show_grant_sql(relation) %}\n show grants on {{ relation }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9089239, "supported_languages": null}, "macro.dbt.get_grant_sql": {"name": "get_grant_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.get_grant_sql", "macro_sql": "{% macro get_grant_sql(relation, privilege, grantees) %}\n {{ return(adapter.dispatch('get_grant_sql', 'dbt')(relation, privilege, grantees)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_grant_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9093847, "supported_languages": null}, "macro.dbt.default__get_grant_sql": {"name": "default__get_grant_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__get_grant_sql", "macro_sql": "\n\n{%- macro default__get_grant_sql(relation, privilege, grantees) -%}\n grant {{ privilege }} on {{ relation }} to {{ grantees | join(', ') }}\n{%- endmacro -%}\n\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.909781, "supported_languages": null}, "macro.dbt.get_revoke_sql": {"name": "get_revoke_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.get_revoke_sql", "macro_sql": "{% macro get_revoke_sql(relation, privilege, grantees) %}\n {{ return(adapter.dispatch('get_revoke_sql', 'dbt')(relation, privilege, grantees)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_revoke_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9106095, "supported_languages": null}, "macro.dbt.default__get_revoke_sql": {"name": "default__get_revoke_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__get_revoke_sql", "macro_sql": "\n\n{%- macro default__get_revoke_sql(relation, privilege, grantees) -%}\n revoke {{ privilege }} on {{ relation }} from {{ grantees | join(', ') }}\n{%- endmacro -%}\n\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9109967, "supported_languages": null}, "macro.dbt.get_dcl_statement_list": {"name": "get_dcl_statement_list", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.get_dcl_statement_list", "macro_sql": "{% macro get_dcl_statement_list(relation, grant_config, get_dcl_macro) %}\n {{ return(adapter.dispatch('get_dcl_statement_list', 'dbt')(relation, grant_config, get_dcl_macro)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_dcl_statement_list"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9114535, "supported_languages": null}, "macro.dbt.default__get_dcl_statement_list": {"name": "default__get_dcl_statement_list", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__get_dcl_statement_list", "macro_sql": "\n\n{%- macro default__get_dcl_statement_list(relation, grant_config, get_dcl_macro) -%}\n {#\n -- Unpack grant_config into specific privileges and the set of users who need them granted/revoked.\n -- Depending on whether this database supports multiple grantees per statement, pass in the list of\n -- all grantees per privilege, or (if not) template one statement per privilege-grantee pair.\n -- `get_dcl_macro` will be either `get_grant_sql` or `get_revoke_sql`\n #}\n {%- set dcl_statements = [] -%}\n {%- for privilege, grantees in grant_config.items() %}\n {%- if support_multiple_grantees_per_dcl_statement() and grantees -%}\n {%- set dcl = get_dcl_macro(relation, privilege, grantees) -%}\n {%- do dcl_statements.append(dcl) -%}\n {%- else -%}\n {%- for grantee in grantees -%}\n {% set dcl = get_dcl_macro(relation, privilege, [grantee]) %}\n {%- do dcl_statements.append(dcl) -%}\n {% endfor -%}\n {%- endif -%}\n {%- endfor -%}\n {{ return(dcl_statements) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.support_multiple_grantees_per_dcl_statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9128199, "supported_languages": null}, "macro.dbt.call_dcl_statements": {"name": "call_dcl_statements", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.call_dcl_statements", "macro_sql": "{% macro call_dcl_statements(dcl_statement_list) %}\n {{ return(adapter.dispatch(\"call_dcl_statements\", \"dbt\")(dcl_statement_list)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__call_dcl_statements"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9132051, "supported_languages": null}, "macro.dbt.default__call_dcl_statements": {"name": "default__call_dcl_statements", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__call_dcl_statements", "macro_sql": "{% macro default__call_dcl_statements(dcl_statement_list) %}\n {#\n -- By default, supply all grant + revoke statements in a single semicolon-separated block,\n -- so that they're all processed together.\n\n -- Some databases do not support this. Those adapters will need to override this macro\n -- to run each statement individually.\n #}\n {% call statement('grants') %}\n {% for dcl_statement in dcl_statement_list %}\n {{ dcl_statement }};\n {% endfor %}\n {% endcall %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.913712, "supported_languages": null}, "macro.dbt.apply_grants": {"name": "apply_grants", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.apply_grants", "macro_sql": "{% macro apply_grants(relation, grant_config, should_revoke) %}\n {{ return(adapter.dispatch(\"apply_grants\", \"dbt\")(relation, grant_config, should_revoke)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__apply_grants"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9141734, "supported_languages": null}, "macro.dbt.default__apply_grants": {"name": "default__apply_grants", "resource_type": "macro", "package_name": "dbt", "path": "macros/adapters/apply_grants.sql", "original_file_path": "macros/adapters/apply_grants.sql", "unique_id": "macro.dbt.default__apply_grants", "macro_sql": "{% macro default__apply_grants(relation, grant_config, should_revoke=True) %}\n {#-- If grant_config is {} or None, this is a no-op --#}\n {% if grant_config %}\n {% if should_revoke %}\n {#-- We think previous grants may have carried over --#}\n {#-- Show current grants and calculate diffs --#}\n {% set current_grants_table = run_query(get_show_grant_sql(relation)) %}\n {% set current_grants_dict = adapter.standardize_grants_dict(current_grants_table) %}\n {% set needs_granting = diff_of_two_dicts(grant_config, current_grants_dict) %}\n {% set needs_revoking = diff_of_two_dicts(current_grants_dict, grant_config) %}\n {% if not (needs_granting or needs_revoking) %}\n {{ log('On ' ~ relation ~': All grants are in place, no revocation or granting needed.')}}\n {% endif %}\n {% else %}\n {#-- We don't think there's any chance of previous grants having carried over. --#}\n {#-- Jump straight to granting what the user has configured. --#}\n {% set needs_revoking = {} %}\n {% set needs_granting = grant_config %}\n {% endif %}\n {% if needs_granting or needs_revoking %}\n {% set revoke_statement_list = get_dcl_statement_list(relation, needs_revoking, get_revoke_sql) %}\n {% set grant_statement_list = get_dcl_statement_list(relation, needs_granting, get_grant_sql) %}\n {% set dcl_statement_list = revoke_statement_list + grant_statement_list %}\n {% if dcl_statement_list %}\n {{ call_dcl_statements(dcl_statement_list) }}\n {% endif %}\n {% endif %}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query", "macro.dbt.get_show_grant_sql", "macro.dbt.get_dcl_statement_list", "macro.dbt.call_dcl_statements"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9164412, "supported_languages": null}, "macro.dbt.default__test_not_null": {"name": "default__test_not_null", "resource_type": "macro", "package_name": "dbt", "path": "macros/generic_test_sql/not_null.sql", "original_file_path": "macros/generic_test_sql/not_null.sql", "unique_id": "macro.dbt.default__test_not_null", "macro_sql": "{% macro default__test_not_null(model, column_name) %}\n\n{% set column_list = '*' if should_store_failures() else column_name %}\n\nselect {{ column_list }}\nfrom {{ model }}\nwhere {{ column_name }} is null\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.should_store_failures"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9170747, "supported_languages": null}, "macro.dbt.default__test_unique": {"name": "default__test_unique", "resource_type": "macro", "package_name": "dbt", "path": "macros/generic_test_sql/unique.sql", "original_file_path": "macros/generic_test_sql/unique.sql", "unique_id": "macro.dbt.default__test_unique", "macro_sql": "{% macro default__test_unique(model, column_name) %}\n\nselect\n {{ column_name }} as unique_field,\n count(*) as n_records\n\nfrom {{ model }}\nwhere {{ column_name }} is not null\ngroup by {{ column_name }}\nhaving count(*) > 1\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9175532, "supported_languages": null}, "macro.dbt.default__test_accepted_values": {"name": "default__test_accepted_values", "resource_type": "macro", "package_name": "dbt", "path": "macros/generic_test_sql/accepted_values.sql", "original_file_path": "macros/generic_test_sql/accepted_values.sql", "unique_id": "macro.dbt.default__test_accepted_values", "macro_sql": "{% macro default__test_accepted_values(model, column_name, values, quote=True) %}\n\nwith all_values as (\n\n select\n {{ column_name }} as value_field,\n count(*) as n_records\n\n from {{ model }}\n group by {{ column_name }}\n\n)\n\nselect *\nfrom all_values\nwhere value_field not in (\n {% for value in values -%}\n {% if quote -%}\n '{{ value }}'\n {%- else -%}\n {{ value }}\n {%- endif -%}\n {%- if not loop.last -%},{%- endif %}\n {%- endfor %}\n)\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9186184, "supported_languages": null}, "macro.dbt.default__test_relationships": {"name": "default__test_relationships", "resource_type": "macro", "package_name": "dbt", "path": "macros/generic_test_sql/relationships.sql", "original_file_path": "macros/generic_test_sql/relationships.sql", "unique_id": "macro.dbt.default__test_relationships", "macro_sql": "{% macro default__test_relationships(model, column_name, to, field) %}\n\nwith child as (\n select {{ column_name }} as from_field\n from {{ model }}\n where {{ column_name }} is not null\n),\n\nparent as (\n select {{ field }} as to_field\n from {{ to }}\n)\n\nselect\n from_field\n\nfrom child\nleft join parent\n on child.from_field = parent.to_field\n\nwhere parent.to_field is null\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9192479, "supported_languages": null}, "macro.dbt.set_sql_header": {"name": "set_sql_header", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/configs.sql", "original_file_path": "macros/materializations/configs.sql", "unique_id": "macro.dbt.set_sql_header", "macro_sql": "{% macro set_sql_header(config) -%}\n {{ config.set('sql_header', caller()) }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9199152, "supported_languages": null}, "macro.dbt.should_full_refresh": {"name": "should_full_refresh", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/configs.sql", "original_file_path": "macros/materializations/configs.sql", "unique_id": "macro.dbt.should_full_refresh", "macro_sql": "{% macro should_full_refresh() %}\n {% set config_full_refresh = config.get('full_refresh') %}\n {% if config_full_refresh is none %}\n {% set config_full_refresh = flags.FULL_REFRESH %}\n {% endif %}\n {% do return(config_full_refresh) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9205437, "supported_languages": null}, "macro.dbt.should_store_failures": {"name": "should_store_failures", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/configs.sql", "original_file_path": "macros/materializations/configs.sql", "unique_id": "macro.dbt.should_store_failures", "macro_sql": "{% macro should_store_failures() %}\n {% set config_store_failures = config.get('store_failures') %}\n {% if config_store_failures is none %}\n {% set config_store_failures = flags.STORE_FAILURES %}\n {% endif %}\n {% do return(config_store_failures) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9211905, "supported_languages": null}, "macro.dbt.run_hooks": {"name": "run_hooks", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/hooks.sql", "original_file_path": "macros/materializations/hooks.sql", "unique_id": "macro.dbt.run_hooks", "macro_sql": "{% macro run_hooks(hooks, inside_transaction=True) %}\n {% for hook in hooks | selectattr('transaction', 'equalto', inside_transaction) %}\n {% if not inside_transaction and loop.first %}\n {% call statement(auto_begin=inside_transaction) %}\n commit;\n {% endcall %}\n {% endif %}\n {% set rendered = render(hook.get('sql')) | trim %}\n {% if (rendered | length) > 0 %}\n {% call statement(auto_begin=inside_transaction) %}\n {{ rendered }}\n {% endcall %}\n {% endif %}\n {% endfor %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9231923, "supported_languages": null}, "macro.dbt.make_hook_config": {"name": "make_hook_config", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/hooks.sql", "original_file_path": "macros/materializations/hooks.sql", "unique_id": "macro.dbt.make_hook_config", "macro_sql": "{% macro make_hook_config(sql, inside_transaction) %}\n {{ tojson({\"sql\": sql, \"transaction\": inside_transaction}) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9235904, "supported_languages": null}, "macro.dbt.before_begin": {"name": "before_begin", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/hooks.sql", "original_file_path": "macros/materializations/hooks.sql", "unique_id": "macro.dbt.before_begin", "macro_sql": "{% macro before_begin(sql) %}\n {{ make_hook_config(sql, inside_transaction=False) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.make_hook_config"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9239008, "supported_languages": null}, "macro.dbt.in_transaction": {"name": "in_transaction", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/hooks.sql", "original_file_path": "macros/materializations/hooks.sql", "unique_id": "macro.dbt.in_transaction", "macro_sql": "{% macro in_transaction(sql) %}\n {{ make_hook_config(sql, inside_transaction=True) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.make_hook_config"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.924194, "supported_languages": null}, "macro.dbt.after_commit": {"name": "after_commit", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/hooks.sql", "original_file_path": "macros/materializations/hooks.sql", "unique_id": "macro.dbt.after_commit", "macro_sql": "{% macro after_commit(sql) %}\n {{ make_hook_config(sql, inside_transaction=False) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.make_hook_config"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9244874, "supported_languages": null}, "macro.dbt.create_or_replace_view": {"name": "create_or_replace_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/create_or_replace_view.sql", "original_file_path": "macros/materializations/models/view/create_or_replace_view.sql", "unique_id": "macro.dbt.create_or_replace_view", "macro_sql": "{% macro create_or_replace_view() %}\n {%- set identifier = model['alias'] -%}\n\n {%- set old_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) -%}\n {%- set exists_as_view = (old_relation is not none and old_relation.is_view) -%}\n\n {%- set target_relation = api.Relation.create(\n identifier=identifier, schema=schema, database=database,\n type='view') -%}\n {% set grant_config = config.get('grants') %}\n\n {{ run_hooks(pre_hooks) }}\n\n -- If there's a table with the same name and we weren't told to full refresh,\n -- that's an error. If we were told to full refresh, drop it. This behavior differs\n -- for Snowflake and BigQuery, so multiple dispatch is used.\n {%- if old_relation is not none and old_relation.is_table -%}\n {{ handle_existing_table(should_full_refresh(), old_relation) }}\n {%- endif -%}\n\n -- build model\n {% call statement('main') -%}\n {{ get_create_view_as_sql(target_relation, sql) }}\n {%- endcall %}\n\n {% set should_revoke = should_revoke(exists_as_view, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {{ run_hooks(post_hooks) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_hooks", "macro.dbt.handle_existing_table", "macro.dbt.should_full_refresh", "macro.dbt.statement", "macro.dbt.get_create_view_as_sql", "macro.dbt.should_revoke", "macro.dbt.apply_grants"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9274657, "supported_languages": null}, "macro.dbt.materialization_view_default": {"name": "materialization_view_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/view.sql", "original_file_path": "macros/materializations/models/view/view.sql", "unique_id": "macro.dbt.materialization_view_default", "macro_sql": "{%- materialization view, default -%}\n\n {%- set existing_relation = load_cached_relation(this) -%}\n {%- set target_relation = this.incorporate(type='view') -%}\n {%- set intermediate_relation = make_intermediate_relation(target_relation) -%}\n\n -- the intermediate_relation should not already exist in the database; get_relation\n -- will return None in that case. Otherwise, we get a relation that we can drop\n -- later, before we try to use this name for the current operation\n {%- set preexisting_intermediate_relation = load_cached_relation(intermediate_relation) -%}\n /*\n This relation (probably) doesn't exist yet. If it does exist, it's a leftover from\n a previous run, and we're going to try to drop it immediately. At the end of this\n materialization, we're going to rename the \"existing_relation\" to this identifier,\n and then we're going to drop it. In order to make sure we run the correct one of:\n - drop view ...\n - drop table ...\n\n We need to set the type of this relation to be the type of the existing_relation, if it exists,\n or else \"view\" as a sane default if it does not. Note that if the existing_relation does not\n exist, then there is nothing to move out of the way and subsequentally drop. In that case,\n this relation will be effectively unused.\n */\n {%- set backup_relation_type = 'view' if existing_relation is none else existing_relation.type -%}\n {%- set backup_relation = make_backup_relation(target_relation, backup_relation_type) -%}\n -- as above, the backup_relation should not already exist\n {%- set preexisting_backup_relation = load_cached_relation(backup_relation) -%}\n -- grab current tables grants config for comparision later on\n {% set grant_config = config.get('grants') %}\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n -- drop the temp relations if they exist already in the database\n {{ drop_relation_if_exists(preexisting_intermediate_relation) }}\n {{ drop_relation_if_exists(preexisting_backup_relation) }}\n\n -- `BEGIN` happens here:\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n -- build model\n {% call statement('main') -%}\n {{ get_create_view_as_sql(intermediate_relation, sql) }}\n {%- endcall %}\n\n -- cleanup\n -- move the existing view out of the way\n {% if existing_relation is not none %}\n {{ adapter.rename_relation(existing_relation, backup_relation) }}\n {% endif %}\n {{ adapter.rename_relation(intermediate_relation, target_relation) }}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n {{ adapter.commit() }}\n\n {{ drop_relation_if_exists(backup_relation) }}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{%- endmaterialization -%}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.make_intermediate_relation", "macro.dbt.make_backup_relation", "macro.dbt.run_hooks", "macro.dbt.drop_relation_if_exists", "macro.dbt.statement", "macro.dbt.get_create_view_as_sql", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9321566, "supported_languages": ["sql"]}, "macro.dbt.get_create_view_as_sql": {"name": "get_create_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/create_view_as.sql", "original_file_path": "macros/materializations/models/view/create_view_as.sql", "unique_id": "macro.dbt.get_create_view_as_sql", "macro_sql": "{% macro get_create_view_as_sql(relation, sql) -%}\n {{ adapter.dispatch('get_create_view_as_sql', 'dbt')(relation, sql) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_create_view_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9329839, "supported_languages": null}, "macro.dbt.default__get_create_view_as_sql": {"name": "default__get_create_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/create_view_as.sql", "original_file_path": "macros/materializations/models/view/create_view_as.sql", "unique_id": "macro.dbt.default__get_create_view_as_sql", "macro_sql": "{% macro default__get_create_view_as_sql(relation, sql) -%}\n {{ return(create_view_as(relation, sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.create_view_as"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.933322, "supported_languages": null}, "macro.dbt.create_view_as": {"name": "create_view_as", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/create_view_as.sql", "original_file_path": "macros/materializations/models/view/create_view_as.sql", "unique_id": "macro.dbt.create_view_as", "macro_sql": "{% macro create_view_as(relation, sql) -%}\n {{ adapter.dispatch('create_view_as', 'dbt')(relation, sql) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__create_view_as"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9336905, "supported_languages": null}, "macro.dbt.default__create_view_as": {"name": "default__create_view_as", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/create_view_as.sql", "original_file_path": "macros/materializations/models/view/create_view_as.sql", "unique_id": "macro.dbt.default__create_view_as", "macro_sql": "{% macro default__create_view_as(relation, sql) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {{ sql_header if sql_header is not none }}\n create view {{ relation }}\n {% set contract_config = config.get('contract') %}\n {% if contract_config.enforced %}\n {{ get_assert_columns_equivalent(sql) }}\n {%- endif %}\n as (\n {{ sql }}\n );\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.get_assert_columns_equivalent"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9345357, "supported_languages": null}, "macro.dbt.handle_existing_table": {"name": "handle_existing_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/helpers.sql", "original_file_path": "macros/materializations/models/view/helpers.sql", "unique_id": "macro.dbt.handle_existing_table", "macro_sql": "{% macro handle_existing_table(full_refresh, old_relation) %}\n {{ adapter.dispatch('handle_existing_table', 'dbt')(full_refresh, old_relation) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__handle_existing_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9350874, "supported_languages": null}, "macro.dbt.default__handle_existing_table": {"name": "default__handle_existing_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/view/helpers.sql", "original_file_path": "macros/materializations/models/view/helpers.sql", "unique_id": "macro.dbt.default__handle_existing_table", "macro_sql": "{% macro default__handle_existing_table(full_refresh, old_relation) %}\n {{ log(\"Dropping relation \" ~ old_relation ~ \" because it is of type \" ~ old_relation.type) }}\n {{ adapter.drop_relation(old_relation) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9355285, "supported_languages": null}, "macro.dbt.create_or_replace_clone": {"name": "create_or_replace_clone", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/clone/create_or_replace_clone.sql", "original_file_path": "macros/materializations/models/clone/create_or_replace_clone.sql", "unique_id": "macro.dbt.create_or_replace_clone", "macro_sql": "{% macro create_or_replace_clone(this_relation, defer_relation) %}\n {{ return(adapter.dispatch('create_or_replace_clone', 'dbt')(this_relation, defer_relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__create_or_replace_clone"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9360857, "supported_languages": null}, "macro.dbt.default__create_or_replace_clone": {"name": "default__create_or_replace_clone", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/clone/create_or_replace_clone.sql", "original_file_path": "macros/materializations/models/clone/create_or_replace_clone.sql", "unique_id": "macro.dbt.default__create_or_replace_clone", "macro_sql": "{% macro default__create_or_replace_clone(this_relation, defer_relation) %}\n create or replace table {{ this_relation }} clone {{ defer_relation }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9363585, "supported_languages": null}, "macro.dbt.materialization_clone_default": {"name": "materialization_clone_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/clone/clone.sql", "original_file_path": "macros/materializations/models/clone/clone.sql", "unique_id": "macro.dbt.materialization_clone_default", "macro_sql": "{%- materialization clone, default -%}\n\n {%- set relations = {'relations': []} -%}\n\n {%- if not defer_relation -%}\n -- nothing to do\n {{ log(\"No relation found in state manifest for \" ~ model.unique_id, info=True) }}\n {{ return(relations) }}\n {%- endif -%}\n\n {%- set existing_relation = load_cached_relation(this) -%}\n\n {%- if existing_relation and not flags.FULL_REFRESH -%}\n -- noop!\n {{ log(\"Relation \" ~ existing_relation ~ \" already exists\", info=True) }}\n {{ return(relations) }}\n {%- endif -%}\n\n {%- set other_existing_relation = load_cached_relation(defer_relation) -%}\n\n -- If this is a database that can do zero-copy cloning of tables, and the other relation is a table, then this will be a table\n -- Otherwise, this will be a view\n\n {% set can_clone_table = can_clone_table() %}\n\n {%- if other_existing_relation and other_existing_relation.type == 'table' and can_clone_table -%}\n\n {%- set target_relation = this.incorporate(type='table') -%}\n {% if existing_relation is not none and not existing_relation.is_table %}\n {{ log(\"Dropping relation \" ~ existing_relation ~ \" because it is of type \" ~ existing_relation.type) }}\n {{ drop_relation_if_exists(existing_relation) }}\n {% endif %}\n\n -- as a general rule, data platforms that can clone tables can also do atomic 'create or replace'\n {% call statement('main') %}\n {{ create_or_replace_clone(target_relation, defer_relation) }}\n {% endcall %}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n {% do persist_docs(target_relation, model) %}\n\n {{ return({'relations': [target_relation]}) }}\n\n {%- else -%}\n\n {%- set target_relation = this.incorporate(type='view') -%}\n\n -- reuse the view materialization\n -- TODO: support actual dispatch for materialization macros\n -- Tracking ticket: https://github.com/dbt-labs/dbt-core/issues/7799\n {% set search_name = \"materialization_view_\" ~ adapter.type() %}\n {% if not search_name in context %}\n {% set search_name = \"materialization_view_default\" %}\n {% endif %}\n {% set materialization_macro = context[search_name] %}\n {% set relations = materialization_macro() %}\n {{ return(relations) }}\n\n {%- endif -%}\n\n{%- endmaterialization -%}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.can_clone_table", "macro.dbt.drop_relation_if_exists", "macro.dbt.statement", "macro.dbt.create_or_replace_clone", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9420948, "supported_languages": ["sql"]}, "macro.dbt.can_clone_table": {"name": "can_clone_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/clone/can_clone_table.sql", "original_file_path": "macros/materializations/models/clone/can_clone_table.sql", "unique_id": "macro.dbt.can_clone_table", "macro_sql": "{% macro can_clone_table() %}\n {{ return(adapter.dispatch('can_clone_table', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__can_clone_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.942592, "supported_languages": null}, "macro.dbt.default__can_clone_table": {"name": "default__can_clone_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/clone/can_clone_table.sql", "original_file_path": "macros/materializations/models/clone/can_clone_table.sql", "unique_id": "macro.dbt.default__can_clone_table", "macro_sql": "{% macro default__can_clone_table() %}\n {{ return(False) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9428346, "supported_languages": null}, "macro.dbt.incremental_validate_on_schema_change": {"name": "incremental_validate_on_schema_change", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/on_schema_change.sql", "original_file_path": "macros/materializations/models/incremental/on_schema_change.sql", "unique_id": "macro.dbt.incremental_validate_on_schema_change", "macro_sql": "{% macro incremental_validate_on_schema_change(on_schema_change, default='ignore') %}\n\n {% if on_schema_change not in ['sync_all_columns', 'append_new_columns', 'fail', 'ignore'] %}\n\n {% set log_message = 'Invalid value for on_schema_change (%s) specified. Setting default value of %s.' % (on_schema_change, default) %}\n {% do log(log_message) %}\n\n {{ return(default) }}\n\n {% else %}\n\n {{ return(on_schema_change) }}\n\n {% endif %}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9507258, "supported_languages": null}, "macro.dbt.check_for_schema_changes": {"name": "check_for_schema_changes", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/on_schema_change.sql", "original_file_path": "macros/materializations/models/incremental/on_schema_change.sql", "unique_id": "macro.dbt.check_for_schema_changes", "macro_sql": "{% macro check_for_schema_changes(source_relation, target_relation) %}\n\n {% set schema_changed = False %}\n\n {%- set source_columns = adapter.get_columns_in_relation(source_relation) -%}\n {%- set target_columns = adapter.get_columns_in_relation(target_relation) -%}\n {%- set source_not_in_target = diff_columns(source_columns, target_columns) -%}\n {%- set target_not_in_source = diff_columns(target_columns, source_columns) -%}\n\n {% set new_target_types = diff_column_data_types(source_columns, target_columns) %}\n\n {% if source_not_in_target != [] %}\n {% set schema_changed = True %}\n {% elif target_not_in_source != [] or new_target_types != [] %}\n {% set schema_changed = True %}\n {% elif new_target_types != [] %}\n {% set schema_changed = True %}\n {% endif %}\n\n {% set changes_dict = {\n 'schema_changed': schema_changed,\n 'source_not_in_target': source_not_in_target,\n 'target_not_in_source': target_not_in_source,\n 'source_columns': source_columns,\n 'target_columns': target_columns,\n 'new_target_types': new_target_types\n } %}\n\n {% set msg %}\n In {{ target_relation }}:\n Schema changed: {{ schema_changed }}\n Source columns not in target: {{ source_not_in_target }}\n Target columns not in source: {{ target_not_in_source }}\n New column types: {{ new_target_types }}\n {% endset %}\n\n {% do log(msg) %}\n\n {{ return(changes_dict) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.diff_columns", "macro.dbt.diff_column_data_types"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9531844, "supported_languages": null}, "macro.dbt.sync_column_schemas": {"name": "sync_column_schemas", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/on_schema_change.sql", "original_file_path": "macros/materializations/models/incremental/on_schema_change.sql", "unique_id": "macro.dbt.sync_column_schemas", "macro_sql": "{% macro sync_column_schemas(on_schema_change, target_relation, schema_changes_dict) %}\n\n {%- set add_to_target_arr = schema_changes_dict['source_not_in_target'] -%}\n\n {%- if on_schema_change == 'append_new_columns'-%}\n {%- if add_to_target_arr | length > 0 -%}\n {%- do alter_relation_add_remove_columns(target_relation, add_to_target_arr, none) -%}\n {%- endif -%}\n\n {% elif on_schema_change == 'sync_all_columns' %}\n {%- set remove_from_target_arr = schema_changes_dict['target_not_in_source'] -%}\n {%- set new_target_types = schema_changes_dict['new_target_types'] -%}\n\n {% if add_to_target_arr | length > 0 or remove_from_target_arr | length > 0 %}\n {%- do alter_relation_add_remove_columns(target_relation, add_to_target_arr, remove_from_target_arr) -%}\n {% endif %}\n\n {% if new_target_types != [] %}\n {% for ntt in new_target_types %}\n {% set column_name = ntt['column_name'] %}\n {% set new_type = ntt['new_type'] %}\n {% do alter_column_type(target_relation, column_name, new_type) %}\n {% endfor %}\n {% endif %}\n\n {% endif %}\n\n {% set schema_change_message %}\n In {{ target_relation }}:\n Schema change approach: {{ on_schema_change }}\n Columns added: {{ add_to_target_arr }}\n Columns removed: {{ remove_from_target_arr }}\n Data types changed: {{ new_target_types }}\n {% endset %}\n\n {% do log(schema_change_message) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.alter_relation_add_remove_columns", "macro.dbt.alter_column_type"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9556718, "supported_languages": null}, "macro.dbt.process_schema_changes": {"name": "process_schema_changes", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/on_schema_change.sql", "original_file_path": "macros/materializations/models/incremental/on_schema_change.sql", "unique_id": "macro.dbt.process_schema_changes", "macro_sql": "{% macro process_schema_changes(on_schema_change, source_relation, target_relation) %}\n\n {% if on_schema_change == 'ignore' %}\n\n {{ return({}) }}\n\n {% else %}\n\n {% set schema_changes_dict = check_for_schema_changes(source_relation, target_relation) %}\n\n {% if schema_changes_dict['schema_changed'] %}\n\n {% if on_schema_change == 'fail' %}\n\n {% set fail_msg %}\n The source and target schemas on this incremental model are out of sync!\n They can be reconciled in several ways:\n - set the `on_schema_change` config to either append_new_columns or sync_all_columns, depending on your situation.\n - Re-run the incremental model with `full_refresh: True` to update the target schema.\n - update the schema manually and re-run the process.\n\n Additional troubleshooting context:\n Source columns not in target: {{ schema_changes_dict['source_not_in_target'] }}\n Target columns not in source: {{ schema_changes_dict['target_not_in_source'] }}\n New column types: {{ schema_changes_dict['new_target_types'] }}\n {% endset %}\n\n {% do exceptions.raise_compiler_error(fail_msg) %}\n\n {# -- unless we ignore, run the sync operation per the config #}\n {% else %}\n\n {% do sync_column_schemas(on_schema_change, target_relation, schema_changes_dict) %}\n\n {% endif %}\n\n {% endif %}\n\n {{ return(schema_changes_dict['source_columns']) }}\n\n {% endif %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.check_for_schema_changes", "macro.dbt.sync_column_schemas"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.957388, "supported_languages": null}, "macro.dbt.materialization_incremental_default": {"name": "materialization_incremental_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/incremental.sql", "original_file_path": "macros/materializations/models/incremental/incremental.sql", "unique_id": "macro.dbt.materialization_incremental_default", "macro_sql": "{% materialization incremental, default -%}\n\n -- relations\n {%- set existing_relation = load_cached_relation(this) -%}\n {%- set target_relation = this.incorporate(type='table') -%}\n {%- set temp_relation = make_temp_relation(target_relation)-%}\n {%- set intermediate_relation = make_intermediate_relation(target_relation)-%}\n {%- set backup_relation_type = 'table' if existing_relation is none else existing_relation.type -%}\n {%- set backup_relation = make_backup_relation(target_relation, backup_relation_type) -%}\n\n -- configs\n {%- set unique_key = config.get('unique_key') -%}\n {%- set full_refresh_mode = (should_full_refresh() or existing_relation.is_view) -%}\n {%- set on_schema_change = incremental_validate_on_schema_change(config.get('on_schema_change'), default='ignore') -%}\n\n -- the temp_ and backup_ relations should not already exist in the database; get_relation\n -- will return None in that case. Otherwise, we get a relation that we can drop\n -- later, before we try to use this name for the current operation. This has to happen before\n -- BEGIN, in a separate transaction\n {%- set preexisting_intermediate_relation = load_cached_relation(intermediate_relation)-%}\n {%- set preexisting_backup_relation = load_cached_relation(backup_relation) -%}\n -- grab current tables grants config for comparision later on\n {% set grant_config = config.get('grants') %}\n {{ drop_relation_if_exists(preexisting_intermediate_relation) }}\n {{ drop_relation_if_exists(preexisting_backup_relation) }}\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n -- `BEGIN` happens here:\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n {% set to_drop = [] %}\n\n {% if existing_relation is none %}\n {% set build_sql = get_create_table_as_sql(False, target_relation, sql) %}\n {% elif full_refresh_mode %}\n {% set build_sql = get_create_table_as_sql(False, intermediate_relation, sql) %}\n {% set need_swap = true %}\n {% else %}\n {% do run_query(get_create_table_as_sql(True, temp_relation, sql)) %}\n {% do adapter.expand_target_column_types(\n from_relation=temp_relation,\n to_relation=target_relation) %}\n {#-- Process schema changes. Returns dict of changes if successful. Use source columns for upserting/merging --#}\n {% set dest_columns = process_schema_changes(on_schema_change, temp_relation, existing_relation) %}\n {% if not dest_columns %}\n {% set dest_columns = adapter.get_columns_in_relation(existing_relation) %}\n {% endif %}\n\n {#-- Get the incremental_strategy, the macro to use for the strategy, and build the sql --#}\n {% set incremental_strategy = config.get('incremental_strategy') or 'default' %}\n {% set incremental_predicates = config.get('predicates', none) or config.get('incremental_predicates', none) %}\n {% set strategy_sql_macro_func = adapter.get_incremental_strategy_macro(context, incremental_strategy) %}\n {% set strategy_arg_dict = ({'target_relation': target_relation, 'temp_relation': temp_relation, 'unique_key': unique_key, 'dest_columns': dest_columns, 'incremental_predicates': incremental_predicates }) %}\n {% set build_sql = strategy_sql_macro_func(strategy_arg_dict) %}\n\n {% endif %}\n\n {% call statement(\"main\") %}\n {{ build_sql }}\n {% endcall %}\n\n {% if need_swap %}\n {% do adapter.rename_relation(target_relation, backup_relation) %}\n {% do adapter.rename_relation(intermediate_relation, target_relation) %}\n {% do to_drop.append(backup_relation) %}\n {% endif %}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {% if existing_relation is none or existing_relation.is_view or should_full_refresh() %}\n {% do create_indexes(target_relation) %}\n {% endif %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n -- `COMMIT` happens here\n {% do adapter.commit() %}\n\n {% for rel in to_drop %}\n {% do adapter.drop_relation(rel) %}\n {% endfor %}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{%- endmaterialization %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.make_temp_relation", "macro.dbt.make_intermediate_relation", "macro.dbt.make_backup_relation", "macro.dbt.should_full_refresh", "macro.dbt.incremental_validate_on_schema_change", "macro.dbt.drop_relation_if_exists", "macro.dbt.run_hooks", "macro.dbt.get_create_table_as_sql", "macro.dbt.run_query", "macro.dbt.process_schema_changes", "macro.dbt.statement", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs", "macro.dbt.create_indexes"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9665484, "supported_languages": ["sql"]}, "macro.dbt.get_merge_sql": {"name": "get_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.get_merge_sql", "macro_sql": "{% macro get_merge_sql(target, source, unique_key, dest_columns, incremental_predicates=none) -%}\n -- back compat for old kwarg name\n {% set incremental_predicates = kwargs.get('predicates', incremental_predicates) %}\n {{ adapter.dispatch('get_merge_sql', 'dbt')(target, source, unique_key, dest_columns, incremental_predicates) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9750023, "supported_languages": null}, "macro.dbt.default__get_merge_sql": {"name": "default__get_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.default__get_merge_sql", "macro_sql": "{% macro default__get_merge_sql(target, source, unique_key, dest_columns, incremental_predicates=none) -%}\n {%- set predicates = [] if incremental_predicates is none else [] + incremental_predicates -%}\n {%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute=\"name\")) -%}\n {%- set merge_update_columns = config.get('merge_update_columns') -%}\n {%- set merge_exclude_columns = config.get('merge_exclude_columns') -%}\n {%- set update_columns = get_merge_update_columns(merge_update_columns, merge_exclude_columns, dest_columns) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {% if unique_key %}\n {% if unique_key is sequence and unique_key is not mapping and unique_key is not string %}\n {% for key in unique_key %}\n {% set this_key_match %}\n DBT_INTERNAL_SOURCE.{{ key }} = DBT_INTERNAL_DEST.{{ key }}\n {% endset %}\n {% do predicates.append(this_key_match) %}\n {% endfor %}\n {% else %}\n {% set unique_key_match %}\n DBT_INTERNAL_SOURCE.{{ unique_key }} = DBT_INTERNAL_DEST.{{ unique_key }}\n {% endset %}\n {% do predicates.append(unique_key_match) %}\n {% endif %}\n {% else %}\n {% do predicates.append('FALSE') %}\n {% endif %}\n\n {{ sql_header if sql_header is not none }}\n\n merge into {{ target }} as DBT_INTERNAL_DEST\n using {{ source }} as DBT_INTERNAL_SOURCE\n on {{\"(\" ~ predicates | join(\") and (\") ~ \")\"}}\n\n {% if unique_key %}\n when matched then update set\n {% for column_name in update_columns -%}\n {{ column_name }} = DBT_INTERNAL_SOURCE.{{ column_name }}\n {%- if not loop.last %}, {%- endif %}\n {%- endfor %}\n {% endif %}\n\n when not matched then insert\n ({{ dest_cols_csv }})\n values\n ({{ dest_cols_csv }})\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_quoted_csv", "macro.dbt.get_merge_update_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9782264, "supported_languages": null}, "macro.dbt.get_delete_insert_merge_sql": {"name": "get_delete_insert_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.get_delete_insert_merge_sql", "macro_sql": "{% macro get_delete_insert_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) -%}\n {{ adapter.dispatch('get_delete_insert_merge_sql', 'dbt')(target, source, unique_key, dest_columns, incremental_predicates) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_delete_insert_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9787498, "supported_languages": null}, "macro.dbt.default__get_delete_insert_merge_sql": {"name": "default__get_delete_insert_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.default__get_delete_insert_merge_sql", "macro_sql": "{% macro default__get_delete_insert_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) -%}\n\n {%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute=\"name\")) -%}\n\n {% if unique_key %}\n {% if unique_key is sequence and unique_key is not string %}\n delete from {{target }}\n using {{ source }}\n where (\n {% for key in unique_key %}\n {{ source }}.{{ key }} = {{ target }}.{{ key }}\n {{ \"and \" if not loop.last}}\n {% endfor %}\n {% if incremental_predicates %}\n {% for predicate in incremental_predicates %}\n and {{ predicate }}\n {% endfor %}\n {% endif %}\n );\n {% else %}\n delete from {{ target }}\n where (\n {{ unique_key }}) in (\n select ({{ unique_key }})\n from {{ source }}\n )\n {%- if incremental_predicates %}\n {% for predicate in incremental_predicates %}\n and {{ predicate }}\n {% endfor %}\n {%- endif -%};\n\n {% endif %}\n {% endif %}\n\n insert into {{ target }} ({{ dest_cols_csv }})\n (\n select {{ dest_cols_csv }}\n from {{ source }}\n )\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.get_quoted_csv"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9807804, "supported_languages": null}, "macro.dbt.get_insert_overwrite_merge_sql": {"name": "get_insert_overwrite_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.get_insert_overwrite_merge_sql", "macro_sql": "{% macro get_insert_overwrite_merge_sql(target, source, dest_columns, predicates, include_sql_header=false) -%}\n {{ adapter.dispatch('get_insert_overwrite_merge_sql', 'dbt')(target, source, dest_columns, predicates, include_sql_header) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_insert_overwrite_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9813128, "supported_languages": null}, "macro.dbt.default__get_insert_overwrite_merge_sql": {"name": "default__get_insert_overwrite_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/merge.sql", "original_file_path": "macros/materializations/models/incremental/merge.sql", "unique_id": "macro.dbt.default__get_insert_overwrite_merge_sql", "macro_sql": "{% macro default__get_insert_overwrite_merge_sql(target, source, dest_columns, predicates, include_sql_header) -%}\n {#-- The only time include_sql_header is True: --#}\n {#-- BigQuery + insert_overwrite strategy + \"static\" partitions config --#}\n {#-- We should consider including the sql header at the materialization level instead --#}\n\n {%- set predicates = [] if predicates is none else [] + predicates -%}\n {%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute=\"name\")) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {{ sql_header if sql_header is not none and include_sql_header }}\n\n merge into {{ target }} as DBT_INTERNAL_DEST\n using {{ source }} as DBT_INTERNAL_SOURCE\n on FALSE\n\n when not matched by source\n {% if predicates %} and {{ predicates | join(' and ') }} {% endif %}\n then delete\n\n when not matched then insert\n ({{ dest_cols_csv }})\n values\n ({{ dest_cols_csv }})\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_quoted_csv"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9825716, "supported_languages": null}, "macro.dbt.is_incremental": {"name": "is_incremental", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/is_incremental.sql", "original_file_path": "macros/materializations/models/incremental/is_incremental.sql", "unique_id": "macro.dbt.is_incremental", "macro_sql": "{% macro is_incremental() %}\n {#-- do not run introspective queries in parsing #}\n {% if not execute %}\n {{ return(False) }}\n {% else %}\n {% set relation = adapter.get_relation(this.database, this.schema, this.table) %}\n {{ return(relation is not none\n and relation.type == 'table'\n and model.config.materialized == 'incremental'\n and not should_full_refresh()) }}\n {% endif %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.should_full_refresh"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9837406, "supported_languages": null}, "macro.dbt.get_quoted_csv": {"name": "get_quoted_csv", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/column_helpers.sql", "original_file_path": "macros/materializations/models/incremental/column_helpers.sql", "unique_id": "macro.dbt.get_quoted_csv", "macro_sql": "{% macro get_quoted_csv(column_names) %}\n\n {% set quoted = [] %}\n {% for col in column_names -%}\n {%- do quoted.append(adapter.quote(col)) -%}\n {%- endfor %}\n\n {%- set dest_cols_csv = quoted | join(', ') -%}\n {{ return(dest_cols_csv) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9863975, "supported_languages": null}, "macro.dbt.diff_columns": {"name": "diff_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/column_helpers.sql", "original_file_path": "macros/materializations/models/incremental/column_helpers.sql", "unique_id": "macro.dbt.diff_columns", "macro_sql": "{% macro diff_columns(source_columns, target_columns) %}\n\n {% set result = [] %}\n {% set source_names = source_columns | map(attribute = 'column') | list %}\n {% set target_names = target_columns | map(attribute = 'column') | list %}\n\n {# --check whether the name attribute exists in the target - this does not perform a data type check #}\n {% for sc in source_columns %}\n {% if sc.name not in target_names %}\n {{ result.append(sc) }}\n {% endif %}\n {% endfor %}\n\n {{ return(result) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9875062, "supported_languages": null}, "macro.dbt.diff_column_data_types": {"name": "diff_column_data_types", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/column_helpers.sql", "original_file_path": "macros/materializations/models/incremental/column_helpers.sql", "unique_id": "macro.dbt.diff_column_data_types", "macro_sql": "{% macro diff_column_data_types(source_columns, target_columns) %}\n\n {% set result = [] %}\n {% for sc in source_columns %}\n {% set tc = target_columns | selectattr(\"name\", \"equalto\", sc.name) | list | first %}\n {% if tc %}\n {% if sc.data_type != tc.data_type and not sc.can_expand_to(other_column=tc) %}\n {{ result.append( { 'column_name': tc.name, 'new_type': sc.data_type } ) }}\n {% endif %}\n {% endif %}\n {% endfor %}\n\n {{ return(result) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9888432, "supported_languages": null}, "macro.dbt.get_merge_update_columns": {"name": "get_merge_update_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/column_helpers.sql", "original_file_path": "macros/materializations/models/incremental/column_helpers.sql", "unique_id": "macro.dbt.get_merge_update_columns", "macro_sql": "{% macro get_merge_update_columns(merge_update_columns, merge_exclude_columns, dest_columns) %}\n {{ return(adapter.dispatch('get_merge_update_columns', 'dbt')(merge_update_columns, merge_exclude_columns, dest_columns)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_merge_update_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9893274, "supported_languages": null}, "macro.dbt.default__get_merge_update_columns": {"name": "default__get_merge_update_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/column_helpers.sql", "original_file_path": "macros/materializations/models/incremental/column_helpers.sql", "unique_id": "macro.dbt.default__get_merge_update_columns", "macro_sql": "{% macro default__get_merge_update_columns(merge_update_columns, merge_exclude_columns, dest_columns) %}\n {%- set default_cols = dest_columns | map(attribute=\"quoted\") | list -%}\n\n {%- if merge_update_columns and merge_exclude_columns -%}\n {{ exceptions.raise_compiler_error(\n 'Model cannot specify merge_update_columns and merge_exclude_columns. Please update model to use only one config'\n )}}\n {%- elif merge_update_columns -%}\n {%- set update_columns = merge_update_columns -%}\n {%- elif merge_exclude_columns -%}\n {%- set update_columns = [] -%}\n {%- for column in dest_columns -%}\n {% if column.column | lower not in merge_exclude_columns | map(\"lower\") | list %}\n {%- do update_columns.append(column.quoted) -%}\n {% endif %}\n {%- endfor -%}\n {%- else -%}\n {%- set update_columns = default_cols -%}\n {%- endif -%}\n\n {{ return(update_columns) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9907713, "supported_languages": null}, "macro.dbt.get_incremental_append_sql": {"name": "get_incremental_append_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_incremental_append_sql", "macro_sql": "{% macro get_incremental_append_sql(arg_dict) %}\n\n {{ return(adapter.dispatch('get_incremental_append_sql', 'dbt')(arg_dict)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_incremental_append_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9922652, "supported_languages": null}, "macro.dbt.default__get_incremental_append_sql": {"name": "default__get_incremental_append_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.default__get_incremental_append_sql", "macro_sql": "{% macro default__get_incremental_append_sql(arg_dict) %}\n\n {% do return(get_insert_into_sql(arg_dict[\"target_relation\"], arg_dict[\"temp_relation\"], arg_dict[\"dest_columns\"])) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_insert_into_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9927611, "supported_languages": null}, "macro.dbt.get_incremental_delete_insert_sql": {"name": "get_incremental_delete_insert_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_incremental_delete_insert_sql", "macro_sql": "{% macro get_incremental_delete_insert_sql(arg_dict) %}\n\n {{ return(adapter.dispatch('get_incremental_delete_insert_sql', 'dbt')(arg_dict)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_incremental_delete_insert_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9931474, "supported_languages": null}, "macro.dbt.default__get_incremental_delete_insert_sql": {"name": "default__get_incremental_delete_insert_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.default__get_incremental_delete_insert_sql", "macro_sql": "{% macro default__get_incremental_delete_insert_sql(arg_dict) %}\n\n {% do return(get_delete_insert_merge_sql(arg_dict[\"target_relation\"], arg_dict[\"temp_relation\"], arg_dict[\"unique_key\"], arg_dict[\"dest_columns\"], arg_dict[\"incremental_predicates\"])) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_delete_insert_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.993738, "supported_languages": null}, "macro.dbt.get_incremental_merge_sql": {"name": "get_incremental_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_incremental_merge_sql", "macro_sql": "{% macro get_incremental_merge_sql(arg_dict) %}\n\n {{ return(adapter.dispatch('get_incremental_merge_sql', 'dbt')(arg_dict)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_incremental_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9941084, "supported_languages": null}, "macro.dbt.default__get_incremental_merge_sql": {"name": "default__get_incremental_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.default__get_incremental_merge_sql", "macro_sql": "{% macro default__get_incremental_merge_sql(arg_dict) %}\n\n {% do return(get_merge_sql(arg_dict[\"target_relation\"], arg_dict[\"temp_relation\"], arg_dict[\"unique_key\"], arg_dict[\"dest_columns\"], arg_dict[\"incremental_predicates\"])) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9947996, "supported_languages": null}, "macro.dbt.get_incremental_insert_overwrite_sql": {"name": "get_incremental_insert_overwrite_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_incremental_insert_overwrite_sql", "macro_sql": "{% macro get_incremental_insert_overwrite_sql(arg_dict) %}\n\n {{ return(adapter.dispatch('get_incremental_insert_overwrite_sql', 'dbt')(arg_dict)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_incremental_insert_overwrite_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9951732, "supported_languages": null}, "macro.dbt.default__get_incremental_insert_overwrite_sql": {"name": "default__get_incremental_insert_overwrite_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.default__get_incremental_insert_overwrite_sql", "macro_sql": "{% macro default__get_incremental_insert_overwrite_sql(arg_dict) %}\n\n {% do return(get_insert_overwrite_merge_sql(arg_dict[\"target_relation\"], arg_dict[\"temp_relation\"], arg_dict[\"dest_columns\"], arg_dict[\"incremental_predicates\"])) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_insert_overwrite_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9956975, "supported_languages": null}, "macro.dbt.get_incremental_default_sql": {"name": "get_incremental_default_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_incremental_default_sql", "macro_sql": "{% macro get_incremental_default_sql(arg_dict) %}\n\n {{ return(adapter.dispatch('get_incremental_default_sql', 'dbt')(arg_dict)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__get_incremental_default_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9960692, "supported_languages": null}, "macro.dbt.default__get_incremental_default_sql": {"name": "default__get_incremental_default_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.default__get_incremental_default_sql", "macro_sql": "{% macro default__get_incremental_default_sql(arg_dict) %}\n\n {% do return(get_incremental_append_sql(arg_dict)) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_incremental_append_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9963746, "supported_languages": null}, "macro.dbt.get_insert_into_sql": {"name": "get_insert_into_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/incremental/strategies.sql", "original_file_path": "macros/materializations/models/incremental/strategies.sql", "unique_id": "macro.dbt.get_insert_into_sql", "macro_sql": "{% macro get_insert_into_sql(target_relation, temp_relation, dest_columns) %}\n\n {%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute=\"name\")) -%}\n\n insert into {{ target_relation }} ({{ dest_cols_csv }})\n (\n select {{ dest_cols_csv }}\n from {{ temp_relation }}\n )\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_quoted_csv"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9969482, "supported_languages": null}, "macro.dbt.get_create_materialized_view_as_sql": {"name": "get_create_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/create_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/create_materialized_view.sql", "unique_id": "macro.dbt.get_create_materialized_view_as_sql", "macro_sql": "{% macro get_create_materialized_view_as_sql(relation, sql) -%}\n {{- log('Applying CREATE to: ' ~ relation) -}}\n {{- adapter.dispatch('get_create_materialized_view_as_sql', 'dbt')(relation, sql) -}}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_create_materialized_view_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9976027, "supported_languages": null}, "macro.dbt.default__get_create_materialized_view_as_sql": {"name": "default__get_create_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/create_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/create_materialized_view.sql", "unique_id": "macro.dbt.default__get_create_materialized_view_as_sql", "macro_sql": "{% macro default__get_create_materialized_view_as_sql(relation, sql) -%}\n {{ exceptions.raise_compiler_error(\"Materialized views have not been implemented for this adapter.\") }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9978874, "supported_languages": null}, "macro.dbt.get_replace_materialized_view_as_sql": {"name": "get_replace_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/replace_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/replace_materialized_view.sql", "unique_id": "macro.dbt.get_replace_materialized_view_as_sql", "macro_sql": "{% macro get_replace_materialized_view_as_sql(relation, sql, existing_relation, backup_relation, intermediate_relation) %}\n {{- log('Applying REPLACE to: ' ~ relation) -}}\n {{- adapter.dispatch('get_replace_materialized_view_as_sql', 'dbt')(relation, sql, existing_relation, backup_relation, intermediate_relation) -}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_replace_materialized_view_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.998673, "supported_languages": null}, "macro.dbt.default__get_replace_materialized_view_as_sql": {"name": "default__get_replace_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/replace_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/replace_materialized_view.sql", "unique_id": "macro.dbt.default__get_replace_materialized_view_as_sql", "macro_sql": "{% macro default__get_replace_materialized_view_as_sql(relation, sql, existing_relation, backup_relation, intermediate_relation) %}\n {{ exceptions.raise_compiler_error(\"Materialized views have not been implemented for this adapter.\") }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973266.9990222, "supported_languages": null}, "macro.dbt.materialization_materialized_view_default": {"name": "materialization_materialized_view_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialization_materialized_view_default", "macro_sql": "{% materialization materialized_view, default %}\n {% set existing_relation = load_cached_relation(this) %}\n {% set target_relation = this.incorporate(type=this.MaterializedView) %}\n {% set intermediate_relation = make_intermediate_relation(target_relation) %}\n {% set backup_relation_type = target_relation.MaterializedView if existing_relation is none else existing_relation.type %}\n {% set backup_relation = make_backup_relation(target_relation, backup_relation_type) %}\n\n {{ materialized_view_setup(backup_relation, intermediate_relation, pre_hooks) }}\n\n {% set build_sql = materialized_view_get_build_sql(existing_relation, target_relation, backup_relation, intermediate_relation) %}\n\n {% if build_sql == '' %}\n {{ materialized_view_execute_no_op(target_relation) }}\n {% else %}\n {{ materialized_view_execute_build_sql(build_sql, existing_relation, target_relation, post_hooks) }}\n {% endif %}\n\n {{ materialized_view_teardown(backup_relation, intermediate_relation, post_hooks) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.make_intermediate_relation", "macro.dbt.make_backup_relation", "macro.dbt.materialized_view_setup", "macro.dbt.materialized_view_get_build_sql", "macro.dbt.materialized_view_execute_no_op", "macro.dbt.materialized_view_execute_build_sql", "macro.dbt.materialized_view_teardown"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0064049, "supported_languages": ["sql"]}, "macro.dbt.materialized_view_setup": {"name": "materialized_view_setup", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialized_view_setup", "macro_sql": "{% macro materialized_view_setup(backup_relation, intermediate_relation, pre_hooks) %}\n\n -- backup_relation and intermediate_relation should not already exist in the database\n -- it's possible these exist because of a previous run that exited unexpectedly\n {% set preexisting_backup_relation = load_cached_relation(backup_relation) %}\n {% set preexisting_intermediate_relation = load_cached_relation(intermediate_relation) %}\n\n -- drop the temp relations if they exist already in the database\n {{ drop_relation_if_exists(preexisting_backup_relation) }}\n {{ drop_relation_if_exists(preexisting_intermediate_relation) }}\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.drop_relation_if_exists", "macro.dbt.run_hooks"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0071733, "supported_languages": null}, "macro.dbt.materialized_view_teardown": {"name": "materialized_view_teardown", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialized_view_teardown", "macro_sql": "{% macro materialized_view_teardown(backup_relation, intermediate_relation, post_hooks) %}\n\n -- drop the temp relations if they exist to leave the database clean for the next run\n {{ drop_relation_if_exists(backup_relation) }}\n {{ drop_relation_if_exists(intermediate_relation) }}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.drop_relation_if_exists", "macro.dbt.run_hooks"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.00767, "supported_languages": null}, "macro.dbt.materialized_view_get_build_sql": {"name": "materialized_view_get_build_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialized_view_get_build_sql", "macro_sql": "{% macro materialized_view_get_build_sql(existing_relation, target_relation, backup_relation, intermediate_relation) %}\n\n {% set full_refresh_mode = should_full_refresh() %}\n\n -- determine the scenario we're in: create, full_refresh, alter, refresh data\n {% if existing_relation is none %}\n {% set build_sql = get_create_materialized_view_as_sql(target_relation, sql) %}\n {% elif full_refresh_mode or not existing_relation.is_materialized_view %}\n {% set build_sql = get_replace_materialized_view_as_sql(target_relation, sql, existing_relation, backup_relation, intermediate_relation) %}\n {% else %}\n\n -- get config options\n {% set on_configuration_change = config.get('on_configuration_change') %}\n {% set configuration_changes = get_materialized_view_configuration_changes(existing_relation, config) %}\n\n {% if configuration_changes is none %}\n {% set build_sql = refresh_materialized_view(target_relation) %}\n\n {% elif on_configuration_change == 'apply' %}\n {% set build_sql = get_alter_materialized_view_as_sql(target_relation, configuration_changes, sql, existing_relation, backup_relation, intermediate_relation) %}\n {% elif on_configuration_change == 'continue' %}\n {% set build_sql = '' %}\n {{ exceptions.warn(\"Configuration changes were identified and `on_configuration_change` was set to `continue` for `\" ~ target_relation ~ \"`\") }}\n {% elif on_configuration_change == 'fail' %}\n {{ exceptions.raise_fail_fast_error(\"Configuration changes were identified and `on_configuration_change` was set to `fail` for `\" ~ target_relation ~ \"`\") }}\n\n {% else %}\n -- this only happens if the user provides a value other than `apply`, 'skip', 'fail'\n {{ exceptions.raise_compiler_error(\"Unexpected configuration scenario\") }}\n\n {% endif %}\n\n {% endif %}\n\n {% do return(build_sql) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.should_full_refresh", "macro.dbt.get_create_materialized_view_as_sql", "macro.dbt.get_replace_materialized_view_as_sql", "macro.dbt.get_materialized_view_configuration_changes", "macro.dbt.refresh_materialized_view", "macro.dbt.get_alter_materialized_view_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0104067, "supported_languages": null}, "macro.dbt.materialized_view_execute_no_op": {"name": "materialized_view_execute_no_op", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialized_view_execute_no_op", "macro_sql": "{% macro materialized_view_execute_no_op(target_relation) %}\n {% do store_raw_result(\n name=\"main\",\n message=\"skip \" ~ target_relation,\n code=\"skip\",\n rows_affected=\"-1\"\n ) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0108802, "supported_languages": null}, "macro.dbt.materialized_view_execute_build_sql": {"name": "materialized_view_execute_build_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/materialized_view.sql", "unique_id": "macro.dbt.materialized_view_execute_build_sql", "macro_sql": "{% macro materialized_view_execute_build_sql(build_sql, existing_relation, target_relation, post_hooks) %}\n\n -- `BEGIN` happens here:\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n {% set grant_config = config.get('grants') %}\n\n {% call statement(name=\"main\") %}\n {{ build_sql }}\n {% endcall %}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n {{ adapter.commit() }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_hooks", "macro.dbt.statement", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0122023, "supported_languages": null}, "macro.dbt.get_alter_materialized_view_as_sql": {"name": "get_alter_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/alter_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/alter_materialized_view.sql", "unique_id": "macro.dbt.get_alter_materialized_view_as_sql", "macro_sql": "{% macro get_alter_materialized_view_as_sql(\n relation,\n configuration_changes,\n sql,\n existing_relation,\n backup_relation,\n intermediate_relation\n) %}\n {{- log('Applying ALTER to: ' ~ relation) -}}\n {{- adapter.dispatch('get_alter_materialized_view_as_sql', 'dbt')(\n relation,\n configuration_changes,\n sql,\n existing_relation,\n backup_relation,\n intermediate_relation\n ) -}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_alter_materialized_view_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.013288, "supported_languages": null}, "macro.dbt.default__get_alter_materialized_view_as_sql": {"name": "default__get_alter_materialized_view_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/alter_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/alter_materialized_view.sql", "unique_id": "macro.dbt.default__get_alter_materialized_view_as_sql", "macro_sql": "{% macro default__get_alter_materialized_view_as_sql(\n relation,\n configuration_changes,\n sql,\n existing_relation,\n backup_relation,\n intermediate_relation\n) %}\n {{ exceptions.raise_compiler_error(\"Materialized views have not been implemented for this adapter.\") }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0136778, "supported_languages": null}, "macro.dbt.refresh_materialized_view": {"name": "refresh_materialized_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/refresh_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/refresh_materialized_view.sql", "unique_id": "macro.dbt.refresh_materialized_view", "macro_sql": "{% macro refresh_materialized_view(relation) %}\n {{- log('Applying REFRESH to: ' ~ relation) -}}\n {{- adapter.dispatch('refresh_materialized_view', 'dbt')(relation) -}}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__refresh_materialized_view"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0142782, "supported_languages": null}, "macro.dbt.default__refresh_materialized_view": {"name": "default__refresh_materialized_view", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/refresh_materialized_view.sql", "original_file_path": "macros/materializations/models/materialized_view/refresh_materialized_view.sql", "unique_id": "macro.dbt.default__refresh_materialized_view", "macro_sql": "{% macro default__refresh_materialized_view(relation) %}\n {{ exceptions.raise_compiler_error(\"Materialized views have not been implemented for this adapter.\") }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0145543, "supported_languages": null}, "macro.dbt.get_materialized_view_configuration_changes": {"name": "get_materialized_view_configuration_changes", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/get_materialized_view_configuration_changes.sql", "original_file_path": "macros/materializations/models/materialized_view/get_materialized_view_configuration_changes.sql", "unique_id": "macro.dbt.get_materialized_view_configuration_changes", "macro_sql": "{% macro get_materialized_view_configuration_changes(existing_relation, new_config) %}\n /* {#\n It's recommended that configuration changes be formatted as follows:\n {\"\": [{\"action\": \"\", \"context\": ...}]}\n\n For example:\n {\n \"indexes\": [\n {\"action\": \"drop\", \"context\": \"index_abc\"},\n {\"action\": \"create\", \"context\": {\"columns\": [\"column_1\", \"column_2\"], \"type\": \"hash\", \"unique\": True}},\n ],\n }\n\n Either way, `get_materialized_view_configuration_changes` needs to align with `get_alter_materialized_view_as_sql`.\n #} */\n {{- log('Determining configuration changes on: ' ~ existing_relation) -}}\n {%- do return(adapter.dispatch('get_materialized_view_configuration_changes', 'dbt')(existing_relation, new_config)) -%}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_materialized_view_configuration_changes"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0153337, "supported_languages": null}, "macro.dbt.default__get_materialized_view_configuration_changes": {"name": "default__get_materialized_view_configuration_changes", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/materialized_view/get_materialized_view_configuration_changes.sql", "original_file_path": "macros/materializations/models/materialized_view/get_materialized_view_configuration_changes.sql", "unique_id": "macro.dbt.default__get_materialized_view_configuration_changes", "macro_sql": "{% macro default__get_materialized_view_configuration_changes(existing_relation, new_config) %}\n {{ exceptions.raise_compiler_error(\"Materialized views have not been implemented for this adapter.\") }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.01563, "supported_languages": null}, "macro.dbt.get_create_table_as_sql": {"name": "get_create_table_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.get_create_table_as_sql", "macro_sql": "{% macro get_create_table_as_sql(temporary, relation, sql) -%}\n {{ adapter.dispatch('get_create_table_as_sql', 'dbt')(temporary, relation, sql) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_create_table_as_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.017129, "supported_languages": null}, "macro.dbt.default__get_create_table_as_sql": {"name": "default__get_create_table_as_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.default__get_create_table_as_sql", "macro_sql": "{% macro default__get_create_table_as_sql(temporary, relation, sql) -%}\n {{ return(create_table_as(temporary, relation, sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.create_table_as"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.017502, "supported_languages": null}, "macro.dbt.create_table_as": {"name": "create_table_as", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.create_table_as", "macro_sql": "{% macro create_table_as(temporary, relation, compiled_code, language='sql') -%}\n {# backward compatibility for create_table_as that does not support language #}\n {% if language == \"sql\" %}\n {{ adapter.dispatch('create_table_as', 'dbt')(temporary, relation, compiled_code)}}\n {% else %}\n {{ adapter.dispatch('create_table_as', 'dbt')(temporary, relation, compiled_code, language) }}\n {% endif %}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__create_table_as"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0183704, "supported_languages": null}, "macro.dbt.default__create_table_as": {"name": "default__create_table_as", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.default__create_table_as", "macro_sql": "{% macro default__create_table_as(temporary, relation, sql) -%}\n {%- set sql_header = config.get('sql_header', none) -%}\n\n {{ sql_header if sql_header is not none }}\n\n create {% if temporary: -%}temporary{%- endif %} table\n {{ relation.include(database=(not temporary), schema=(not temporary)) }}\n {% set contract_config = config.get('contract') %}\n {% if contract_config.enforced %}\n {{ get_assert_columns_equivalent(sql) }}\n {{ get_table_columns_and_constraints() }}\n {%- set sql = get_select_subquery(sql) %}\n {% endif %}\n as (\n {{ sql }}\n );\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.get_assert_columns_equivalent", "macro.dbt.get_table_columns_and_constraints", "macro.dbt.get_select_subquery"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.019701, "supported_languages": null}, "macro.dbt.default__get_column_names": {"name": "default__get_column_names", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.default__get_column_names", "macro_sql": "{% macro default__get_column_names() %}\n {#- loop through user_provided_columns to get column names -#}\n {%- set user_provided_columns = model['columns'] -%}\n {%- for i in user_provided_columns %}\n {%- set col = user_provided_columns[i] -%}\n {%- set col_name = adapter.quote(col['name']) if col.get('quote') else col['name'] -%}\n {{ col_name }}{{ \", \" if not loop.last }}\n {%- endfor -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0206509, "supported_languages": null}, "macro.dbt.get_select_subquery": {"name": "get_select_subquery", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.get_select_subquery", "macro_sql": "{% macro get_select_subquery(sql) %}\n {{ return(adapter.dispatch('get_select_subquery', 'dbt')(sql)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_select_subquery"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0210364, "supported_languages": null}, "macro.dbt.default__get_select_subquery": {"name": "default__get_select_subquery", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/create_table_as.sql", "original_file_path": "macros/materializations/models/table/create_table_as.sql", "unique_id": "macro.dbt.default__get_select_subquery", "macro_sql": "{% macro default__get_select_subquery(sql) %}\n select {{ adapter.dispatch('get_column_names', 'dbt')() }}\n from (\n {{ sql }}\n ) as model_subq\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_column_names"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0213957, "supported_languages": null}, "macro.dbt.get_table_columns_and_constraints": {"name": "get_table_columns_and_constraints", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.get_table_columns_and_constraints", "macro_sql": "{%- macro get_table_columns_and_constraints() -%}\n {{ adapter.dispatch('get_table_columns_and_constraints', 'dbt')() }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__get_table_columns_and_constraints"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0231307, "supported_languages": null}, "macro.dbt.default__get_table_columns_and_constraints": {"name": "default__get_table_columns_and_constraints", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.default__get_table_columns_and_constraints", "macro_sql": "{% macro default__get_table_columns_and_constraints() -%}\n {{ return(table_columns_and_constraints()) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.table_columns_and_constraints"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.023381, "supported_languages": null}, "macro.dbt.table_columns_and_constraints": {"name": "table_columns_and_constraints", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.table_columns_and_constraints", "macro_sql": "{% macro table_columns_and_constraints() %}\n {# loop through user_provided_columns to create DDL with data types and constraints #}\n {%- set raw_column_constraints = adapter.render_raw_columns_constraints(raw_columns=model['columns']) -%}\n {%- set raw_model_constraints = adapter.render_raw_model_constraints(raw_constraints=model['constraints']) -%}\n (\n {% for c in raw_column_constraints -%}\n {{ c }}{{ \",\" if not loop.last or raw_model_constraints }}\n {% endfor %}\n {% for c in raw_model_constraints -%}\n {{ c }}{{ \",\" if not loop.last }}\n {% endfor -%}\n )\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0244544, "supported_languages": null}, "macro.dbt.get_assert_columns_equivalent": {"name": "get_assert_columns_equivalent", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.get_assert_columns_equivalent", "macro_sql": "\n\n{%- macro get_assert_columns_equivalent(sql) -%}\n {{ adapter.dispatch('get_assert_columns_equivalent', 'dbt')(sql) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__get_assert_columns_equivalent"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0249388, "supported_languages": null}, "macro.dbt.default__get_assert_columns_equivalent": {"name": "default__get_assert_columns_equivalent", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.default__get_assert_columns_equivalent", "macro_sql": "{% macro default__get_assert_columns_equivalent(sql) -%}\n {{ return(assert_columns_equivalent(sql)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.assert_columns_equivalent"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0252223, "supported_languages": null}, "macro.dbt.assert_columns_equivalent": {"name": "assert_columns_equivalent", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.assert_columns_equivalent", "macro_sql": "{% macro assert_columns_equivalent(sql) %}\n\n {#-- First ensure the user has defined 'columns' in yaml specification --#}\n {%- set user_defined_columns = model['columns'] -%}\n {%- if not user_defined_columns -%}\n {{ exceptions.raise_contract_error([], []) }}\n {%- endif -%}\n\n {#-- Obtain the column schema provided by sql file. #}\n {%- set sql_file_provided_columns = get_column_schema_from_query(sql, config.get('sql_header', none)) -%}\n {#--Obtain the column schema provided by the schema file by generating an 'empty schema' query from the model's columns. #}\n {%- set schema_file_provided_columns = get_column_schema_from_query(get_empty_schema_sql(user_defined_columns)) -%}\n\n {#-- create dictionaries with name and formatted data type and strings for exception #}\n {%- set sql_columns = format_columns(sql_file_provided_columns) -%}\n {%- set yaml_columns = format_columns(schema_file_provided_columns) -%}\n\n {%- if sql_columns|length != yaml_columns|length -%}\n {%- do exceptions.raise_contract_error(yaml_columns, sql_columns) -%}\n {%- endif -%}\n\n {%- for sql_col in sql_columns -%}\n {%- set yaml_col = [] -%}\n {%- for this_col in yaml_columns -%}\n {%- if this_col['name'] == sql_col['name'] -%}\n {%- do yaml_col.append(this_col) -%}\n {%- break -%}\n {%- endif -%}\n {%- endfor -%}\n {%- if not yaml_col -%}\n {#-- Column with name not found in yaml #}\n {%- do exceptions.raise_contract_error(yaml_columns, sql_columns) -%}\n {%- endif -%}\n {%- if sql_col['formatted'] != yaml_col[0]['formatted'] -%}\n {#-- Column data types don't match #}\n {%- do exceptions.raise_contract_error(yaml_columns, sql_columns) -%}\n {%- endif -%}\n {%- endfor -%}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_column_schema_from_query", "macro.dbt.get_empty_schema_sql", "macro.dbt.format_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0277421, "supported_languages": null}, "macro.dbt.format_columns": {"name": "format_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.format_columns", "macro_sql": "{% macro format_columns(columns) %}\n {% set formatted_columns = [] %}\n {% for column in columns %}\n {%- set formatted_column = adapter.dispatch('format_column', 'dbt')(column) -%}\n {%- do formatted_columns.append(formatted_column) -%}\n {% endfor %}\n {{ return(formatted_columns) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__format_column"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0285099, "supported_languages": null}, "macro.dbt.default__format_column": {"name": "default__format_column", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/columns_spec_ddl.sql", "original_file_path": "macros/materializations/models/table/columns_spec_ddl.sql", "unique_id": "macro.dbt.default__format_column", "macro_sql": "{% macro default__format_column(column) -%}\n {% set data_type = column.dtype %}\n {% set formatted = column.column.lower() ~ \" \" ~ data_type %}\n {{ return({'name': column.name, 'data_type': data_type, 'formatted': formatted}) }}\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.029217, "supported_languages": null}, "macro.dbt.materialization_table_default": {"name": "materialization_table_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/models/table/table.sql", "original_file_path": "macros/materializations/models/table/table.sql", "unique_id": "macro.dbt.materialization_table_default", "macro_sql": "{% materialization table, default %}\n\n {%- set existing_relation = load_cached_relation(this) -%}\n {%- set target_relation = this.incorporate(type='table') %}\n {%- set intermediate_relation = make_intermediate_relation(target_relation) -%}\n -- the intermediate_relation should not already exist in the database; get_relation\n -- will return None in that case. Otherwise, we get a relation that we can drop\n -- later, before we try to use this name for the current operation\n {%- set preexisting_intermediate_relation = load_cached_relation(intermediate_relation) -%}\n /*\n See ../view/view.sql for more information about this relation.\n */\n {%- set backup_relation_type = 'table' if existing_relation is none else existing_relation.type -%}\n {%- set backup_relation = make_backup_relation(target_relation, backup_relation_type) -%}\n -- as above, the backup_relation should not already exist\n {%- set preexisting_backup_relation = load_cached_relation(backup_relation) -%}\n -- grab current tables grants config for comparision later on\n {% set grant_config = config.get('grants') %}\n\n -- drop the temp relations if they exist already in the database\n {{ drop_relation_if_exists(preexisting_intermediate_relation) }}\n {{ drop_relation_if_exists(preexisting_backup_relation) }}\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n -- `BEGIN` happens here:\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n -- build model\n {% call statement('main') -%}\n {{ get_create_table_as_sql(False, intermediate_relation, sql) }}\n {%- endcall %}\n\n -- cleanup\n {% if existing_relation is not none %}\n {{ adapter.rename_relation(existing_relation, backup_relation) }}\n {% endif %}\n\n {{ adapter.rename_relation(intermediate_relation, target_relation) }}\n\n {% do create_indexes(target_relation) %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n {% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n -- `COMMIT` happens here\n {{ adapter.commit() }}\n\n -- finally, drop the existing/backup relation after the commit\n {{ drop_relation_if_exists(backup_relation) }}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n {{ return({'relations': [target_relation]}) }}\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt.load_cached_relation", "macro.dbt.make_intermediate_relation", "macro.dbt.make_backup_relation", "macro.dbt.drop_relation_if_exists", "macro.dbt.run_hooks", "macro.dbt.statement", "macro.dbt.get_create_table_as_sql", "macro.dbt.create_indexes", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0339084, "supported_languages": ["sql"]}, "macro.dbt.get_where_subquery": {"name": "get_where_subquery", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/tests/where_subquery.sql", "original_file_path": "macros/materializations/tests/where_subquery.sql", "unique_id": "macro.dbt.get_where_subquery", "macro_sql": "{% macro get_where_subquery(relation) -%}\n {% do return(adapter.dispatch('get_where_subquery', 'dbt')(relation)) %}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_where_subquery"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0346172, "supported_languages": null}, "macro.dbt.default__get_where_subquery": {"name": "default__get_where_subquery", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/tests/where_subquery.sql", "original_file_path": "macros/materializations/tests/where_subquery.sql", "unique_id": "macro.dbt.default__get_where_subquery", "macro_sql": "{% macro default__get_where_subquery(relation) -%}\n {% set where = config.get('where', '') %}\n {% if where %}\n {%- set filtered -%}\n (select * from {{ relation }} where {{ where }}) dbt_subquery\n {%- endset -%}\n {% do return(filtered) %}\n {%- else -%}\n {% do return(relation) %}\n {%- endif -%}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.035395, "supported_languages": null}, "macro.dbt.materialization_test_default": {"name": "materialization_test_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/tests/test.sql", "original_file_path": "macros/materializations/tests/test.sql", "unique_id": "macro.dbt.materialization_test_default", "macro_sql": "{%- materialization test, default -%}\n\n {% set relations = [] %}\n\n {% if should_store_failures() %}\n\n {% set identifier = model['alias'] %}\n {% set old_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) %}\n {% set target_relation = api.Relation.create(\n identifier=identifier, schema=schema, database=database, type='table') -%} %}\n\n {% if old_relation %}\n {% do adapter.drop_relation(old_relation) %}\n {% endif %}\n\n {% call statement(auto_begin=True) %}\n {{ create_table_as(False, target_relation, sql) }}\n {% endcall %}\n\n {% do relations.append(target_relation) %}\n\n {% set main_sql %}\n select *\n from {{ target_relation }}\n {% endset %}\n\n {{ adapter.commit() }}\n\n {% else %}\n\n {% set main_sql = sql %}\n\n {% endif %}\n\n {% set limit = config.get('limit') %}\n {% set fail_calc = config.get('fail_calc') %}\n {% set warn_if = config.get('warn_if') %}\n {% set error_if = config.get('error_if') %}\n\n {% call statement('main', fetch_result=True) -%}\n\n {{ get_test_sql(main_sql, fail_calc, warn_if, error_if, limit)}}\n\n {%- endcall %}\n\n {{ return({'relations': relations}) }}\n\n{%- endmaterialization -%}", "depends_on": {"macros": ["macro.dbt.should_store_failures", "macro.dbt.statement", "macro.dbt.create_table_as", "macro.dbt.get_test_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0392334, "supported_languages": ["sql"]}, "macro.dbt.get_test_sql": {"name": "get_test_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/tests/helpers.sql", "original_file_path": "macros/materializations/tests/helpers.sql", "unique_id": "macro.dbt.get_test_sql", "macro_sql": "{% macro get_test_sql(main_sql, fail_calc, warn_if, error_if, limit) -%}\n {{ adapter.dispatch('get_test_sql', 'dbt')(main_sql, fail_calc, warn_if, error_if, limit) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_test_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.040054, "supported_languages": null}, "macro.dbt.default__get_test_sql": {"name": "default__get_test_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/tests/helpers.sql", "original_file_path": "macros/materializations/tests/helpers.sql", "unique_id": "macro.dbt.default__get_test_sql", "macro_sql": "{% macro default__get_test_sql(main_sql, fail_calc, warn_if, error_if, limit) -%}\n select\n {{ fail_calc }} as failures,\n {{ fail_calc }} {{ warn_if }} as should_warn,\n {{ fail_calc }} {{ error_if }} as should_error\n from (\n {{ main_sql }}\n {{ \"limit \" ~ limit if limit != none }}\n ) dbt_internal_test\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0407078, "supported_languages": null}, "macro.dbt.materialization_seed_default": {"name": "materialization_seed_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/seed.sql", "original_file_path": "macros/materializations/seeds/seed.sql", "unique_id": "macro.dbt.materialization_seed_default", "macro_sql": "{% materialization seed, default %}\n\n {%- set identifier = model['alias'] -%}\n {%- set full_refresh_mode = (should_full_refresh()) -%}\n\n {%- set old_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) -%}\n\n {%- set exists_as_table = (old_relation is not none and old_relation.is_table) -%}\n {%- set exists_as_view = (old_relation is not none and old_relation.is_view) -%}\n\n {%- set grant_config = config.get('grants') -%}\n {%- set agate_table = load_agate_table() -%}\n -- grab current tables grants config for comparison later on\n\n {%- do store_result('agate_table', response='OK', agate_table=agate_table) -%}\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n -- `BEGIN` happens here:\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n -- build model\n {% set create_table_sql = \"\" %}\n {% if exists_as_view %}\n {{ exceptions.raise_compiler_error(\"Cannot seed to '{}', it is a view\".format(old_relation)) }}\n {% elif exists_as_table %}\n {% set create_table_sql = reset_csv_table(model, full_refresh_mode, old_relation, agate_table) %}\n {% else %}\n {% set create_table_sql = create_csv_table(model, agate_table) %}\n {% endif %}\n\n {% set code = 'CREATE' if full_refresh_mode else 'INSERT' %}\n {% set rows_affected = (agate_table.rows | length) %}\n {% set sql = load_csv_rows(model, agate_table) %}\n\n {% call noop_statement('main', code ~ ' ' ~ rows_affected, code, rows_affected) %}\n {{ get_csv_sql(create_table_sql, sql) }};\n {% endcall %}\n\n {% set target_relation = this.incorporate(type='table') %}\n\n {% set should_revoke = should_revoke(old_relation, full_refresh_mode) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {% if full_refresh_mode or not exists_as_table %}\n {% do create_indexes(target_relation) %}\n {% endif %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n -- `COMMIT` happens here\n {{ adapter.commit() }}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt.should_full_refresh", "macro.dbt.run_hooks", "macro.dbt.reset_csv_table", "macro.dbt.create_csv_table", "macro.dbt.load_csv_rows", "macro.dbt.noop_statement", "macro.dbt.get_csv_sql", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs", "macro.dbt.create_indexes"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0465631, "supported_languages": ["sql"]}, "macro.dbt.create_csv_table": {"name": "create_csv_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.create_csv_table", "macro_sql": "{% macro create_csv_table(model, agate_table) -%}\n {{ adapter.dispatch('create_csv_table', 'dbt')(model, agate_table) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__create_csv_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.05364, "supported_languages": null}, "macro.dbt.default__create_csv_table": {"name": "default__create_csv_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__create_csv_table", "macro_sql": "{% macro default__create_csv_table(model, agate_table) %}\n {%- set column_override = model['config'].get('column_types', {}) -%}\n {%- set quote_seed_column = model['config'].get('quote_columns', None) -%}\n\n {% set sql %}\n create table {{ this.render() }} (\n {%- for col_name in agate_table.column_names -%}\n {%- set inferred_type = adapter.convert_type(agate_table, loop.index0) -%}\n {%- set type = column_override.get(col_name, inferred_type) -%}\n {%- set column_name = (col_name | string) -%}\n {{ adapter.quote_seed_column(column_name, quote_seed_column) }} {{ type }} {%- if not loop.last -%}, {%- endif -%}\n {%- endfor -%}\n )\n {% endset %}\n\n {% call statement('_') -%}\n {{ sql }}\n {%- endcall %}\n\n {{ return(sql) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0554252, "supported_languages": null}, "macro.dbt.reset_csv_table": {"name": "reset_csv_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.reset_csv_table", "macro_sql": "{% macro reset_csv_table(model, full_refresh, old_relation, agate_table) -%}\n {{ adapter.dispatch('reset_csv_table', 'dbt')(model, full_refresh, old_relation, agate_table) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__reset_csv_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0558958, "supported_languages": null}, "macro.dbt.default__reset_csv_table": {"name": "default__reset_csv_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__reset_csv_table", "macro_sql": "{% macro default__reset_csv_table(model, full_refresh, old_relation, agate_table) %}\n {% set sql = \"\" %}\n {% if full_refresh %}\n {{ adapter.drop_relation(old_relation) }}\n {% set sql = create_csv_table(model, agate_table) %}\n {% else %}\n {{ adapter.truncate_relation(old_relation) }}\n {% set sql = \"truncate table \" ~ old_relation %}\n {% endif %}\n\n {{ return(sql) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.create_csv_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0568838, "supported_languages": null}, "macro.dbt.get_csv_sql": {"name": "get_csv_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.get_csv_sql", "macro_sql": "{% macro get_csv_sql(create_or_truncate_sql, insert_sql) %}\n {{ adapter.dispatch('get_csv_sql', 'dbt')(create_or_truncate_sql, insert_sql) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_csv_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0572646, "supported_languages": null}, "macro.dbt.default__get_csv_sql": {"name": "default__get_csv_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__get_csv_sql", "macro_sql": "{% macro default__get_csv_sql(create_or_truncate_sql, insert_sql) %}\n {{ create_or_truncate_sql }};\n -- dbt seed --\n {{ insert_sql }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0575316, "supported_languages": null}, "macro.dbt.get_binding_char": {"name": "get_binding_char", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.get_binding_char", "macro_sql": "{% macro get_binding_char() -%}\n {{ adapter.dispatch('get_binding_char', 'dbt')() }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_binding_char"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0578134, "supported_languages": null}, "macro.dbt.default__get_binding_char": {"name": "default__get_binding_char", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__get_binding_char", "macro_sql": "{% macro default__get_binding_char() %}\n {{ return('%s') }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.058047, "supported_languages": null}, "macro.dbt.get_batch_size": {"name": "get_batch_size", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.get_batch_size", "macro_sql": "{% macro get_batch_size() -%}\n {{ return(adapter.dispatch('get_batch_size', 'dbt')()) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_batch_size"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0583627, "supported_languages": null}, "macro.dbt.default__get_batch_size": {"name": "default__get_batch_size", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__get_batch_size", "macro_sql": "{% macro default__get_batch_size() %}\n {{ return(10000) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.058599, "supported_languages": null}, "macro.dbt.get_seed_column_quoted_csv": {"name": "get_seed_column_quoted_csv", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.get_seed_column_quoted_csv", "macro_sql": "{% macro get_seed_column_quoted_csv(model, column_names) %}\n {%- set quote_seed_column = model['config'].get('quote_columns', None) -%}\n {% set quoted = [] %}\n {% for col in column_names -%}\n {%- do quoted.append(adapter.quote_seed_column(col, quote_seed_column)) -%}\n {%- endfor %}\n\n {%- set dest_cols_csv = quoted | join(', ') -%}\n {{ return(dest_cols_csv) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.059546, "supported_languages": null}, "macro.dbt.load_csv_rows": {"name": "load_csv_rows", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.load_csv_rows", "macro_sql": "{% macro load_csv_rows(model, agate_table) -%}\n {{ adapter.dispatch('load_csv_rows', 'dbt')(model, agate_table) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__load_csv_rows"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0599165, "supported_languages": null}, "macro.dbt.default__load_csv_rows": {"name": "default__load_csv_rows", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/seeds/helpers.sql", "original_file_path": "macros/materializations/seeds/helpers.sql", "unique_id": "macro.dbt.default__load_csv_rows", "macro_sql": "{% macro default__load_csv_rows(model, agate_table) %}\n\n {% set batch_size = get_batch_size() %}\n\n {% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}\n {% set bindings = [] %}\n\n {% set statements = [] %}\n\n {% for chunk in agate_table.rows | batch(batch_size) %}\n {% set bindings = [] %}\n\n {% for row in chunk %}\n {% do bindings.extend(row) %}\n {% endfor %}\n\n {% set sql %}\n insert into {{ this.render() }} ({{ cols_sql }}) values\n {% for row in chunk -%}\n ({%- for column in agate_table.column_names -%}\n {{ get_binding_char() }}\n {%- if not loop.last%},{%- endif %}\n {%- endfor -%})\n {%- if not loop.last%},{%- endif %}\n {%- endfor %}\n {% endset %}\n\n {% do adapter.add_query(sql, bindings=bindings, abridge_sql_log=True) %}\n\n {% if loop.index0 == 0 %}\n {% do statements.append(sql) %}\n {% endif %}\n {% endfor %}\n\n {# Return SQL so we can render it out into the compiled files #}\n {{ return(statements[0]) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.get_batch_size", "macro.dbt.get_seed_column_quoted_csv", "macro.dbt.get_binding_char"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0625072, "supported_languages": null}, "macro.dbt.snapshot_merge_sql": {"name": "snapshot_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/snapshot_merge.sql", "original_file_path": "macros/materializations/snapshots/snapshot_merge.sql", "unique_id": "macro.dbt.snapshot_merge_sql", "macro_sql": "{% macro snapshot_merge_sql(target, source, insert_cols) -%}\n {{ adapter.dispatch('snapshot_merge_sql', 'dbt')(target, source, insert_cols) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__snapshot_merge_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0632966, "supported_languages": null}, "macro.dbt.default__snapshot_merge_sql": {"name": "default__snapshot_merge_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/snapshot_merge.sql", "original_file_path": "macros/materializations/snapshots/snapshot_merge.sql", "unique_id": "macro.dbt.default__snapshot_merge_sql", "macro_sql": "{% macro default__snapshot_merge_sql(target, source, insert_cols) -%}\n {%- set insert_cols_csv = insert_cols | join(', ') -%}\n\n merge into {{ target }} as DBT_INTERNAL_DEST\n using {{ source }} as DBT_INTERNAL_SOURCE\n on DBT_INTERNAL_SOURCE.dbt_scd_id = DBT_INTERNAL_DEST.dbt_scd_id\n\n when matched\n and DBT_INTERNAL_DEST.dbt_valid_to is null\n and DBT_INTERNAL_SOURCE.dbt_change_type in ('update', 'delete')\n then update\n set dbt_valid_to = DBT_INTERNAL_SOURCE.dbt_valid_to\n\n when not matched\n and DBT_INTERNAL_SOURCE.dbt_change_type = 'insert'\n then insert ({{ insert_cols_csv }})\n values ({{ insert_cols_csv }})\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0638359, "supported_languages": null}, "macro.dbt.create_columns": {"name": "create_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.create_columns", "macro_sql": "{% macro create_columns(relation, columns) %}\n {{ adapter.dispatch('create_columns', 'dbt')(relation, columns) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__create_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.069507, "supported_languages": null}, "macro.dbt.default__create_columns": {"name": "default__create_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.default__create_columns", "macro_sql": "{% macro default__create_columns(relation, columns) %}\n {% for column in columns %}\n {% call statement() %}\n alter table {{ relation }} add column \"{{ column.name }}\" {{ column.data_type }};\n {% endcall %}\n {% endfor %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0700912, "supported_languages": null}, "macro.dbt.post_snapshot": {"name": "post_snapshot", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.post_snapshot", "macro_sql": "{% macro post_snapshot(staging_relation) %}\n {{ adapter.dispatch('post_snapshot', 'dbt')(staging_relation) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__post_snapshot"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0704298, "supported_languages": null}, "macro.dbt.default__post_snapshot": {"name": "default__post_snapshot", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.default__post_snapshot", "macro_sql": "{% macro default__post_snapshot(staging_relation) %}\n {# no-op #}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0706232, "supported_languages": null}, "macro.dbt.get_true_sql": {"name": "get_true_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.get_true_sql", "macro_sql": "{% macro get_true_sql() %}\n {{ adapter.dispatch('get_true_sql', 'dbt')() }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__get_true_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0709233, "supported_languages": null}, "macro.dbt.default__get_true_sql": {"name": "default__get_true_sql", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.default__get_true_sql", "macro_sql": "{% macro default__get_true_sql() %}\n {{ return('TRUE') }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0711703, "supported_languages": null}, "macro.dbt.snapshot_staging_table": {"name": "snapshot_staging_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.snapshot_staging_table", "macro_sql": "{% macro snapshot_staging_table(strategy, source_sql, target_relation) -%}\n {{ adapter.dispatch('snapshot_staging_table', 'dbt')(strategy, source_sql, target_relation) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__snapshot_staging_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.071581, "supported_languages": null}, "macro.dbt.default__snapshot_staging_table": {"name": "default__snapshot_staging_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.default__snapshot_staging_table", "macro_sql": "{% macro default__snapshot_staging_table(strategy, source_sql, target_relation) -%}\n\n with snapshot_query as (\n\n {{ source_sql }}\n\n ),\n\n snapshotted_data as (\n\n select *,\n {{ strategy.unique_key }} as dbt_unique_key\n\n from {{ target_relation }}\n where dbt_valid_to is null\n\n ),\n\n insertions_source_data as (\n\n select\n *,\n {{ strategy.unique_key }} as dbt_unique_key,\n {{ strategy.updated_at }} as dbt_updated_at,\n {{ strategy.updated_at }} as dbt_valid_from,\n nullif({{ strategy.updated_at }}, {{ strategy.updated_at }}) as dbt_valid_to,\n {{ strategy.scd_id }} as dbt_scd_id\n\n from snapshot_query\n ),\n\n updates_source_data as (\n\n select\n *,\n {{ strategy.unique_key }} as dbt_unique_key,\n {{ strategy.updated_at }} as dbt_updated_at,\n {{ strategy.updated_at }} as dbt_valid_from,\n {{ strategy.updated_at }} as dbt_valid_to\n\n from snapshot_query\n ),\n\n {%- if strategy.invalidate_hard_deletes %}\n\n deletes_source_data as (\n\n select\n *,\n {{ strategy.unique_key }} as dbt_unique_key\n from snapshot_query\n ),\n {% endif %}\n\n insertions as (\n\n select\n 'insert' as dbt_change_type,\n source_data.*\n\n from insertions_source_data as source_data\n left outer join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key\n where snapshotted_data.dbt_unique_key is null\n or (\n snapshotted_data.dbt_unique_key is not null\n and (\n {{ strategy.row_changed }}\n )\n )\n\n ),\n\n updates as (\n\n select\n 'update' as dbt_change_type,\n source_data.*,\n snapshotted_data.dbt_scd_id\n\n from updates_source_data as source_data\n join snapshotted_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key\n where (\n {{ strategy.row_changed }}\n )\n )\n\n {%- if strategy.invalidate_hard_deletes -%}\n ,\n\n deletes as (\n\n select\n 'delete' as dbt_change_type,\n source_data.*,\n {{ snapshot_get_time() }} as dbt_valid_from,\n {{ snapshot_get_time() }} as dbt_updated_at,\n {{ snapshot_get_time() }} as dbt_valid_to,\n snapshotted_data.dbt_scd_id\n\n from snapshotted_data\n left join deletes_source_data as source_data on snapshotted_data.dbt_unique_key = source_data.dbt_unique_key\n where source_data.dbt_unique_key is null\n )\n {%- endif %}\n\n select * from insertions\n union all\n select * from updates\n {%- if strategy.invalidate_hard_deletes %}\n union all\n select * from deletes\n {%- endif %}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.snapshot_get_time"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.073316, "supported_languages": null}, "macro.dbt.build_snapshot_table": {"name": "build_snapshot_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.build_snapshot_table", "macro_sql": "{% macro build_snapshot_table(strategy, sql) -%}\n {{ adapter.dispatch('build_snapshot_table', 'dbt')(strategy, sql) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__build_snapshot_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.073698, "supported_languages": null}, "macro.dbt.default__build_snapshot_table": {"name": "default__build_snapshot_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.default__build_snapshot_table", "macro_sql": "{% macro default__build_snapshot_table(strategy, sql) %}\n\n select *,\n {{ strategy.scd_id }} as dbt_scd_id,\n {{ strategy.updated_at }} as dbt_updated_at,\n {{ strategy.updated_at }} as dbt_valid_from,\n nullif({{ strategy.updated_at }}, {{ strategy.updated_at }}) as dbt_valid_to\n from (\n {{ sql }}\n ) sbq\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0742087, "supported_languages": null}, "macro.dbt.build_snapshot_staging_table": {"name": "build_snapshot_staging_table", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/helpers.sql", "original_file_path": "macros/materializations/snapshots/helpers.sql", "unique_id": "macro.dbt.build_snapshot_staging_table", "macro_sql": "{% macro build_snapshot_staging_table(strategy, sql, target_relation) %}\n {% set temp_relation = make_temp_relation(target_relation) %}\n\n {% set select = snapshot_staging_table(strategy, sql, target_relation) %}\n\n {% call statement('build_snapshot_staging_relation') %}\n {{ create_table_as(True, temp_relation, select) }}\n {% endcall %}\n\n {% do return(temp_relation) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.make_temp_relation", "macro.dbt.snapshot_staging_table", "macro.dbt.statement", "macro.dbt.create_table_as"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0750532, "supported_languages": null}, "macro.dbt.materialization_snapshot_default": {"name": "materialization_snapshot_default", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/snapshot.sql", "original_file_path": "macros/materializations/snapshots/snapshot.sql", "unique_id": "macro.dbt.materialization_snapshot_default", "macro_sql": "{% materialization snapshot, default %}\n {%- set config = model['config'] -%}\n\n {%- set target_table = model.get('alias', model.get('name')) -%}\n\n {%- set strategy_name = config.get('strategy') -%}\n {%- set unique_key = config.get('unique_key') %}\n -- grab current tables grants config for comparision later on\n {%- set grant_config = config.get('grants') -%}\n\n {% set target_relation_exists, target_relation = get_or_create_relation(\n database=model.database,\n schema=model.schema,\n identifier=target_table,\n type='table') -%}\n\n {%- if not target_relation.is_table -%}\n {% do exceptions.relation_wrong_type(target_relation, 'table') %}\n {%- endif -%}\n\n\n {{ run_hooks(pre_hooks, inside_transaction=False) }}\n\n {{ run_hooks(pre_hooks, inside_transaction=True) }}\n\n {% set strategy_macro = strategy_dispatch(strategy_name) %}\n {% set strategy = strategy_macro(model, \"snapshotted_data\", \"source_data\", config, target_relation_exists) %}\n\n {% if not target_relation_exists %}\n\n {% set build_sql = build_snapshot_table(strategy, model['compiled_code']) %}\n {% set final_sql = create_table_as(False, target_relation, build_sql) %}\n\n {% else %}\n\n {{ adapter.valid_snapshot_target(target_relation) }}\n\n {% set staging_table = build_snapshot_staging_table(strategy, sql, target_relation) %}\n\n -- this may no-op if the database does not require column expansion\n {% do adapter.expand_target_column_types(from_relation=staging_table,\n to_relation=target_relation) %}\n\n {% set missing_columns = adapter.get_missing_columns(staging_table, target_relation)\n | rejectattr('name', 'equalto', 'dbt_change_type')\n | rejectattr('name', 'equalto', 'DBT_CHANGE_TYPE')\n | rejectattr('name', 'equalto', 'dbt_unique_key')\n | rejectattr('name', 'equalto', 'DBT_UNIQUE_KEY')\n | list %}\n\n {% do create_columns(target_relation, missing_columns) %}\n\n {% set source_columns = adapter.get_columns_in_relation(staging_table)\n | rejectattr('name', 'equalto', 'dbt_change_type')\n | rejectattr('name', 'equalto', 'DBT_CHANGE_TYPE')\n | rejectattr('name', 'equalto', 'dbt_unique_key')\n | rejectattr('name', 'equalto', 'DBT_UNIQUE_KEY')\n | list %}\n\n {% set quoted_source_columns = [] %}\n {% for column in source_columns %}\n {% do quoted_source_columns.append(adapter.quote(column.name)) %}\n {% endfor %}\n\n {% set final_sql = snapshot_merge_sql(\n target = target_relation,\n source = staging_table,\n insert_cols = quoted_source_columns\n )\n %}\n\n {% endif %}\n\n {% call statement('main') %}\n {{ final_sql }}\n {% endcall %}\n\n {% set should_revoke = should_revoke(target_relation_exists, full_refresh_mode=False) %}\n {% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}\n\n {% do persist_docs(target_relation, model) %}\n\n {% if not target_relation_exists %}\n {% do create_indexes(target_relation) %}\n {% endif %}\n\n {{ run_hooks(post_hooks, inside_transaction=True) }}\n\n {{ adapter.commit() }}\n\n {% if staging_table is defined %}\n {% do post_snapshot(staging_table) %}\n {% endif %}\n\n {{ run_hooks(post_hooks, inside_transaction=False) }}\n\n {{ return({'relations': [target_relation]}) }}\n\n{% endmaterialization %}", "depends_on": {"macros": ["macro.dbt.get_or_create_relation", "macro.dbt.run_hooks", "macro.dbt.strategy_dispatch", "macro.dbt.build_snapshot_table", "macro.dbt.create_table_as", "macro.dbt.build_snapshot_staging_table", "macro.dbt.create_columns", "macro.dbt.snapshot_merge_sql", "macro.dbt.statement", "macro.dbt.should_revoke", "macro.dbt.apply_grants", "macro.dbt.persist_docs", "macro.dbt.create_indexes", "macro.dbt.post_snapshot"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.085442, "supported_languages": ["sql"]}, "macro.dbt.strategy_dispatch": {"name": "strategy_dispatch", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.strategy_dispatch", "macro_sql": "{% macro strategy_dispatch(name) -%}\n{% set original_name = name %}\n {% if '.' in name %}\n {% set package_name, name = name.split(\".\", 1) %}\n {% else %}\n {% set package_name = none %}\n {% endif %}\n\n {% if package_name is none %}\n {% set package_context = context %}\n {% elif package_name in context %}\n {% set package_context = context[package_name] %}\n {% else %}\n {% set error_msg %}\n Could not find package '{{package_name}}', called with '{{original_name}}'\n {% endset %}\n {{ exceptions.raise_compiler_error(error_msg | trim) }}\n {% endif %}\n\n {%- set search_name = 'snapshot_' ~ name ~ '_strategy' -%}\n\n {% if search_name not in package_context %}\n {% set error_msg %}\n The specified strategy macro '{{name}}' was not found in package '{{ package_name }}'\n {% endset %}\n {{ exceptions.raise_compiler_error(error_msg | trim) }}\n {% endif %}\n {{ return(package_context[search_name]) }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0912397, "supported_languages": null}, "macro.dbt.snapshot_hash_arguments": {"name": "snapshot_hash_arguments", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.snapshot_hash_arguments", "macro_sql": "{% macro snapshot_hash_arguments(args) -%}\n {{ adapter.dispatch('snapshot_hash_arguments', 'dbt')(args) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__snapshot_hash_arguments"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0915878, "supported_languages": null}, "macro.dbt.default__snapshot_hash_arguments": {"name": "default__snapshot_hash_arguments", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.default__snapshot_hash_arguments", "macro_sql": "{% macro default__snapshot_hash_arguments(args) -%}\n md5({%- for arg in args -%}\n coalesce(cast({{ arg }} as varchar ), '')\n {% if not loop.last %} || '|' || {% endif %}\n {%- endfor -%})\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0920436, "supported_languages": null}, "macro.dbt.snapshot_timestamp_strategy": {"name": "snapshot_timestamp_strategy", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.snapshot_timestamp_strategy", "macro_sql": "{% macro snapshot_timestamp_strategy(node, snapshotted_rel, current_rel, config, target_exists) %}\n {% set primary_key = config['unique_key'] %}\n {% set updated_at = config['updated_at'] %}\n {% set invalidate_hard_deletes = config.get('invalidate_hard_deletes', false) %}\n\n {#/*\n The snapshot relation might not have an {{ updated_at }} value if the\n snapshot strategy is changed from `check` to `timestamp`. We\n should use a dbt-created column for the comparison in the snapshot\n table instead of assuming that the user-supplied {{ updated_at }}\n will be present in the historical data.\n\n See https://github.com/dbt-labs/dbt-core/issues/2350\n */ #}\n {% set row_changed_expr -%}\n ({{ snapshotted_rel }}.dbt_valid_from < {{ current_rel }}.{{ updated_at }})\n {%- endset %}\n\n {% set scd_id_expr = snapshot_hash_arguments([primary_key, updated_at]) %}\n\n {% do return({\n \"unique_key\": primary_key,\n \"updated_at\": updated_at,\n \"row_changed\": row_changed_expr,\n \"scd_id\": scd_id_expr,\n \"invalidate_hard_deletes\": invalidate_hard_deletes\n }) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.snapshot_hash_arguments"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.09348, "supported_languages": null}, "macro.dbt.snapshot_string_as_time": {"name": "snapshot_string_as_time", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.snapshot_string_as_time", "macro_sql": "{% macro snapshot_string_as_time(timestamp) -%}\n {{ adapter.dispatch('snapshot_string_as_time', 'dbt')(timestamp) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__snapshot_string_as_time"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0938115, "supported_languages": null}, "macro.dbt.default__snapshot_string_as_time": {"name": "default__snapshot_string_as_time", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.default__snapshot_string_as_time", "macro_sql": "{% macro default__snapshot_string_as_time(timestamp) %}\n {% do exceptions.raise_not_implemented(\n 'snapshot_string_as_time macro not implemented for adapter '+adapter.type()\n ) %}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0941582, "supported_languages": null}, "macro.dbt.snapshot_check_all_get_existing_columns": {"name": "snapshot_check_all_get_existing_columns", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.snapshot_check_all_get_existing_columns", "macro_sql": "{% macro snapshot_check_all_get_existing_columns(node, target_exists, check_cols_config) -%}\n {%- if not target_exists -%}\n {#-- no table yet -> return whatever the query does --#}\n {{ return((false, query_columns)) }}\n {%- endif -%}\n\n {#-- handle any schema changes --#}\n {%- set target_relation = adapter.get_relation(database=node.database, schema=node.schema, identifier=node.alias) -%}\n\n {% if check_cols_config == 'all' %}\n {%- set query_columns = get_columns_in_query(node['compiled_code']) -%}\n\n {% elif check_cols_config is iterable and (check_cols_config | length) > 0 %}\n {#-- query for proper casing/quoting, to support comparison below --#}\n {%- set select_check_cols_from_target -%}\n {#-- N.B. The whitespace below is necessary to avoid edge case issue with comments --#}\n {#-- See: https://github.com/dbt-labs/dbt-core/issues/6781 --#}\n select {{ check_cols_config | join(', ') }} from (\n {{ node['compiled_code'] }}\n ) subq\n {%- endset -%}\n {% set query_columns = get_columns_in_query(select_check_cols_from_target) %}\n\n {% else %}\n {% do exceptions.raise_compiler_error(\"Invalid value for 'check_cols': \" ~ check_cols_config) %}\n {% endif %}\n\n {%- set existing_cols = adapter.get_columns_in_relation(target_relation) | map(attribute = 'name') | list -%}\n {%- set ns = namespace() -%} {#-- handle for-loop scoping with a namespace --#}\n {%- set ns.column_added = false -%}\n\n {%- set intersection = [] -%}\n {%- for col in query_columns -%}\n {%- if col in existing_cols -%}\n {%- do intersection.append(adapter.quote(col)) -%}\n {%- else -%}\n {% set ns.column_added = true %}\n {%- endif -%}\n {%- endfor -%}\n {{ return((ns.column_added, intersection)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.get_columns_in_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0968812, "supported_languages": null}, "macro.dbt.snapshot_check_strategy": {"name": "snapshot_check_strategy", "resource_type": "macro", "package_name": "dbt", "path": "macros/materializations/snapshots/strategies.sql", "original_file_path": "macros/materializations/snapshots/strategies.sql", "unique_id": "macro.dbt.snapshot_check_strategy", "macro_sql": "{% macro snapshot_check_strategy(node, snapshotted_rel, current_rel, config, target_exists) %}\n {% set check_cols_config = config['check_cols'] %}\n {% set primary_key = config['unique_key'] %}\n {% set invalidate_hard_deletes = config.get('invalidate_hard_deletes', false) %}\n {% set updated_at = config.get('updated_at', snapshot_get_time()) %}\n\n {% set column_added = false %}\n\n {% set column_added, check_cols = snapshot_check_all_get_existing_columns(node, target_exists, check_cols_config) %}\n\n {%- set row_changed_expr -%}\n (\n {%- if column_added -%}\n {{ get_true_sql() }}\n {%- else -%}\n {%- for col in check_cols -%}\n {{ snapshotted_rel }}.{{ col }} != {{ current_rel }}.{{ col }}\n or\n (\n (({{ snapshotted_rel }}.{{ col }} is null) and not ({{ current_rel }}.{{ col }} is null))\n or\n ((not {{ snapshotted_rel }}.{{ col }} is null) and ({{ current_rel }}.{{ col }} is null))\n )\n {%- if not loop.last %} or {% endif -%}\n {%- endfor -%}\n {%- endif -%}\n )\n {%- endset %}\n\n {% set scd_id_expr = snapshot_hash_arguments([primary_key, updated_at]) %}\n\n {% do return({\n \"unique_key\": primary_key,\n \"updated_at\": updated_at,\n \"row_changed\": row_changed_expr,\n \"scd_id\": scd_id_expr,\n \"invalidate_hard_deletes\": invalidate_hard_deletes\n }) %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.snapshot_get_time", "macro.dbt.snapshot_check_all_get_existing_columns", "macro.dbt.get_true_sql", "macro.dbt.snapshot_hash_arguments"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.0995584, "supported_languages": null}, "macro.dbt.generate_database_name": {"name": "generate_database_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_database.sql", "original_file_path": "macros/get_custom_name/get_custom_database.sql", "unique_id": "macro.dbt.generate_database_name", "macro_sql": "{% macro generate_database_name(custom_database_name=none, node=none) -%}\n {% do return(adapter.dispatch('generate_database_name', 'dbt')(custom_database_name, node)) %}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__generate_database_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.100327, "supported_languages": null}, "macro.dbt.default__generate_database_name": {"name": "default__generate_database_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_database.sql", "original_file_path": "macros/get_custom_name/get_custom_database.sql", "unique_id": "macro.dbt.default__generate_database_name", "macro_sql": "{% macro default__generate_database_name(custom_database_name=none, node=none) -%}\n {%- set default_database = target.database -%}\n {%- if custom_database_name is none -%}\n\n {{ default_database }}\n\n {%- else -%}\n\n {{ custom_database_name }}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1008801, "supported_languages": null}, "macro.dbt.generate_alias_name": {"name": "generate_alias_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_alias.sql", "original_file_path": "macros/get_custom_name/get_custom_alias.sql", "unique_id": "macro.dbt.generate_alias_name", "macro_sql": "{% macro generate_alias_name(custom_alias_name=none, node=none) -%}\n {% do return(adapter.dispatch('generate_alias_name', 'dbt')(custom_alias_name, node)) %}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__generate_alias_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1016483, "supported_languages": null}, "macro.dbt.default__generate_alias_name": {"name": "default__generate_alias_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_alias.sql", "original_file_path": "macros/get_custom_name/get_custom_alias.sql", "unique_id": "macro.dbt.default__generate_alias_name", "macro_sql": "{% macro default__generate_alias_name(custom_alias_name=none, node=none) -%}\n\n {%- if custom_alias_name -%}\n\n {{ custom_alias_name | trim }}\n\n {%- elif node.version -%}\n\n {{ return(node.name ~ \"_v\" ~ (node.version | replace(\".\", \"_\"))) }}\n\n {%- else -%}\n\n {{ node.name }}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1024065, "supported_languages": null}, "macro.dbt.generate_schema_name": {"name": "generate_schema_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_schema.sql", "original_file_path": "macros/get_custom_name/get_custom_schema.sql", "unique_id": "macro.dbt.generate_schema_name", "macro_sql": "{% macro generate_schema_name(custom_schema_name=none, node=none) -%}\n {{ return(adapter.dispatch('generate_schema_name', 'dbt')(custom_schema_name, node)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__generate_schema_name"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1033368, "supported_languages": null}, "macro.dbt.default__generate_schema_name": {"name": "default__generate_schema_name", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_schema.sql", "original_file_path": "macros/get_custom_name/get_custom_schema.sql", "unique_id": "macro.dbt.default__generate_schema_name", "macro_sql": "{% macro default__generate_schema_name(custom_schema_name, node) -%}\n\n {%- set default_schema = target.schema -%}\n {%- if custom_schema_name is none -%}\n\n {{ default_schema }}\n\n {%- else -%}\n\n {{ default_schema }}_{{ custom_schema_name | trim }}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1038654, "supported_languages": null}, "macro.dbt.generate_schema_name_for_env": {"name": "generate_schema_name_for_env", "resource_type": "macro", "package_name": "dbt", "path": "macros/get_custom_name/get_custom_schema.sql", "original_file_path": "macros/get_custom_name/get_custom_schema.sql", "unique_id": "macro.dbt.generate_schema_name_for_env", "macro_sql": "{% macro generate_schema_name_for_env(custom_schema_name, node) -%}\n\n {%- set default_schema = target.schema -%}\n {%- if target.name == 'prod' and custom_schema_name is not none -%}\n\n {{ custom_schema_name | trim }}\n\n {%- else -%}\n\n {{ default_schema }}\n\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1044445, "supported_languages": null}, "macro.dbt.position": {"name": "position", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/position.sql", "original_file_path": "macros/utils/position.sql", "unique_id": "macro.dbt.position", "macro_sql": "{% macro position(substring_text, string_text) -%}\n {{ return(adapter.dispatch('position', 'dbt') (substring_text, string_text)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__position"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1050255, "supported_languages": null}, "macro.dbt.default__position": {"name": "default__position", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/position.sql", "original_file_path": "macros/utils/position.sql", "unique_id": "macro.dbt.default__position", "macro_sql": "{% macro default__position(substring_text, string_text) %}\n\n position(\n {{ substring_text }} in {{ string_text }}\n )\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1053019, "supported_languages": null}, "macro.dbt.any_value": {"name": "any_value", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/any_value.sql", "original_file_path": "macros/utils/any_value.sql", "unique_id": "macro.dbt.any_value", "macro_sql": "{% macro any_value(expression) -%}\n {{ return(adapter.dispatch('any_value', 'dbt') (expression)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__any_value"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1057796, "supported_languages": null}, "macro.dbt.default__any_value": {"name": "default__any_value", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/any_value.sql", "original_file_path": "macros/utils/any_value.sql", "unique_id": "macro.dbt.default__any_value", "macro_sql": "{% macro default__any_value(expression) -%}\n\n any_value({{ expression }})\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1060002, "supported_languages": null}, "macro.dbt.except": {"name": "except", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/except.sql", "original_file_path": "macros/utils/except.sql", "unique_id": "macro.dbt.except", "macro_sql": "{% macro except() %}\n {{ return(adapter.dispatch('except', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__except"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1064355, "supported_languages": null}, "macro.dbt.default__except": {"name": "default__except", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/except.sql", "original_file_path": "macros/utils/except.sql", "unique_id": "macro.dbt.default__except", "macro_sql": "{% macro default__except() %}\n\n except\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1065943, "supported_languages": null}, "macro.dbt.last_day": {"name": "last_day", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/last_day.sql", "original_file_path": "macros/utils/last_day.sql", "unique_id": "macro.dbt.last_day", "macro_sql": "{% macro last_day(date, datepart) %}\n {{ return(adapter.dispatch('last_day', 'dbt') (date, datepart)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__last_day"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1072319, "supported_languages": null}, "macro.dbt.default_last_day": {"name": "default_last_day", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/last_day.sql", "original_file_path": "macros/utils/last_day.sql", "unique_id": "macro.dbt.default_last_day", "macro_sql": "\n\n{%- macro default_last_day(date, datepart) -%}\n cast(\n {{dbt.dateadd('day', '-1',\n dbt.dateadd(datepart, '1', dbt.date_trunc(datepart, date))\n )}}\n as date)\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.dateadd", "macro.dbt.date_trunc"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1077645, "supported_languages": null}, "macro.dbt.default__last_day": {"name": "default__last_day", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/last_day.sql", "original_file_path": "macros/utils/last_day.sql", "unique_id": "macro.dbt.default__last_day", "macro_sql": "{% macro default__last_day(date, datepart) -%}\n {{dbt.default_last_day(date, datepart)}}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default_last_day"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.108068, "supported_languages": null}, "macro.dbt.escape_single_quotes": {"name": "escape_single_quotes", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/escape_single_quotes.sql", "original_file_path": "macros/utils/escape_single_quotes.sql", "unique_id": "macro.dbt.escape_single_quotes", "macro_sql": "{% macro escape_single_quotes(expression) %}\n {{ return(adapter.dispatch('escape_single_quotes', 'dbt') (expression)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__escape_single_quotes"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1085794, "supported_languages": null}, "macro.dbt.default__escape_single_quotes": {"name": "default__escape_single_quotes", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/escape_single_quotes.sql", "original_file_path": "macros/utils/escape_single_quotes.sql", "unique_id": "macro.dbt.default__escape_single_quotes", "macro_sql": "{% macro default__escape_single_quotes(expression) -%}\n{{ expression | replace(\"'\",\"''\") }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1088924, "supported_languages": null}, "macro.dbt.dateadd": {"name": "dateadd", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/dateadd.sql", "original_file_path": "macros/utils/dateadd.sql", "unique_id": "macro.dbt.dateadd", "macro_sql": "{% macro dateadd(datepart, interval, from_date_or_timestamp) %}\n {{ return(adapter.dispatch('dateadd', 'dbt')(datepart, interval, from_date_or_timestamp)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__dateadd"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1098397, "supported_languages": null}, "macro.dbt.default__dateadd": {"name": "default__dateadd", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/dateadd.sql", "original_file_path": "macros/utils/dateadd.sql", "unique_id": "macro.dbt.default__dateadd", "macro_sql": "{% macro default__dateadd(datepart, interval, from_date_or_timestamp) %}\n\n dateadd(\n {{ datepart }},\n {{ interval }},\n {{ from_date_or_timestamp }}\n )\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1101844, "supported_languages": null}, "macro.dbt.date_trunc": {"name": "date_trunc", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/date_trunc.sql", "original_file_path": "macros/utils/date_trunc.sql", "unique_id": "macro.dbt.date_trunc", "macro_sql": "{% macro date_trunc(datepart, date) -%}\n {{ return(adapter.dispatch('date_trunc', 'dbt') (datepart, date)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__date_trunc"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1107292, "supported_languages": null}, "macro.dbt.default__date_trunc": {"name": "default__date_trunc", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/date_trunc.sql", "original_file_path": "macros/utils/date_trunc.sql", "unique_id": "macro.dbt.default__date_trunc", "macro_sql": "{% macro default__date_trunc(datepart, date) -%}\n date_trunc('{{datepart}}', {{date}})\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1109922, "supported_languages": null}, "macro.dbt.datediff": {"name": "datediff", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/datediff.sql", "original_file_path": "macros/utils/datediff.sql", "unique_id": "macro.dbt.datediff", "macro_sql": "{% macro datediff(first_date, second_date, datepart) %}\n {{ return(adapter.dispatch('datediff', 'dbt')(first_date, second_date, datepart)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__datediff"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1116138, "supported_languages": null}, "macro.dbt.default__datediff": {"name": "default__datediff", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/datediff.sql", "original_file_path": "macros/utils/datediff.sql", "unique_id": "macro.dbt.default__datediff", "macro_sql": "{% macro default__datediff(first_date, second_date, datepart) -%}\n\n datediff(\n {{ datepart }},\n {{ first_date }},\n {{ second_date }}\n )\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1119485, "supported_languages": null}, "macro.dbt.array_construct": {"name": "array_construct", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_construct.sql", "original_file_path": "macros/utils/array_construct.sql", "unique_id": "macro.dbt.array_construct", "macro_sql": "{% macro array_construct(inputs=[], data_type=api.Column.translate_type('integer')) -%}\n {{ return(adapter.dispatch('array_construct', 'dbt')(inputs, data_type)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__array_construct"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1126974, "supported_languages": null}, "macro.dbt.default__array_construct": {"name": "default__array_construct", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_construct.sql", "original_file_path": "macros/utils/array_construct.sql", "unique_id": "macro.dbt.default__array_construct", "macro_sql": "{% macro default__array_construct(inputs, data_type) -%}\n {% if inputs|length > 0 %}\n array[ {{ inputs|join(' , ') }} ]\n {% else %}\n array[]::{{data_type}}[]\n {% endif %}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1132522, "supported_languages": null}, "macro.dbt.string_literal": {"name": "string_literal", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/literal.sql", "original_file_path": "macros/utils/literal.sql", "unique_id": "macro.dbt.string_literal", "macro_sql": "{%- macro string_literal(value) -%}\n {{ return(adapter.dispatch('string_literal', 'dbt') (value)) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__string_literal"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1137636, "supported_languages": null}, "macro.dbt.default__string_literal": {"name": "default__string_literal", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/literal.sql", "original_file_path": "macros/utils/literal.sql", "unique_id": "macro.dbt.default__string_literal", "macro_sql": "{% macro default__string_literal(value) -%}\n '{{ value }}'\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1139793, "supported_languages": null}, "macro.dbt.concat": {"name": "concat", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/concat.sql", "original_file_path": "macros/utils/concat.sql", "unique_id": "macro.dbt.concat", "macro_sql": "{% macro concat(fields) -%}\n {{ return(adapter.dispatch('concat', 'dbt')(fields)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__concat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1144514, "supported_languages": null}, "macro.dbt.default__concat": {"name": "default__concat", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/concat.sql", "original_file_path": "macros/utils/concat.sql", "unique_id": "macro.dbt.default__concat", "macro_sql": "{% macro default__concat(fields) -%}\n {{ fields|join(' || ') }}\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1147022, "supported_languages": null}, "macro.dbt.array_append": {"name": "array_append", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_append.sql", "original_file_path": "macros/utils/array_append.sql", "unique_id": "macro.dbt.array_append", "macro_sql": "{% macro array_append(array, new_element) -%}\n {{ return(adapter.dispatch('array_append', 'dbt')(array, new_element)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__array_append"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.115235, "supported_languages": null}, "macro.dbt.default__array_append": {"name": "default__array_append", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_append.sql", "original_file_path": "macros/utils/array_append.sql", "unique_id": "macro.dbt.default__array_append", "macro_sql": "{% macro default__array_append(array, new_element) -%}\n array_append({{ array }}, {{ new_element }})\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1155038, "supported_languages": null}, "macro.dbt.hash": {"name": "hash", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/hash.sql", "original_file_path": "macros/utils/hash.sql", "unique_id": "macro.dbt.hash", "macro_sql": "{% macro hash(field) -%}\n {{ return(adapter.dispatch('hash', 'dbt') (field)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__hash"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1159873, "supported_languages": null}, "macro.dbt.default__hash": {"name": "default__hash", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/hash.sql", "original_file_path": "macros/utils/hash.sql", "unique_id": "macro.dbt.default__hash", "macro_sql": "{% macro default__hash(field) -%}\n md5(cast({{ field }} as {{ api.Column.translate_type('string') }}))\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1163073, "supported_languages": null}, "macro.dbt.type_string": {"name": "type_string", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_string", "macro_sql": "\n\n{%- macro type_string() -%}\n {{ return(adapter.dispatch('type_string', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_string"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1179388, "supported_languages": null}, "macro.dbt.default__type_string": {"name": "default__type_string", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_string", "macro_sql": "{% macro default__type_string() %}\n {{ return(api.Column.translate_type(\"string\")) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1182578, "supported_languages": null}, "macro.dbt.type_timestamp": {"name": "type_timestamp", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_timestamp", "macro_sql": "\n\n{%- macro type_timestamp() -%}\n {{ return(adapter.dispatch('type_timestamp', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1185882, "supported_languages": null}, "macro.dbt.default__type_timestamp": {"name": "default__type_timestamp", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_timestamp", "macro_sql": "{% macro default__type_timestamp() %}\n {{ return(api.Column.translate_type(\"timestamp\")) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1190012, "supported_languages": null}, "macro.dbt.type_float": {"name": "type_float", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_float", "macro_sql": "\n\n{%- macro type_float() -%}\n {{ return(adapter.dispatch('type_float', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_float"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1193292, "supported_languages": null}, "macro.dbt.default__type_float": {"name": "default__type_float", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_float", "macro_sql": "{% macro default__type_float() %}\n {{ return(api.Column.translate_type(\"float\")) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1196318, "supported_languages": null}, "macro.dbt.type_numeric": {"name": "type_numeric", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_numeric", "macro_sql": "\n\n{%- macro type_numeric() -%}\n {{ return(adapter.dispatch('type_numeric', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_numeric"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1199563, "supported_languages": null}, "macro.dbt.default__type_numeric": {"name": "default__type_numeric", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_numeric", "macro_sql": "{% macro default__type_numeric() %}\n {{ return(api.Column.numeric_type(\"numeric\", 28, 6)) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1203048, "supported_languages": null}, "macro.dbt.type_bigint": {"name": "type_bigint", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_bigint", "macro_sql": "\n\n{%- macro type_bigint() -%}\n {{ return(adapter.dispatch('type_bigint', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_bigint"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1206489, "supported_languages": null}, "macro.dbt.default__type_bigint": {"name": "default__type_bigint", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_bigint", "macro_sql": "{% macro default__type_bigint() %}\n {{ return(api.Column.translate_type(\"bigint\")) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1209764, "supported_languages": null}, "macro.dbt.type_int": {"name": "type_int", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_int", "macro_sql": "\n\n{%- macro type_int() -%}\n {{ return(adapter.dispatch('type_int', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_int"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1213005, "supported_languages": null}, "macro.dbt.default__type_int": {"name": "default__type_int", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_int", "macro_sql": "{%- macro default__type_int() -%}\n {{ return(api.Column.translate_type(\"integer\")) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1215966, "supported_languages": null}, "macro.dbt.type_boolean": {"name": "type_boolean", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.type_boolean", "macro_sql": "\n\n{%- macro type_boolean() -%}\n {{ return(adapter.dispatch('type_boolean', 'dbt')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt.default__type_boolean"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1219306, "supported_languages": null}, "macro.dbt.default__type_boolean": {"name": "default__type_boolean", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/data_types.sql", "original_file_path": "macros/utils/data_types.sql", "unique_id": "macro.dbt.default__type_boolean", "macro_sql": "{%- macro default__type_boolean() -%}\n {{ return(api.Column.translate_type(\"boolean\")) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1222277, "supported_languages": null}, "macro.dbt.listagg": {"name": "listagg", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/listagg.sql", "original_file_path": "macros/utils/listagg.sql", "unique_id": "macro.dbt.listagg", "macro_sql": "{% macro listagg(measure, delimiter_text=\"','\", order_by_clause=none, limit_num=none) -%}\n {{ return(adapter.dispatch('listagg', 'dbt') (measure, delimiter_text, order_by_clause, limit_num)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__listagg"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.123311, "supported_languages": null}, "macro.dbt.default__listagg": {"name": "default__listagg", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/listagg.sql", "original_file_path": "macros/utils/listagg.sql", "unique_id": "macro.dbt.default__listagg", "macro_sql": "{% macro default__listagg(measure, delimiter_text, order_by_clause, limit_num) -%}\n\n {% if limit_num -%}\n array_to_string(\n array_slice(\n array_agg(\n {{ measure }}\n ){% if order_by_clause -%}\n within group ({{ order_by_clause }})\n {%- endif %}\n ,0\n ,{{ limit_num }}\n ),\n {{ delimiter_text }}\n )\n {%- else %}\n listagg(\n {{ measure }},\n {{ delimiter_text }}\n )\n {% if order_by_clause -%}\n within group ({{ order_by_clause }})\n {%- endif %}\n {%- endif %}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1241264, "supported_languages": null}, "macro.dbt.replace": {"name": "replace", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/replace.sql", "original_file_path": "macros/utils/replace.sql", "unique_id": "macro.dbt.replace", "macro_sql": "{% macro replace(field, old_chars, new_chars) -%}\n {{ return(adapter.dispatch('replace', 'dbt') (field, old_chars, new_chars)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__replace"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1247723, "supported_languages": null}, "macro.dbt.default__replace": {"name": "default__replace", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/replace.sql", "original_file_path": "macros/utils/replace.sql", "unique_id": "macro.dbt.default__replace", "macro_sql": "{% macro default__replace(field, old_chars, new_chars) %}\n\n replace(\n {{ field }},\n {{ old_chars }},\n {{ new_chars }}\n )\n\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1251087, "supported_languages": null}, "macro.dbt.split_part": {"name": "split_part", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/split_part.sql", "original_file_path": "macros/utils/split_part.sql", "unique_id": "macro.dbt.split_part", "macro_sql": "{% macro split_part(string_text, delimiter_text, part_number) %}\n {{ return(adapter.dispatch('split_part', 'dbt') (string_text, delimiter_text, part_number)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__split_part"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1259978, "supported_languages": null}, "macro.dbt.default__split_part": {"name": "default__split_part", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/split_part.sql", "original_file_path": "macros/utils/split_part.sql", "unique_id": "macro.dbt.default__split_part", "macro_sql": "{% macro default__split_part(string_text, delimiter_text, part_number) %}\n\n split_part(\n {{ string_text }},\n {{ delimiter_text }},\n {{ part_number }}\n )\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1263309, "supported_languages": null}, "macro.dbt._split_part_negative": {"name": "_split_part_negative", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/split_part.sql", "original_file_path": "macros/utils/split_part.sql", "unique_id": "macro.dbt._split_part_negative", "macro_sql": "{% macro _split_part_negative(string_text, delimiter_text, part_number) %}\n\n split_part(\n {{ string_text }},\n {{ delimiter_text }},\n length({{ string_text }})\n - length(\n replace({{ string_text }}, {{ delimiter_text }}, '')\n ) + 2 + {{ part_number }}\n )\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1267915, "supported_languages": null}, "macro.dbt.intersect": {"name": "intersect", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/intersect.sql", "original_file_path": "macros/utils/intersect.sql", "unique_id": "macro.dbt.intersect", "macro_sql": "{% macro intersect() %}\n {{ return(adapter.dispatch('intersect', 'dbt')()) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__intersect"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1272323, "supported_languages": null}, "macro.dbt.default__intersect": {"name": "default__intersect", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/intersect.sql", "original_file_path": "macros/utils/intersect.sql", "unique_id": "macro.dbt.default__intersect", "macro_sql": "{% macro default__intersect() %}\n\n intersect\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1273906, "supported_languages": null}, "macro.dbt.length": {"name": "length", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/length.sql", "original_file_path": "macros/utils/length.sql", "unique_id": "macro.dbt.length", "macro_sql": "{% macro length(expression) -%}\n {{ return(adapter.dispatch('length', 'dbt') (expression)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__length"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.127867, "supported_languages": null}, "macro.dbt.default__length": {"name": "default__length", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/length.sql", "original_file_path": "macros/utils/length.sql", "unique_id": "macro.dbt.default__length", "macro_sql": "{% macro default__length(expression) %}\n\n length(\n {{ expression }}\n )\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1281977, "supported_languages": null}, "macro.dbt.array_concat": {"name": "array_concat", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_concat.sql", "original_file_path": "macros/utils/array_concat.sql", "unique_id": "macro.dbt.array_concat", "macro_sql": "{% macro array_concat(array_1, array_2) -%}\n {{ return(adapter.dispatch('array_concat', 'dbt')(array_1, array_2)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.default__array_concat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.128746, "supported_languages": null}, "macro.dbt.default__array_concat": {"name": "default__array_concat", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/array_concat.sql", "original_file_path": "macros/utils/array_concat.sql", "unique_id": "macro.dbt.default__array_concat", "macro_sql": "{% macro default__array_concat(array_1, array_2) -%}\n array_cat({{ array_1 }}, {{ array_2 }})\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.129024, "supported_languages": null}, "macro.dbt.cast_bool_to_text": {"name": "cast_bool_to_text", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/cast_bool_to_text.sql", "original_file_path": "macros/utils/cast_bool_to_text.sql", "unique_id": "macro.dbt.cast_bool_to_text", "macro_sql": "{% macro cast_bool_to_text(field) %}\n {{ adapter.dispatch('cast_bool_to_text', 'dbt') (field) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.default__cast_bool_to_text"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1294973, "supported_languages": null}, "macro.dbt.default__cast_bool_to_text": {"name": "default__cast_bool_to_text", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/cast_bool_to_text.sql", "original_file_path": "macros/utils/cast_bool_to_text.sql", "unique_id": "macro.dbt.default__cast_bool_to_text", "macro_sql": "{% macro default__cast_bool_to_text(field) %}\n cast({{ field }} as {{ api.Column.translate_type('string') }})\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1298218, "supported_languages": null}, "macro.dbt.bool_or": {"name": "bool_or", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/bool_or.sql", "original_file_path": "macros/utils/bool_or.sql", "unique_id": "macro.dbt.bool_or", "macro_sql": "{% macro bool_or(expression) -%}\n {{ return(adapter.dispatch('bool_or', 'dbt') (expression)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__bool_or"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.130301, "supported_languages": null}, "macro.dbt.default__bool_or": {"name": "default__bool_or", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/bool_or.sql", "original_file_path": "macros/utils/bool_or.sql", "unique_id": "macro.dbt.default__bool_or", "macro_sql": "{% macro default__bool_or(expression) -%}\n\n bool_or({{ expression }})\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1305163, "supported_languages": null}, "macro.dbt.right": {"name": "right", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/right.sql", "original_file_path": "macros/utils/right.sql", "unique_id": "macro.dbt.right", "macro_sql": "{% macro right(string_text, length_expression) -%}\n {{ return(adapter.dispatch('right', 'dbt') (string_text, length_expression)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__right"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1310818, "supported_languages": null}, "macro.dbt.default__right": {"name": "default__right", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/right.sql", "original_file_path": "macros/utils/right.sql", "unique_id": "macro.dbt.default__right", "macro_sql": "{% macro default__right(string_text, length_expression) %}\n\n right(\n {{ string_text }},\n {{ length_expression }}\n )\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1313627, "supported_languages": null}, "macro.dbt.safe_cast": {"name": "safe_cast", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/safe_cast.sql", "original_file_path": "macros/utils/safe_cast.sql", "unique_id": "macro.dbt.safe_cast", "macro_sql": "{% macro safe_cast(field, type) %}\n {{ return(adapter.dispatch('safe_cast', 'dbt') (field, type)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_snowflake.snowflake__safe_cast"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1319156, "supported_languages": null}, "macro.dbt.default__safe_cast": {"name": "default__safe_cast", "resource_type": "macro", "package_name": "dbt", "path": "macros/utils/safe_cast.sql", "original_file_path": "macros/utils/safe_cast.sql", "unique_id": "macro.dbt.default__safe_cast", "macro_sql": "{% macro default__safe_cast(field, type) %}\n {# most databases don't support this function yet\n so we just need to use cast #}\n cast({{field}} as {{type}})\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1321988, "supported_languages": null}, "macro.dbt.test_unique": {"name": "test_unique", "resource_type": "macro", "package_name": "dbt", "path": "tests/generic/builtin.sql", "original_file_path": "tests/generic/builtin.sql", "unique_id": "macro.dbt.test_unique", "macro_sql": "{% test unique(model, column_name) %}\n {% set macro = adapter.dispatch('test_unique', 'dbt') %}\n {{ macro(model, column_name) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt.default__test_unique"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1331277, "supported_languages": null}, "macro.dbt.test_not_null": {"name": "test_not_null", "resource_type": "macro", "package_name": "dbt", "path": "tests/generic/builtin.sql", "original_file_path": "tests/generic/builtin.sql", "unique_id": "macro.dbt.test_not_null", "macro_sql": "{% test not_null(model, column_name) %}\n {% set macro = adapter.dispatch('test_not_null', 'dbt') %}\n {{ macro(model, column_name) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt.default__test_not_null"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1335914, "supported_languages": null}, "macro.dbt.test_accepted_values": {"name": "test_accepted_values", "resource_type": "macro", "package_name": "dbt", "path": "tests/generic/builtin.sql", "original_file_path": "tests/generic/builtin.sql", "unique_id": "macro.dbt.test_accepted_values", "macro_sql": "{% test accepted_values(model, column_name, values, quote=True) %}\n {% set macro = adapter.dispatch('test_accepted_values', 'dbt') %}\n {{ macro(model, column_name, values, quote) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt.default__test_accepted_values"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1341562, "supported_languages": null}, "macro.dbt.test_relationships": {"name": "test_relationships", "resource_type": "macro", "package_name": "dbt", "path": "tests/generic/builtin.sql", "original_file_path": "tests/generic/builtin.sql", "unique_id": "macro.dbt.test_relationships", "macro_sql": "{% test relationships(model, column_name, to, field) %}\n {% set macro = adapter.dispatch('test_relationships', 'dbt') %}\n {{ macro(model, column_name, to, field) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt.default__test_relationships"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1347036, "supported_languages": null}, "macro.dbt_utils.test_equal_rowcount": {"name": "test_equal_rowcount", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/equal_rowcount.sql", "original_file_path": "macros/generic_tests/equal_rowcount.sql", "unique_id": "macro.dbt_utils.test_equal_rowcount", "macro_sql": "{% test equal_rowcount(model, compare_model, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_equal_rowcount', 'dbt_utils')(model, compare_model, group_by_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_equal_rowcount"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1361334, "supported_languages": null}, "macro.dbt_utils.default__test_equal_rowcount": {"name": "default__test_equal_rowcount", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/equal_rowcount.sql", "original_file_path": "macros/generic_tests/equal_rowcount.sql", "unique_id": "macro.dbt_utils.default__test_equal_rowcount", "macro_sql": "{% macro default__test_equal_rowcount(model, compare_model, group_by_columns) %}\n\n{#-- Needs to be set at parse time, before we return '' below --#}\n{{ config(fail_calc = 'sum(coalesce(diff_count, 0))') }}\n\n{#-- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. #}\n{%- if not execute -%}\n {{ return('') }}\n{% endif %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(', ') + ', ' %}\n {% set join_gb_cols %}\n {% for c in group_by_columns %}\n and a.{{c}} = b.{{c}}\n {% endfor %}\n {% endset %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\n{#-- We must add a fake join key in case additional grouping variables are not provided --#}\n{#-- Redshift does not allow for dynamically created join conditions (e.g. full join on 1 = 1 --#}\n{#-- The same logic is used in fewer_rows_than. In case of changes, maintain consistent logic --#}\n{% set group_by_columns = ['id_dbtutils_test_equal_rowcount'] + group_by_columns %}\n{% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n\nwith a as (\n\n select \n {{select_gb_cols}}\n 1 as id_dbtutils_test_equal_rowcount,\n count(*) as count_a \n from {{ model }}\n {{groupby_gb_cols}}\n\n\n),\nb as (\n\n select \n {{select_gb_cols}}\n 1 as id_dbtutils_test_equal_rowcount,\n count(*) as count_b \n from {{ compare_model }}\n {{groupby_gb_cols}}\n\n),\nfinal as (\n\n select\n \n {% for c in group_by_columns -%}\n a.{{c}} as {{c}}_a,\n b.{{c}} as {{c}}_b,\n {% endfor %}\n\n count_a,\n count_b,\n abs(count_a - count_b) as diff_count\n\n from a\n full join b\n on\n a.id_dbtutils_test_equal_rowcount = b.id_dbtutils_test_equal_rowcount\n {{join_gb_cols}}\n\n\n)\n\nselect * from final\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1384552, "supported_languages": null}, "macro.dbt_utils.test_accepted_range": {"name": "test_accepted_range", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/accepted_range.sql", "original_file_path": "macros/generic_tests/accepted_range.sql", "unique_id": "macro.dbt_utils.test_accepted_range", "macro_sql": "{% test accepted_range(model, column_name, min_value=none, max_value=none, inclusive=true) %}\n {{ return(adapter.dispatch('test_accepted_range', 'dbt_utils')(model, column_name, min_value, max_value, inclusive)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_accepted_range"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1395478, "supported_languages": null}, "macro.dbt_utils.default__test_accepted_range": {"name": "default__test_accepted_range", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/accepted_range.sql", "original_file_path": "macros/generic_tests/accepted_range.sql", "unique_id": "macro.dbt_utils.default__test_accepted_range", "macro_sql": "{% macro default__test_accepted_range(model, column_name, min_value=none, max_value=none, inclusive=true) %}\n\nwith meet_condition as(\n select *\n from {{ model }}\n),\n\nvalidation_errors as (\n select *\n from meet_condition\n where\n -- never true, defaults to an empty result set. Exists to ensure any combo of the `or` clauses below succeeds\n 1 = 2\n\n {%- if min_value is not none %}\n -- records with a value >= min_value are permitted. The `not` flips this to find records that don't meet the rule.\n or not {{ column_name }} > {{- \"=\" if inclusive }} {{ min_value }}\n {%- endif %}\n\n {%- if max_value is not none %}\n -- records with a value <= max_value are permitted. The `not` flips this to find records that don't meet the rule.\n or not {{ column_name }} < {{- \"=\" if inclusive }} {{ max_value }}\n {%- endif %}\n)\n\nselect *\nfrom validation_errors\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1404967, "supported_languages": null}, "macro.dbt_utils.test_not_accepted_values": {"name": "test_not_accepted_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_accepted_values.sql", "original_file_path": "macros/generic_tests/not_accepted_values.sql", "unique_id": "macro.dbt_utils.test_not_accepted_values", "macro_sql": "{% test not_accepted_values(model, column_name, values, quote=True) %}\n {{ return(adapter.dispatch('test_not_accepted_values', 'dbt_utils')(model, column_name, values, quote)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_not_accepted_values"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.141465, "supported_languages": null}, "macro.dbt_utils.default__test_not_accepted_values": {"name": "default__test_not_accepted_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_accepted_values.sql", "original_file_path": "macros/generic_tests/not_accepted_values.sql", "unique_id": "macro.dbt_utils.default__test_not_accepted_values", "macro_sql": "{% macro default__test_not_accepted_values(model, column_name, values, quote=True) %}\nwith all_values as (\n\n select distinct\n {{ column_name }} as value_field\n\n from {{ model }}\n\n),\n\nvalidation_errors as (\n\n select\n value_field\n\n from all_values\n where value_field in (\n {% for value in values -%}\n {% if quote -%}\n '{{ value }}'\n {%- else -%}\n {{ value }}\n {%- endif -%}\n {%- if not loop.last -%},{%- endif %}\n {%- endfor %}\n )\n\n)\n\nselect *\nfrom validation_errors\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1422422, "supported_languages": null}, "macro.dbt_utils.test_unique_combination_of_columns": {"name": "test_unique_combination_of_columns", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/unique_combination_of_columns.sql", "original_file_path": "macros/generic_tests/unique_combination_of_columns.sql", "unique_id": "macro.dbt_utils.test_unique_combination_of_columns", "macro_sql": "{% test unique_combination_of_columns(model, combination_of_columns, quote_columns=false) %}\n {{ return(adapter.dispatch('test_unique_combination_of_columns', 'dbt_utils')(model, combination_of_columns, quote_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_unique_combination_of_columns"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1432912, "supported_languages": null}, "macro.dbt_utils.default__test_unique_combination_of_columns": {"name": "default__test_unique_combination_of_columns", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/unique_combination_of_columns.sql", "original_file_path": "macros/generic_tests/unique_combination_of_columns.sql", "unique_id": "macro.dbt_utils.default__test_unique_combination_of_columns", "macro_sql": "{% macro default__test_unique_combination_of_columns(model, combination_of_columns, quote_columns=false) %}\n\n{% if not quote_columns %}\n {%- set column_list=combination_of_columns %}\n{% elif quote_columns %}\n {%- set column_list=[] %}\n {% for column in combination_of_columns -%}\n {% set column_list = column_list.append( adapter.quote(column) ) %}\n {%- endfor %}\n{% else %}\n {{ exceptions.raise_compiler_error(\n \"`quote_columns` argument for unique_combination_of_columns test must be one of [True, False] Got: '\" ~ quote ~\"'.'\"\n ) }}\n{% endif %}\n\n{%- set columns_csv=column_list | join(', ') %}\n\n\nwith validation_errors as (\n\n select\n {{ columns_csv }}\n from {{ model }}\n group by {{ columns_csv }}\n having count(*) > 1\n\n)\n\nselect *\nfrom validation_errors\n\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1445835, "supported_languages": null}, "macro.dbt_utils.test_sequential_values": {"name": "test_sequential_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/sequential_values.sql", "original_file_path": "macros/generic_tests/sequential_values.sql", "unique_id": "macro.dbt_utils.test_sequential_values", "macro_sql": "{% test sequential_values(model, column_name, interval=1, datepart=None, group_by_columns = []) %}\n\n {{ return(adapter.dispatch('test_sequential_values', 'dbt_utils')(model, column_name, interval, datepart, group_by_columns)) }}\n\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_sequential_values"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1461105, "supported_languages": null}, "macro.dbt_utils.default__test_sequential_values": {"name": "default__test_sequential_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/sequential_values.sql", "original_file_path": "macros/generic_tests/sequential_values.sql", "unique_id": "macro.dbt_utils.default__test_sequential_values", "macro_sql": "{% macro default__test_sequential_values(model, column_name, interval=1, datepart=None, group_by_columns = []) %}\n\n{% set previous_column_name = \"previous_\" ~ dbt_utils.slugify(column_name) %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(',') + ', ' %}\n {% set partition_gb_cols = 'partition by ' + group_by_columns|join(',') %}\n{% endif %}\n\nwith windowed as (\n\n select\n {{ select_gb_cols }}\n {{ column_name }},\n lag({{ column_name }}) over (\n {{partition_gb_cols}}\n order by {{ column_name }}\n ) as {{ previous_column_name }}\n from {{ model }}\n),\n\nvalidation_errors as (\n select\n *\n from windowed\n {% if datepart %}\n where not(cast({{ column_name }} as {{ dbt.type_timestamp() }})= cast({{ dbt.dateadd(datepart, interval, previous_column_name) }} as {{ dbt.type_timestamp() }}))\n {% else %}\n where not({{ column_name }} = {{ previous_column_name }} + {{ interval }})\n {% endif %}\n)\n\nselect *\nfrom validation_errors\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.slugify", "macro.dbt.type_timestamp", "macro.dbt.dateadd"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1479356, "supported_languages": null}, "macro.dbt_utils.test_not_empty_string": {"name": "test_not_empty_string", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_empty_string.sql", "original_file_path": "macros/generic_tests/not_empty_string.sql", "unique_id": "macro.dbt_utils.test_not_empty_string", "macro_sql": "{% test not_empty_string(model, column_name, trim_whitespace=true) %}\n\n {{ return(adapter.dispatch('test_not_empty_string', 'dbt_utils')(model, column_name, trim_whitespace)) }}\n\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_not_empty_string"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.148864, "supported_languages": null}, "macro.dbt_utils.default__test_not_empty_string": {"name": "default__test_not_empty_string", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_empty_string.sql", "original_file_path": "macros/generic_tests/not_empty_string.sql", "unique_id": "macro.dbt_utils.default__test_not_empty_string", "macro_sql": "{% macro default__test_not_empty_string(model, column_name, trim_whitespace=true) %}\n\n with\n \n all_values as (\n\n select \n\n\n {% if trim_whitespace == true -%}\n\n trim({{ column_name }}) as {{ column_name }}\n\n {%- else -%}\n\n {{ column_name }}\n\n {%- endif %}\n \n from {{ model }}\n\n ),\n\n errors as (\n\n select * from all_values\n where {{ column_name }} = ''\n\n )\n\n select * from errors\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1494768, "supported_languages": null}, "macro.dbt_utils.test_equality": {"name": "test_equality", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/equality.sql", "original_file_path": "macros/generic_tests/equality.sql", "unique_id": "macro.dbt_utils.test_equality", "macro_sql": "{% test equality(model, compare_model, compare_columns=None) %}\n {{ return(adapter.dispatch('test_equality', 'dbt_utils')(model, compare_model, compare_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_equality"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.150733, "supported_languages": null}, "macro.dbt_utils.default__test_equality": {"name": "default__test_equality", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/equality.sql", "original_file_path": "macros/generic_tests/equality.sql", "unique_id": "macro.dbt_utils.default__test_equality", "macro_sql": "{% macro default__test_equality(model, compare_model, compare_columns=None) %}\n\n{% set set_diff %}\n count(*) + coalesce(abs(\n sum(case when which_diff = 'a_minus_b' then 1 else 0 end) -\n sum(case when which_diff = 'b_minus_a' then 1 else 0 end)\n ), 0)\n{% endset %}\n\n{#-- Needs to be set at parse time, before we return '' below --#}\n{{ config(fail_calc = set_diff) }}\n\n{#-- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. #}\n{%- if not execute -%}\n {{ return('') }}\n{% endif %}\n\n-- setup\n{%- do dbt_utils._is_relation(model, 'test_equality') -%}\n\n{#-\nIf the compare_cols arg is provided, we can run this test without querying the\ninformation schema\u00a0\u2014 this allows the model to be an ephemeral model\n-#}\n\n{%- if not compare_columns -%}\n {%- do dbt_utils._is_ephemeral(model, 'test_equality') -%}\n {%- set compare_columns = adapter.get_columns_in_relation(model) | map(attribute='quoted') -%}\n{%- endif -%}\n\n{% set compare_cols_csv = compare_columns | join(', ') %}\n\nwith a as (\n\n select * from {{ model }}\n\n),\n\nb as (\n\n select * from {{ compare_model }}\n\n),\n\na_minus_b as (\n\n select {{compare_cols_csv}} from a\n {{ dbt.except() }}\n select {{compare_cols_csv}} from b\n\n),\n\nb_minus_a as (\n\n select {{compare_cols_csv}} from b\n {{ dbt.except() }}\n select {{compare_cols_csv}} from a\n\n),\n\nunioned as (\n\n select 'a_minus_b' as which_diff, a_minus_b.* from a_minus_b\n union all\n select 'b_minus_a' as which_diff, b_minus_a.* from b_minus_a\n\n)\n\nselect * from unioned\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral", "macro.dbt.except"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.152534, "supported_languages": null}, "macro.dbt_utils.test_relationships_where": {"name": "test_relationships_where", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/relationships_where.sql", "original_file_path": "macros/generic_tests/relationships_where.sql", "unique_id": "macro.dbt_utils.test_relationships_where", "macro_sql": "{% test relationships_where(model, column_name, to, field, from_condition=\"1=1\", to_condition=\"1=1\") %}\n {{ return(adapter.dispatch('test_relationships_where', 'dbt_utils')(model, column_name, to, field, from_condition, to_condition)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_relationships_where"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1536837, "supported_languages": null}, "macro.dbt_utils.default__test_relationships_where": {"name": "default__test_relationships_where", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/relationships_where.sql", "original_file_path": "macros/generic_tests/relationships_where.sql", "unique_id": "macro.dbt_utils.default__test_relationships_where", "macro_sql": "{% macro default__test_relationships_where(model, column_name, to, field, from_condition=\"1=1\", to_condition=\"1=1\") %}\n\n{# T-SQL has no boolean data type so we use 1=1 which returns TRUE #}\n{# ref https://stackoverflow.com/a/7170753/3842610 #}\n\nwith left_table as (\n\n select\n {{column_name}} as id\n\n from {{model}}\n\n where {{column_name}} is not null\n and {{from_condition}}\n\n),\n\nright_table as (\n\n select\n {{field}} as id\n\n from {{to}}\n\n where {{field}} is not null\n and {{to_condition}}\n\n),\n\nexceptions as (\n\n select\n left_table.id,\n right_table.id as right_id\n\n from left_table\n\n left join right_table\n on left_table.id = right_table.id\n\n where right_table.id is null\n\n)\n\nselect * from exceptions\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.154419, "supported_languages": null}, "macro.dbt_utils.test_cardinality_equality": {"name": "test_cardinality_equality", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/cardinality_equality.sql", "original_file_path": "macros/generic_tests/cardinality_equality.sql", "unique_id": "macro.dbt_utils.test_cardinality_equality", "macro_sql": "{% test cardinality_equality(model, column_name, to, field) %}\n {{ return(adapter.dispatch('test_cardinality_equality', 'dbt_utils')(model, column_name, to, field)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_cardinality_equality"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.155407, "supported_languages": null}, "macro.dbt_utils.default__test_cardinality_equality": {"name": "default__test_cardinality_equality", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/cardinality_equality.sql", "original_file_path": "macros/generic_tests/cardinality_equality.sql", "unique_id": "macro.dbt_utils.default__test_cardinality_equality", "macro_sql": "{% macro default__test_cardinality_equality(model, column_name, to, field) %}\n\n{# T-SQL does not let you use numbers as aliases for columns #}\n{# Thus, no \"GROUP BY 1\" #}\n\nwith table_a as (\nselect\n {{ column_name }},\n count(*) as num_rows\nfrom {{ model }}\ngroup by {{ column_name }}\n),\n\ntable_b as (\nselect\n {{ field }},\n count(*) as num_rows\nfrom {{ to }}\ngroup by {{ field }}\n),\n\nexcept_a as (\n select *\n from table_a\n {{ dbt.except() }}\n select *\n from table_b\n),\n\nexcept_b as (\n select *\n from table_b\n {{ dbt.except() }}\n select *\n from table_a\n),\n\nunioned as (\n select *\n from except_a\n union all\n select *\n from except_b\n)\n\nselect *\nfrom unioned\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.except"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.156091, "supported_languages": null}, "macro.dbt_utils.test_not_constant": {"name": "test_not_constant", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_constant.sql", "original_file_path": "macros/generic_tests/not_constant.sql", "unique_id": "macro.dbt_utils.test_not_constant", "macro_sql": "{% test not_constant(model, column_name, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_not_constant', 'dbt_utils')(model, column_name, group_by_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_not_constant"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1569493, "supported_languages": null}, "macro.dbt_utils.default__test_not_constant": {"name": "default__test_not_constant", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_constant.sql", "original_file_path": "macros/generic_tests/not_constant.sql", "unique_id": "macro.dbt_utils.default__test_not_constant", "macro_sql": "{% macro default__test_not_constant(model, column_name, group_by_columns) %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(' ,') + ', ' %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\n\nselect\n {# In TSQL, subquery aggregate columns need aliases #}\n {# thus: a filler col name, 'filler_column' #}\n {{select_gb_cols}}\n count(distinct {{ column_name }}) as filler_column\n\nfrom {{ model }}\n\n {{groupby_gb_cols}}\n\nhaving count(distinct {{ column_name }}) = 1\n\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1580985, "supported_languages": null}, "macro.dbt_utils.test_not_null_proportion": {"name": "test_not_null_proportion", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_null_proportion.sql", "original_file_path": "macros/generic_tests/not_null_proportion.sql", "unique_id": "macro.dbt_utils.test_not_null_proportion", "macro_sql": "{% macro test_not_null_proportion(model, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_not_null_proportion', 'dbt_utils')(model, group_by_columns, **kwargs)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_not_null_proportion"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.15921, "supported_languages": null}, "macro.dbt_utils.default__test_not_null_proportion": {"name": "default__test_not_null_proportion", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/not_null_proportion.sql", "original_file_path": "macros/generic_tests/not_null_proportion.sql", "unique_id": "macro.dbt_utils.default__test_not_null_proportion", "macro_sql": "{% macro default__test_not_null_proportion(model, group_by_columns) %}\n\n{% set column_name = kwargs.get('column_name', kwargs.get('arg')) %}\n{% set at_least = kwargs.get('at_least', kwargs.get('arg')) %}\n{% set at_most = kwargs.get('at_most', kwargs.get('arg', 1)) %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(' ,') + ', ' %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\nwith validation as (\n select\n {{select_gb_cols}}\n sum(case when {{ column_name }} is null then 0 else 1 end) / cast(count(*) as numeric) as not_null_proportion\n from {{ model }}\n {{groupby_gb_cols}}\n),\nvalidation_errors as (\n select\n {{select_gb_cols}}\n not_null_proportion\n from validation\n where not_null_proportion < {{ at_least }} or not_null_proportion > {{ at_most }}\n)\nselect\n *\nfrom validation_errors\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.160901, "supported_languages": null}, "macro.dbt_utils.test_at_least_one": {"name": "test_at_least_one", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/at_least_one.sql", "original_file_path": "macros/generic_tests/at_least_one.sql", "unique_id": "macro.dbt_utils.test_at_least_one", "macro_sql": "{% test at_least_one(model, column_name, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_at_least_one', 'dbt_utils')(model, column_name, group_by_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_at_least_one"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1617494, "supported_languages": null}, "macro.dbt_utils.default__test_at_least_one": {"name": "default__test_at_least_one", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/at_least_one.sql", "original_file_path": "macros/generic_tests/at_least_one.sql", "unique_id": "macro.dbt_utils.default__test_at_least_one", "macro_sql": "{% macro default__test_at_least_one(model, column_name, group_by_columns) %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(' ,') + ', ' %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\nselect *\nfrom (\n select\n {# In TSQL, subquery aggregate columns need aliases #}\n {# thus: a filler col name, 'filler_column' #}\n {{select_gb_cols}}\n count({{ column_name }}) as filler_column\n\n from {{ model }}\n\n {{groupby_gb_cols}}\n\n having count({{ column_name }}) = 0\n\n) validation_errors\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.162693, "supported_languages": null}, "macro.dbt_utils.test_mutually_exclusive_ranges": {"name": "test_mutually_exclusive_ranges", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/mutually_exclusive_ranges.sql", "original_file_path": "macros/generic_tests/mutually_exclusive_ranges.sql", "unique_id": "macro.dbt_utils.test_mutually_exclusive_ranges", "macro_sql": "{% test mutually_exclusive_ranges(model, lower_bound_column, upper_bound_column, partition_by=None, gaps='allowed', zero_length_range_allowed=False) %}\n {{ return(adapter.dispatch('test_mutually_exclusive_ranges', 'dbt_utils')(model, lower_bound_column, upper_bound_column, partition_by, gaps, zero_length_range_allowed)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_mutually_exclusive_ranges"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1675816, "supported_languages": null}, "macro.dbt_utils.default__test_mutually_exclusive_ranges": {"name": "default__test_mutually_exclusive_ranges", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/mutually_exclusive_ranges.sql", "original_file_path": "macros/generic_tests/mutually_exclusive_ranges.sql", "unique_id": "macro.dbt_utils.default__test_mutually_exclusive_ranges", "macro_sql": "{% macro default__test_mutually_exclusive_ranges(model, lower_bound_column, upper_bound_column, partition_by=None, gaps='allowed', zero_length_range_allowed=False) %}\n{% if gaps == 'not_allowed' %}\n {% set allow_gaps_operator='=' %}\n {% set allow_gaps_operator_in_words='equal_to' %}\n{% elif gaps == 'allowed' %}\n {% set allow_gaps_operator='<=' %}\n {% set allow_gaps_operator_in_words='less_than_or_equal_to' %}\n{% elif gaps == 'required' %}\n {% set allow_gaps_operator='<' %}\n {% set allow_gaps_operator_in_words='less_than' %}\n{% else %}\n {{ exceptions.raise_compiler_error(\n \"`gaps` argument for mutually_exclusive_ranges test must be one of ['not_allowed', 'allowed', 'required'] Got: '\" ~ gaps ~\"'.'\"\n ) }}\n{% endif %}\n{% if not zero_length_range_allowed %}\n {% set allow_zero_length_operator='<' %}\n {% set allow_zero_length_operator_in_words='less_than' %}\n{% elif zero_length_range_allowed %}\n {% set allow_zero_length_operator='<=' %}\n {% set allow_zero_length_operator_in_words='less_than_or_equal_to' %}\n{% else %}\n {{ exceptions.raise_compiler_error(\n \"`zero_length_range_allowed` argument for mutually_exclusive_ranges test must be one of [true, false] Got: '\" ~ zero_length_range_allowed ~\"'.'\"\n ) }}\n{% endif %}\n\n{% set partition_clause=\"partition by \" ~ partition_by if partition_by else '' %}\n\nwith window_functions as (\n\n select\n {% if partition_by %}\n {{ partition_by }} as partition_by_col,\n {% endif %}\n {{ lower_bound_column }} as lower_bound,\n {{ upper_bound_column }} as upper_bound,\n\n lead({{ lower_bound_column }}) over (\n {{ partition_clause }}\n order by {{ lower_bound_column }}, {{ upper_bound_column }}\n ) as next_lower_bound,\n\n row_number() over (\n {{ partition_clause }}\n order by {{ lower_bound_column }} desc, {{ upper_bound_column }} desc\n ) = 1 as is_last_record\n\n from {{ model }}\n\n),\n\ncalc as (\n -- We want to return records where one of our assumptions fails, so we'll use\n -- the `not` function with `and` statements so we can write our assumptions more cleanly\n select\n *,\n\n -- For each record: lower_bound should be < upper_bound.\n -- Coalesce it to return an error on the null case (implicit assumption\n -- these columns are not_null)\n coalesce(\n lower_bound {{ allow_zero_length_operator }} upper_bound,\n false\n ) as lower_bound_{{ allow_zero_length_operator_in_words }}_upper_bound,\n\n -- For each record: upper_bound {{ allow_gaps_operator }} the next lower_bound.\n -- Coalesce it to handle null cases for the last record.\n coalesce(\n upper_bound {{ allow_gaps_operator }} next_lower_bound,\n is_last_record,\n false\n ) as upper_bound_{{ allow_gaps_operator_in_words }}_next_lower_bound\n\n from window_functions\n\n),\n\nvalidation_errors as (\n\n select\n *\n from calc\n\n where not(\n -- THE FOLLOWING SHOULD BE TRUE --\n lower_bound_{{ allow_zero_length_operator_in_words }}_upper_bound\n and upper_bound_{{ allow_gaps_operator_in_words }}_next_lower_bound\n )\n)\n\nselect * from validation_errors\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1707463, "supported_languages": null}, "macro.dbt_utils.test_recency": {"name": "test_recency", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/recency.sql", "original_file_path": "macros/generic_tests/recency.sql", "unique_id": "macro.dbt_utils.test_recency", "macro_sql": "{% test recency(model, field, datepart, interval, ignore_time_component=False, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_recency', 'dbt_utils')(model, field, datepart, interval, ignore_time_component, group_by_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_recency"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1720266, "supported_languages": null}, "macro.dbt_utils.default__test_recency": {"name": "default__test_recency", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/recency.sql", "original_file_path": "macros/generic_tests/recency.sql", "unique_id": "macro.dbt_utils.default__test_recency", "macro_sql": "{% macro default__test_recency(model, field, datepart, interval, ignore_time_component, group_by_columns) %}\n\n{% set threshold = 'cast(' ~ dbt.dateadd(datepart, interval * -1, dbt.current_timestamp()) ~ ' as ' ~ ('date' if ignore_time_component else dbt.type_timestamp()) ~ ')' %}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(' ,') + ', ' %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\n\nwith recency as (\n\n select \n\n {{ select_gb_cols }}\n {% if ignore_time_component %}\n cast(max({{ field }}) as date) as most_recent\n {%- else %}\n max({{ field }}) as most_recent\n {%- endif %}\n\n from {{ model }}\n\n {{ groupby_gb_cols }}\n\n)\n\nselect\n\n {{ select_gb_cols }}\n most_recent,\n {{ threshold }} as threshold\n\nfrom recency\nwhere most_recent < {{ threshold }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.dateadd", "macro.dbt.current_timestamp", "macro.dbt.type_timestamp"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1737134, "supported_languages": null}, "macro.dbt_utils.test_fewer_rows_than": {"name": "test_fewer_rows_than", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/fewer_rows_than.sql", "original_file_path": "macros/generic_tests/fewer_rows_than.sql", "unique_id": "macro.dbt_utils.test_fewer_rows_than", "macro_sql": "{% test fewer_rows_than(model, compare_model, group_by_columns = []) %}\n {{ return(adapter.dispatch('test_fewer_rows_than', 'dbt_utils')(model, compare_model, group_by_columns)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_fewer_rows_than"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1754024, "supported_languages": null}, "macro.dbt_utils.default__test_fewer_rows_than": {"name": "default__test_fewer_rows_than", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/fewer_rows_than.sql", "original_file_path": "macros/generic_tests/fewer_rows_than.sql", "unique_id": "macro.dbt_utils.default__test_fewer_rows_than", "macro_sql": "{% macro default__test_fewer_rows_than(model, compare_model, group_by_columns) %}\n\n{{ config(fail_calc = 'sum(coalesce(row_count_delta, 0))') }}\n\n{% if group_by_columns|length() > 0 %}\n {% set select_gb_cols = group_by_columns|join(' ,') + ', ' %}\n {% set join_gb_cols %}\n {% for c in group_by_columns %}\n and a.{{c}} = b.{{c}}\n {% endfor %}\n {% endset %}\n {% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n{% endif %}\n\n{#-- We must add a fake join key in case additional grouping variables are not provided --#}\n{#-- Redshift does not allow for dynamically created join conditions (e.g. full join on 1 = 1 --#}\n{#-- The same logic is used in equal_rowcount. In case of changes, maintain consistent logic --#}\n{% set group_by_columns = ['id_dbtutils_test_fewer_rows_than'] + group_by_columns %}\n{% set groupby_gb_cols = 'group by ' + group_by_columns|join(',') %}\n\n\nwith a as (\n\n select \n {{select_gb_cols}}\n 1 as id_dbtutils_test_fewer_rows_than,\n count(*) as count_our_model \n from {{ model }}\n {{ groupby_gb_cols }}\n\n),\nb as (\n\n select \n {{select_gb_cols}}\n 1 as id_dbtutils_test_fewer_rows_than,\n count(*) as count_comparison_model \n from {{ compare_model }}\n {{ groupby_gb_cols }}\n\n),\ncounts as (\n\n select\n\n {% for c in group_by_columns -%}\n a.{{c}} as {{c}}_a,\n b.{{c}} as {{c}}_b,\n {% endfor %}\n\n count_our_model,\n count_comparison_model\n from a\n full join b on \n a.id_dbtutils_test_fewer_rows_than = b.id_dbtutils_test_fewer_rows_than\n {{ join_gb_cols }}\n\n),\nfinal as (\n\n select *,\n case\n -- fail the test if we have more rows than the reference model and return the row count delta\n when count_our_model > count_comparison_model then (count_our_model - count_comparison_model)\n -- fail the test if they are the same number\n when count_our_model = count_comparison_model then 1\n -- pass the test if the delta is positive (i.e. return the number 0)\n else 0\n end as row_count_delta\n from counts\n\n)\n\nselect * from final\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1774251, "supported_languages": null}, "macro.dbt_utils.test_expression_is_true": {"name": "test_expression_is_true", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/expression_is_true.sql", "original_file_path": "macros/generic_tests/expression_is_true.sql", "unique_id": "macro.dbt_utils.test_expression_is_true", "macro_sql": "{% test expression_is_true(model, expression, column_name=None) %}\n {{ return(adapter.dispatch('test_expression_is_true', 'dbt_utils')(model, expression, column_name)) }}\n{% endtest %}", "depends_on": {"macros": ["macro.dbt_utils.default__test_expression_is_true"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.178191, "supported_languages": null}, "macro.dbt_utils.default__test_expression_is_true": {"name": "default__test_expression_is_true", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/generic_tests/expression_is_true.sql", "original_file_path": "macros/generic_tests/expression_is_true.sql", "unique_id": "macro.dbt_utils.default__test_expression_is_true", "macro_sql": "{% macro default__test_expression_is_true(model, expression, column_name) %}\n\n{% set column_list = '*' if should_store_failures() else \"1\" %}\n\nselect\n {{ column_list }}\nfrom {{ model }}\n{% if column_name is none %}\nwhere not({{ expression }})\n{%- else %}\nwhere not({{ column_name }} {{ expression }})\n{%- endif %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.should_store_failures"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1789234, "supported_languages": null}, "macro.dbt_utils.get_url_parameter": {"name": "get_url_parameter", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_parameter.sql", "original_file_path": "macros/web/get_url_parameter.sql", "unique_id": "macro.dbt_utils.get_url_parameter", "macro_sql": "{% macro get_url_parameter(field, url_parameter) -%}\n {{ return(adapter.dispatch('get_url_parameter', 'dbt_utils')(field, url_parameter)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_url_parameter"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1796854, "supported_languages": null}, "macro.dbt_utils.default__get_url_parameter": {"name": "default__get_url_parameter", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_parameter.sql", "original_file_path": "macros/web/get_url_parameter.sql", "unique_id": "macro.dbt_utils.default__get_url_parameter", "macro_sql": "{% macro default__get_url_parameter(field, url_parameter) -%}\n\n{%- set formatted_url_parameter = \"'\" + url_parameter + \"='\" -%}\n\n{%- set split = dbt.split_part(dbt.split_part(field, formatted_url_parameter, 2), \"'&'\", 1) -%}\n\nnullif({{ split }},'')\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.split_part"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1803515, "supported_languages": null}, "macro.dbt_utils.get_url_path": {"name": "get_url_path", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_path.sql", "original_file_path": "macros/web/get_url_path.sql", "unique_id": "macro.dbt_utils.get_url_path", "macro_sql": "{% macro get_url_path(field) -%}\n {{ return(adapter.dispatch('get_url_path', 'dbt_utils')(field)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_url_path"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1811907, "supported_languages": null}, "macro.dbt_utils.default__get_url_path": {"name": "default__get_url_path", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_path.sql", "original_file_path": "macros/web/get_url_path.sql", "unique_id": "macro.dbt_utils.default__get_url_path", "macro_sql": "{% macro default__get_url_path(field) -%}\n\n {%- set stripped_url =\n dbt.replace(\n dbt.replace(field, \"'http://'\", \"''\"), \"'https://'\", \"''\")\n -%}\n\n {%- set first_slash_pos -%}\n coalesce(\n nullif({{ dbt.position(\"'/'\", stripped_url) }}, 0),\n {{ dbt.position(\"'?'\", stripped_url) }} - 1\n )\n {%- endset -%}\n\n {%- set parsed_path =\n dbt.split_part(\n dbt.right(\n stripped_url,\n dbt.length(stripped_url) ~ \"-\" ~ first_slash_pos\n ),\n \"'?'\", 1\n )\n -%}\n\n {{ dbt.safe_cast(\n parsed_path,\n dbt.type_string()\n )}}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.replace", "macro.dbt.position", "macro.dbt.split_part", "macro.dbt.right", "macro.dbt.length", "macro.dbt.safe_cast", "macro.dbt.type_string"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1826138, "supported_languages": null}, "macro.dbt_utils.get_url_host": {"name": "get_url_host", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_host.sql", "original_file_path": "macros/web/get_url_host.sql", "unique_id": "macro.dbt_utils.get_url_host", "macro_sql": "{% macro get_url_host(field) -%}\n {{ return(adapter.dispatch('get_url_host', 'dbt_utils')(field)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_url_host"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1832542, "supported_languages": null}, "macro.dbt_utils.default__get_url_host": {"name": "default__get_url_host", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/web/get_url_host.sql", "original_file_path": "macros/web/get_url_host.sql", "unique_id": "macro.dbt_utils.default__get_url_host", "macro_sql": "{% macro default__get_url_host(field) -%}\n\n{%- set parsed =\n dbt.split_part(\n dbt.split_part(\n dbt.replace(\n dbt.replace(\n dbt.replace(field, \"'android-app://'\", \"''\"\n ), \"'http://'\", \"''\"\n ), \"'https://'\", \"''\"\n ), \"'/'\", 1\n ), \"'?'\", 1\n )\n\n-%}\n\n\n {{ dbt.safe_cast(\n parsed,\n dbt.type_string()\n )}}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.split_part", "macro.dbt.replace", "macro.dbt.safe_cast", "macro.dbt.type_string"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1841743, "supported_languages": null}, "macro.dbt_utils.nullcheck": {"name": "nullcheck", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/nullcheck.sql", "original_file_path": "macros/sql/nullcheck.sql", "unique_id": "macro.dbt_utils.nullcheck", "macro_sql": "{% macro nullcheck(cols) %}\n {{ return(adapter.dispatch('nullcheck', 'dbt_utils')(cols)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__nullcheck"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1848714, "supported_languages": null}, "macro.dbt_utils.default__nullcheck": {"name": "default__nullcheck", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/nullcheck.sql", "original_file_path": "macros/sql/nullcheck.sql", "unique_id": "macro.dbt_utils.default__nullcheck", "macro_sql": "{% macro default__nullcheck(cols) %}\n{%- for col in cols %}\n\n {% if col.is_string() -%}\n\n nullif({{col.name}},'') as {{col.name}}\n\n {%- else -%}\n\n {{col.name}}\n\n {%- endif -%}\n\n{%- if not loop.last -%} , {%- endif -%}\n\n{%- endfor -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1855648, "supported_languages": null}, "macro.dbt_utils.get_relations_by_pattern": {"name": "get_relations_by_pattern", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_relations_by_pattern.sql", "original_file_path": "macros/sql/get_relations_by_pattern.sql", "unique_id": "macro.dbt_utils.get_relations_by_pattern", "macro_sql": "{% macro get_relations_by_pattern(schema_pattern, table_pattern, exclude='', database=target.database) %}\n {{ return(adapter.dispatch('get_relations_by_pattern', 'dbt_utils')(schema_pattern, table_pattern, exclude, database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_relations_by_pattern"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1867008, "supported_languages": null}, "macro.dbt_utils.default__get_relations_by_pattern": {"name": "default__get_relations_by_pattern", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_relations_by_pattern.sql", "original_file_path": "macros/sql/get_relations_by_pattern.sql", "unique_id": "macro.dbt_utils.default__get_relations_by_pattern", "macro_sql": "{% macro default__get_relations_by_pattern(schema_pattern, table_pattern, exclude='', database=target.database) %}\n\n {%- call statement('get_tables', fetch_result=True) %}\n\n {{ dbt_utils.get_tables_by_pattern_sql(schema_pattern, table_pattern, exclude, database) }}\n\n {%- endcall -%}\n\n {%- set table_list = load_result('get_tables') -%}\n\n {%- if table_list and table_list['table'] -%}\n {%- set tbl_relations = [] -%}\n {%- for row in table_list['table'] -%}\n {%- set tbl_relation = api.Relation.create(\n database=database,\n schema=row.table_schema,\n identifier=row.table_name,\n type=row.table_type\n ) -%}\n {%- do tbl_relations.append(tbl_relation) -%}\n {%- endfor -%}\n\n {{ return(tbl_relations) }}\n {%- else -%}\n {{ return([]) }}\n {%- endif -%}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt_utils.get_tables_by_pattern_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1884189, "supported_languages": null}, "macro.dbt_utils.generate_surrogate_key": {"name": "generate_surrogate_key", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_surrogate_key.sql", "original_file_path": "macros/sql/generate_surrogate_key.sql", "unique_id": "macro.dbt_utils.generate_surrogate_key", "macro_sql": "{%- macro generate_surrogate_key(field_list) -%}\n {{ return(adapter.dispatch('generate_surrogate_key', 'dbt_utils')(field_list)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__generate_surrogate_key"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1893106, "supported_languages": null}, "macro.dbt_utils.default__generate_surrogate_key": {"name": "default__generate_surrogate_key", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_surrogate_key.sql", "original_file_path": "macros/sql/generate_surrogate_key.sql", "unique_id": "macro.dbt_utils.default__generate_surrogate_key", "macro_sql": "\n\n{%- macro default__generate_surrogate_key(field_list) -%}\n\n{% if var('surrogate_key_treat_nulls_as_empty_strings', False) %}\n {% set default_null_value = \"\" %}\n{% else %}\n {% set default_null_value = '_dbt_utils_surrogate_key_null_'%}\n{% endif %}\n\n{%- set fields = [] -%}\n\n{%- for field in field_list -%}\n\n {%- do fields.append(\n \"coalesce(cast(\" ~ field ~ \" as \" ~ dbt.type_string() ~ \"), '\" ~ default_null_value ~\"')\"\n ) -%}\n\n {%- if not loop.last %}\n {%- do fields.append(\"'-'\") -%}\n {%- endif -%}\n\n{%- endfor -%}\n\n{{ dbt.hash(dbt.concat(fields)) }}\n\n{%- endmacro -%}", "depends_on": {"macros": ["macro.dbt.type_string", "macro.dbt.hash", "macro.dbt.concat"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1906471, "supported_languages": null}, "macro.dbt_utils.get_single_value": {"name": "get_single_value", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_single_value.sql", "original_file_path": "macros/sql/get_single_value.sql", "unique_id": "macro.dbt_utils.get_single_value", "macro_sql": "{% macro get_single_value(query, default=none) %}\n {{ return(adapter.dispatch('get_single_value', 'dbt_utils')(query, default)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_single_value"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1915765, "supported_languages": null}, "macro.dbt_utils.default__get_single_value": {"name": "default__get_single_value", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_single_value.sql", "original_file_path": "macros/sql/get_single_value.sql", "unique_id": "macro.dbt_utils.default__get_single_value", "macro_sql": "{% macro default__get_single_value(query, default) %}\n\n{# This macro returns the (0, 0) record in a query, i.e. the first row of the first column #}\n\n {%- call statement('get_query_result', fetch_result=True, auto_begin=false) -%}\n\n {{ query }}\n\n {%- endcall -%}\n\n {%- if execute -%}\n\n {% set r = load_result('get_query_result').table.columns[0].values() %}\n {% if r | length == 0 %}\n {% do print('Query `' ~ query ~ '` returned no rows. Using the default value: ' ~ default) %}\n {% set sql_result = default %}\n {% else %}\n {% set sql_result = r[0] %}\n {% endif %}\n \n {%- else -%}\n \n {% set sql_result = default %}\n \n {%- endif -%}\n\n {% do return(sql_result) %}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1930912, "supported_languages": null}, "macro.dbt_utils.safe_divide": {"name": "safe_divide", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/safe_divide.sql", "original_file_path": "macros/sql/safe_divide.sql", "unique_id": "macro.dbt_utils.safe_divide", "macro_sql": "{% macro safe_divide(numerator, denominator) -%}\n {{ return(adapter.dispatch('safe_divide', 'dbt_utils')(numerator, denominator)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__safe_divide"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1936524, "supported_languages": null}, "macro.dbt_utils.default__safe_divide": {"name": "default__safe_divide", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/safe_divide.sql", "original_file_path": "macros/sql/safe_divide.sql", "unique_id": "macro.dbt_utils.default__safe_divide", "macro_sql": "{% macro default__safe_divide(numerator, denominator) %}\n ( {{ numerator }} ) / nullif( ( {{ denominator }} ), 0)\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1939337, "supported_languages": null}, "macro.dbt_utils.group_by": {"name": "group_by", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/groupby.sql", "original_file_path": "macros/sql/groupby.sql", "unique_id": "macro.dbt_utils.group_by", "macro_sql": "{%- macro group_by(n) -%}\n {{ return(adapter.dispatch('group_by', 'dbt_utils')(n)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__group_by"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.194469, "supported_languages": null}, "macro.dbt_utils.default__group_by": {"name": "default__group_by", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/groupby.sql", "original_file_path": "macros/sql/groupby.sql", "unique_id": "macro.dbt_utils.default__group_by", "macro_sql": "\n\n{%- macro default__group_by(n) -%}\n\n group by {% for i in range(1, n + 1) -%}\n {{ i }}{{ ',' if not loop.last }} \n {%- endfor -%}\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.194974, "supported_languages": null}, "macro.dbt_utils.get_intervals_between": {"name": "get_intervals_between", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/date_spine.sql", "original_file_path": "macros/sql/date_spine.sql", "unique_id": "macro.dbt_utils.get_intervals_between", "macro_sql": "{% macro get_intervals_between(start_date, end_date, datepart) -%}\n {{ return(adapter.dispatch('get_intervals_between', 'dbt_utils')(start_date, end_date, datepart)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_intervals_between"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1961384, "supported_languages": null}, "macro.dbt_utils.default__get_intervals_between": {"name": "default__get_intervals_between", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/date_spine.sql", "original_file_path": "macros/sql/date_spine.sql", "unique_id": "macro.dbt_utils.default__get_intervals_between", "macro_sql": "{% macro default__get_intervals_between(start_date, end_date, datepart) -%}\n {%- call statement('get_intervals_between', fetch_result=True) %}\n\n select {{ dbt.datediff(start_date, end_date, datepart) }}\n\n {%- endcall -%}\n\n {%- set value_list = load_result('get_intervals_between') -%}\n\n {%- if value_list and value_list['data'] -%}\n {%- set values = value_list['data'] | map(attribute=0) | list %}\n {{ return(values[0]) }}\n {%- else -%}\n {{ return(1) }}\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt.datediff"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.197479, "supported_languages": null}, "macro.dbt_utils.date_spine": {"name": "date_spine", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/date_spine.sql", "original_file_path": "macros/sql/date_spine.sql", "unique_id": "macro.dbt_utils.date_spine", "macro_sql": "{% macro date_spine(datepart, start_date, end_date) %}\n {{ return(adapter.dispatch('date_spine', 'dbt_utils')(datepart, start_date, end_date)) }}\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__date_spine"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1979504, "supported_languages": null}, "macro.dbt_utils.default__date_spine": {"name": "default__date_spine", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/date_spine.sql", "original_file_path": "macros/sql/date_spine.sql", "unique_id": "macro.dbt_utils.default__date_spine", "macro_sql": "{% macro default__date_spine(datepart, start_date, end_date) %}\n\n\n{# call as follows:\n\ndate_spine(\n \"day\",\n \"to_date('01/01/2016', 'mm/dd/yyyy')\",\n \"dbt.dateadd(week, 1, current_date)\"\n) #}\n\n\nwith rawdata as (\n\n {{dbt_utils.generate_series(\n dbt_utils.get_intervals_between(start_date, end_date, datepart)\n )}}\n\n),\n\nall_periods as (\n\n select (\n {{\n dbt.dateadd(\n datepart,\n \"row_number() over (order by 1) - 1\",\n start_date\n )\n }}\n ) as date_{{datepart}}\n from rawdata\n\n),\n\nfiltered as (\n\n select *\n from all_periods\n where date_{{datepart}} <= {{ end_date }}\n\n)\n\nselect * from filtered\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.generate_series", "macro.dbt_utils.get_intervals_between", "macro.dbt.dateadd"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1986644, "supported_languages": null}, "macro.dbt_utils.get_table_types_sql": {"name": "get_table_types_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_table_types_sql.sql", "original_file_path": "macros/sql/get_table_types_sql.sql", "unique_id": "macro.dbt_utils.get_table_types_sql", "macro_sql": "{%- macro get_table_types_sql() -%}\n {{ return(adapter.dispatch('get_table_types_sql', 'dbt_utils')()) }}\n{%- endmacro -%}\n\n", "depends_on": {"macros": ["macro.dbt_utils.default__get_table_types_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1993337, "supported_languages": null}, "macro.dbt_utils.default__get_table_types_sql": {"name": "default__get_table_types_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_table_types_sql.sql", "original_file_path": "macros/sql/get_table_types_sql.sql", "unique_id": "macro.dbt_utils.default__get_table_types_sql", "macro_sql": "{% macro default__get_table_types_sql() %}\n case table_type\n when 'BASE TABLE' then 'table'\n when 'EXTERNAL TABLE' then 'external'\n when 'MATERIALIZED VIEW' then 'materializedview'\n else lower(table_type)\n end as {{ adapter.quote('table_type') }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1996117, "supported_languages": null}, "macro.dbt_utils.postgres__get_table_types_sql": {"name": "postgres__get_table_types_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_table_types_sql.sql", "original_file_path": "macros/sql/get_table_types_sql.sql", "unique_id": "macro.dbt_utils.postgres__get_table_types_sql", "macro_sql": "{% macro postgres__get_table_types_sql() %}\n case table_type\n when 'BASE TABLE' then 'table'\n when 'FOREIGN' then 'external'\n when 'MATERIALIZED VIEW' then 'materializedview'\n else lower(table_type)\n end as {{ adapter.quote('table_type') }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.1998994, "supported_languages": null}, "macro.dbt_utils.deduplicate": {"name": "deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.deduplicate", "macro_sql": "{%- macro deduplicate(relation, partition_by, order_by) -%}\n {{ return(adapter.dispatch('deduplicate', 'dbt_utils')(relation, partition_by, order_by)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.snowflake__deduplicate"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2011452, "supported_languages": null}, "macro.dbt_utils.default__deduplicate": {"name": "default__deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.default__deduplicate", "macro_sql": "\n\n{%- macro default__deduplicate(relation, partition_by, order_by) -%}\n\n with row_numbered as (\n select\n _inner.*,\n row_number() over (\n partition by {{ partition_by }}\n order by {{ order_by }}\n ) as rn\n from {{ relation }} as _inner\n )\n\n select\n distinct data.*\n from {{ relation }} as data\n {#\n -- Not all DBs will support natural joins but the ones that do include:\n -- Oracle, MySQL, SQLite, Redshift, Teradata, Materialize, Databricks\n -- Apache Spark, SingleStore, Vertica\n -- Those that do not appear to support natural joins include:\n -- SQLServer, Trino, Presto, Rockset, Athena\n #}\n natural join row_numbered\n where row_numbered.rn = 1\n\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2015805, "supported_languages": null}, "macro.dbt_utils.redshift__deduplicate": {"name": "redshift__deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.redshift__deduplicate", "macro_sql": "{% macro redshift__deduplicate(relation, partition_by, order_by) -%}\n\n {{ return(dbt_utils.default__deduplicate(relation, partition_by, order_by=order_by)) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__deduplicate"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2019827, "supported_languages": null}, "macro.dbt_utils.postgres__deduplicate": {"name": "postgres__deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.postgres__deduplicate", "macro_sql": "\n{%- macro postgres__deduplicate(relation, partition_by, order_by) -%}\n\n select\n distinct on ({{ partition_by }}) *\n from {{ relation }}\n order by {{ partition_by }}{{ ',' ~ order_by }}\n\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.202375, "supported_languages": null}, "macro.dbt_utils.snowflake__deduplicate": {"name": "snowflake__deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.snowflake__deduplicate", "macro_sql": "\n{%- macro snowflake__deduplicate(relation, partition_by, order_by) -%}\n\n select *\n from {{ relation }}\n qualify\n row_number() over (\n partition by {{ partition_by }}\n order by {{ order_by }}\n ) = 1\n\n{%- endmacro -%}\n\n", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.202713, "supported_languages": null}, "macro.dbt_utils.bigquery__deduplicate": {"name": "bigquery__deduplicate", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/deduplicate.sql", "original_file_path": "macros/sql/deduplicate.sql", "unique_id": "macro.dbt_utils.bigquery__deduplicate", "macro_sql": "\n{%- macro bigquery__deduplicate(relation, partition_by, order_by) -%}\n\n select unique.*\n from (\n select\n array_agg (\n original\n order by {{ order_by }}\n limit 1\n )[offset(0)] unique\n from {{ relation }} original\n group by {{ partition_by }}\n )\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2030628, "supported_languages": null}, "macro.dbt_utils.degrees_to_radians": {"name": "degrees_to_radians", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/haversine_distance.sql", "original_file_path": "macros/sql/haversine_distance.sql", "unique_id": "macro.dbt_utils.degrees_to_radians", "macro_sql": "{% macro degrees_to_radians(degrees) -%}\n acos(-1) * {{degrees}} / 180\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2047243, "supported_languages": null}, "macro.dbt_utils.haversine_distance": {"name": "haversine_distance", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/haversine_distance.sql", "original_file_path": "macros/sql/haversine_distance.sql", "unique_id": "macro.dbt_utils.haversine_distance", "macro_sql": "{% macro haversine_distance(lat1, lon1, lat2, lon2, unit='mi') -%}\n {{ return(adapter.dispatch('haversine_distance', 'dbt_utils')(lat1,lon1,lat2,lon2,unit)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__haversine_distance"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.205282, "supported_languages": null}, "macro.dbt_utils.default__haversine_distance": {"name": "default__haversine_distance", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/haversine_distance.sql", "original_file_path": "macros/sql/haversine_distance.sql", "unique_id": "macro.dbt_utils.default__haversine_distance", "macro_sql": "{% macro default__haversine_distance(lat1, lon1, lat2, lon2, unit='mi') -%}\n{%- if unit == 'mi' %}\n {% set conversion_rate = 1 %}\n{% elif unit == 'km' %}\n {% set conversion_rate = 1.60934 %}\n{% else %}\n {{ exceptions.raise_compiler_error(\"unit input must be one of 'mi' or 'km'. Got \" ~ unit) }}\n{% endif %}\n\n 2 * 3961 * asin(sqrt(power((sin(radians(({{ lat2 }} - {{ lat1 }}) / 2))), 2) +\n cos(radians({{lat1}})) * cos(radians({{lat2}})) *\n power((sin(radians(({{ lon2 }} - {{ lon1 }}) / 2))), 2))) * {{ conversion_rate }}\n\n{%- endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2064037, "supported_languages": null}, "macro.dbt_utils.bigquery__haversine_distance": {"name": "bigquery__haversine_distance", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/haversine_distance.sql", "original_file_path": "macros/sql/haversine_distance.sql", "unique_id": "macro.dbt_utils.bigquery__haversine_distance", "macro_sql": "{% macro bigquery__haversine_distance(lat1, lon1, lat2, lon2, unit='mi') -%}\n{% set radians_lat1 = dbt_utils.degrees_to_radians(lat1) %}\n{% set radians_lat2 = dbt_utils.degrees_to_radians(lat2) %}\n{% set radians_lon1 = dbt_utils.degrees_to_radians(lon1) %}\n{% set radians_lon2 = dbt_utils.degrees_to_radians(lon2) %}\n{%- if unit == 'mi' %}\n {% set conversion_rate = 1 %}\n{% elif unit == 'km' %}\n {% set conversion_rate = 1.60934 %}\n{% else %}\n {{ exceptions.raise_compiler_error(\"unit input must be one of 'mi' or 'km'. Got \" ~ unit) }}\n{% endif %}\n 2 * 3961 * asin(sqrt(power(sin(({{ radians_lat2 }} - {{ radians_lat1 }}) / 2), 2) +\n cos({{ radians_lat1 }}) * cos({{ radians_lat2 }}) *\n power(sin(({{ radians_lon2 }} - {{ radians_lon1 }}) / 2), 2))) * {{ conversion_rate }}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.degrees_to_radians"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.208154, "supported_languages": null}, "macro.dbt_utils.pivot": {"name": "pivot", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/pivot.sql", "original_file_path": "macros/sql/pivot.sql", "unique_id": "macro.dbt_utils.pivot", "macro_sql": "{% macro pivot(column,\n values,\n alias=True,\n agg='sum',\n cmp='=',\n prefix='',\n suffix='',\n then_value=1,\n else_value=0,\n quote_identifiers=True,\n distinct=False) %}\n {{ return(adapter.dispatch('pivot', 'dbt_utils')(column, values, alias, agg, cmp, prefix, suffix, then_value, else_value, quote_identifiers, distinct)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__pivot"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2099767, "supported_languages": null}, "macro.dbt_utils.default__pivot": {"name": "default__pivot", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/pivot.sql", "original_file_path": "macros/sql/pivot.sql", "unique_id": "macro.dbt_utils.default__pivot", "macro_sql": "{% macro default__pivot(column,\n values,\n alias=True,\n agg='sum',\n cmp='=',\n prefix='',\n suffix='',\n then_value=1,\n else_value=0,\n quote_identifiers=True,\n distinct=False) %}\n {% for value in values %}\n {{ agg }}(\n {% if distinct %} distinct {% endif %}\n case\n when {{ column }} {{ cmp }} '{{ dbt.escape_single_quotes(value) }}'\n then {{ then_value }}\n else {{ else_value }}\n end\n )\n {% if alias %}\n {% if quote_identifiers %}\n as {{ adapter.quote(prefix ~ value ~ suffix) }}\n {% else %}\n as {{ dbt_utils.slugify(prefix ~ value ~ suffix) }}\n {% endif %}\n {% endif %}\n {% if not loop.last %},{% endif %}\n {% endfor %}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.escape_single_quotes", "macro.dbt_utils.slugify"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2116761, "supported_languages": null}, "macro.dbt_utils.nullcheck_table": {"name": "nullcheck_table", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/nullcheck_table.sql", "original_file_path": "macros/sql/nullcheck_table.sql", "unique_id": "macro.dbt_utils.nullcheck_table", "macro_sql": "{% macro nullcheck_table(relation) %}\n {{ return(adapter.dispatch('nullcheck_table', 'dbt_utils')(relation)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__nullcheck_table"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2122703, "supported_languages": null}, "macro.dbt_utils.default__nullcheck_table": {"name": "default__nullcheck_table", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/nullcheck_table.sql", "original_file_path": "macros/sql/nullcheck_table.sql", "unique_id": "macro.dbt_utils.default__nullcheck_table", "macro_sql": "{% macro default__nullcheck_table(relation) %}\n\n {%- do dbt_utils._is_relation(relation, 'nullcheck_table') -%}\n {%- do dbt_utils._is_ephemeral(relation, 'nullcheck_table') -%}\n {% set cols = adapter.get_columns_in_relation(relation) %}\n\n select {{ dbt_utils.nullcheck(cols) }}\n from {{relation}}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral", "macro.dbt_utils.nullcheck"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2129874, "supported_languages": null}, "macro.dbt_utils.width_bucket": {"name": "width_bucket", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/width_bucket.sql", "original_file_path": "macros/sql/width_bucket.sql", "unique_id": "macro.dbt_utils.width_bucket", "macro_sql": "{% macro width_bucket(expr, min_value, max_value, num_buckets) %}\n {{ return(adapter.dispatch('width_bucket', 'dbt_utils') (expr, min_value, max_value, num_buckets)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.snowflake__width_bucket"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.215146, "supported_languages": null}, "macro.dbt_utils.default__width_bucket": {"name": "default__width_bucket", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/width_bucket.sql", "original_file_path": "macros/sql/width_bucket.sql", "unique_id": "macro.dbt_utils.default__width_bucket", "macro_sql": "{% macro default__width_bucket(expr, min_value, max_value, num_buckets) -%}\n\n {% set bin_size -%}\n (( {{ max_value }} - {{ min_value }} ) / {{ num_buckets }} )\n {%- endset %}\n (\n -- to break ties when the amount is eaxtly at the bucket egde\n case\n when\n mod(\n {{ dbt.safe_cast(expr, dbt.type_numeric() ) }},\n {{ dbt.safe_cast(bin_size, dbt.type_numeric() ) }}\n ) = 0\n then 1\n else 0\n end\n ) +\n -- Anything over max_value goes the N+1 bucket\n least(\n ceil(\n ({{ expr }} - {{ min_value }})/{{ bin_size }}\n ),\n {{ num_buckets }} + 1\n )\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.safe_cast", "macro.dbt.type_numeric"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2160685, "supported_languages": null}, "macro.dbt_utils.redshift__width_bucket": {"name": "redshift__width_bucket", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/width_bucket.sql", "original_file_path": "macros/sql/width_bucket.sql", "unique_id": "macro.dbt_utils.redshift__width_bucket", "macro_sql": "{% macro redshift__width_bucket(expr, min_value, max_value, num_buckets) -%}\n\n {% set bin_size -%}\n (( {{ max_value }} - {{ min_value }} ) / {{ num_buckets }} )\n {%- endset %}\n (\n -- to break ties when the amount is exactly at the bucket edge\n case\n when\n {{ dbt.safe_cast(expr, dbt.type_numeric() ) }} %\n {{ dbt.safe_cast(bin_size, dbt.type_numeric() ) }}\n = 0\n then 1\n else 0\n end\n ) +\n -- Anything over max_value goes the N+1 bucket\n least(\n ceil(\n ({{ expr }} - {{ min_value }})/{{ bin_size }}\n ),\n {{ num_buckets }} + 1\n )\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt.safe_cast", "macro.dbt.type_numeric"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2170522, "supported_languages": null}, "macro.dbt_utils.snowflake__width_bucket": {"name": "snowflake__width_bucket", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/width_bucket.sql", "original_file_path": "macros/sql/width_bucket.sql", "unique_id": "macro.dbt_utils.snowflake__width_bucket", "macro_sql": "{% macro snowflake__width_bucket(expr, min_value, max_value, num_buckets) %}\n width_bucket({{ expr }}, {{ min_value }}, {{ max_value }}, {{ num_buckets }} )\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2174652, "supported_languages": null}, "macro.dbt_utils.get_tables_by_prefix_sql": {"name": "get_tables_by_prefix_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_prefix_sql.sql", "original_file_path": "macros/sql/get_tables_by_prefix_sql.sql", "unique_id": "macro.dbt_utils.get_tables_by_prefix_sql", "macro_sql": "{% macro get_tables_by_prefix_sql(schema, prefix, exclude='', database=target.database) %}\n {{ return(adapter.dispatch('get_tables_by_prefix_sql', 'dbt_utils')(schema, prefix, exclude, database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_tables_by_prefix_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.218219, "supported_languages": null}, "macro.dbt_utils.default__get_tables_by_prefix_sql": {"name": "default__get_tables_by_prefix_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_prefix_sql.sql", "original_file_path": "macros/sql/get_tables_by_prefix_sql.sql", "unique_id": "macro.dbt_utils.default__get_tables_by_prefix_sql", "macro_sql": "{% macro default__get_tables_by_prefix_sql(schema, prefix, exclude='', database=target.database) %}\n\n {{ dbt_utils.get_tables_by_pattern_sql(\n schema_pattern = schema,\n table_pattern = prefix ~ '%',\n exclude = exclude,\n database = database\n ) }}\n \n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.get_tables_by_pattern_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2187784, "supported_languages": null}, "macro.dbt_utils.union_relations": {"name": "union_relations", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/union.sql", "original_file_path": "macros/sql/union.sql", "unique_id": "macro.dbt_utils.union_relations", "macro_sql": "{%- macro union_relations(relations, column_override=none, include=[], exclude=[], source_column_name='_dbt_source_relation', where=none) -%}\n {{ return(adapter.dispatch('union_relations', 'dbt_utils')(relations, column_override, include, exclude, source_column_name, where)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__union_relations"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2234068, "supported_languages": null}, "macro.dbt_utils.default__union_relations": {"name": "default__union_relations", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/union.sql", "original_file_path": "macros/sql/union.sql", "unique_id": "macro.dbt_utils.default__union_relations", "macro_sql": "\n\n{%- macro default__union_relations(relations, column_override=none, include=[], exclude=[], source_column_name='_dbt_source_relation', where=none) -%}\n\n {%- if exclude and include -%}\n {{ exceptions.raise_compiler_error(\"Both an exclude and include list were provided to the `union` macro. Only one is allowed\") }}\n {%- endif -%}\n\n {#-- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. -#}\n {%- if not execute %}\n {{ return('') }}\n {% endif -%}\n\n {%- set column_override = column_override if column_override is not none else {} -%}\n\n {%- set relation_columns = {} -%}\n {%- set column_superset = {} -%}\n {%- set all_excludes = [] -%}\n {%- set all_includes = [] -%}\n\n {%- if exclude -%}\n {%- for exc in exclude -%}\n {%- do all_excludes.append(exc | lower) -%}\n {%- endfor -%}\n {%- endif -%}\n\n {%- if include -%}\n {%- for inc in include -%}\n {%- do all_includes.append(inc | lower) -%}\n {%- endfor -%}\n {%- endif -%}\n\n {%- for relation in relations -%}\n\n {%- do relation_columns.update({relation: []}) -%}\n\n {%- do dbt_utils._is_relation(relation, 'union_relations') -%}\n {%- do dbt_utils._is_ephemeral(relation, 'union_relations') -%}\n {%- set cols = adapter.get_columns_in_relation(relation) -%}\n {%- for col in cols -%}\n\n {#- If an exclude list was provided and the column is in the list, do nothing -#}\n {%- if exclude and col.column | lower in all_excludes -%}\n\n {#- If an include list was provided and the column is not in the list, do nothing -#}\n {%- elif include and col.column | lower not in all_includes -%}\n\n {#- Otherwise add the column to the column superset -#}\n {%- else -%}\n\n {#- update the list of columns in this relation -#}\n {%- do relation_columns[relation].append(col.column) -%}\n\n {%- if col.column in column_superset -%}\n\n {%- set stored = column_superset[col.column] -%}\n {%- if col.is_string() and stored.is_string() and col.string_size() > stored.string_size() -%}\n\n {%- do column_superset.update({col.column: col}) -%}\n\n {%- endif %}\n\n {%- else -%}\n\n {%- do column_superset.update({col.column: col}) -%}\n\n {%- endif -%}\n\n {%- endif -%}\n\n {%- endfor -%}\n {%- endfor -%}\n\n {%- set ordered_column_names = column_superset.keys() -%}\n {%- set dbt_command = flags.WHICH -%}\n\n\n {% if dbt_command in ['run', 'build'] %}\n {% if (include | length > 0 or exclude | length > 0) and not column_superset.keys() %}\n {%- set relations_string -%}\n {%- for relation in relations -%}\n {{ relation.name }}\n {%- if not loop.last %}, {% endif -%}\n {%- endfor -%}\n {%- endset -%}\n\n {%- set error_message -%}\n There were no columns found to union for relations {{ relations_string }}\n {%- endset -%}\n\n {{ exceptions.raise_compiler_error(error_message) }}\n {%- endif -%}\n {%- endif -%}\n\n {%- for relation in relations %}\n\n (\n select\n\n {%- if source_column_name is not none %}\n cast({{ dbt.string_literal(relation) }} as {{ dbt.type_string() }}) as {{ source_column_name }},\n {%- endif %}\n\n {% for col_name in ordered_column_names -%}\n\n {%- set col = column_superset[col_name] %}\n {%- set col_type = column_override.get(col.column, col.data_type) %}\n {%- set col_name = adapter.quote(col_name) if col_name in relation_columns[relation] else 'null' %}\n cast({{ col_name }} as {{ col_type }}) as {{ col.quoted }} {% if not loop.last %},{% endif -%}\n\n {%- endfor %}\n\n from {{ relation }}\n\n {% if where -%}\n where {{ where }}\n {%- endif %}\n )\n\n {% if not loop.last -%}\n union all\n {% endif -%}\n\n {%- endfor -%}\n\n{%- endmacro -%}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral", "macro.dbt.string_literal", "macro.dbt.type_string"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.23043, "supported_languages": null}, "macro.dbt_utils.get_tables_by_pattern_sql": {"name": "get_tables_by_pattern_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_pattern_sql.sql", "original_file_path": "macros/sql/get_tables_by_pattern_sql.sql", "unique_id": "macro.dbt_utils.get_tables_by_pattern_sql", "macro_sql": "{% macro get_tables_by_pattern_sql(schema_pattern, table_pattern, exclude='', database=target.database) %}\n {{ return(adapter.dispatch('get_tables_by_pattern_sql', 'dbt_utils')\n (schema_pattern, table_pattern, exclude, database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_tables_by_pattern_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.23303, "supported_languages": null}, "macro.dbt_utils.default__get_tables_by_pattern_sql": {"name": "default__get_tables_by_pattern_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_pattern_sql.sql", "original_file_path": "macros/sql/get_tables_by_pattern_sql.sql", "unique_id": "macro.dbt_utils.default__get_tables_by_pattern_sql", "macro_sql": "{% macro default__get_tables_by_pattern_sql(schema_pattern, table_pattern, exclude='', database=target.database) %}\n\n select distinct\n table_schema as {{ adapter.quote('table_schema') }},\n table_name as {{ adapter.quote('table_name') }},\n {{ dbt_utils.get_table_types_sql() }}\n from {{ database }}.information_schema.tables\n where table_schema ilike '{{ schema_pattern }}'\n and table_name ilike '{{ table_pattern }}'\n and table_name not ilike '{{ exclude }}'\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.get_table_types_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2337718, "supported_languages": null}, "macro.dbt_utils.bigquery__get_tables_by_pattern_sql": {"name": "bigquery__get_tables_by_pattern_sql", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_pattern_sql.sql", "original_file_path": "macros/sql/get_tables_by_pattern_sql.sql", "unique_id": "macro.dbt_utils.bigquery__get_tables_by_pattern_sql", "macro_sql": "{% macro bigquery__get_tables_by_pattern_sql(schema_pattern, table_pattern, exclude='', database=target.database) %}\n\n {% if '%' in schema_pattern %}\n {% set schemata=dbt_utils._bigquery__get_matching_schemata(schema_pattern, database) %}\n {% else %}\n {% set schemata=[schema_pattern] %}\n {% endif %}\n\n {% set sql %}\n {% for schema in schemata %}\n select distinct\n table_schema,\n table_name,\n {{ dbt_utils.get_table_types_sql() }}\n\n from {{ adapter.quote(database) }}.{{ schema }}.INFORMATION_SCHEMA.TABLES\n where lower(table_name) like lower ('{{ table_pattern }}')\n and lower(table_name) not like lower ('{{ exclude }}')\n\n {% if not loop.last %} union all {% endif %}\n\n {% endfor %}\n {% endset %}\n\n {{ return(sql) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._bigquery__get_matching_schemata", "macro.dbt_utils.get_table_types_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2352278, "supported_languages": null}, "macro.dbt_utils._bigquery__get_matching_schemata": {"name": "_bigquery__get_matching_schemata", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_tables_by_pattern_sql.sql", "original_file_path": "macros/sql/get_tables_by_pattern_sql.sql", "unique_id": "macro.dbt_utils._bigquery__get_matching_schemata", "macro_sql": "{% macro _bigquery__get_matching_schemata(schema_pattern, database) %}\n {% if execute %}\n\n {% set sql %}\n select schema_name from {{ adapter.quote(database) }}.INFORMATION_SCHEMA.SCHEMATA\n where lower(schema_name) like lower('{{ schema_pattern }}')\n {% endset %}\n\n {% set results=run_query(sql) %}\n\n {% set schemata=results.columns['schema_name'].values() %}\n\n {{ return(schemata) }}\n\n {% else %}\n\n {{ return([]) }}\n\n {% endif %}\n\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.run_query"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2362452, "supported_languages": null}, "macro.dbt_utils.unpivot": {"name": "unpivot", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/unpivot.sql", "original_file_path": "macros/sql/unpivot.sql", "unique_id": "macro.dbt_utils.unpivot", "macro_sql": "{% macro unpivot(relation=none, cast_to='varchar', exclude=none, remove=none, field_name='field_name', value_name='value') -%}\n {{ return(adapter.dispatch('unpivot', 'dbt_utils')(relation, cast_to, exclude, remove, field_name, value_name)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__unpivot"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.238397, "supported_languages": null}, "macro.dbt_utils.default__unpivot": {"name": "default__unpivot", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/unpivot.sql", "original_file_path": "macros/sql/unpivot.sql", "unique_id": "macro.dbt_utils.default__unpivot", "macro_sql": "{% macro default__unpivot(relation=none, cast_to='varchar', exclude=none, remove=none, field_name='field_name', value_name='value') -%}\n\n {% if not relation %}\n {{ exceptions.raise_compiler_error(\"Error: argument `relation` is required for `unpivot` macro.\") }}\n {% endif %}\n\n {%- set exclude = exclude if exclude is not none else [] %}\n {%- set remove = remove if remove is not none else [] %}\n\n {%- set include_cols = [] %}\n\n {%- set table_columns = {} %}\n\n {%- do table_columns.update({relation: []}) %}\n\n {%- do dbt_utils._is_relation(relation, 'unpivot') -%}\n {%- do dbt_utils._is_ephemeral(relation, 'unpivot') -%}\n {%- set cols = adapter.get_columns_in_relation(relation) %}\n\n {%- for col in cols -%}\n {%- if col.column.lower() not in remove|map('lower') and col.column.lower() not in exclude|map('lower') -%}\n {% do include_cols.append(col) %}\n {%- endif %}\n {%- endfor %}\n\n\n {%- for col in include_cols -%}\n select\n {%- for exclude_col in exclude %}\n {{ exclude_col }},\n {%- endfor %}\n\n cast('{{ col.column }}' as {{ dbt.type_string() }}) as {{ field_name }},\n cast( {% if col.data_type == 'boolean' %}\n {{ dbt.cast_bool_to_text(col.column) }}\n {% else %}\n {{ col.column }}\n {% endif %}\n as {{ cast_to }}) as {{ value_name }}\n\n from {{ relation }}\n\n {% if not loop.last -%}\n union all\n {% endif -%}\n {%- endfor -%}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral", "macro.dbt.type_string", "macro.dbt.cast_bool_to_text"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2415748, "supported_languages": null}, "macro.dbt_utils.get_query_results_as_dict": {"name": "get_query_results_as_dict", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_query_results_as_dict.sql", "original_file_path": "macros/sql/get_query_results_as_dict.sql", "unique_id": "macro.dbt_utils.get_query_results_as_dict", "macro_sql": "{% macro get_query_results_as_dict(query) %}\n {{ return(adapter.dispatch('get_query_results_as_dict', 'dbt_utils')(query)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_query_results_as_dict"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2423058, "supported_languages": null}, "macro.dbt_utils.default__get_query_results_as_dict": {"name": "default__get_query_results_as_dict", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_query_results_as_dict.sql", "original_file_path": "macros/sql/get_query_results_as_dict.sql", "unique_id": "macro.dbt_utils.default__get_query_results_as_dict", "macro_sql": "{% macro default__get_query_results_as_dict(query) %}\n\n{# This macro returns a dictionary of the form {column_name: (tuple_of_results)} #}\n\n {%- call statement('get_query_results', fetch_result=True,auto_begin=false) -%}\n\n {{ query }}\n\n {%- endcall -%}\n\n {% set sql_results={} %}\n\n {%- if execute -%}\n {% set sql_results_table = load_result('get_query_results').table.columns %}\n {% for column_name, column in sql_results_table.items() %}\n {% do sql_results.update({column_name: column.values()}) %}\n {% endfor %}\n {%- endif -%}\n\n {{ return(sql_results) }}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2434993, "supported_languages": null}, "macro.dbt_utils.get_relations_by_prefix": {"name": "get_relations_by_prefix", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_relations_by_prefix.sql", "original_file_path": "macros/sql/get_relations_by_prefix.sql", "unique_id": "macro.dbt_utils.get_relations_by_prefix", "macro_sql": "{% macro get_relations_by_prefix(schema, prefix, exclude='', database=target.database) %}\n {{ return(adapter.dispatch('get_relations_by_prefix', 'dbt_utils')(schema, prefix, exclude, database)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_relations_by_prefix"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2446804, "supported_languages": null}, "macro.dbt_utils.default__get_relations_by_prefix": {"name": "default__get_relations_by_prefix", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_relations_by_prefix.sql", "original_file_path": "macros/sql/get_relations_by_prefix.sql", "unique_id": "macro.dbt_utils.default__get_relations_by_prefix", "macro_sql": "{% macro default__get_relations_by_prefix(schema, prefix, exclude='', database=target.database) %}\n\n {%- call statement('get_tables', fetch_result=True) %}\n\n {{ dbt_utils.get_tables_by_prefix_sql(schema, prefix, exclude, database) }}\n\n {%- endcall -%}\n\n {%- set table_list = load_result('get_tables') -%}\n\n {%- if table_list and table_list['table'] -%}\n {%- set tbl_relations = [] -%}\n {%- for row in table_list['table'] -%}\n {%- set tbl_relation = api.Relation.create(\n database=database,\n schema=row.table_schema,\n identifier=row.table_name,\n type=row.table_type\n ) -%}\n {%- do tbl_relations.append(tbl_relation) -%}\n {%- endfor -%}\n\n {{ return(tbl_relations) }}\n {%- else -%}\n {{ return([]) }}\n {%- endif -%}\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt.statement", "macro.dbt_utils.get_tables_by_prefix_sql"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2463844, "supported_languages": null}, "macro.dbt_utils.get_column_values": {"name": "get_column_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_column_values.sql", "original_file_path": "macros/sql/get_column_values.sql", "unique_id": "macro.dbt_utils.get_column_values", "macro_sql": "{% macro get_column_values(table, column, order_by='count(*) desc', max_records=none, default=none, where=none) -%}\n {{ return(adapter.dispatch('get_column_values', 'dbt_utils')(table, column, order_by, max_records, default, where)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_column_values"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2482176, "supported_languages": null}, "macro.dbt_utils.default__get_column_values": {"name": "default__get_column_values", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_column_values.sql", "original_file_path": "macros/sql/get_column_values.sql", "unique_id": "macro.dbt_utils.default__get_column_values", "macro_sql": "{% macro default__get_column_values(table, column, order_by='count(*) desc', max_records=none, default=none, where=none) -%}\n {#-- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. #}\n {%- if not execute -%}\n {% set default = [] if not default %}\n {{ return(default) }}\n {% endif %}\n\n {%- do dbt_utils._is_ephemeral(table, 'get_column_values') -%}\n\n {# Not all relations are tables. Renaming for internal clarity without breaking functionality for anyone using named arguments #}\n {# TODO: Change the method signature in a future 0.x.0 release #}\n {%- set target_relation = table -%}\n\n {# adapter.load_relation is a convenience wrapper to avoid building a Relation when we already have one #}\n {% set relation_exists = (load_relation(target_relation)) is not none %}\n\n {%- call statement('get_column_values', fetch_result=true) %}\n\n {%- if not relation_exists and default is none -%}\n\n {{ exceptions.raise_compiler_error(\"In get_column_values(): relation \" ~ target_relation ~ \" does not exist and no default value was provided.\") }}\n\n {%- elif not relation_exists and default is not none -%}\n\n {{ log(\"Relation \" ~ target_relation ~ \" does not exist. Returning the default value: \" ~ default) }}\n\n {{ return(default) }}\n\n {%- else -%}\n\n\n select\n {{ column }} as value\n\n from {{ target_relation }}\n\n {% if where is not none %}\n where {{ where }}\n {% endif %}\n\n group by {{ column }}\n order by {{ order_by }}\n\n {% if max_records is not none %}\n limit {{ max_records }}\n {% endif %}\n\n {% endif %}\n\n {%- endcall -%}\n\n {%- set value_list = load_result('get_column_values') -%}\n\n {%- if value_list and value_list['data'] -%}\n {%- set values = value_list['data'] | map(attribute=0) | list %}\n {{ return(values) }}\n {%- else -%}\n {{ return(default) }}\n {%- endif -%}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_ephemeral", "macro.dbt.load_relation", "macro.dbt.statement"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2514021, "supported_languages": null}, "macro.dbt_utils.get_filtered_columns_in_relation": {"name": "get_filtered_columns_in_relation", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_filtered_columns_in_relation.sql", "original_file_path": "macros/sql/get_filtered_columns_in_relation.sql", "unique_id": "macro.dbt_utils.get_filtered_columns_in_relation", "macro_sql": "{% macro get_filtered_columns_in_relation(from, except=[]) -%}\n {{ return(adapter.dispatch('get_filtered_columns_in_relation', 'dbt_utils')(from, except)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_filtered_columns_in_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2522702, "supported_languages": null}, "macro.dbt_utils.default__get_filtered_columns_in_relation": {"name": "default__get_filtered_columns_in_relation", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/get_filtered_columns_in_relation.sql", "original_file_path": "macros/sql/get_filtered_columns_in_relation.sql", "unique_id": "macro.dbt_utils.default__get_filtered_columns_in_relation", "macro_sql": "{% macro default__get_filtered_columns_in_relation(from, except=[]) -%}\n {%- do dbt_utils._is_relation(from, 'get_filtered_columns_in_relation') -%}\n {%- do dbt_utils._is_ephemeral(from, 'get_filtered_columns_in_relation') -%}\n\n {# -- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. #}\n {%- if not execute -%}\n {{ return('') }}\n {% endif %}\n\n {%- set include_cols = [] %}\n {%- set cols = adapter.get_columns_in_relation(from) -%}\n {%- set except = except | map(\"lower\") | list %}\n {%- for col in cols -%}\n {%- if col.column|lower not in except -%}\n {% do include_cols.append(col.column) %}\n {%- endif %}\n {%- endfor %}\n\n {{ return(include_cols) }}\n\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2537475, "supported_languages": null}, "macro.dbt_utils.surrogate_key": {"name": "surrogate_key", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/surrogate_key.sql", "original_file_path": "macros/sql/surrogate_key.sql", "unique_id": "macro.dbt_utils.surrogate_key", "macro_sql": "{%- macro surrogate_key(field_list) -%}\n {% set frustrating_jinja_feature = varargs %}\n {{ return(adapter.dispatch('surrogate_key', 'dbt_utils')(field_list, *varargs)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__surrogate_key"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2544816, "supported_languages": null}, "macro.dbt_utils.default__surrogate_key": {"name": "default__surrogate_key", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/surrogate_key.sql", "original_file_path": "macros/sql/surrogate_key.sql", "unique_id": "macro.dbt_utils.default__surrogate_key", "macro_sql": "\n\n{%- macro default__surrogate_key(field_list) -%}\n\n{%- set error_message = '\nWarning: `dbt_utils.surrogate_key` has been replaced by \\\n`dbt_utils.generate_surrogate_key`. The new macro treats null values \\\ndifferently to empty strings. To restore the behaviour of the original \\\nmacro, add a global variable in dbt_project.yml called \\\n`surrogate_key_treat_nulls_as_empty_strings` to your \\\ndbt_project.yml file with a value of True. \\\nThe {}.{} model triggered this warning. \\\n'.format(model.package_name, model.name) -%}\n\n{%- do exceptions.raise_compiler_error(error_message) -%}\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.254953, "supported_languages": null}, "macro.dbt_utils.get_powers_of_two": {"name": "get_powers_of_two", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_series.sql", "original_file_path": "macros/sql/generate_series.sql", "unique_id": "macro.dbt_utils.get_powers_of_two", "macro_sql": "{% macro get_powers_of_two(upper_bound) %}\n {{ return(adapter.dispatch('get_powers_of_two', 'dbt_utils')(upper_bound)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__get_powers_of_two"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.256316, "supported_languages": null}, "macro.dbt_utils.default__get_powers_of_two": {"name": "default__get_powers_of_two", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_series.sql", "original_file_path": "macros/sql/generate_series.sql", "unique_id": "macro.dbt_utils.default__get_powers_of_two", "macro_sql": "{% macro default__get_powers_of_two(upper_bound) %}\n\n {% if upper_bound <= 0 %}\n {{ exceptions.raise_compiler_error(\"upper bound must be positive\") }}\n {% endif %}\n\n {% for _ in range(1, 100) %}\n {% if upper_bound <= 2 ** loop.index %}{{ return(loop.index) }}{% endif %}\n {% endfor %}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2571776, "supported_languages": null}, "macro.dbt_utils.generate_series": {"name": "generate_series", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_series.sql", "original_file_path": "macros/sql/generate_series.sql", "unique_id": "macro.dbt_utils.generate_series", "macro_sql": "{% macro generate_series(upper_bound) %}\n {{ return(adapter.dispatch('generate_series', 'dbt_utils')(upper_bound)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__generate_series"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2575583, "supported_languages": null}, "macro.dbt_utils.default__generate_series": {"name": "default__generate_series", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/generate_series.sql", "original_file_path": "macros/sql/generate_series.sql", "unique_id": "macro.dbt_utils.default__generate_series", "macro_sql": "{% macro default__generate_series(upper_bound) %}\n\n {% set n = dbt_utils.get_powers_of_two(upper_bound) %}\n\n with p as (\n select 0 as generated_number union all select 1\n ), unioned as (\n\n select\n\n {% for i in range(n) %}\n p{{i}}.generated_number * power(2, {{i}})\n {% if not loop.last %} + {% endif %}\n {% endfor %}\n + 1\n as generated_number\n\n from\n\n {% for i in range(n) %}\n p as p{{i}}\n {% if not loop.last %} cross join {% endif %}\n {% endfor %}\n\n )\n\n select *\n from unioned\n where generated_number <= {{upper_bound}}\n order by generated_number\n\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.get_powers_of_two"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2586148, "supported_languages": null}, "macro.dbt_utils.safe_add": {"name": "safe_add", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/safe_add.sql", "original_file_path": "macros/sql/safe_add.sql", "unique_id": "macro.dbt_utils.safe_add", "macro_sql": "{%- macro safe_add(field_list) -%}\n {{ return(adapter.dispatch('safe_add', 'dbt_utils')(field_list)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__safe_add"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.259329, "supported_languages": null}, "macro.dbt_utils.default__safe_add": {"name": "default__safe_add", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/safe_add.sql", "original_file_path": "macros/sql/safe_add.sql", "unique_id": "macro.dbt_utils.default__safe_add", "macro_sql": "\n\n{%- macro default__safe_add(field_list) -%}\n\n{%- if field_list is not iterable or field_list is string or field_list is mapping -%}\n\n{%- set error_message = '\nWarning: the `safe_add` macro now takes a single list argument instead of \\\nstring arguments. The {}.{} model triggered this warning. \\\n'.format(model.package_name, model.name) -%}\n\n{%- do exceptions.warn(error_message) -%}\n\n{%- endif -%}\n\n{% set fields = [] %}\n\n{%- for field in field_list -%}\n\n {% do fields.append(\"coalesce(\" ~ field ~ \", 0)\") %}\n\n{%- endfor -%}\n\n{{ fields|join(' +\\n ') }}\n\n{%- endmacro -%}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2604249, "supported_languages": null}, "macro.dbt_utils.star": {"name": "star", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/star.sql", "original_file_path": "macros/sql/star.sql", "unique_id": "macro.dbt_utils.star", "macro_sql": "{% macro star(from, relation_alias=False, except=[], prefix='', suffix='', quote_identifiers=True) -%}\r\n {{ return(adapter.dispatch('star', 'dbt_utils')(from, relation_alias, except, prefix, suffix, quote_identifiers)) }}\r\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__star"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.262424, "supported_languages": null}, "macro.dbt_utils.default__star": {"name": "default__star", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/sql/star.sql", "original_file_path": "macros/sql/star.sql", "unique_id": "macro.dbt_utils.default__star", "macro_sql": "{% macro default__star(from, relation_alias=False, except=[], prefix='', suffix='', quote_identifiers=True) -%}\r\n {%- do dbt_utils._is_relation(from, 'star') -%}\r\n {%- do dbt_utils._is_ephemeral(from, 'star') -%}\r\n\r\n {#-- Prevent querying of db in parsing mode. This works because this macro does not create any new refs. #}\r\n {%- if not execute -%}\r\n {% do return('*') %}\r\n {%- endif -%}\r\n\r\n {% set cols = dbt_utils.get_filtered_columns_in_relation(from, except) %}\r\n\r\n {%- if cols|length <= 0 -%}\r\n {% if flags.WHICH == 'compile' %}\r\n {% set response %}\r\n*\r\n/* No columns were returned. Maybe the relation doesn't exist yet \r\nor all columns were excluded. This star is only output during \r\ndbt compile, and exists to keep SQLFluff happy. */\r\n {% endset %}\r\n {% do return(response) %}\r\n {% else %}\r\n {% do return(\"/* no columns returned from star() macro */\") %}\r\n {% endif %}\r\n {%- else -%}\r\n {%- for col in cols %}\r\n {%- if relation_alias %}{{ relation_alias }}.{% else %}{%- endif -%}\r\n {%- if quote_identifiers -%}\r\n {{ adapter.quote(col)|trim }} {%- if prefix!='' or suffix!='' %} as {{ adapter.quote(prefix ~ col ~ suffix)|trim }} {%- endif -%}\r\n {%- else -%}\r\n {{ col|trim }} {%- if prefix!='' or suffix!='' %} as {{ (prefix ~ col ~ suffix)|trim }} {%- endif -%}\r\n {% endif %}\r\n {%- if not loop.last %},{{ '\\n ' }}{%- endif -%}\r\n {%- endfor -%}\r\n {% endif %}\r\n{%- endmacro %}", "depends_on": {"macros": ["macro.dbt_utils._is_relation", "macro.dbt_utils._is_ephemeral", "macro.dbt_utils.get_filtered_columns_in_relation"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2654011, "supported_languages": null}, "macro.dbt_utils.log_info": {"name": "log_info", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/log_info.sql", "original_file_path": "macros/jinja_helpers/log_info.sql", "unique_id": "macro.dbt_utils.log_info", "macro_sql": "{% macro log_info(message) %}\n {{ return(adapter.dispatch('log_info', 'dbt_utils')(message)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__log_info"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2659225, "supported_languages": null}, "macro.dbt_utils.default__log_info": {"name": "default__log_info", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/log_info.sql", "original_file_path": "macros/jinja_helpers/log_info.sql", "unique_id": "macro.dbt_utils.default__log_info", "macro_sql": "{% macro default__log_info(message) %}\n {{ log(dbt_utils.pretty_log_format(message), info=True) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.pretty_log_format"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2662716, "supported_languages": null}, "macro.dbt_utils._is_ephemeral": {"name": "_is_ephemeral", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/_is_ephemeral.sql", "original_file_path": "macros/jinja_helpers/_is_ephemeral.sql", "unique_id": "macro.dbt_utils._is_ephemeral", "macro_sql": "{% macro _is_ephemeral(obj, macro) %}\n {%- if obj.is_cte -%}\n {% set ephemeral_prefix = api.Relation.add_ephemeral_prefix('') %}\n {% if obj.name.startswith(ephemeral_prefix) %}\n {% set model_name = obj.name[(ephemeral_prefix|length):] %}\n {% else %}\n {% set model_name = obj.name %}\n {%- endif -%}\n {% set error_message %}\nThe `{{ macro }}` macro cannot be used with ephemeral models, as it relies on the information schema.\n\n`{{ model_name }}` is an ephemeral model. Consider making it a view or table instead.\n {% endset %}\n {%- do exceptions.raise_compiler_error(error_message) -%}\n {%- endif -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2678218, "supported_languages": null}, "macro.dbt_utils._is_relation": {"name": "_is_relation", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/_is_relation.sql", "original_file_path": "macros/jinja_helpers/_is_relation.sql", "unique_id": "macro.dbt_utils._is_relation", "macro_sql": "{% macro _is_relation(obj, macro) %}\n {%- if not (obj is mapping and obj.get('metadata', {}).get('type', '').endswith('Relation')) -%}\n {%- do exceptions.raise_compiler_error(\"Macro \" ~ macro ~ \" expected a Relation but received the value: \" ~ obj) -%}\n {%- endif -%}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.268689, "supported_languages": null}, "macro.dbt_utils.slugify": {"name": "slugify", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/slugify.sql", "original_file_path": "macros/jinja_helpers/slugify.sql", "unique_id": "macro.dbt_utils.slugify", "macro_sql": "{% macro slugify(string) %}\n\n{#- Lower case the string -#}\n{% set string = string | lower %}\n{#- Replace spaces and dashes with underscores -#}\n{% set string = modules.re.sub('[ -]+', '_', string) %}\n{#- Only take letters, numbers, and underscores -#}\n{% set string = modules.re.sub('[^a-z0-9_]+', '', string) %}\n{#- Prepends \"_\" if string begins with a number -#}\n{% set string = modules.re.sub('^[0-9]', '_' + string[0], string) %}\n\n{{ return(string) }}\n\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.269886, "supported_languages": null}, "macro.dbt_utils.pretty_time": {"name": "pretty_time", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/pretty_time.sql", "original_file_path": "macros/jinja_helpers/pretty_time.sql", "unique_id": "macro.dbt_utils.pretty_time", "macro_sql": "{% macro pretty_time(format='%H:%M:%S') %}\n {{ return(adapter.dispatch('pretty_time', 'dbt_utils')(format)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__pretty_time"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.270421, "supported_languages": null}, "macro.dbt_utils.default__pretty_time": {"name": "default__pretty_time", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/pretty_time.sql", "original_file_path": "macros/jinja_helpers/pretty_time.sql", "unique_id": "macro.dbt_utils.default__pretty_time", "macro_sql": "{% macro default__pretty_time(format='%H:%M:%S') %}\n {{ return(modules.datetime.datetime.now().strftime(format)) }}\n{% endmacro %}", "depends_on": {"macros": []}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2708175, "supported_languages": null}, "macro.dbt_utils.pretty_log_format": {"name": "pretty_log_format", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/pretty_log_format.sql", "original_file_path": "macros/jinja_helpers/pretty_log_format.sql", "unique_id": "macro.dbt_utils.pretty_log_format", "macro_sql": "{% macro pretty_log_format(message) %}\n {{ return(adapter.dispatch('pretty_log_format', 'dbt_utils')(message)) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.default__pretty_log_format"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.2713175, "supported_languages": null}, "macro.dbt_utils.default__pretty_log_format": {"name": "default__pretty_log_format", "resource_type": "macro", "package_name": "dbt_utils", "path": "macros/jinja_helpers/pretty_log_format.sql", "original_file_path": "macros/jinja_helpers/pretty_log_format.sql", "unique_id": "macro.dbt_utils.default__pretty_log_format", "macro_sql": "{% macro default__pretty_log_format(message) %}\n {{ return( dbt_utils.pretty_time() ~ ' + ' ~ message) }}\n{% endmacro %}", "depends_on": {"macros": ["macro.dbt_utils.pretty_time"]}, "description": "", "meta": {}, "docs": {"show": true, "node_color": null}, "patch_path": null, "arguments": [], "created_at": 1701973267.271656, "supported_languages": null}}, "docs": {"doc.dse_analytics.__overview__": {"name": "__overview__", "resource_type": "doc", "package_name": "dse_analytics", "path": "overview.md", "original_file_path": "models/overview.md", "unique_id": "doc.dse_analytics.__overview__", "block_contents": "# CalData dbt Documentation\n\nWelcome to the CalData Data Services and Engineering `dbt` Snowflake docs.\nTo go back to the top-level docs, follow [this link](../)\n\n## Navigation\n\nYou can use the `Project` and `Database` navigation tabs on the left side of the window to explore the models in your project.\n\n### Project Tab\n\nThe Project tab mirrors the directory structure of your dbt project.\nIn this tab, you can see all of the models defined in your dbt project, as well as models imported from dbt packages.\n\n### Database Tab\n\nThe Database tab also exposes your models, but in a format that looks more like a database explorer.\nThis view shows relations (tables and views) grouped into database schemas.\nNote that ephemeral models are not shown in this interface, as they do not exist in the database.\n\n## Graph Exploration\n\nYou can click the blue icon on the bottom-right corner of the page to view the lineage graph of your models.\n\nOn model pages, you'll see the immediate parents and children of the model you're exploring.\nBy clicking the Expand button at the top-right of this lineage pane,\nyou'll be able to see all of the models that are used to build, or are built from,\nthe model you're exploring.\n\nOnce expanded, you'll be able to use the `--select` and `--exclude` model selection syntax to filter the models in the graph.\nFor more information on model selection, check out the [dbt docs](https://docs.getdbt.com/reference/node-selection/syntax).\n\nNote that you can also right-click on models to interactively filter and explore the graph."}, "doc.dbt.__overview__": {"name": "__overview__", "resource_type": "doc", "package_name": "dbt", "path": "overview.md", "original_file_path": "docs/overview.md", "unique_id": "doc.dbt.__overview__", "block_contents": "### Welcome!\n\nWelcome to the auto-generated documentation for your dbt project!\n\n### Navigation\n\nYou can use the `Project` and `Database` navigation tabs on the left side of the window to explore the models\nin your project.\n\n#### Project Tab\nThe `Project` tab mirrors the directory structure of your dbt project. In this tab, you can see all of the\nmodels defined in your dbt project, as well as models imported from dbt packages.\n\n#### Database Tab\nThe `Database` tab also exposes your models, but in a format that looks more like a database explorer. This view\nshows relations (tables and views) grouped into database schemas. Note that ephemeral models are _not_ shown\nin this interface, as they do not exist in the database.\n\n### Graph Exploration\nYou can click the blue icon on the bottom-right corner of the page to view the lineage graph of your models.\n\nOn model pages, you'll see the immediate parents and children of the model you're exploring. By clicking the `Expand`\nbutton at the top-right of this lineage pane, you'll be able to see all of the models that are used to build,\nor are built from, the model you're exploring.\n\nOnce expanded, you'll be able to use the `--select` and `--exclude` model selection syntax to filter the\nmodels in the graph. For more information on model selection, check out the [dbt docs](https://docs.getdbt.com/docs/model-selection-syntax).\n\nNote that you can also right-click on models to interactively filter and explore the graph.\n\n---\n\n### More information\n\n- [What is dbt](https://docs.getdbt.com/docs/introduction)?\n- Read the [dbt viewpoint](https://docs.getdbt.com/docs/viewpoint)\n- [Installation](https://docs.getdbt.com/docs/installation)\n- Join the [dbt Community](https://www.getdbt.com/community/) for questions and discussion"}}, "exposures": {}, "metrics": {}, "groups": {}, "selectors": {}, "disabled": {}, "parent_map": {"model.dse_analytics.dim_state_entities__agencies": ["model.dse_analytics.int_state_entities__active"], "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger": ["source.dse_analytics.building_footprints.global_ml_building_footprints", "source.dse_analytics.tiger_2022.blocks", "source.dse_analytics.tiger_2022.places"], "model.dse_analytics.geo_reference__us_building_footprints_with_tiger": ["source.dse_analytics.building_footprints.us_building_footprints", "source.dse_analytics.tiger_2022.blocks", "source.dse_analytics.tiger_2022.places"], "model.dse_analytics.stg_department_of_finance__entities": ["source.dse_analytics.state_entities.base_entities"], "model.dse_analytics.stg_ebudget__budgets": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"], "model.dse_analytics.int_state_entities__budgets": ["model.dse_analytics.int_state_entities__active", "model.dse_analytics.stg_ebudget__budgets"], "model.dse_analytics.int_state_entities__active": ["model.dse_analytics.stg_department_of_finance__entities"], "model.dse_analytics.int_state_entities__technical": ["model.dse_analytics.stg_department_of_finance__entities"], "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21": ["model.dse_analytics.dim_state_entities__agencies"], "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b": ["model.dse_analytics.dim_state_entities__agencies"], "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291": ["model.dse_analytics.dim_state_entities__agencies"], "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e": ["model.dse_analytics.dim_state_entities__agencies"], "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014": ["model.dse_analytics.stg_department_of_finance__entities"], "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121": ["model.dse_analytics.stg_ebudget__budgets"], "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863": ["model.dse_analytics.int_state_entities__active"], "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe": ["model.dse_analytics.int_state_entities__active"], "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772": ["model.dse_analytics.int_state_entities__technical"], "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f": ["model.dse_analytics.int_state_entities__budgets"], "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63": ["source.dse_analytics.state_entities.base_entities"], "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2": ["source.dse_analytics.state_entities.base_entities"], "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4": ["source.dse_analytics.state_entities.base_entities"], "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173": ["source.dse_analytics.state_entities.base_entities"], "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"], "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"], "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"], "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75": ["source.dse_analytics.state_entities.ebudget_agency_and_department_budgets"], "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43": ["source.dse_analytics.state_entities.ebudget_program_budgets"], "source.dse_analytics.building_footprints.us_building_footprints": [], "source.dse_analytics.building_footprints.global_ml_building_footprints": [], "source.dse_analytics.tiger_2022.blocks": [], "source.dse_analytics.tiger_2022.places": [], "source.dse_analytics.state_entities.base_entities": [], "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets": [], "source.dse_analytics.state_entities.ebudget_program_budgets": []}, "child_map": {"model.dse_analytics.dim_state_entities__agencies": ["test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e", "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b", "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291", "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21"], "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger": [], "model.dse_analytics.geo_reference__us_building_footprints_with_tiger": [], "model.dse_analytics.stg_department_of_finance__entities": ["model.dse_analytics.int_state_entities__active", "model.dse_analytics.int_state_entities__technical", "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014"], "model.dse_analytics.stg_ebudget__budgets": ["model.dse_analytics.int_state_entities__budgets", "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121"], "model.dse_analytics.int_state_entities__budgets": ["test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f"], "model.dse_analytics.int_state_entities__active": ["model.dse_analytics.dim_state_entities__agencies", "model.dse_analytics.int_state_entities__budgets", "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863", "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe"], "model.dse_analytics.int_state_entities__technical": ["test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772"], "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21": [], "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b": [], "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291": [], "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e": [], "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014": [], "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121": [], "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863": [], "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe": [], "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772": [], "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f": [], "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63": [], "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2": [], "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4": [], "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173": [], "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8": [], "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8": [], "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca": [], "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75": [], "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43": [], "source.dse_analytics.building_footprints.us_building_footprints": ["model.dse_analytics.geo_reference__us_building_footprints_with_tiger"], "source.dse_analytics.building_footprints.global_ml_building_footprints": ["model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger"], "source.dse_analytics.tiger_2022.blocks": ["model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger", "model.dse_analytics.geo_reference__us_building_footprints_with_tiger"], "source.dse_analytics.tiger_2022.places": ["model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger", "model.dse_analytics.geo_reference__us_building_footprints_with_tiger"], "source.dse_analytics.state_entities.base_entities": ["model.dse_analytics.stg_department_of_finance__entities", "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63", "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2", "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173", "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4"], "source.dse_analytics.state_entities.ebudget_agency_and_department_budgets": ["model.dse_analytics.stg_ebudget__budgets", "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8", "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca", "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8", "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75"], "source.dse_analytics.state_entities.ebudget_program_budgets": ["test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43"]}, "group_map": {}, "semantic_models": {}}
\ No newline at end of file
diff --git a/dbt_docs_snowflake/partial_parse.msgpack b/dbt_docs_snowflake/partial_parse.msgpack
new file mode 100644
index 00000000..c53f2756
Binary files /dev/null and b/dbt_docs_snowflake/partial_parse.msgpack differ
diff --git a/dbt_docs_snowflake/run_results.json b/dbt_docs_snowflake/run_results.json
new file mode 100644
index 00000000..c8cf66f4
--- /dev/null
+++ b/dbt_docs_snowflake/run_results.json
@@ -0,0 +1 @@
+{"metadata": {"dbt_schema_version": "https://schemas.getdbt.com/dbt/run-results/v4.json", "dbt_version": "1.6.0", "generated_at": "2023-12-07T18:21:10.571093Z", "invocation_id": "73752f3e-f0c3-4ad7-a412-541df042f42a", "env": {}}, "results": [{"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.269310Z", "completed_at": "2023-12-07T18:21:10.300027Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.303142Z", "completed_at": "2023-12-07T18:21:10.303159Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.04291248321533203, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.geo_reference__global_ml_building_footprints_with_tiger"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.294313Z", "completed_at": "2023-12-07T18:21:10.301828Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.307486Z", "completed_at": "2023-12-07T18:21:10.307496Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.04242420196533203, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.stg_department_of_finance__entities"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.290521Z", "completed_at": "2023-12-07T18:21:10.302504Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.308159Z", "completed_at": "2023-12-07T18:21:10.308166Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.045111894607543945, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.stg_ebudget__budgets"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.276423Z", "completed_at": "2023-12-07T18:21:10.303810Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.309834Z", "completed_at": "2023-12-07T18:21:10.309842Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.049308061599731445, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.geo_reference__us_building_footprints_with_tiger"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.320425Z", "completed_at": "2023-12-07T18:21:10.338132Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.351372Z", "completed_at": "2023-12-07T18:21:10.351385Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.041553497314453125, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.dbt_utils_source_unique_combination_of_columns_state_entities_base_entities_A__B__L1__L2__L3.3301323f63"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.328891Z", "completed_at": "2023-12-07T18:21:10.350601Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.355314Z", "completed_at": "2023-12-07T18:21:10.355322Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.03978133201599121, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_not_null_state_entities_base_entities__A_.3b0e8bceb2"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.338805Z", "completed_at": "2023-12-07T18:21:10.352802Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.359304Z", "completed_at": "2023-12-07T18:21:10.359312Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.04326176643371582, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_not_null_state_entities_base_entities__name_.175bb24173"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.343608Z", "completed_at": "2023-12-07T18:21:10.353541Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.360064Z", "completed_at": "2023-12-07T18:21:10.360072Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.03807210922241211, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__org_cd_.6650ae0ce8"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.368048Z", "completed_at": "2023-12-07T18:21:10.380692Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.395908Z", "completed_at": "2023-12-07T18:21:10.395924Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.037397146224975586, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.1100593dca"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.376020Z", "completed_at": "2023-12-07T18:21:10.388833Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.400358Z", "completed_at": "2023-12-07T18:21:10.400368Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.0380864143371582, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_not_null_state_entities_ebudget_program_budgets__program_code_.29940cbe43"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.391448Z", "completed_at": "2023-12-07T18:21:10.399004Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.405399Z", "completed_at": "2023-12-07T18:21:10.405408Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.033904075622558594, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__org_cd_.f2687093d8"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.383694Z", "completed_at": "2023-12-07T18:21:10.399695Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.406124Z", "completed_at": "2023-12-07T18:21:10.406134Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.03625774383544922, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_unique_state_entities_base_entities__L3_.28ccbe9ad4"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.411904Z", "completed_at": "2023-12-07T18:21:10.420197Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.431866Z", "completed_at": "2023-12-07T18:21:10.431880Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.03448295593261719, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.source_unique_state_entities_ebudget_agency_and_department_budgets__web_agency_cd_.926c843b75"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.421749Z", "completed_at": "2023-12-07T18:21:10.437169Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.443002Z", "completed_at": "2023-12-07T18:21:10.443011Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.03521323204040527, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.int_state_entities__active"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.428503Z", "completed_at": "2023-12-07T18:21:10.438016Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.444404Z", "completed_at": "2023-12-07T18:21:10.444412Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.028679847717285156, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.int_state_entities__technical"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.432789Z", "completed_at": "2023-12-07T18:21:10.440721Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.447097Z", "completed_at": "2023-12-07T18:21:10.447106Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.02930283546447754, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_stg_department_of_finance__entities_primary_code.ab13df4014"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.452422Z", "completed_at": "2023-12-07T18:21:10.460961Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.471786Z", "completed_at": "2023-12-07T18:21:10.471798Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.03220653533935547, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_stg_ebudget__budgets_primary_code.11fe170121"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.464887Z", "completed_at": "2023-12-07T18:21:10.478239Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.484419Z", "completed_at": "2023-12-07T18:21:10.484427Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.028622865676879883, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.dim_state_entities__agencies"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.468103Z", "completed_at": "2023-12-07T18:21:10.479079Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.485162Z", "completed_at": "2023-12-07T18:21:10.485170Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.028517723083496094, "adapter_response": {}, "message": null, "failures": null, "unique_id": "model.dse_analytics.int_state_entities__budgets"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.472636Z", "completed_at": "2023-12-07T18:21:10.480511Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.486433Z", "completed_at": "2023-12-07T18:21:10.486441Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.03450274467468262, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_int_state_entities__active_primary_code.498a9cc863"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.489431Z", "completed_at": "2023-12-07T18:21:10.499260Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.505921Z", "completed_at": "2023-12-07T18:21:10.505930Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.02988719940185547, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.unique_int_state_entities__active_primary_code.cb6e0784fe"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.507184Z", "completed_at": "2023-12-07T18:21:10.522330Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.526646Z", "completed_at": "2023-12-07T18:21:10.526657Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.028851985931396484, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_int_state_entities__technical_primary_code.92bdfb0772"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.513608Z", "completed_at": "2023-12-07T18:21:10.527314Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.537227Z", "completed_at": "2023-12-07T18:21:10.537238Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.03738903999328613, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_dim_state_entities__agencies_agency_code.8ad3a79d8e"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.518273Z", "completed_at": "2023-12-07T18:21:10.528061Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.537946Z", "completed_at": "2023-12-07T18:21:10.537956Z"}], "thread_id": "Thread-4 (worker)", "execution_time": 0.03692150115966797, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_dim_state_entities__agencies_name.d3e367a95b"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.531686Z", "completed_at": "2023-12-07T18:21:10.539948Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.549625Z", "completed_at": "2023-12-07T18:21:10.549637Z"}], "thread_id": "Thread-1 (worker)", "execution_time": 0.026700735092163086, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.unique_dim_state_entities__agencies_agency_code.efd290c291"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.545350Z", "completed_at": "2023-12-07T18:21:10.553145Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.561520Z", "completed_at": "2023-12-07T18:21:10.561532Z"}], "thread_id": "Thread-3 (worker)", "execution_time": 0.023451566696166992, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.unique_dim_state_entities__agencies_name.44718adf21"}, {"status": "success", "timing": [{"name": "compile", "started_at": "2023-12-07T18:21:10.555677Z", "completed_at": "2023-12-07T18:21:10.564117Z"}, {"name": "execute", "started_at": "2023-12-07T18:21:10.565443Z", "completed_at": "2023-12-07T18:21:10.565450Z"}], "thread_id": "Thread-2 (worker)", "execution_time": 0.015454769134521484, "adapter_response": {}, "message": null, "failures": null, "unique_id": "test.dse_analytics.not_null_int_state_entities__budgets_primary_code.2d2e132a3f"}], "elapsed_time": 2.5980443954467773, "args": {"send_anonymous_usage_stats": false, "partial_parse": true, "enable_legacy_logger": false, "empty_catalog": false, "log_level": "info", "use_colors": true, "log_level_file": "debug", "write_json": true, "quiet": false, "printer_width": 80, "populate_cache": true, "select": [], "version_check": true, "warn_error": true, "which": "generate", "vars": {}, "warn_error_options": {"include": [], "exclude": []}, "invocation_command": "dbt docs generate --project-dir=transform", "log_file_max_bytes": 10485760, "project_dir": "transform", "print": true, "strict_mode": false, "indirect_selection": "eager", "introspect": true, "static_parser": true, "exclude": [], "profiles_dir": "ci", "log_format": "default", "log_path": "transform/logs", "defer": false, "favor_state": false, "cache_selected_only": false, "compile": true, "log_format_file": "debug", "macro_debugging": false, "use_colors_file": true}}
\ No newline at end of file
diff --git a/dbt_docs_snowflake/semantic_manifest.json b/dbt_docs_snowflake/semantic_manifest.json
new file mode 100644
index 00000000..b1bc37d7
--- /dev/null
+++ b/dbt_docs_snowflake/semantic_manifest.json
@@ -0,0 +1 @@
+{"semantic_models": [], "metrics": [], "project_configuration": {"time_spine_table_configurations": [], "metadata": null, "dsi_package_version": {"major_version": "0", "minor_version": "2", "patch_version": "0"}}}
\ No newline at end of file
diff --git a/images/codespace-secrets.png b/images/codespace-secrets.png
new file mode 100644
index 00000000..53f09ee1
Binary files /dev/null and b/images/codespace-secrets.png differ
diff --git a/images/column-pruning.png b/images/column-pruning.png
new file mode 100644
index 00000000..47e28bcc
Binary files /dev/null and b/images/column-pruning.png differ
diff --git a/images/columnar-storage.png b/images/columnar-storage.png
new file mode 100644
index 00000000..6ecf5961
Binary files /dev/null and b/images/columnar-storage.png differ
diff --git a/images/create-new-codespace.png b/images/create-new-codespace.png
new file mode 100644
index 00000000..3320311b
Binary files /dev/null and b/images/create-new-codespace.png differ
diff --git a/images/dbt_model_timing.png b/images/dbt_model_timing.png
new file mode 100644
index 00000000..c62c527e
Binary files /dev/null and b/images/dbt_model_timing.png differ
diff --git a/images/dbt_run_summary.png b/images/dbt_run_summary.png
new file mode 100644
index 00000000..d79a17d8
Binary files /dev/null and b/images/dbt_run_summary.png differ
diff --git a/images/initial-query.png b/images/initial-query.png
new file mode 100644
index 00000000..fdbe3bba
Binary files /dev/null and b/images/initial-query.png differ
diff --git a/images/launch-codespace.png b/images/launch-codespace.png
new file mode 100644
index 00000000..5074189b
Binary files /dev/null and b/images/launch-codespace.png differ
diff --git a/images/odi-circle_logomark-blue.png b/images/odi-circle_logomark-blue.png
new file mode 100644
index 00000000..5e594697
Binary files /dev/null and b/images/odi-circle_logomark-blue.png differ
diff --git a/images/odi-square_logomark-blue.svg b/images/odi-square_logomark-blue.svg
new file mode 100644
index 00000000..deb7ad3f
--- /dev/null
+++ b/images/odi-square_logomark-blue.svg
@@ -0,0 +1,4 @@
+
diff --git a/images/partition-pruning.png b/images/partition-pruning.png
new file mode 100644
index 00000000..29947c14
Binary files /dev/null and b/images/partition-pruning.png differ
diff --git a/index.html b/index.html
new file mode 100644
index 00000000..bbf149b4
--- /dev/null
+++ b/index.html
@@ -0,0 +1,672 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ CalData Data Services and Engineering Infrastructure
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
CalData Data Services and Engineering Infrastructure¶
+
This is the technical documentation for CalData's
+Data Services and Engineering (DSE) projects.
+It consists of processes, conventions, instructions, and architecture diagrams.
Cloud data warehouses (CDWs) are databases which are hosted in the cloud,
+and are typically optimized around analytical queries like aggregations and window functions,
+rather than the typical transactional queries that might support a traditional application.
+Examples of popular cloud data warehouses include
+Google BigQuery,
+Amazon Redshift,
+and Snowflake.
+
Cloud data warehouses typically have a few advantages over traditional transactional databases for analytical workflows, including:
+
+
They are usually managed services, meaning you don't have to provision and maintain servers.
+
They can scale to truly massive data.
+
+
By having a solid understanding of how cloud data warehouses work,
+you can construct fast, efficient queries and avoid surprise costs.
With most on-premise transactional warehouses, costs scale with the number of server instances you buy and run.
+These servers usually are always-on and power various applications with high availability.
+In a traditional transactional warehouse both compute power and storage are associated with the same logical machine.
+
Cloud data warehouses typically have a different pricing model:
+they decouple storage and compute and charge based on your query usage.
+Google BigQuery charges based on the amount of data your queries scan.
+Snowflake charges based on the amount of compute resources needed to execute your queries.
+There are also costs associated with data storage, but those are usually small compared to compute.
+Though these two models are slightly different, they both lead to a similar take-home lesson:
+by being careful with how data are laid out and accessed,
+you can significantly reduce both execution time and cost for your cloud data warehouses.
Most cloud data warehouses use columnar storage for their data.
+This means that data for each column of a table are stored sequentially in object storage
+(this is in contrast to transactional databases which usually store each row, or record, sequentially in storage).
+This BigQuery blog post goes into a bit more detail.
+
+
There are a number of consequences of using columnar storage:
+
+
You can read in columns separately from each other. So if your query only needs to look at one column of a several-hundred column table, it can do that without incurring the cost of loading and processing all of the other columns.
+
Because the values in a column are located near each other in device storage, it is much faster to read them all at once for analytical queries like aggregations or window functions. In row-based storage, there is much more jumping around to different parts of memory.
+
Having values of the same data type stored sequentially allows for much more efficient serialization and compression of the data at rest.
+
+
In addition to columnar storage,
+cloud data warehouses also usually divide tables row-wise into chunks called partitions.
+Different warehouses choose different sizing strategies for partitions,
+but they are typically from a few to a few hundred megabytes.
+Having separate logical partitions in a table allows the compute resources to process the partitions independently of each other in parallel.
+This massively parallel processing capability is a large part of what makes cloud data warehouses scalable.
+When designing your tables, you can often set partitioning strategies or clustering keys for the table.
+This tells the cloud data warehouse to store rows with similar values for those keys within the same partitions.
+A well-partitioned table can enable queries to only read from the partitions that it needs, and ignore the rest.
With the above understanding of how cloud data warehouses store and process data,
+we can write down a set of recommendations for how to construct efficient queries for large tables stored within them:
+
+
Only SELECT the columns you need. Columnar storage allows you to ignore the columns you don't need, and avoid the cost of reading it in. SELECT * can get expensive!
+
If the table has a natural ordering, consider setting a partitioning or clustering key. For example, if the data in the table consists of events with an associated timestamp, you might want to partition according to that timestamp. Then events with similar times would be stored near each other in the same or adjacent partitions, and queries selecting for a particular date range would have to scan fewer partitions.
+
If the table has a partitioning or clustering key already set, try to filter based on that in your queries. This can greatly reduce the amount of data you need to scan.
+
Filter early in complex queries, rather than at the end. If you have complex, multi-stage queries, filtering down to the subset of interest at the outset can avoid the need to process unnecessary data and then throw it away later in the query.
+
+
+
Note
+
For people coming from transactional databases,
+the considerations about partitioning and clustering may seem reminiscent of indexes.
+Cloud data warehouses usually don't have traditional indexes,
+but partitioning and clustering keys fill approximately the same role,
+tailored to the distributed compute model.
A central feature of cloud data warehouses is that storage is separate from compute,
+and data can be processed in parallel by distributed compute resources.
+The less communication that needs to happen between these distributed compute resources,
+the faster they can work.
+For this reason, most cloud data warehouses do not support primary keys,
+foreign keys, or other constraints.
+
For example: if we have a foreign key constraint set on a table and insert a new record,
+we would have to scan every single row of the parent table to see if the referenced row exists and is unique.
+If the table is large and partitioned, this could mean spinning up a large amount of compute resources,
+just to insert a single row.
+So rather than supporting constraints with horrible performance characteristics,
+cloud data warehouses just don't do it.
+This can be surprising to some people, since they often still include the syntax for constraints for SQL standard compatibility
+(see the Snowflake docs on constraints).
+
+
Note
+
One exception to the above is NOT NULL constraints,
+which can be done cheaply since they don't require information from other tables or partitions to be enforced.
This exercise is intended to be done live with collaborators.
+It should read fine, but will be more impactful if we set up a lab setting! We'll be querying Google Analytics 4 data stored in BigQuery. The dataset in question consists of (at the time of this writing) about six months of user event data collected from websites under ca.gov domain. It has over 500 million events in about 400 gigabytes of storage. The table is partitioned by event date, so all events on the same day get put in the same partition.
Suppose we want to analyze the breakdown of the different web browsers used to access state sites so we can understand which browsers are the highest priority to support. We expect this to be a moving target as different browsers become more or less popular, so we'll try to restrict our analysis to the month of January, 2023.
+Fortunately, the dataset has a timestamp column, so we can try to filter based on that column:
Yikes! This query scans the whole 400 GB dataset. Based on Google's approximately $5/TB charge, this costs about $2, and if it were a query we were running many times a day, it could easily start costing thousands of dollars per year.
You'll note that we are doing a SELECT * query, but if we're interested in browser usage, we really only need that column. So let's just SELECT that column:
By just selecting the column we wanted, we avoided loading a lot of unnecessary data, and now we are only scanning ~4 GB of data, reducing the charge by 99%!
In the above query we are filtering based on the event_timestamp field. However, the dataset actually has two different time-like fields, and it is partitioned based on the other one! The query planner is not smart enough to know that both fields contain similar information, and it is therefore not able to infer that we don't need to scan every partition to get the data within the January time window. Let's fix that by re-working the query to use the partitioned-by DATE field:
By using the field by which the table is partitioned in our filter, we reduced the data scanned by another factor of ~5 (as discussed above, this is analogous to using an index).
Many CalData projects use dbt
+for transforming and modeling data within our cloud data warehouses.
+dbt has become extremely popular over the last several years,
+popularizing the practice and position of "analytics engineering".
+It has a number of features that makes it valuable for data stacks:
+
+
It works well with version control
+
It encourages modular, reusable SQL code
+
It makes it easier to track data lineage as it flows through your data warehouse
+
It has a large, active community with which you can share tips and techniques
Git and GitHub occupy central positions in modern software development practices.
+
Git is software that you install locally on your machine that enables source code management (SCM).
+Code is organized into folder-like structures called git repositories,
+which enables you to track the history of code,
+safely develop features in side branches,
+and collaborate with others in a distributed fashion.
+
GitHub is a web platform for hosting git repositories.
+It integrates tightly with local git development workflows,
+and includes additional features like a code review user interface,
+issue tracking, project boards,
+continuous integration/continuous delivery,
+and social networking.
+There are a number of web platforms which have similar features and goals as GitHub
+(including Bitbucket and GitLab), but GitHub is the most commonly used.
In addition to the fundamentals of git,
+it's also helpful to know how to use the GitHub web platform for development.
+GitHub hosts an excellent set of interactive tutorials
+for learning to use its various features, including:
On the CalData Data Services and Engineering team we make heavy use of git and GitHub for our projects,
+and have our own set of guidelines and best practices for code review.
This glossary is a reference for commonly used acronyms, terms, and tools associated with the modern data stack and data and analytics engineering practices.
Modern data stack - a cloud-first suite of software tools that enable data teams to connect to, process, store, transform, and visualize data.
+
+
+
ETL vs ELT – ETL (Extract, Transform and Load) and ELT (Extract, Load and Transform) are data integration methods that determine whether data is preprocessed before landing in storage or transformed after being stored.
+
Both methods have the same three operations:
+
+
Extraction: Pulling data from its original source system (e.g. connecting to data from a SaaS platform like Google Analytics)
+
Transformation: Changing the data’s structure so it can be integrated into the target data system. (e.g. changing geospatial data from a JSON structure to a parquet format)
+
Loading: Dumping data into a storage system (e.g. AWS S3 bucket or GCS)
+
+
Advantages of ELT over ETL:
+
+
More flexibility, as ETL is traditionally intended for relational, structured data. Cloud-based data warehouses enable ELT for structured and unstructured data
+
Greater accessibility, as ETL is generally supported, maintained, and governed by organizations’ IT departments. ELT allows for easier access and use by employees
+
Scalability, as ETL can be prohibitively resource-intensive for some businesses. ELT solutions are generally cloud-based SaaS, available to a broader range of businesses
+
Faster load times, as ETL typically takes longer as it uses a staging area and system. With ELT, there is only one load to the destination system
+
Faster transformation times, as ETL is typically slower and dependent on the size of the data set(s). ELT transformation is not dependent on data size
+
Less time required for data maintenance, as data may need to be re-sourced and re-loaded if the transformation is found to be inadequate for the data’s intended purposes. With ELT, the original data is intact and already loaded from disk
Columnar Database vs Relational Database - a columnar database stores data by columns making it suitable for analytical query processing whereas a relational database stores data by rows making it optimized for transactional applications
+
Advantages of Columnar over Relational databases:
+
+
Reduces amount of data needed to be loaded
+
Improves query performance by returning relevant data faster (instead of going row by row, multiple fields can be skipped)
+
+
+
+
Cloud Data Warehouse - a database stored as a managed service in a public cloud optimized for scalable analytics.
+
Tools like Excel, Tableau, or PowerBI are limited in how much data can be brought into the dashboard. Cloud data warehouses, however, can handle petabyte scale data without too much fuss. Now, PowerBI or Tableau can also pass off data processing to a cloud data warehouse, but then major data processing jobs get hidden in a dashboard panel, which can produce unexpected spends, poor code reusability, and brittle dashboards.
+
+
+
Analytics engineering - applies software engineering practices to analytical workflows like version control and continuous integration/development. Analytics engineers are often thought of as a hybrid between data engineers and data analysts. Most analytics engineers spend their time transforming, modeling, testing, deploying, and documenting data. Data modeling – applying business logic to data to represent commonly known truths across an organization (for example, what data defines an order) – is the core of their workload and it enables analysts and other data consumers to answer their own questions.
+
Originally coined two years ago, by Michael Kaminsky, the term came from the ground-up, when data people experienced a shift in their job: they went from handling data engineer/scientist/analyst’s tasks to spending most of their time fixing, cleaning, and transforming data. And so, they (mainly members of the dbt community) created a terminology to describe this middle seat role: the Analytics Engineer.
+
dbt comes in as a SQL-first transformation layer built for modern data warehousing and ingestion tools that centralizes data models, tests, and documentation.
Agile development - is an iterative approach to software development (and project management) that helps teams ship code faster and with fewer bugs.
+
+
Sprints - a time-boxed period (usually 2 weeks) when a team works to complete a set amount of work. Some sprints have themes like if a new tool was procured an entire sprint may be dedicated to setup and onboarding.
Our approach is adapted from this blog post.
+The goals of establishing a naming convention are:
+1. Prevent name collisions between similar resources (especially in cases where names are required to be unique).
+1. Allow developers to identify at a glance what a particular resource is and who owns it.
+1. Structured naming allows for easier sorting and filtering of resources.
+Where {...} indicates a component in the name, and [{...}] indicates that it is optional or conditionally required.
+
+
+
+
Component
+
Description
+
Required
+
Constraints
+
+
+
+
+
owner
+
Owner of the resource
+
✔
+
len 3-6
+
+
+
project
+
Project name
+
✔
+
len 4-10, a-z0-9
+
+
+
env
+
Environment type, e.g. dev, prd, stg
+
✔
+
len 3, a-z, enum
+
+
+
region
+
Region (if applicable)
+
✗
+
enum
+
+
+
description
+
Additional description (if needed)
+
✗
+
len 1-20, a-z0-9
+
+
+
suffix
+
Random suffix (only use if there are multiple identical resources)
+
✗
+
len 3, a-z0-9
+
+
+
+
Owner:
+This is a required field.
+For most of our projects, it will be dse (for Data Services and Engineering),
+though it could be other things for projects that we will be handing off to clients upon completion.
+
Project:
+A short project name. This is a required field. For general DSE infrastructure, use infra.
+
Environment:
+The deployment environment. This is a required field.
+Generally prd (production), stg (staging), or dev (development).
+
Region:
+If the resource exists in a particular region (e.g. us-west-1), this should be included.
+
Description:
+There may be multiple resources that are identical with respect to the above parameters,
+but have a different purpose.
+In that case, append a description to the name to describe that purpose.
+For instance, we might have multiple subnets, some of which are public and some of which are private. Or we could have multiple buckets for storing different kinds data within the same project.
+
Suffix:
+If there all of the above are identical (including description),
+include a random suffix.
+This can be accomplished with the terraform random_id resource.
GCP Project: We have a GCP project for managing web/product analytics collaborations with the CalInnovate side of ODI. GCP projects to not exist in a region, so a production project could be called dse-product-analytics-prd.
+
MWAA Environment: An Apache Airflow environment in AWS does exist in a region, and supports general DSE infrastructure. So a development deployment of the environment could be dse-infra-dev-us-west-2.
+
Scratch bucket: We might have several S3 buckets supporting different aspects of a project. For instance, one bucket could be used for scratch work, and another could be used for source data. The scratch bucket could then be named dse-infra-dev-us-west-1-scratch.
Cloud resources can be tagged with user-specified key-value pairs
+which allow for resource and cost tracking within a given cloud account.
+
Our tagging convention is that information which is available in the resource name should also be available in tags as a specific key-value pair:
+
+
+
+
Key
+
Value
+
Required
+
+
+
+
+
Owner
+
{owner}
+
✔
+
+
+
Project
+
{project}
+
✔
+
+
+
Environment
+
{env}
+
✔
+
+
+
Description
+
{description}
+
✗
+
+
+
+
Note that the {region} and {suffix} components are not included.
+This is because the region information is typically available elsewhere in the API/Console,
+and the suffix information is not semantically meaningful.
Not all resources can follow the above convention exactly.
+For instance, some resource names may not allow hyphens, or may have length limits.
+In those cases, we should try to adhere to the conventions as closely as possible
+(e.g., by substituting underscores for hyphens)
+and document the exception here.
Data warehouse schemas (or datasets in BigQuery) are often user/analyst-facing,
+and have different considerations.
+They usually cannot have hyphens in them, so words should be separated with underscores "_".
+Furthermore, analysts needn't need to know details of regions or deployments, so region and env are dropped, and the naming convention becomes:
+
{owner}_{project}_[{description}]
+
+
If a project is owned by the Data Services and Engineering team,
+the owner component may be omitted, and the schema name is simply
+
{project}_[{description}]
+
+Note that Snowflake normalizes all object names to upper case.
+This is opposite to how PostgreSQL normalizes object names (sigh).
+Most of the time this doesn't matter, but occasionally requires thought if you have a mixed-case object name. If you are naming new database tables or schemas, mixed-case identifiers should be avoided.
+
The names of tables loaded by Fivetran are typically set by either Fivetran or the names of the tables in the source systems.
+As such, we don't have much control over them, and they won't adhere to any particular naming conventions.
+
Fivetran connectors names cannot contain hyphens, and should follow this pattern:
+The schemas into which a fivetran connector is writing should be named the same as the connector
+(which is why the connector name has some seemingly redundant information).
+
If a project is owned by the Data Services and Engineering team,
+the owner component may be omitted, and the schema name is simply
+
The DSE team regularly creates new Snowflake accounts in our Snowflake org.
+We do this instead of putting all of our data projects into our main account for a few reasons:
+
+
At the end of a project, we often want to transfer account ownership to our partners.
+ Having it separated from the start helps that process.
+
We frequently want to add our project champion or IT partners to our account as admins.
+ This is safer if project accounts are separate.
+
We often want to have accounts in a specific cloud and region for compliance or data transfer regions.
+
Different projects may require different approaches to account-level operations like OAuth/SAML.
+
+
Here we document the steps to creating a new Snowflake account from scratch.
In order to create a new account, you will need access to the orgadmin role.
+If you have accountadmin in the primary Snowflake account, you can grant it to yourself:
We typically create our Snowflake architecture using Terraform.
+Terraform state is stored in S3 buckets within our AWS account,
+so you will need read/write access to those buckets.
You can install Terraform using whatever approach makes sense for your system,
+including using brew or conda.
+
Here is a sample for installing the dependencies using conda:
+
condacreate-ninfrapython=3.10# create an environment named 'infra'
+condaactivateinfra# activate the environment
+condainstall-cconda-forgeterraformtflint# Install terraform and tflint
+
Under the "Admin" side panel, go to "Accounts" and click the "+ Account" button:
+
Select the cloud and region appropriate to the project. The region should be in the United States.
+
Select "Business Critical" for the Snowflake Edition.
+
You will be prompted to create an initial user with ACCOUNTADMIN privileges. This should be you.
+ You will be prompted to create a password for your user. Create one using your password manager,
+ but know that it will ask you change your password upon first log-in.
+
Save the Account Locator and Account URL for your new account.
+
+
+
Log into your new account. You should be prompted to change your password. Save the updated password in your password manager.
Certain Snowflake clients don't properly cache MFA tokens,
+which means that using them can generate dozens or hundreds of MFA requests on your phone.
+At best this makes the tools unusable, and at worst it can lock your Snowflake account.
+One example of such a tool is (as of this writing) the Snowflake Terraform Provider.
+
The recommended workaround for this is to add a key pair to your account for use with those tools.
+
+
Follow the instructions given here
+ to generate a key pair and add the public key to your account.
+ Keep the key pair in a secure place on your device.
+ This gist
+ contains the bash commands from the instructions,
+ and can be helpful for quickly creating a new encrypted key pair.
+ Usage of the script looks like:
+
+ You can use pbcopy < _your_public_key_file_name_.pub to copy the contents of your public key.
+ Be sure to remove the ----BEGIN PUBLIC KEY---- and -----END PUBLIC KEY------ portions
+ when adding your key to your Snowflake user.
+
In your local .bash_profile or an .env file, add environment variables for
+ SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH,
+ and (if applicable) SNOWFLAKE_PRIVATE_KEY_PASSPHRASE.
By default, Snowflake logs out user sessions after four hours of inactivity.
+ODI's information security policies prefer that we log out after one hour of inactivity for most accounts,
+and after fifteen minutes of inactivity for particularly sensitive accounts.
+
+
Note
+
It's possible we will do this using Terraform in the future,
+but at the time of this writing the Snowflake Terraform provider does not support session policies.
+
+
After the Snowflake account is created, run the following script in a worksheet
+to set the appropriate session policy:
Create a new git repository from the CalData Infrastructure Template
+following the instructions here.
+
Once you have created the repository, push it to a remote repository in GitHub.
+There are some GitHub actions that will fail because the repository is not yet
+configured to work with the new Snowflake account.
We will create two separate deployments of the project infrastructure,
+one for development, and one for production.
+In some places we will refer to project name and owner as <project> and <owner>, respectively,
+following our naming conventions.
+You should substitute the appropriate names there.
Ensure that your environment has environment variables set for
+ SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH, and SNOWFLAKE_PRIVATE_KEY_PASSPHRASE.
+ Make sure you don't have any other SNOWFLAKE_* variables set,
+ as they can interfere with authentication.
+
In the new git repository, create a directory to hold the development Terraform configuration:
+
mkdir-pterraform/environments/dev/
+
+ The location of this directory is by convention, and subject to change.
+
Copy the terraform configuration from
+ here
+ to your dev directory.
+
In the "elt" module of main.tf, change the source parameter to point to
+ "github.com/cagov/data-infrastructure.git//terraform/snowflake/modules/elt?ref=<ref>"
+ where <ref> is the short hash of the most recent commit in the data-infrastructure repository.
+
In the dev directory, create a new backend configuration file called <owner>-<project>-dev.tfbackend.
+ The file will point to the S3 bucket in which we are storing terraform state. Populate the backend
+ configuration file with the following (making sure to substitute values for <owner> and <project>):
+
In the dev directory, create a terraform variables file called terraform.tfvars,
+ and populate the "elt" module variables. These variables may expand in the future,
+ but at the moment they are just the new Snowflake account locator and the environment
+ (in this case "DEV"):
+
Add your new main.tf, terraform.tfvars, <owner>-<project>-dev.tfbackend,
+ and terraform lock file to the git repository. Do not add the .terraform/ directory.
Ensure that your local environment has environment variables set for SNOWFLAKE_ACCOUNT,
+ SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH, and SNOWFLAKE_PRIVATE_KEY_PASSPHRASE,
+ and that they are set to your new account, rather than any other accounts.
+
Run terraform plan to see the plan for the resources that will be created.
+ Inspect the plan to see that everything looks correct.
+
Run terraform apply to deploy the configuration. This will actually create the infrastructure!
+
+
Configure and deploy the production configuration¶
+
Re-run all of the steps above, but in a new directory terraform/environments/prd.
+Everywhere where there is a dev (or DEV), replace it with a prd (or PRD).
ODI IT requires that systems log to our Microsoft Sentinel instance
+for compliance with security monitoring policies.
+The terraform configuration deployed above creates a service account for Sentinel
+which needs to be integrated.
+
+
Create a password for the Sentinel service account.
+ In other contexts we prefer key pairs for service accounts, but the Sentinel
+ integration requires password authentication. In a Snowflake worksheet run:
+
Store the Sentinel service account authentication information in our shared
+ 1Password vault.
+ Make sure to provide enough information to disambiguate it from others stored in the vault,
+ including:
+
+
The account locator
+
The account name (distinct from the account locator)
+
The service account name
+
The public key
+
The private key
+
+
+
+
Create an IT Help Desk ticket to add the new account to our Sentinel instance.
+ Share the 1Password item with the IT-Ops staff member who is implementing the ticket.
+ If you've included all of the above information in the vault item,
+ it should be all they need.
+
+
Within fifteen minutes or so of implementation it should be clear whether the integration is working.
+ IT-Ops should be able to see logs ingesting, and Snowflake account admins should see queries
+ from the Sentinel service user.
Set up key pairs for the two GitHub actions service accounts
+(GITHUB_ACTIONS_SVC_USER_DEV and GITHUB_ACTIONS_SVC_USER_PRD).
+This follows a similar procedure to what you did for your personal key pair,
+though the project template currently does not assume an encrypted key pair.
+This bash script
+is a helpful shortcut for generating the key pair:
+
bashgenerate_key.sh<key-name>
+
+
Once you have created and set the key pairs, add them to the DSE 1Password shared vault.
+Make sure to provide enough information to disambiguate the key pair from others stored in the vault,
+including:
+
+
The account locator
+
The account name (distinct from the account locator)
You need to configure secrets in GitHub actions
+in order for the service accounts to be able to connect to your Snowflake account.
+From the repository page, go to "Settings", then to "Secrets and variables", then to "Actions".
Upon completion of a project (or if you just went through the above for testing purposes)
+there are a few steps needed to tear down the infrastructure.
+
+
If the GitHub repository is to be handed off a client, transfer ownership of it to them.
+ Otherwise, delete or archive the GitHub repository.
+ If archiving, delete the GitHub actions secrets.
+
Open a Help Desk ticket with IT-Ops to remove Sentinel logging for the Snowflake account.
+
If the Snowflake account is to be handed off to a client, transfer ownership of it to them.
+ Otherwise, drop the account.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/search/search_index.json b/search/search_index.json
new file mode 100644
index 00000000..5bf983ee
--- /dev/null
+++ b/search/search_index.json
@@ -0,0 +1 @@
+{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"CalData Data Services and Engineering Infrastructure","text":"
This is the technical documentation for CalData's Data Services and Engineering (DSE) projects. It consists of processes, conventions, instructions, and architecture diagrams.
When deploying a new version of your infrastrucutre, Terraform diffs the current state against what you have specified in your infrastructure-as-code. The current state is tracked in a JSON document, which can be stored in any of a number of locations (including local files).
This project stores remote state using the S3 backend.
Different applications or environments can be isolated from each other by using different S3 buckets for holding their state. We reuse a terraform configuration (terraform/s3-remote-state) for setting up the S3 backend
Note
The S3 remote state configuration is not a proper module because it contains a provider block. Different deployments of the configuration are controlled by giving it different tfvars files, and capturing the outputs for use in a tfbackend file.
Here is an example set of commands for bootstrapping a new S3 backend for a deployment. Suppose the deployment is a QA environment of our Snowflake project:
cd terraform/snowflake/environments/qa # Go to the new environment directory\nmkdir remote-state # Create a remote-state directory\ncd remote-state\nln -s ../../../s3-remote-state/main.tf main.tf # symlink the s3 configuration\nterraform init # initialize the remote state backend\nterraform apply -var=\"owner=dse\" -var=\"environment=qa\" -var=\"project=snowflake\" # Create the infrastructure\nterraform output > ../dse-snowflake-qa.tfbackend # Pipe the outputs to a .tfbackend\n\ncd ..\nterraform init -backend-config=./dse-snowflake-qa.tfbackend # Configure the deployment with the new backend.\n
Terraform deployments include a lockfile with hashes of installed packages. Because we have mixed development environments (i.e., Macs locally, Linux in CI), it is helpful to include both Mac and Linux builds of terraform packages in the lockfile. This needs to be done every time package versions are updated:
terraform init -upgrade # Upgrade versions\nterraform providers lock -platform=linux_amd64 -platform=darwin_amd64 # include Mac and Linux binaries\n
"},{"location":"cloud-infrastructure/#requirements","title":"Requirements","text":"Name Version terraform >= 1.0 aws 4.56.0 random 3.4.3"},{"location":"cloud-infrastructure/#providers","title":"Providers","text":"Name Version aws 4.56.0 random 3.4.3"},{"location":"cloud-infrastructure/#modules","title":"Modules","text":"
No modules.
"},{"location":"cloud-infrastructure/#resources","title":"Resources","text":"Name Type aws_batch_compute_environment.default resource aws_batch_job_definition.default resource aws_batch_job_queue.default resource aws_ecr_repository.default resource aws_eip.this resource aws_iam_group.aae resource aws_iam_group_membership.aae resource aws_iam_group_policy_attachment.aae_dsa_project resource aws_iam_group_policy_attachment.aae_list_all_my_buckets resource aws_iam_group_policy_attachment.aae_self_manage_creentials resource aws_iam_policy.access_snowflake_loader resource aws_iam_policy.batch_submit_policy resource aws_iam_policy.default_ecr_policy resource aws_iam_policy.mwaa resource aws_iam_policy.s3_dsa_project_policy resource aws_iam_policy.s3_list_all_my_buckets resource aws_iam_policy.s3_scratch_policy resource aws_iam_policy.self_manage_credentials resource aws_iam_role.aws_batch_service_role resource aws_iam_role.batch_job_role resource aws_iam_role.ecs_task_execution_role resource aws_iam_role.mwaa resource aws_iam_role_policy_attachment.aws_batch_service_role resource aws_iam_role_policy_attachment.ecs_task_execution_access_snowflake_loader resource aws_iam_role_policy_attachment.ecs_task_execution_role_policy resource aws_iam_role_policy_attachment.mwaa_batch_submit_role resource aws_iam_role_policy_attachment.mwaa_execution_role resource aws_iam_role_policy_attachment.s3_scratch_policy_role_attachment resource aws_iam_user.arman resource aws_iam_user.cd_bot resource aws_iam_user.esa resource aws_iam_user.kim resource aws_iam_user.monica resource aws_iam_user.rocio resource aws_iam_user_policy_attachment.batch_cd_bot_policy_attachment resource aws_iam_user_policy_attachment.ecr_cd_bot_policy_attachment resource aws_internet_gateway.this resource aws_mwaa_environment.this resource aws_nat_gateway.this resource aws_route_table.private resource aws_route_table.public resource aws_route_table_association.private resource aws_route_table_association.public resource aws_s3_bucket.dof_demographics_public resource aws_s3_bucket.dsa_project resource aws_s3_bucket.mwaa resource aws_s3_bucket.scratch resource aws_s3_bucket_policy.dof_demographics_public_read_access resource aws_s3_bucket_public_access_block.dof_demographics_public resource aws_s3_bucket_public_access_block.mwaa resource aws_s3_bucket_versioning.dof_demographics_public resource aws_s3_bucket_versioning.dsa_project resource aws_s3_bucket_versioning.mwaa resource aws_security_group.batch resource aws_security_group.mwaa resource aws_subnet.private resource aws_subnet.public resource aws_vpc.this resource random_id.private_subnet resource random_id.public_subnet resource aws_availability_zones.available data source aws_caller_identity.current data source aws_iam_policy_document.access_snowflake_loader data source aws_iam_policy_document.assume data source aws_iam_policy_document.assume_role_policy data source aws_iam_policy_document.aws_batch_service_policy data source aws_iam_policy_document.batch_submit_policy_document data source aws_iam_policy_document.default_ecr_policy_document data source aws_iam_policy_document.dof_demographics_public_read_access data source aws_iam_policy_document.mwaa data source aws_iam_policy_document.s3_dsa_project_policy_document data source aws_iam_policy_document.s3_list_all_my_buckets data source aws_iam_policy_document.s3_scratch_policy_document data source aws_iam_policy_document.self_manage_credentials data source"},{"location":"cloud-infrastructure/#inputs","title":"Inputs","text":"Name Description Type Default Required environment Deployment environment of the resource string\"dev\" no owner Owner of the resource string\"dse\" no project Name of the project the resource is serving string\"infra\" no region Region for AWS resources string\"us-west-2\" no snowflake_loader_secret ARN for SecretsManager login info to Snowflake with loader role stringnull no"},{"location":"cloud-infrastructure/#outputs","title":"Outputs","text":"Name Description state Resources from terraform-state"},{"location":"code-review/","title":"Conventions for Code Review and GitHub","text":"
This page documents the Data Services and Engineering Team's practices around GitHub-based development and code review.
"},{"location":"code-review/#creating-a-pull-request","title":"Creating a Pull Request","text":"
The process for GitHub-based development is:
Create a new branch from main:
git switch -c <branch-name>\n
Develop in your branch. Try to keep commits to single ideas. A branch for a good pull request can tell a story.
When your branch is ready for review, push it to GitHub:
git push <remote-name> <branch-name>\n
From the GitHub UI, open a pull request for merging your code into main.
Request one or more reviewers.
Go through one or several rounds of review, making changes to your branch as necessary. A healthy review is a conversation, and it's normal to have disagreements.
When the reviewer is happy, they can approve and merge the pull request!
In general, the author of a PR should not approve and merge their own pull request.
Delete your feature branch, it's in main now.
"},{"location":"code-review/#considerations-when-authoring-a-pull-request","title":"Considerations when authoring a pull request","text":""},{"location":"code-review/#have-empathy","title":"Have Empathy","text":"
We recommend reading \"Have empathy on pull requests\" and \"How about code reviews?\" as nice references for how to be empathetic as the opener of a pull request.
In particular, it's important to remember that you are the subject matter expert for a PR. The reviewer will likely not know anything about the path you took to a particular solution, what approaches did not work, and what tradeoffs you encountered. It's your job to communicate that context for reviewers to help them review your code. This can include comments in the GitHub UI, comments in the code base, and even self-reviews.
"},{"location":"code-review/#use-linters-and-formatters","title":"Use linters and formatters","text":"
Making use of code linters and formatters helps to establish a consistent style for a project and removes a whole whole class of common errors and disagreements. Even if one can take issue with specific conventions or rules, having them used consistently within a team pays big dividends over time.
Many of our projects use pre-commit to enforce the linter and formatter conventions. To set up your pre-commit environment locally (requires a Python development environment, run
pre-commit install\n
The next time you make a commit, the pre-commit hooks will run on the contents of your commit (the first time may be a bit slow as there is some additional setup).
"},{"location":"code-review/#try-to-avoid-excessive-merge-commits","title":"Try to avoid excessive merge commits","text":"
More merge commits in a PR can make review more difficult, as contents from unrelated work can appear in the code diff. Sometimes they are necessary for particularly large or long-running branches, but for most work you should try to avoid them. The following guidelines can help:
Usually branch from the latest main
Keep feature branches small and focused on a single problem. It is often helpful for both authors and reviewers to have larger efforts broken up into smaller tasks.
In some circumstances, a git rebase can help keep a feature branch easy to review and reason about.
If your pull request adds any new features or changes any workflows for users of the project, you should include documentation. Otherwise, the hard work you did to implement a feature may go unnoticed/unused! What this looks like will vary from project to project, but might include:
New functionality ideally should have automated tests. As with documentation, these tests will look different depending upon the project needs. A good test will:
Be separated from production environments
If fixing a bug, should actually fix the issue.
Guard against regressions
Not take so long as to be annoying to run.
Not rely on internal implementation details of a project (i.e. use public contracts)
One nice strategy for bugfixes is to write a failing test before making a fix, then verifying that the fix makes the test pass. It is surprisingly common for tests to accidentally not cover the behavior they are intended to.
"},{"location":"code-review/#reviewing-a-pull-request","title":"Reviewing a pull request","text":""},{"location":"code-review/#have-empathy_1","title":"Have Empathy","text":"
As above, reviewers should have empathy for the author of a pull request. You as a reviewer are unaware of the constraints and tradeoffs that an author might have encountered.
Some general tips for conducting productive reviews:
If there is something you might have done differently, come in with a constructive attitude and try to understand why the author took their approach.
Keep your reviews timely (ideally provide feedback within 24 hours)
Try to avoid letting a PR review stretch on too long. A branch with many review cycles stretching for weeks is demoralizing to code authors.
Remember that perfect is the enemy of the good. A PR that makes an improvement or is a concrete step forward can be merged without having to solve everything. It's perfectly reasonable to open up issues to capture follow-up work from a PR.
"},{"location":"code-review/#ci-should-pass","title":"CI should pass","text":"
Before merging a pull request, maintainers should make every effort to ensure that CI passes. Often this will require looking into the logs of a failed run to see what went wrong, and alerting the pull request author. Ideally, no pull request should be merged if there are CI failures, as broken CI in main can easily mask problems with other PRs, and a consistently broken CI can be demoralizing for maintainers.
However, in practice, there are occasionally flaky tests, broken upstream dependencies, and failures that are otherwise obviously not related to the PR at hand. If that is the case, a reviewer may merge a PR with failing tests, but they should be prepared to follow up with any failures that result from such an unsafe operation.
Note: these conventions and recommendations are partially drawn from maintainer guidelines for JupyterHub and Dask.
"},{"location":"codespaces/","title":"Developing using Codespaces","text":"
GitHub Codespaces allow you to spin up an ephemeral development environment in VS Code which includes a git repository, configurations, and pre-installed libraries. It provides an easy way for developers to get started working in a repository, especially if they are uncomfortable
"},{"location":"codespaces/#creating-a-codespace","title":"Creating a Codespace","text":"
Go to the \"Code\" dropdown from the main repository page, select the three dot dropdown, and select \"New with options...\" This will allow more configuration than the default codespace.
In the codespace configuration form, you will have an option to add \"Recommended Secrets\". This is where you can add your personal Snowflake credentials to your codespace, allowing for development against our Snowflake warehouse, including using dbt. You should only add credentials for accounts that are protected by multi-factor authentication (MFA).
After you have added your secrets, click \"Create Codespace\". Building it may take a few minutes, but then you should be redirected to a VS Code environment in your browser.
"},{"location":"codespaces/#launching-an-existing-codespace","title":"Launching an existing Codespace","text":"
Once your codespace is created, you should be able to launch it without re-creating it every time using the \"Code\" dropdown, going to \"Open in...\", and selecting \"Open in browser\":
"},{"location":"codespaces/#using-a-codespace","title":"Using a Codespace","text":"
Once you have created and configured a codespace, you have access to a relatively full-featured VS Code-based development environment. This includes:
When you launch a new codespace, it can take a couple of minutes for all of the extensions to install. In particular, this means that the Python environment may not be fully set-up when you land in VS Code. We recommend closing existing terminal sessions and starting a new one once the extensions have finished installing.
The first time you make a commit, the pre-commit hooks will be installed. This may take a few minutes. Subsequent commits will take less time.
If the pre-commit hooks fail when making a commit, it will give you the opportunity to open the git logs to view the errors. If you are unable to fix the errors for whatever reason, you can always make a new commit from the command line with --no-verify:
"},{"location":"dbt-performance/","title":"dbt Performance Evaluation and Tuning","text":""},{"location":"dbt-performance/#considerations-when-does-performance-matter","title":"Considerations: When Does Performance Matter?","text":"
In most settings, what is considered acceptable performance is relative to business needs and constraints. It's not atypical to deem the performance acceptable as long as there are no scheduling conflicts and models can run within a timeframe dictated by the frequency of the models running. In other words, if you need to run models every hour then the entire job cannot take longer than an hour to run. In general compute costs are not so high to necessarily be worth optimizing the underlying queries but may be high enough to optimize frequency or data size.
Although compute time is relatively cheap, it's sometimes possible with larger datasets that need to be frequently refreshed to optimize performance to save enough costs to be worth the time to optimize. In Snowflake you can easily monitor costs in the Admin/Usage section of the Snowflake UI, where you can see credits used by warehouse and role. Snowflake also provides several tables with meta information that can be used to derive exact costs for each query - an approach to this, with a ready-use-query can be found in the Select.dev blog post \"Calculating cost per query in Snowflake\"
Typically, unless model performance is obviously very poor you are better off adjusting the frequency of runs (end users almost always over-state their desire for data freshness) or reducing data set size either by limiting what you provide or by using incremental models.
In other words, very often the questions you should be asking are not in the category of SQL performance tuning but rather \"do we need this data to be this fresh?\" and \"do we need all this data?\".
Often performance issues show up in scheduling. If you are running jobs once a day it is extremely unlikely you will run into any scheduling conflicts. However, if a much higher frequency is required, it's possible for jobs to take longer than that time between runs. In this case a common first approach is to break up model runs so that things that don't need to run as frequently can run separately from models that require more frequent updating. A typical way of doing this is to either use dbt run --select or dbt tags dbt tags to select models in groups. This is not to say performance tuning of individual queries is never worth it but that the big macro gains come more from running models less frequently and/or with less data, e.g. using filtering or incremental models.
It is extremely important to balance time spent in optimizing model performance with compute costs and other concerns. If it takes you a day to optimize a model to run only a few seconds faster and save a few pennies per run, it's not likely worth the effort. Similarly, the use of incremental materilization can certainly reduce build time but introduce complexity and require a degree of monitoring to ensure integrity. See also Materialization Matters below.
"},{"location":"dbt-performance/#analyzing-performance-in-dbt","title":"Analyzing Performance in dbt","text":"
With every dbt run or build several artifacts are generated in the target/ directory, including the run_results.json file. This includes detailed information on run execution and many people parse this to create dashboards to report on dbt performance and help with optimization and cost monitoring. There is an important caveat here: simply knowing how long a model took to run is important to uncover which models might need optimization, but cannot tell you anything about why they are performing poorly.
"},{"location":"dbt-performance/#getting-model-timing-local-development","title":"Getting model timing: Local Development","text":"
Every time you run a model dbt outputs timing, information which you can easily use identify non-performance models. The output will look like:
14:16:39.438935 [info ] [Thread-4 ]: 136 of 160 OK created sql table model dbt_aerishan.JobControl_Published_CertEligActions [SUCCESS 1 in 43.00s]\n
This is extremely useful during development in order to understand potential problems with your models performance."},{"location":"dbt-performance/#getting-model-timing-dbt-cloud","title":"Getting model timing: dbt Cloud","text":"
dbt Cloud has a nicer interface for finding which models in a project are running longest. Visit the Deploy > Runs section of dbt Cloud. You'll see a full list of jobs and how long each one toook. To drill down to the model timing level click on a run name. You can expand the \"Invoke dbt build\" section under \"Run Summary\" to get a detailed summary of your run as well as timing for each model and test. There is also a \"Debug logs\" section for even more detail, including the exact queries run and an option to download the logs for easier viewing. Of course this is also where you go to find model and test errors and warnings!
For a quick visual reference of which models take up the most time in a run, click on the \"Model Timing\" tab. If you hover over a model you will be shown the specific timing.
"},{"location":"dbt-performance/#getting-model-timing-snowflake","title":"Getting model timing: Snowflake","text":"
Snowflake has quite a lot of performance data readily available through it's information_schema.QUERY_HISTORY() table function and several views in the Account Usage schema. This is great not only for finding expensive queries regardless of source and of course for all sorts of analytics on Snowflake usage, such as credits.
The Query History gives you real time data while the Account Usage is delayed. So Query History is great for analyzing your own queries in development and for current query performance in production.
Example Query: Get top time-consuming queries for the dbt Cloud production loads
SELECT\nquery_text, query_type, database_name, schema_name,\nuser_name, total_elapsed_time\nFROM\n-- query_history() is a table function\ntable (information_schema.query_history())\nWHERE user_name = 'DBT_CLOUD_SVC_USER_PRD'\nORDER BY total_elapsed_time DESC\nLIMIT 20\n
As you might have guessed this also lets you search for a model on query text, so you can find specific dbt models or classes of models:
The Account Usage schema (snowflake.account_usage) has multiple views that are of interest for monitoring not just query performance and credit usage but warehouse and database usage and more. This data is delayed 45 minutes but has a much longer history.
Example Query: Find the queries with highest total execution time this month for the dbt cloud production loads.
SELECT query_text,\nSUM(execution_time) / 60000 AS total_exec_time_mins,\nSUM(credits_used_cloud_services) AS total_credits_used\nfrom snowflake.account_usage.query_history\nWHERE\nstart_time >= date_trunc(month, current_date)\nAND user_name = 'DBT_CLOUD_SVC_USER_PRD'\nGROUP BY 1\nORDER BY total_exec_time_mins DESC\nLIMIT 20\n
Example Query: Get Credits used by Warehouse this month
select warehouse_name,\nsum(credits_used) as total_credits_used\nfrom warehouse_metering_history\nwhere start_time >= date_trunc(month, current_date)\ngroup by 1\norder by 2 desc;\n
"},{"location":"dbt-performance/#solutions-how-to-tackle-dbt-performance-issues","title":"Solutions: How to Tackle dbt Performance Issues","text":"
Now that you've identified which models might need optimization, it's time to figure out how to get them to run faster. These options are roughly in order of bang-for-buck in most situations.
"},{"location":"dbt-performance/#1-job-level-adjust-frequency-and-break-up-runs","title":"1. Job Level: Adjust Frequency and Break Up Runs","text":"
It's common for end-users to say they want the freshest data (who doesn't?) but in practice require a much lower frequency of refreshing. To gain an understand of the real-world needs it's helpful to see the frequency with which end-users actually view reporting and to consider the time scales involved. If someone only cares about monthly results, for example, you can in theory have a 30 day frequency for model runs. It's also quite common to have parts of the data be relatively static, and only need to be refreshed occasionally whereas other parts of the data might change much more often. An easy way to break up model runs is by using dbt tags.
In this case you can run certain models with: dbt run --select tag:daily Of course this works in dbt Cloud as well!
For more information refer to the dbt tags documentation.
"},{"location":"dbt-performance/#2-model-level-materialization-matters","title":"2. Model Level: Materialization Matters","text":"
For a good comparison of materialization options and their trade-offs see the Materialization Best Practices section of the dbt docs.
Views: Are a trade-off between build performance and read/reporting performance. In cases where you are using a BI tool, you should almost always use table materializations unless data storage size is an issue or refresh frequency is so high that cost or scheduling conflicts become a problem. In cases where performance at time of reporting is not an issue (say, you are generating an aggregated report on a monthly basis) then views can be a great way to cut run time. Another case where views can be a good option is with staging data of relatively small size, where your queries are relatively light-weight and you want to ensure fresh data without having to configure separate runs for those models.
Incremental Models: For a very large data sets, it can be essential to use incremental models. For this to work, you need some means of filtering records from the source table, typically using a timestamp. You then add a conditional block into your model to only select new records unless you're doing a full refresh. It's worth noting that incremental models can be tricky to get right and you will often want to implement some additional data integrity testing to ensure data is fresh and complete. For a more detailed discussion of Incremental Models, see Incremental models in-depth
An example in our current projects is the CalHR Ecos model stg_CertEligibles. This query takes over three minutes to run and no wonder - it generates 858 million rows! This is clearly a case where we should ask if we need all of that data or can filtered in someway and if the answer is yes, then we should consider using an incremental materialization.
A great many books have been written on this subject! The good news is that most of the tools we use provide excellent resources for analyzing query performance.
"},{"location":"dbt-performance/#write-or-read","title":"Write or Read?","text":"
Because models are often created using a CREATE TABLE... SELECT statement you need to separate out read from write performance to understand if the issue is that your original query is slow or you are simply moving a lot of data and it takes time. It's worth saying that the chances are good that if you are moving a lot of data you are also querying a lot of data and in fact both read and write may be very time consuming but this is not a given -- if you are doing lots of joins on big data sets along with aggregations that output a small number of rows, then probably your model performance is read-bound. If this is the case the first question you should probably ask is can you break up that model into smaller chunks using staging and intermediate models.
A good way to get a sense of read vs write performance is to do one or more of: 1. Simply know the number of rows generated by the model (for some database dbt will output this in the output above). If you are creating tables with millions of rows you should probably consider an incremental model or reassess if you can filter and narrow your data somehow. 2. Use your database's query profiler, if available, to separate out what part of the execution is taking the most amount of time. In Snowflake for example, you can use the query profile to easily determine whether a query is rebound or write down and also determine where exactly other performance issues may lie. A CREATE TABLE with a simple select, for example, will show that the majority of time is spent in the CreateTableAsSelect node and only a fraction of the time in the Result node. Be careful if you are comparing queries across runs - most databases use caching and this will of course affect your results (see Caching Notes below). 3. Switch the materialization to view. Typically a view will take a fraction of the time to generate, and if that's the case you know your model is slow in writes. 4. Run the query separately in the database without the CREATE TABLE part. When you do this you can typically assess the execution plan
You can easily pull up the query profile for any query that has been run in Snowflake either from a worksheet or from the query history page. This includes queries run from dbt Cloud! This profile is essential in understanding the elements of your query that are most costly in terms of time, and which might be improved through optimization. Refer to the Analyzing Queries Using Query Profile page in the Snowflake Documentation for complete information including common problems and their solutions.
Big Query offers similar query execution profiling in the Google Cloud Console. See Query plan and timeline as well as Big Query's Introduction to optimizing query performance
Most databases use some type of caching which needs to be turned off in order to properly test performance. Snowflake uses both a results cache and a disk cache, but only one can be turned off with a session variable:
alter session set use_cached_result = false;\n
See this in-depth discussion for more details: Deep Dive on Snowflake Caching A general workaround (other than to shutdown and restart the warehouse) is to use slightly different result sets which do the same operations and return the same number of rows."},{"location":"dbt-performance/#local-development-tips","title":"Local Development Tips","text":"
Use Tags or dbt run --select to limit what you are building
Use --select state:modified+ result:error+ to limit runs
You can also include limits on data when working with a development target, e.g. {% if target.name = 'dev' %} LIMIT 500 {% endif %}
"},{"location":"dbt/","title":"dbt on the Data Services and Engineering team","text":""},{"location":"dbt/#naming-conventions","title":"Naming conventions","text":"
Models in a data warehouse do not follow the same naming conventions as raw cloud resources, as their most frequent consumers are analytics engineers and data analysts.
The following conventions are used where appropriate:
Dimension tables are prefixed with dim_.
Fact tables are prefixed with fct_.
Staging tables are prefixed with stg_.
Intermediate tables are prefixed with int_.
We may adopt additional conventions for denoting aggregations, column data types, etc. in the future. If during the course of a project's model development we determine that simpler human-readable names work better for our partners or downstream consumers, we may drop the above prefixing conventions.
dbt's default method for generating custom schema names works well for a single-database setup:
It allows development work to occur in a separate schema from production models.
It allows analytics engineers to develop side-by-side without stepping on each others toes.
A downside of the default is that production models all get a prefix, which may not be an ideal naming convention for end-users.
Because our architecture separates development and production databases, and has strict permissions protecting the RAW database, there is less danger of breaking production models. So we use our own custom schema name following the modified from the approach of the GitLab Data Team.
In production, each schema is just the custom schema name without any prefix. In non-production environments the default is used, where analytics engineers get the custom schema name prefixed with their target schema name (i.e. dbt_username_schemaname), and CI runs get the custom schema name prefixed with a CI job name.
This approach may be reevaluated as the project matures.
"},{"location":"dbt/#developing-against-production-data","title":"Developing against production data","text":"
Our Snowflake architecture allows for reasonably safe SELECTing from the production RAW database while developing models. While this could be expensive for large tables, it also allows for faster and more reliable model development.
To develop against production RAW data, first you need someone with the USERADMIN role to grant rights to the TRANSFORMER_DEV role (this need only be done once, and can be revoked later):
USE ROLE USERADMIN;\nGRANT ROLE RAW_PRD_READ TO ROLE TRANSFORMER_DEV;\n
Note
This grant is not managed via terraform in order to keep the configurations of different environments as logically separate as possible. We may revisit this decision should the manual grant cause problems.
You can then run dbt locally and specify the RAW database manually:
Our approach is adapted from this blog post. The goals of establishing a naming convention are: 1. Prevent name collisions between similar resources (especially in cases where names are required to be unique). 1. Allow developers to identify at a glance what a particular resource is and who owns it. 1. Structured naming allows for easier sorting and filtering of resources.
Where {...} indicates a component in the name, and [{...}] indicates that it is optional or conditionally required. Component Description Required Constraints owner Owner of the resource \u2714 len 3-6 project Project name \u2714 len 4-10, a-z0-9 env Environment type, e.g. dev, prd, stg \u2714 len 3, a-z, enum region Region (if applicable) \u2717 enum description Additional description (if needed) \u2717 len 1-20, a-z0-9 suffix Random suffix (only use if there are multiple identical resources) \u2717 len 3, a-z0-9
Owner: This is a required field. For most of our projects, it will be dse (for Data Services and Engineering), though it could be other things for projects that we will be handing off to clients upon completion.
Project: A short project name. This is a required field. For general DSE infrastructure, use infra.
Environment: The deployment environment. This is a required field. Generally prd (production), stg (staging), or dev (development).
Region: If the resource exists in a particular region (e.g. us-west-1), this should be included.
Description: There may be multiple resources that are identical with respect to the above parameters, but have a different purpose. In that case, append a description to the name to describe that purpose. For instance, we might have multiple subnets, some of which are public and some of which are private. Or we could have multiple buckets for storing different kinds data within the same project.
Suffix: If there all of the above are identical (including description), include a random suffix. This can be accomplished with the terraform random_id resource.
GCP Project: We have a GCP project for managing web/product analytics collaborations with the CalInnovate side of ODI. GCP projects to not exist in a region, so a production project could be called dse-product-analytics-prd.
MWAA Environment: An Apache Airflow environment in AWS does exist in a region, and supports general DSE infrastructure. So a development deployment of the environment could be dse-infra-dev-us-west-2.
Scratch bucket: We might have several S3 buckets supporting different aspects of a project. For instance, one bucket could be used for scratch work, and another could be used for source data. The scratch bucket could then be named dse-infra-dev-us-west-1-scratch.
Note that the {region} and {suffix} components are not included. This is because the region information is typically available elsewhere in the API/Console, and the suffix information is not semantically meaningful.
Not all resources can follow the above convention exactly. For instance, some resource names may not allow hyphens, or may have length limits. In those cases, we should try to adhere to the conventions as closely as possible (e.g., by substituting underscores for hyphens) and document the exception here.
"},{"location":"naming-conventions/#cloud-data-warehouse-schemas","title":"Cloud data warehouse schemas","text":"
Data warehouse schemas (or datasets in BigQuery) are often user/analyst-facing, and have different considerations. They usually cannot have hyphens in them, so words should be separated with underscores \"_\". Furthermore, analysts needn't need to know details of regions or deployments, so region and env are dropped, and the naming convention becomes:
{owner}_{project}_[{description}]\n
If a project is owned by the Data Services and Engineering team, the owner component may be omitted, and the schema name is simply
{project}_[{description}]\n
Note that Snowflake normalizes all object names to upper case. This is opposite to how PostgreSQL normalizes object names (sigh). Most of the time this doesn't matter, but occasionally requires thought if you have a mixed-case object name. If you are naming new database tables or schemas, mixed-case identifiers should be avoided."},{"location":"naming-conventions/#dbt","title":"dbt","text":"
The names of tables loaded by Fivetran are typically set by either Fivetran or the names of the tables in the source systems. As such, we don't have much control over them, and they won't adhere to any particular naming conventions.
Fivetran connectors names cannot contain hyphens, and should follow this pattern:
The schemas into which a fivetran connector is writing should be named the same as the connector (which is why the connector name has some seemingly redundant information).
If a project is owned by the Data Services and Engineering team, the owner component may be omitted, and the schema name is simply
The DSE team regularly creates new Snowflake accounts in our Snowflake org. We do this instead of putting all of our data projects into our main account for a few reasons:
At the end of a project, we often want to transfer account ownership to our partners. Having it separated from the start helps that process.
We frequently want to add our project champion or IT partners to our account as admins. This is safer if project accounts are separate.
We often want to have accounts in a specific cloud and region for compliance or data transfer regions.
Different projects may require different approaches to account-level operations like OAuth/SAML.
Here we document the steps to creating a new Snowflake account from scratch.
"},{"location":"new-project-setup/#prerequisites","title":"Prerequisites","text":""},{"location":"new-project-setup/#obtain-permissions-in-snowflake","title":"Obtain permissions in Snowflake","text":"
In order to create a new account, you will need access to the orgadmin role. If you have accountadmin in the primary Snowflake account, you can grant it to yourself:
USE ROLE accountadmin;\nGRANT ROLE orgadmin TO USER <YOUR-USER>;\n
If you later want to revoke the orgadmin role from your user or any other, you can do so with:
USE ROLE accountadmin;\nREVOKE ROLE orgadmin FROM USER <YOUR-USER>;\n
"},{"location":"new-project-setup/#get-access-to-aws","title":"Get access to AWS","text":"
We typically create our Snowflake architecture using Terraform. Terraform state is stored in S3 buckets within our AWS account, so you will need read/write access to those buckets.
Ask a DSE AWS admin to give you access to these buckets, and configure your AWS credentials.
You can install Terraform using whatever approach makes sense for your system, including using brew or conda.
Here is a sample for installing the dependencies using conda:
conda create -n infra python=3.10 # create an environment named 'infra'\nconda activate infra # activate the environment\nconda install -c conda-forge terraform tflint # Install terraform and tflint\n
"},{"location":"new-project-setup/#snowflake-account-setup","title":"Snowflake account setup","text":""},{"location":"new-project-setup/#create-the-account","title":"Create the account","text":"
Assume the ORGADMIN role
Under the \"Admin\" side panel, go to \"Accounts\" and click the \"+ Account\" button:
Select the cloud and region appropriate to the project. The region should be in the United States.
Select \"Business Critical\" for the Snowflake Edition.
You will be prompted to create an initial user with ACCOUNTADMIN privileges. This should be you. You will be prompted to create a password for your user. Create one using your password manager, but know that it will ask you change your password upon first log-in.
Save the Account Locator and Account URL for your new account.
Log into your new account. You should be prompted to change your password. Save the updated password in your password manager.
"},{"location":"new-project-setup/#enable-multi-factor-authentication-for-your-user","title":"Enable multi-factor authentication for your user","text":"
Ensure the Duo Mobile app is installed on your phone.
In the upper-left corner of the Snowsight UI, click on your username, and select \"Profile\"
At the bottom of the dialog, select \"Enroll\" to enable multi-factor authentication.
Follow the instructions to link the new account with your Duo app.
"},{"location":"new-project-setup/#set-up-key-pair-authentication","title":"Set up key pair authentication","text":"
Certain Snowflake clients don't properly cache MFA tokens, which means that using them can generate dozens or hundreds of MFA requests on your phone. At best this makes the tools unusable, and at worst it can lock your Snowflake account. One example of such a tool is (as of this writing) the Snowflake Terraform Provider.
The recommended workaround for this is to add a key pair to your account for use with those tools.
Follow the instructions given here to generate a key pair and add the public key to your account. Keep the key pair in a secure place on your device. This gist contains the bash commands from the instructions, and can be helpful for quickly creating a new encrypted key pair. Usage of the script looks like:
You can use pbcopy < _your_public_key_file_name_.pub to copy the contents of your public key. Be sure to remove the ----BEGIN PUBLIC KEY---- and -----END PUBLIC KEY------ portions when adding your key to your Snowflake user.
In your local .bash_profile or an .env file, add environment variables for SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH, and (if applicable) SNOWFLAKE_PRIVATE_KEY_PASSPHRASE.
"},{"location":"new-project-setup/#apply-a-session-policy","title":"Apply a session policy","text":"
By default, Snowflake logs out user sessions after four hours of inactivity. ODI's information security policies prefer that we log out after one hour of inactivity for most accounts, and after fifteen minutes of inactivity for particularly sensitive accounts.
Note
It's possible we will do this using Terraform in the future, but at the time of this writing the Snowflake Terraform provider does not support session policies.
After the Snowflake account is created, run the following script in a worksheet to set the appropriate session policy:
use role sysadmin;\ncreate database if not exists policies;\ncreate session policy if not exists policies.public.account_session_policy\nsession_idle_timeout_mins = 60\nsession_ui_idle_timeout_mins = 60\n;\nuse role accountadmin;\n-- alter account unset session policy; -- unset any previously existing session policy\nalter account set session policy policies.public.account_session_policy;\n
Create a new git repository from the CalData Infrastructure Template following the instructions here.
Once you have created the repository, push it to a remote repository in GitHub. There are some GitHub actions that will fail because the repository is not yet configured to work with the new Snowflake account.
"},{"location":"new-project-setup/#deploy-project-infrastructure-using-terraform","title":"Deploy project infrastructure using Terraform","text":"
We will create two separate deployments of the project infrastructure, one for development, and one for production. In some places we will refer to project name and owner as <project> and <owner>, respectively, following our naming conventions. You should substitute the appropriate names there.
"},{"location":"new-project-setup/#create-the-dev-configuration","title":"Create the dev configuration","text":"
Ensure that your environment has environment variables set for SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH, and SNOWFLAKE_PRIVATE_KEY_PASSPHRASE. Make sure you don't have any other SNOWFLAKE_* variables set, as they can interfere with authentication.
In the new git repository, create a directory to hold the development Terraform configuration:
mkdir -p terraform/environments/dev/\n
The location of this directory is by convention, and subject to change.
Copy the terraform configuration from here to your dev directory.
In the \"elt\" module of main.tf, change the source parameter to point to \"github.com/cagov/data-infrastructure.git//terraform/snowflake/modules/elt?ref=<ref>\" where <ref> is the short hash of the most recent commit in the data-infrastructure repository.
In the dev directory, create a new backend configuration file called <owner>-<project>-dev.tfbackend. The file will point to the S3 bucket in which we are storing terraform state. Populate the backend configuration file with the following (making sure to substitute values for <owner> and <project>):
In the dev directory, create a terraform variables file called terraform.tfvars, and populate the \"elt\" module variables. These variables may expand in the future, but at the moment they are just the new Snowflake account locator and the environment (in this case \"DEV\"):
Add your new main.tf, terraform.tfvars, <owner>-<project>-dev.tfbackend, and terraform lock file to the git repository. Do not add the .terraform/ directory.
"},{"location":"new-project-setup/#deploy-the-dev-configuration","title":"Deploy the dev configuration","text":"
Ensure that your local environment has environment variables set for SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PRIVATE_KEY_PATH, and SNOWFLAKE_PRIVATE_KEY_PASSPHRASE, and that they are set to your new account, rather than any other accounts.
Run terraform plan to see the plan for the resources that will be created. Inspect the plan to see that everything looks correct.
Run terraform apply to deploy the configuration. This will actually create the infrastructure!
"},{"location":"new-project-setup/#configure-and-deploy-the-production-configuration","title":"Configure and deploy the production configuration","text":"
Re-run all of the steps above, but in a new directory terraform/environments/prd. Everywhere where there is a dev (or DEV), replace it with a prd (or PRD).
"},{"location":"new-project-setup/#set-up-sentinel-logging","title":"Set up Sentinel logging","text":"
ODI IT requires that systems log to our Microsoft Sentinel instance for compliance with security monitoring policies. The terraform configuration deployed above creates a service account for Sentinel which needs to be integrated.
Create a password for the Sentinel service account. In other contexts we prefer key pairs for service accounts, but the Sentinel integration requires password authentication. In a Snowflake worksheet run:
use role securityadmin;\nalter user sentinel_svc_user_prd set password = '<new-password>'\n
Store the Sentinel service account authentication information in our shared 1Password vault. Make sure to provide enough information to disambiguate it from others stored in the vault, including:
The account locator
The account name (distinct from the account locator)
The service account name
The public key
The private key
Create an IT Help Desk ticket to add the new account to our Sentinel instance. Share the 1Password item with the IT-Ops staff member who is implementing the ticket. If you've included all of the above information in the vault item, it should be all they need.
Within fifteen minutes or so of implementation it should be clear whether the integration is working. IT-Ops should be able to see logs ingesting, and Snowflake account admins should see queries from the Sentinel service user.
"},{"location":"new-project-setup/#set-up-ci-in-github","title":"Set up CI in GitHub","text":"
The projects generated from our infrastructure template need read access to the Snowflake account in order to do two things from GitHub actions:
Verify that dbt models in branches compile and pass linter checks
Generate dbt docs upon merge to main.
The terraform configurations deployed above create two service accounts for GitHub actions, a production one for docs and a dev one for CI checks.
"},{"location":"new-project-setup/#add-key-pairs-to-the-github-service-accounts","title":"Add key pairs to the GitHub service accounts","text":"
Set up key pairs for the two GitHub actions service accounts (GITHUB_ACTIONS_SVC_USER_DEV and GITHUB_ACTIONS_SVC_USER_PRD). This follows a similar procedure to what you did for your personal key pair, though the project template currently does not assume an encrypted key pair. This bash script is a helpful shortcut for generating the key pair:
bash generate_key.sh <key-name>\n
Once you have created and set the key pairs, add them to the DSE 1Password shared vault. Make sure to provide enough information to disambiguate the key pair from others stored in the vault, including:
The account locator
The account name (distinct from the account locator)
The service account name
The public key
The private key
"},{"location":"new-project-setup/#set-up-github-actions-secrets","title":"Set up GitHub actions secrets","text":"
You need to configure secrets in GitHub actions in order for the service accounts to be able to connect to your Snowflake account. From the repository page, go to \"Settings\", then to \"Secrets and variables\", then to \"Actions\".
Add the following repository secrets:
Variable Value SNOWFLAKE_ACCOUNT new account locator SNOWFLAKE_USER_DEVGITHUB_ACTIONS_SVC_USER_DEVSNOWFLAKE_USER_PRDGITHUB_ACTIONS_SVC_USER_PRDSNOWFLAKE_PRIVATE_KEY_DEV dev service account private key SNOWFLAKE_PRIVATE_KEY_PRD prd service account private key"},{"location":"new-project-setup/#enable-github-pages-for-the-repository","title":"Enable GitHub pages for the repository","text":"
The repository must have GitHub pages enabled in order for it to deploy and be viewable.
From the repository page, go to \"Settings\", then to \"Pages\".
Under \"GitHub Pages visibility\" select \"Private\" (unless the project is public!).
Under \"Build and deployment\" select \"Deploy from a branch\" and choose \"gh-pages\" as your branch.
"},{"location":"new-project-setup/#tearing-down-a-project","title":"Tearing down a project","text":"
Upon completion of a project (or if you just went through the above for testing purposes) there are a few steps needed to tear down the infrastructure.
If the GitHub repository is to be handed off a client, transfer ownership of it to them. Otherwise, delete or archive the GitHub repository. If archiving, delete the GitHub actions secrets.
Open a Help Desk ticket with IT-Ops to remove Sentinel logging for the Snowflake account.
If the Snowflake account is to be handed off to a client, transfer ownership of it to them. Otherwise, drop the account.
This document describes security conventions for CalData's Data Services and Engineering team, especially as it relates to cloud and SaaS services.
"},{"location":"security/#cloud-security-and-iam","title":"Cloud Security and IAM","text":"
The major public clouds (AWS, GCP, Azure) all have a service for Identity and Access Management (IAM). This allows us to manage which users or services are able to perform actions on which resources. In general, IAM is described by:
Users (or principals) - some person or workflow which uses IAM to access cloud resources. Users can be assigned to groups.
Permissions - an ability to perform some action on a resource or collection of resources.
Groups - Rather than assigning permissions directly to users, it is considered good practice to instead create user groups with appropriate permissions, then add users to the group. This makes it easier to add and remove users while maintaining separate user personas.
Policies - a group of related permissions for performing a job, which can be assigned to a role or user.
Role - a group of policies for performing a workflow. Roles are similar to users, but do not have a user identity associated with them. Instead, they can be assumed by users or services to perform the relevant workflow.
Most of the work of IAM is managing users, permissions, groups, policies, and roles to perform tasks in a secure way.
"},{"location":"security/#principle-of-least-privilege","title":"Principle of Least-Privilege","text":"
In general, users and roles should be assigned permissions according to the Principle of Least Privilege, which states that they should have sufficient privileges to perform legitimate work, and no more. This reduces security risks should a particular user or role become compromised.
Both AWS and GCP have functionality for analyzing the usage history of principals, and can flag permissions that they have but are not using. This can be a nice way to reduce the risk surface area of a project.
Service accounts are special IAM principals which act like users, and are intended to perform actions to support a particular workflow. For instance, a system might have a \"deploy\" service account in CD which is responsible for pushing code changes to production on merge.
Some good practices around the use of service accounts (largely drawn from here):
Service accounts often have greater permissions than human users, so user permissions to impersonate these accounts should be monitored!
Don't use service accounts during development (unless testing the service account permissions). Instead, use your own credentials in a safe development environment.
Create single-purpose service accounts, tied to a particular application or process. Different applications have different security needs, and being able to edit or decommission accounts separately from each other is a good idea.
Regularly rotate access keys for long-term service accounts.
"},{"location":"security/#production-and-development-environments","title":"Production and Development Environments","text":"
Production environments should be treated with greater care than development ones. In the testing and developing of a service, roles and policies are often crafted which do not follow the principal of least privilege (i.e., they have too many permissions).
When productionizing a service or application, make sure to review the relevant roles and service accounts to ensure they only have the necessary policies, and that unauthorized users don't have permission to assume those roles.
GCP's IAM documentation is a good read, and goes into much greater detail than this document on how to craft and maintain IAM roles.
Default GCP service accounts often have more permissions than are strictly needed for their intended operation. For example, they might have read/write access to all GCS buckets in a project, when their application only requires access to one.
Often a third-party software-as-a-service (SaaS) provider will require service accounts to access resources within a cloud account. For example, dbt requires fairly expansive permissions within your cloud data warehouse to create, transform, and drop data.
Specific IAM roles needed for a SaaS product are usually documented in their setup guides. These should be periodically reviewed by CalData and ODI IT-Ops staff to ensure they are still required.
Fivetran's security docs which link to a deeper dive white paper are a good place to go to understand their standards and policies for connecting, replicating, and loading data from all of our data sources.
Within the Users & Permissions section of our Fivetran account there are three sub-sections for: Users, Roles, and Teams.
Fivetran also provides detailed docs on Role-Based Access Controls (RBAC) which covers the relationship among users, roles, and teams when defining RBAC policies.
This diagram from their docs gives an at-a-glance view of how RBAC could be configured across multiple teams, destinations, and connectors. Since we aren't a massive team, we may not have as much delegation, but this gives you a sense of what's possible.
See diagram in context of docs.
Note: With the recent creation of organizations in Fivetran, client project security policies get simplified as we can isolate them from our other projects by creating a separate account. This is also better from a handoff perspective. The Roles page defines all default role types and how many users are associated with each type. There is also the ability to create custom roles via the + Add Role button.
Currently, we have two teams: - IT - Data Services and Engineering (DSE)
Our IT team's role is Account Billing and User Access. This is a custom role that provides billing and user management access with view-only access for the remaining account-level features; it provides no destination or connector access.
The DSE team both manages CalData projects and onboards clients into Fivetran, and so its members have Account Administrator roles.
"},{"location":"security/#iam-through-infrastructure-as-code","title":"IAM through infrastructure-as-code","text":"
These are instructions for individual contributors to set up the repository locally. For instructions on how to develop using GitHub Codespaces, see here.
"},{"location":"setup/#install-dependencies","title":"Install dependencies","text":""},{"location":"setup/#1-set-up-a-python-virtual-environment","title":"1. Set up a Python virtual environment","text":"
Much of the software in this project is written in Python. It is usually worthwhile to install Python packages into a virtual environment, which allows them to be isolated from those in other projects which might have different version constraints.
One popular solution for managing Python environments is Anaconda/Miniconda. Another option is to use pyenv. Pyenv is lighter weight, but is Python-only, whereas conda allows you to install packages from other language ecosystems.
Here are instructions for setting up a Python environment using Miniconda:
Follow the installation instructions for installing Miniconda.
Create a new environment called infra:
conda create -n infra -c conda-forge python=3.10 poetry\n
The following pronpt will appear, \"The following NEW packages will be INSTALLED: \" You'll have the option to accept or reject by typing y or n. Type y 3. 4. Activate the infra environment:
Any time the dependencies change, you can re-run the above command to update them.
"},{"location":"setup/#3-install-go-dependencies","title":"3. Install go dependencies","text":"
We use Terraform to manage infrastructure. Dependencies for Terraform (mostly in the go ecosystem) can be installed via a number of different package managers.
If you are running Mac OS, you can install you can install these dependencies with Homebrew. First, install Homebrew
In order to use Snowflake (as well as the terraform validators for the Snowflake configuration) you should set some default local environment variables in your environment. This will depend on your operating system and shell. For Linux and Mac OS systems, as well as users of Windows subsystem for Linux (WSL) it's often set in ~/.zshrc, ~/.bashrc, or ~/.bash_profile.
If you use zsh or bash, open your shell configuration file, and add the following lines:
This will enable you to perform transforming activities which is needed for dbt. Open a new terminal and verify that the environment variables are set.
This will enable you to perform loading activities and is needed to which is needed for Airflow or Fivetran. Again, open a new terminal and verify that the environment variables are set.
"},{"location":"setup/#configure-aws-and-gcp-optional","title":"Configure AWS and GCP (optional)","text":""},{"location":"setup/#aws","title":"AWS","text":"
In order to create and manage AWS resources programmatically, you need to create access keys and configure your local setup to use them:
Install the AWS command-line interface.
Go to the AWS IAM console and create an access key for yourself.
In a terminal, enter aws configure, and add the access key ID and secret access key when prompted. We use us-west-2 as our default region.
The connection information for our data warehouses will, in general, live outside of this repository. This is because connection information is both user-specific usually sensitive, so should not be checked into version control. In order to run this project locally, you will need to provide this information in a YAML file located (by default) in ~/.dbt/profiles.yml.
Instructions for writing a profiles.yml are documented here, as well as specific instructions for Snowflake.
You can verify that your profiles.yml is configured properly by running
The target name (dev) in the above example can be anything. However, we treat targets named prd differently in generating custom dbt schema names (see here). We recommend naming your local development target dev, and only include a prd target in your profiles under rare circumstances.
You can include profiles for several databases in the same profiles.yml, (as well as targets for production), allowing you to develop in several projects using the same computer.
"},{"location":"setup/#example-vs-code-setup","title":"Example VS Code setup","text":"
This project can be developed entirely using dbt Cloud. That said, many people prefer to use more featureful editors, and the code quality checks that are set up here are easier to run locally. By equipping a text editor like VS Code with an appropriate set of extensions and configurations we can largely replicate the dbt Cloud experience locally. Here is one possible configuration for VS Code:
Install some useful extensions (this list is advisory, and non-exhaustive):
dbt Power User (query previews, compilation, and auto-completion)
Python (Microsoft's bundle of Python linters and formatters)
sqlfluff (SQL linter)
Configure the VS Code Python extension to use your virtual environment by choosing Python: Select Interpreter from the command palette and selecting your virtual environment from the options.
Associate .sql files with the jinja-sql language by going to Code -> Preferences -> Settings -> Files: Associations, per these instructions.
Test that the vscode-dbt-power-user extension is working by opening one of the project model .sql files and pressing the \"\u25b6\" icon in the upper right corner. You should have query results pane open that shows a preview of the data.
This project uses pre-commit to lint, format, and generally enforce code quality. These checks are run on every commit, as well as in CI.
To set up your pre-commit environment locally run
pre-commit install\n
The next time you make a commit, the pre-commit hooks will run on the contents of your commit (the first time may be a bit slow as there is some additional setup).
You can verify that the pre-commit hooks are working properly by running
pre-commit run --all-files\n
to test every file in the repository against the checks.
Some of the checks lint our dbt models and Terraform configurations, so having the terraform dependencies installed and the dbt project configured is a requirement to run them, even if you don't intend to use those packages.
The setup of our account is adapted from the approach described in this dbt blog post, which we summarize here.
Note
We have development and production environments, which we denote with _DEV and _PRD suffixes on Snowflake objects. For that reason, some of the names here are not exactly what exists in our deployment, but are given in the un-suffixed form for clarity.
RAW_{env}: This holds raw data loaded from tools like Fivetran or Airflow. It is strictly permissioned, and only loader tools should have the ability to load or change data.
TRANSFORM_{env}: This holds intermediate results, including staging data, joined datasets, and aggregations. It is the primary database where development/analytics engineering happens.
ANALYTICS_{env}: This holds analysis/BI-ready datasets. This is the \"marts\" database.
There are warehouse groups for processing data in the databases, corresponding to the primary purposes of the above databases. They are available in a few different sizes, depending upon the needs of the the data processing job, X-small (denoted by (XS), X-Large (denoted by (XL), and 4X-Large (denoted by 4XL). Most jobs on small data should use the relevant X-small warehouse.
LOADING_{size}_{env}: These warehouse is for loading data to RAW.
TRANSFORMING_{size}_{env}: This warehouse is for transforming data in TRANSFORM and ANALYTICS.
REPORTING_{size}_{env}: This warehouse is the role for BI tools and other end-users of the data.
LOADER_{env}: This role is for tooling like Fivetran or Airflow to load raw data in to the RAW database.
TRANSFORMER_{env}: This is the analytics engineer/dbt role, for transforming raw data into something analysis-ready. It has read/write/control access to both TRANSFORM and ANALYTICS, and read access to RAW.
REPORTER_{env}: This role read access to ANALYTICS, and is intended for BI tools and other end-users of the data.
READER_{env}: This role has read access to all three databases, and is intended for CI service accounts to generate documentation.
"},{"location":"snowflake/#access-roles-vs-functional-roles","title":"Access Roles vs Functional Roles","text":"
We create a two layer role hierarchy according to Snowflake's guidelines:
Access Roles are roles giving a specific access type (read, write, or control) to a specific database object, e.g., \"read access on RAW\".
Functional Roles represent specific user personae like \"developer\" or \"analyst\" or \"administrator\". Functional roles are built by being granted a set of Access Roles.
There is no technical difference between access roles and functional roles in Snowflake. The difference lies in the semantics and hierarchy that we impose upon them.
Our security policies and norms for Snowflake are following the best practices laid out in this article, these overview docs, and conversations had with our Snowflake representatives.
"},{"location":"snowflake/#use-federated-single-sign-on-sso-and-system-for-cross-domain-identity-management-scim-for-human-users","title":"Use Federated Single Sign-On (SSO) and System for Cross-domain Identity Management (SCIM) for human users","text":"
Most State departments will have a federated identity provider for SSO and SCIM. At the Office of Data and Innovation, we use Okta. Many State departments use Active Directory.
Most human users should have their account lifecycle managed through SCIM, and should log in via SSO.
Using SCIM with Snowflake requires creating an authorization token for the account. This token should be stored in DSE's shared 1Password vault, and needs to be manually rotated every six months.
"},{"location":"snowflake/#enable-multi-factor-authentication-mfa-for-users","title":"Enable multi-factor authentication (MFA) for users","text":"
Users, especially those with elevated permissions, should have multi-factor authentication enabled for their accounts. In some cases, this may be provided by their SSO identity provider, and in some cases this may use the built-in Snowflake MFA using Duo.
"},{"location":"snowflake/#use-auto-sign-out-for-snowflake-sessions","title":"Use auto-sign-out for Snowflake sessions","text":"
Ensure that CLIENT_SESSION_KEEP_ALIVE is set to FALSE in the account. This means that unattended browser windows will automatically sign out after a set amount of time (defaulting to one hour).
"},{"location":"snowflake/#follow-the-principle-of-least-privilege","title":"Follow the principle of least-privilege","text":"
In general, users and roles should be assigned permissions according to the Principle of Least Privilege, which states that they should have sufficient privileges to perform legitimate work, and no more. This reduces security risks should a particular user or role become compromised.
"},{"location":"snowflake/#create-service-accounts-using-terraform","title":"Create service accounts using Terraform","text":"
Service accounts aren't associated with a human user. Instead, they are created by an account administrator for the purposes of allowing another service to perform some action.
We currently use service accounts for:
Fivetran loading raw data
Airflow loading raw data
dbt Cloud for transforming data
GitHub actions generating docs
These service accounts are created using Terraform and assigned roles according to the principle of least-privilege. They use key pair authentication, which is more secure than password-based authentication as no sensitive data are exchanged. Private keys for service accounts should be stored in CalData's 1Password vault.
The following are steps for creating a new service account with key pair authentication:
Create a new key pair in accordance with these docs. Most of the time, you should create a key pair with encryption enabled for the private key.
Add the private key to the CalData 1Password vault, along with the intended service account user name and passphrase (if applicable)
Create a new user in the Snowflake Terraform configuration (users.tf) and assign it the appropriate functional role. Once the user is created, add its public key in the Snowflake UI:
ALTER USER <USERNAME> SET RSA_PUBLIC_KEY='MII...'\n
Note that we need to remove the header and trailer (i.e. -- BEGIN PUBLIC KEY --) as well as any line breaks in order for Snowflake to accept the public key as valid.
Add the private key for the user to whatever system needs to access Snowflake.
Service accounts should not be shared across different applications, so if one becomes compromised, the damage is more isolated.
"},{"location":"snowflake/#regularly-review-users-with-elevated-privileges","title":"Regularly review users with elevated privileges","text":"
Users with access to elevated privileges (especially the ACCOUNTADMIN, SECURITYADMIN, and SYSADMIN roles) should be regularly reviewed by account administrators.
Documentation for this project is built using mkdocs with the material theme and hosted using GitHub Pages. The documentation source files are in the docs/ directory and are authored using markdown.
To write documentation for this project, make sure that the build tools are installed. In a Python environment and in the data-infrastructure repo, you should be able to start a local server for the docs by running:
mkdocs serve\n
Then open a web browser to http://localhost:8000 to view the built docs. Any edits you make to the markdown sources should be automatically picked up, and the page should automatically rebuild and refresh.
Deployment of the docs for this repository is done automatically upon merging to main using the docs GitHub Action.
Built documentation is pushed to the gh-pages branch of the repository, and can be viewed by navigating to https://cagov.github.io/data-infrastructure.
The Data Services and Engineering team maintains a derived dataset from the Microsoft US Building Footprints and Global ML Building Footprints datasets. The two datasets are broadly similar, but the latter has global coverage and is more frequently updated.
We take the original datasets, and join them with US Census TIGER data to make them more useful for demographic and social science research. The additional census-derived fields include:
State
County
Tract
Block Group
Block
Place
If a footprint intersects more than one of the above, we assign it to the one with the greater intersection, so each footprint should only appear once in the dataset.
Note
Despite the names, these derived datasets are scoped to California only.
The data are stored as files in AWS S3. We distribute them in both GeoParquet and zipped Shapefile formats.
GeoParquet is usually a superior format for doing data analytics as it is:
An open format, based on the industry-standard Parquet format.
Efficiently compressed
Cloud-native
Uses a columnar data layout optimized for analytical workloads.
However, GeoParquet is also somewhat newer, and not supported by all tooling yet, so the zipped Shapefiles may be better suited for some workflows (especially Esri ones).
"},{"location":"data/footprints/#usage-with-geopandas","title":"Usage with GeoPandas","text":"
GeoPandas is an extension to the Python Pandas library enabling analysis of geopsatial vector data.
Examples for reading the files using GeoPanas:
import os\nimport geopandas\n\n# Ensure S3 requests are anonymous, there is no need for AWS credentials here.\nos.environ[\"AWS_NO_SIGN_REQUEST\"] = \"YES\"\n\n# Read GeoParquet using S3 URL\ngdf = geopandas.read_parquet(\n \"s3://dof-demographics-dev-us-west-2-public/\"\n \"global_ml_building_footprints/parquet/county_fips_003.parquet\"\n)\n\n# Read Shapefile using HTTPS URL\ngdf = geopandas.read_file(\n \"https://dof-demographics-dev-us-west-2-public.s3.us-west-2.amazonaws.com/\"\n \"global_ml_building_footprints/shp/county_fips_003.zip\"\n)\n
"},{"location":"data/footprints/#usage-with-arcgis-pro-toolbox","title":"Usage with ArcGIS Pro toolbox:","text":"
Fennis Reed at the California Department of Finance Demographics Research Unit has created an ArcGIS Pro toolbox for downloading individual footprint files, which can be downloaded here.
The following tables contains public links to the datasets partitioned by county. The HTTPS URLs can be used to directly download files using a web browser, while the S3 URLs are more appropriate for scripts like the examples above.
Cloud data warehouses (CDWs) are databases which are hosted in the cloud, and are typically optimized around analytical queries like aggregations and window functions, rather than the typical transactional queries that might support a traditional application. Examples of popular cloud data warehouses include Google BigQuery, Amazon Redshift, and Snowflake.
Cloud data warehouses typically have a few advantages over traditional transactional databases for analytical workflows, including:
They are usually managed services, meaning you don't have to provision and maintain servers.
They can scale to truly massive data.
By having a solid understanding of how cloud data warehouses work, you can construct fast, efficient queries and avoid surprise costs.
With most on-premise transactional warehouses, costs scale with the number of server instances you buy and run. These servers usually are always-on and power various applications with high availability. In a traditional transactional warehouse both compute power and storage are associated with the same logical machine.
Cloud data warehouses typically have a different pricing model: they decouple storage and compute and charge based on your query usage. Google BigQuery charges based on the amount of data your queries scan. Snowflake charges based on the amount of compute resources needed to execute your queries. There are also costs associated with data storage, but those are usually small compared to compute. Though these two models are slightly different, they both lead to a similar take-home lesson: by being careful with how data are laid out and accessed, you can significantly reduce both execution time and cost for your cloud data warehouses.
Most cloud data warehouses use columnar storage for their data. This means that data for each column of a table are stored sequentially in object storage (this is in contrast to transactional databases which usually store each row, or record, sequentially in storage). This BigQuery blog post goes into a bit more detail.
There are a number of consequences of using columnar storage:
You can read in columns separately from each other. So if your query only needs to look at one column of a several-hundred column table, it can do that without incurring the cost of loading and processing all of the other columns.
Because the values in a column are located near each other in device storage, it is much faster to read them all at once for analytical queries like aggregations or window functions. In row-based storage, there is much more jumping around to different parts of memory.
Having values of the same data type stored sequentially allows for much more efficient serialization and compression of the data at rest.
In addition to columnar storage, cloud data warehouses also usually divide tables row-wise into chunks called partitions. Different warehouses choose different sizing strategies for partitions, but they are typically from a few to a few hundred megabytes. Having separate logical partitions in a table allows the compute resources to process the partitions independently of each other in parallel. This massively parallel processing capability is a large part of what makes cloud data warehouses scalable. When designing your tables, you can often set partitioning strategies or clustering keys for the table. This tells the cloud data warehouse to store rows with similar values for those keys within the same partitions. A well-partitioned table can enable queries to only read from the partitions that it needs, and ignore the rest.
"},{"location":"learning/cloud-data-warehouses/#constructing-queries-for-cloud-data-warehouses","title":"Constructing queries for cloud data warehouses","text":"
With the above understanding of how cloud data warehouses store and process data, we can write down a set of recommendations for how to construct efficient queries for large tables stored within them:
Only SELECT the columns you need. Columnar storage allows you to ignore the columns you don't need, and avoid the cost of reading it in. SELECT * can get expensive!
If the table has a natural ordering, consider setting a partitioning or clustering key. For example, if the data in the table consists of events with an associated timestamp, you might want to partition according to that timestamp. Then events with similar times would be stored near each other in the same or adjacent partitions, and queries selecting for a particular date range would have to scan fewer partitions.
If the table has a partitioning or clustering key already set, try to filter based on that in your queries. This can greatly reduce the amount of data you need to scan.
Filter early in complex queries, rather than at the end. If you have complex, multi-stage queries, filtering down to the subset of interest at the outset can avoid the need to process unnecessary data and then throw it away later in the query.
Note
For people coming from transactional databases, the considerations about partitioning and clustering may seem reminiscent of indexes. Cloud data warehouses usually don't have traditional indexes, but partitioning and clustering keys fill approximately the same role, tailored to the distributed compute model.
"},{"location":"learning/cloud-data-warehouses/#primary-keys-and-constraints","title":"Primary keys and constraints","text":"
A central feature of cloud data warehouses is that storage is separate from compute, and data can be processed in parallel by distributed compute resources. The less communication that needs to happen between these distributed compute resources, the faster they can work. For this reason, most cloud data warehouses do not support primary keys, foreign keys, or other constraints.
For example: if we have a foreign key constraint set on a table and insert a new record, we would have to scan every single row of the parent table to see if the referenced row exists and is unique. If the table is large and partitioned, this could mean spinning up a large amount of compute resources, just to insert a single row. So rather than supporting constraints with horrible performance characteristics, cloud data warehouses just don't do it. This can be surprising to some people, since they often still include the syntax for constraints for SQL standard compatibility (see the Snowflake docs on constraints).
Note
One exception to the above is NOT NULL constraints, which can be done cheaply since they don't require information from other tables or partitions to be enforced.
This exercise is intended to be done live with collaborators. It should read fine, but will be more impactful if we set up a lab setting! We'll be querying Google Analytics 4 data stored in BigQuery. The dataset in question consists of (at the time of this writing) about six months of user event data collected from websites under ca.gov domain. It has over 500 million events in about 400 gigabytes of storage. The table is partitioned by event date, so all events on the same day get put in the same partition.
Suppose we want to analyze the breakdown of the different web browsers used to access state sites so we can understand which browsers are the highest priority to support. We expect this to be a moving target as different browsers become more or less popular, so we'll try to restrict our analysis to the month of January, 2023. Fortunately, the dataset has a timestamp column, so we can try to filter based on that column:
Yikes! This query scans the whole 400 GB dataset. Based on Google's approximately $5/TB charge, this costs about $2, and if it were a query we were running many times a day, it could easily start costing thousands of dollars per year.
"},{"location":"learning/cloud-data-warehouses/#take-advantage-of-column-pruning","title":"Take advantage of column pruning","text":"
You'll note that we are doing a SELECT * query, but if we're interested in browser usage, we really only need that column. So let's just SELECT that column:
By just selecting the column we wanted, we avoided loading a lot of unnecessary data, and now we are only scanning ~4 GB of data, reducing the charge by 99%!
"},{"location":"learning/cloud-data-warehouses/#take-advantage-of-partition-pruning","title":"Take advantage of partition pruning","text":"
In the above query we are filtering based on the event_timestamp field. However, the dataset actually has two different time-like fields, and it is partitioned based on the other one! The query planner is not smart enough to know that both fields contain similar information, and it is therefore not able to infer that we don't need to scan every partition to get the data within the January time window. Let's fix that by re-working the query to use the partitioned-by DATE field:
SELECT device_browser from `analytics_staging.base_ga4__events`\nWHERE event_date_dt >= '2023-01-01'\nAND event_date_dt <= '2023-01-31'\n
By using the field by which the table is partitioned in our filter, we reduced the data scanned by another factor of ~5 (as discussed above, this is analogous to using an index).
Many CalData projects use dbt for transforming and modeling data within our cloud data warehouses. dbt has become extremely popular over the last several years, popularizing the practice and position of \"analytics engineering\". It has a number of features that makes it valuable for data stacks:
It works well with version control
It encourages modular, reusable SQL code
It makes it easier to track data lineage as it flows through your data warehouse
It has a large, active community with which you can share tips and techniques
dbt provides a series of free courses for learning how to use the project:
dbt Fundamentals
Jinja, Macros, and Packages
Advanced Materializations
Refactoring SQL for Modularity
"},{"location":"learning/git/","title":"Git and GitHub","text":""},{"location":"learning/git/#what-are-git-and-github","title":"What are git and GitHub?","text":"
Git and GitHub occupy central positions in modern software development practices.
Git is software that you install locally on your machine that enables source code management (SCM). Code is organized into folder-like structures called git repositories, which enables you to track the history of code, safely develop features in side branches, and collaborate with others in a distributed fashion.
GitHub is a web platform for hosting git repositories. It integrates tightly with local git development workflows, and includes additional features like a code review user interface, issue tracking, project boards, continuous integration/continuous delivery, and social networking. There are a number of web platforms which have similar features and goals as GitHub (including Bitbucket and GitLab), but GitHub is the most commonly used.
In addition to the fundamentals of git, it's also helpful to know how to use the GitHub web platform for development. GitHub hosts an excellent set of interactive tutorials for learning to use its various features, including:
An introduction to GitHub
How to use Markdown for issues and READMEs
How to review pull requests
How to automate tasks and use CI/CD with GitHub actions
"},{"location":"learning/git/#gitgithub-at-caldata","title":"Git+GitHub at CalData","text":"
On the CalData Data Services and Engineering team we make heavy use of git and GitHub for our projects, and have our own set of guidelines and best practices for code review.
"},{"location":"learning/glossary/","title":"Modern Data Stack Glossary","text":"
This glossary is a reference for commonly used acronyms, terms, and tools associated with the modern data stack and data and analytics engineering practices.
Modern data stack - a cloud-first suite of software tools that enable data teams to connect to, process, store, transform, and visualize data.
ETL vs ELT \u2013 ETL (Extract, Transform and Load) and ELT (Extract, Load and Transform) are data integration methods that determine whether data is preprocessed before landing in storage or transformed after being stored.
Both methods have the same three operations:
Extraction: Pulling data from its original source system (e.g. connecting to data from a SaaS platform like Google Analytics)
Transformation: Changing the data\u2019s structure so it can be integrated into the target data system. (e.g. changing geospatial data from a JSON structure to a parquet format)
Loading: Dumping data into a storage system (e.g. AWS S3 bucket or GCS)
Advantages of ELT over ETL:
More flexibility, as ETL is traditionally intended for relational, structured data. Cloud-based data warehouses enable ELT for structured and unstructured data
Greater accessibility, as ETL is generally supported, maintained, and governed by organizations\u2019 IT departments. ELT allows for easier access and use by employees
Scalability, as ETL can be prohibitively resource-intensive for some businesses. ELT solutions are generally cloud-based SaaS, available to a broader range of businesses
Faster load times, as ETL typically takes longer as it uses a staging area and system. With ELT, there is only one load to the destination system
Faster transformation times, as ETL is typically slower and dependent on the size of the data set(s). ELT transformation is not dependent on data size
Less time required for data maintenance, as data may need to be re-sourced and re-loaded if the transformation is found to be inadequate for the data\u2019s intended purposes. With ELT, the original data is intact and already loaded from disk
Sources: Fivetran, Snowflake
Columnar Database vs Relational Database - a columnar database stores data by columns making it suitable for analytical query processing whereas a relational database stores data by rows making it optimized for transactional applications
Advantages of Columnar over Relational databases:
Reduces amount of data needed to be loaded
Improves query performance by returning relevant data faster (instead of going row by row, multiple fields can be skipped)
Cloud Data Warehouse - a database stored as a managed service in a public cloud optimized for scalable analytics.
Tools like Excel, Tableau, or PowerBI are limited in how much data can be brought into the dashboard. Cloud data warehouses, however, can handle petabyte scale data without too much fuss. Now, PowerBI or Tableau can also pass off data processing to a cloud data warehouse, but then major data processing jobs get hidden in a dashboard panel, which can produce unexpected spends, poor code reusability, and brittle dashboards.
Analytics engineering - applies software engineering practices to analytical workflows like version control and continuous integration/development. Analytics engineers are often thought of as a hybrid between data engineers and data analysts. Most analytics engineers spend their time transforming, modeling, testing, deploying, and documenting data. Data modeling \u2013 applying business logic to data to represent commonly known truths across an organization (for example, what data defines an order) \u2013 is the core of their workload and it enables analysts and other data consumers to answer their own questions.
Originally coined two years ago, by Michael Kaminsky, the term came from the ground-up, when data people experienced a shift in their job: they went from handling data engineer/scientist/analyst\u2019s tasks to spending most of their time fixing, cleaning, and transforming data. And so, they (mainly members of the dbt community) created a terminology to describe this middle seat role: the Analytics Engineer.
dbt comes in as a SQL-first transformation layer built for modern data warehousing and ingestion tools that centralizes data models, tests, and documentation.
Sources: Castor, dbt
Agile development - is an iterative approach to software development (and project management) that helps teams ship code faster and with fewer bugs.
Sprints - a time-boxed period (usually 2 weeks) when a team works to complete a set amount of work. Some sprints have themes like if a new tool was procured an entire sprint may be dedicated to setup and onboarding.
Additional reading: Atlassian: What is Agile?, Adobe: Project Sprints
Version control - enables teams to collaborate and streamline code development to resolve conflicts and create a centralized location for code.
Source: Gitlab: What is Version Control
CI/CD - Continuous integration and continuous delivery/deployment are automated processes to deploy code (e.g., whenever you merge to main).
Continuous integration (CI) automatically builds, tests, and integrates code changes within a shared repository
Continuous delivery (CD) automatically delivers code changes to production environments for human approval
~OR ~ Continuous deployment (CD) automatically delivers and deploys code changes directly, circumventing human approval
The major public clouds (AWS, GCP, Azure) all have a service for Identity and Access Management (IAM).
+This allows us to manage which users or services are able to perform actions on which resources.
+In general, IAM is described by:
+
+
Users (or principals) - some person or workflow which uses IAM to access cloud resources. Users can be assigned to groups.
+
Permissions - an ability to perform some action on a resource or collection of resources.
+
Groups - Rather than assigning permissions directly to users, it is considered good practice to instead create user groups with appropriate permissions, then add users to the group. This makes it easier to add and remove users while maintaining separate user personas.
+
Policies - a group of related permissions for performing a job, which can be assigned to a role or user.
+
Role - a group of policies for performing a workflow. Roles are similar to users, but do not have a user identity associated with them. Instead, they can be assumed by users or services to perform the relevant workflow.
+
+
Most of the work of IAM is managing users, permissions, groups, policies, and roles to perform tasks in a secure way.
In general, users and roles should be assigned permissions according to the
+Principle of Least Privilege,
+which states that they should have sufficient privileges to perform
+legitimate work, and no more. This reduces security risks should a particular
+user or role become compromised.
+
Both AWS and GCP have functionality for analyzing the usage history of principals,
+and can flag permissions that they have but are not using.
+This can be a nice way to reduce the risk surface area of a project.
Service accounts are special IAM principals which act like users,
+and are intended to perform actions to support a particular workflow.
+For instance, a system might have a "deploy" service account in CD which is
+responsible for pushing code changes to production on merge.
+
Some good practices around the use of service accounts
+(largely drawn from here):
+
+
Service accounts often have greater permissions than human users,
+ so user permissions to impersonate these accounts should be monitored!
+
Don't use service accounts during development (unless testing the service account permissions).
+ Instead, use your own credentials in a safe development environment.
+
Create single-purpose service accounts, tied to a particular application or process.
+ Different applications have different security needs,
+ and being able to edit or decommission accounts separately from each other is a good idea.
+
Regularly rotate access keys for long-term service accounts.
Production environments should be treated with greater care than development ones.
+In the testing and developing of a service, roles and policies are often crafted
+which do not follow the principal of least privilege (i.e., they have too many permissions).
+
When productionizing a service or application, make sure to review the relevant
+roles and service accounts to ensure they only have the necessary policies,
+and that unauthorized users don't have permission to assume those roles.
GCP's IAM documentation is a good read,
+and goes into much greater detail than this document on how to craft and maintain IAM roles.
+
Default GCP service accounts often have more permissions than are strictly needed
+for their intended operation. For example, they might have read/write access to all
+GCS buckets in a project, when their application only requires access to one.
Often a third-party software-as-a-service (SaaS) provider will require service accounts
+to access resources within a cloud account.
+For example, dbt requires fairly expansive permissions
+within your cloud data warehouse to create, transform, and drop data.
+
Specific IAM roles needed for a SaaS product are usually documented in their setup guides.
+These should be periodically reviewed by CalData and ODI IT-Ops staff to ensure they are still required.
Fivetran's security docs which link to a deeper dive white paper are a good place to go to understand their standards and policies for connecting, replicating, and loading data from all of our data sources.
+
Within the Users & Permissions section of our Fivetran account there are three sub-sections for: Users, Roles, and Teams.
This diagram from their docs gives an at-a-glance view of how RBAC could be configured across multiple teams, destinations, and connectors. Since we aren't a massive team, we may not have as much delegation, but this gives you a sense of what's possible.
Note: With the recent creation of organizations in Fivetran, client project security policies get simplified as we can isolate them from our other projects by creating a separate account. This is also better from a handoff perspective.
+The Roles page defines all default role types and how many users are associated with each type. There is also the ability to create custom roles via the + Add Role button.
+
Currently, we have two teams:
+- IT
+- Data Services and Engineering (DSE)
+
Our IT team's role is Account Billing and User Access. This is a custom role that provides billing and user management access with view-only access for the remaining account-level features; it provides no destination or connector access.
+
The DSE team both manages CalData projects and onboards clients into Fivetran, and so its members have Account Administrator roles.
These are instructions for individual contributors to set up the repository locally.
+For instructions on how to develop using GitHub Codespaces, see here.
Much of the software in this project is written in Python.
+It is usually worthwhile to install Python packages into a virtual environment,
+which allows them to be isolated from those in other projects which might have different version constraints.
+
One popular solution for managing Python environments is Anaconda/Miniconda.
+Another option is to use pyenv.
+Pyenv is lighter weight, but is Python-only, whereas conda allows you to install packages from other language ecosystems.
+
Here are instructions for setting up a Python environment using Miniconda:
+
+
Follow the installation instructions for installing Miniconda.
The following pronpt will appear, "The following NEW packages will be INSTALLED: "
+ You'll have the option to accept or reject by typing y or n. Type y
+3.
+4. Activate the infra environment:
+
We use Terraform to manage infrastructure.
+Dependencies for Terraform (mostly in the go ecosystem)
+can be installed via a number of different package managers.
+
If you are running Mac OS, you can install you can install these dependencies with Homebrew.
+First, install Homebrew
In order to use Snowflake (as well as the terraform validators for the Snowflake configuration)
+you should set some default local environment variables in your environment.
+This will depend on your operating system and shell. For Linux and Mac OS systems,
+as well as users of Windows subsystem for Linux (WSL) it's often set in
+~/.zshrc, ~/.bashrc, or ~/.bash_profile.
+
If you use zsh or bash, open your shell configuration file, and add the following lines:
This will enable you to perform transforming activities which is needed for dbt.
+Open a new terminal and verify that the environment variables are set.
This will enable you to perform loading activities and is needed to which is needed for Airflow or Fivetran.
+Again, open a new terminal and verify that the environment variables are set.
The connection information for our data warehouses will,
+in general, live outside of this repository.
+This is because connection information is both user-specific usually sensitive,
+so should not be checked into version control.
+In order to run this project locally, you will need to provide this information
+in a YAML file located (by default) in ~/.dbt/profiles.yml.
+
Instructions for writing a profiles.yml are documented
+here,
+as well as specific instructions for
+Snowflake.
+
You can verify that your profiles.yml is configured properly by running
The target name (dev) in the above example can be anything.
+However, we treat targets named prd differently in generating
+custom dbt schema names (see here).
+We recommend naming your local development target dev, and only
+include a prd target in your profiles under rare circumstances.
You can include profiles for several databases in the same profiles.yml,
+(as well as targets for production), allowing you to develop in several projects
+using the same computer.
This project can be developed entirely using dbt Cloud.
+That said, many people prefer to use more featureful editors,
+and the code quality checks that are set up here are easier to run locally.
+By equipping a text editor like VS Code with an appropriate set of extensions and configurations
+we can largely replicate the dbt Cloud experience locally.
+Here is one possible configuration for VS Code:
+
+
Install some useful extensions (this list is advisory, and non-exhaustive):
+
dbt Power User (query previews, compilation, and auto-completion)
+
Python (Microsoft's bundle of Python linters and formatters)
+
sqlfluff (SQL linter)
+
+
+
Configure the VS Code Python extension to use your virtual environment by choosing Python: Select Interpreter from the command palette and selecting your virtual environment from the options.
+
Associate .sql files with the jinja-sql language by going to Code -> Preferences -> Settings -> Files: Associations, per these instructions.
+
Test that the vscode-dbt-power-user extension is working by opening one of the project model .sql files and pressing the "▶" icon in the upper right corner. You should have query results pane open that shows a preview of the data.
This project uses pre-commit to lint, format,
+and generally enforce code quality. These checks are run on every commit,
+as well as in CI.
+
To set up your pre-commit environment locally run
+
pre-commitinstall
+
+
The next time you make a commit, the pre-commit hooks will run on the contents of your commit
+(the first time may be a bit slow as there is some additional setup).
+
You can verify that the pre-commit hooks are working properly by running
+
pre-commitrun--all-files
+
+to test every file in the repository against the checks.
+
Some of the checks lint our dbt models and Terraform configurations,
+so having the terraform dependencies installed and the dbt project configured
+is a requirement to run them, even if you don't intend to use those packages.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/sitemap.xml b/sitemap.xml
new file mode 100644
index 00000000..0f8724ef
--- /dev/null
+++ b/sitemap.xml
@@ -0,0 +1,3 @@
+
+
+
\ No newline at end of file
diff --git a/sitemap.xml.gz b/sitemap.xml.gz
new file mode 100644
index 00000000..a71f269c
Binary files /dev/null and b/sitemap.xml.gz differ
diff --git a/snowflake/index.html b/snowflake/index.html
new file mode 100644
index 00000000..b6b5214f
--- /dev/null
+++ b/snowflake/index.html
@@ -0,0 +1,1412 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Snowflake - CalData Data Services and Engineering Infrastructure
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
The setup of our account is adapted from the approach described in
+this dbt blog post,
+which we summarize here.
+
+
Note
+
We have development and production environments, which we denote with _DEV
+and _PRD suffixes on Snowflake objects. For that reason, some of the names here
+are not exactly what exists in our deployment, but are given in the un-suffixed
+form for clarity.
+
+
flowchart LR
+ Airflow((Airflow))
+ Fivetran((Fivetran))
+ subgraph RAW
+ direction LR
+ A[(SCHEMA A)]
+ B[(SCHEMA A)]
+ C[(SCHEMA A)]
+ end
+ DBT1((dbt))
+ subgraph TRANSFORM
+ direction LR
+ D[(SCHEMA A)]
+ E[(SCHEMA A)]
+ F[(SCHEMA A)]
+ end
+ DBT2((dbt))
+ subgraph ANALYTICS
+ direction LR
+ G[(SCHEMA A)]
+ H[(SCHEMA A)]
+ I[(SCHEMA A)]
+ end
+ PowerBI
+ Tableau
+ Python
+ R
+
+ Airflow -- LOADER --> RAW
+ Fivetran -- LOADER --> RAW
+ RAW -- TRANSFORMER --> DBT1
+ DBT1 -- TRANSFORMER --> TRANSFORM
+ TRANSFORM -- TRANSFORMER --> DBT2
+ DBT2 -- TRANSFORMER --> ANALYTICS
+ ANALYTICS -- REPORTER --> Tableau
+ ANALYTICS -- REPORTER --> Python
+ ANALYTICS -- REPORTER --> R
+ ANALYTICS -- REPORTER --> PowerBI
RAW_{env}: This holds raw data loaded from tools like Fivetran or Airflow. It is strictly permissioned, and only loader tools should have the ability to load or change data.
+
TRANSFORM_{env}: This holds intermediate results, including staging data, joined datasets, and aggregations. It is the primary database where development/analytics engineering happens.
+
ANALYTICS_{env}: This holds analysis/BI-ready datasets. This is the "marts" database.
There are warehouse groups for processing data in the databases,
+corresponding to the primary purposes of the above databases.
+They are available in a few different sizes, depending upon the needs of the the data processing job,
+X-small (denoted by (XS), X-Large (denoted by (XL), and 4X-Large (denoted by 4XL).
+Most jobs on small data should use the relevant X-small warehouse.
+
+
LOADING_{size}_{env}: These warehouse is for loading data to RAW.
+
TRANSFORMING_{size}_{env}: This warehouse is for transforming data in TRANSFORM and ANALYTICS.
+
REPORTING_{size}_{env}: This warehouse is the role for BI tools and other end-users of the data.
LOADER_{env}: This role is for tooling like Fivetran or Airflow to load raw data in to the RAW database.
+
TRANSFORMER_{env}: This is the analytics engineer/dbt role, for transforming raw data into something analysis-ready. It has read/write/control access to both TRANSFORM and ANALYTICS, and read access to RAW.
+
REPORTER_{env}: This role read access to ANALYTICS, and is intended for BI tools and other end-users of the data.
+
READER_{env}: This role has read access to all three databases, and is intended for CI service accounts to generate documentation.
We create a two layer role hierarchy according to Snowflake's
+guidelines:
+
+
Access Roles are roles giving a specific access type (read, write, or control) to a specific database object, e.g., "read access on RAW".
+
Functional Roles represent specific user personae like "developer" or "analyst" or "administrator". Functional roles are built by being granted a set of Access Roles.
+
+
There is no technical difference between access roles and functional roles in Snowflake. The difference lies in the semantics and hierarchy that we impose upon them.
Our security policies and norms for Snowflake are following the best practices laid out in
+this article,
+these overview docs,
+and conversations had with our Snowflake representatives.
+
Use Federated Single Sign-On (SSO) and System for Cross-domain Identity Management (SCIM) for human users¶
+
Most State departments will have a federated identity provider for SSO and SCIM.
+At the Office of Data and Innovation, we use Okta.
+Many State departments use Active Directory.
+
Most human users should have their account lifecycle managed through SCIM, and should log in via SSO.
+
Using SCIM with Snowflake requires creating an authorization token for the account.
+This token should be stored in DSE's shared 1Password vault,
+and needs to be manually rotated every six months.
+
Enable multi-factor authentication (MFA) for users¶
+
Users, especially those with elevated permissions, should have multi-factor authentication enabled for their accounts.
+In some cases, this may be provided by their SSO identity provider, and in some cases this may use the built-in Snowflake MFA using Duo.
Ensure that CLIENT_SESSION_KEEP_ALIVE is set to FALSE in the account.
+This means that unattended browser windows will automatically sign out after a set amount of time (defaulting to one hour).
In general, users and roles should be assigned permissions according to the
+Principle of Least Privilege,
+which states that they should have sufficient privileges to perform
+legitimate work, and no more. This reduces security risks should a particular
+user or role become compromised.
Service accounts aren't associated with a human user.
+Instead, they are created by an account administrator for
+the purposes of allowing another service to perform some action.
+
We currently use service accounts for:
+
+
Fivetran loading raw data
+
Airflow loading raw data
+
dbt Cloud for transforming data
+
GitHub actions generating docs
+
+
These service accounts are created using Terraform
+and assigned roles according to the principle of least-privilege.
+They use key pair authentication, which is more secure than password-based authentication as no sensitive data are exchanged.
+Private keys for service accounts should be stored in CalData's 1Password vault.
+
The following are steps for creating a new service account with key pair authentication:
+
+
Create a new key pair in accordance with these docs.
+ Most of the time, you should create a key pair with encryption enabled for the private key.
+
Add the private key to the CalData 1Password vault, along with the intended service account user name and passphrase (if applicable)
+
Create a new user in the Snowflake Terraform configuration (users.tf) and assign it the appropriate functional role.
+ Once the user is created, add its public key in the Snowflake UI:
+
ALTERUSER<USERNAME>SETRSA_PUBLIC_KEY='MII...'
+
+ Note that we need to remove the header and trailer (i.e. -- BEGIN PUBLIC KEY --) as well as any line breaks
+ in order for Snowflake to accept the public key as valid.
+
Add the private key for the user to whatever system needs to access Snowflake.
+
+
Service accounts should not be shared across different applications,
+so if one becomes compromised, the damage is more isolated.
Users with access to elevated privileges (especially the ACCOUNTADMIN, SECURITYADMIN, and SYSADMIN roles)
+should be regularly reviewed by account administrators.
Documentation for this project is built using mkdocs
+with the material theme
+and hosted using GitHub Pages.
+The documentation source files are in the docs/ directory
+and are authored using markdown.
To write documentation for this project, make sure that the build tools are installed.
+In a Python environment and in the data-infrastructure repo, you should be able to start a local server for the docs by running:
+
mkdocsserve
+
+
Then open a web browser to http://localhost:8000 to view the built docs.
+Any edits you make to the markdown sources should be automatically picked up,
+and the page should automatically rebuild and refresh.