-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Adding new schema dynamically #88
Comments
@advolut-team thanks for opening this issue! One quick adjustment, I am going to change this label from a Your request is something we have heard from other customers in the past (for example see this closed issue) and something we would be interested in adding onto the union schema feature. We had explored creating dynamic macros to query the information schema, but we continually ran into issues across warehouses and also with dbt interpreting jinja macros literally (as you highlighted) which resulted in us halting that exploration. That being said, we have not explored the environment variable you are proposing and this could be an interesting solution. 🤔 I'm a bit apprehensive with adding an environment variable to the package code as this is not something we have done in the past and would need to make sure this wouldn't conflict somehow with users dbt Core or Cloud environments. While I investigate that, would you be open to exploring a dbt source override and changing the schema to your proposed update and sharing if this works for your scenario? Additionally, you can create a fork and apply your changes and install your fork to test out the functionality. This will help us understand the viability of the proposed solution. |
@fivetran-joemarkiewicz Thanks for the fantastic suggestion! So adding the following source override in my parent
|
Hey, I can confirm that the source override fix works. I am now able to add a new schema dynamically in Redshift when creating a connector with the Fivetran connect card. Happy for you to close this ticket! :) Thank you very much |
That's great to hear and thanks for letting us know that this solution worked for you. I'm going to keep this ticket open in case others using the package want something similar to be a built in feature of the package. If others come across this and would like to see this functionality, please comment and let us know. Thanks again @advolut-team for opening this issue and collaborating with us on a solution that worked for you 😄 |
Is there an existing issue for this?
Describe the issue
Hi Fivetran team,
First off, thank you for the amazing work in maintaining this repo - it has been very useful :)
Problem:
I'd like to run a DBT transformation for more than 50 Shopify shops using the same dbt_shopify repo, which uses this repo as a "source" package. I understand that this feature is already available, but every time a new Shopify store signs up via the Fivetran Connect Card flow, I'll need to append the
shopify_union_schemas
variable, which is already a long list.So, I'd like to read the
shopify_schema
dynamically whenever a new Shopify store signs up to my website.Attempted Workaround:
dbt_shopify
repo with an additional environment variableDBT_SCHEMA_NAME=shopify_store_name
dbt_project.yml
, I modifiedshopify_schema
to read the environment variableDBT_SCHEMA_NAME
, like below:Request:
Ideal: Modify
dbt_packages/shopify_source/models/src_shopify.yml
in your code. I'm happy to send a PR.I'm also keen for any recommendations if you think my approach can be improved.
Relevant error log or model output
Expected behavior
Expected end result in Redshift:
Assuming the Shopify store is
env_var2
, this repo should generateenv_var2_stg_shopify
, and thedbt_shopify
repo will generateenv_var2_shopify
. Theenv_var2_final
schema is the "final" serving layer that has the derived tables from both repos. There will be 4 schemas in total.dbt Project configurations
Package versions
What database are you using dbt with?
redshift
dbt Version
Additional Context
No response
Are you willing to open a PR to help address this issue?
The text was updated successfully, but these errors were encountered: