-
Hey, all, I have another question which one of my co-workers would like me to ask you. As background: we have an existing process that uses schemachange, but we'd like to stick with our declarative model (e.g., table creation script that contains all of a table's columns, as opposed to a creation script accompanied by several ALTER scripts that must be read in order to understand the table's current definition). We're essentially taking our repo to be the source of truth regarding the state of the d/b, as opposed to a golden d/b instance. Our plan is to use schemachange to run our repo into empty "scratch" database(s) and then use snowddl-convert followed by 'snowddl plan' to compare the scratch d/bs to our QA d/bs to generate a massive ALTER script which would then be fed to schemachange to get QA (and downstream environments) updated. Now the question: is anybody using SnowDDL (or any other tool) to generate plans (ALTER scripts) by directly comparing SQL source code to a target database, or is it all YAML-to-d/b or d/b-to-d/b comparisons? Thanks. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Interestingly enough, the very first implementation of this tool was based on SQL configs. We've parsed SQL to extract the desired object state and compared it with actual object state in Snowflake account. But we had to drop this idea relatively quickly and switched entirely to YAML. Unfortunately, SQL in Snowflake is not reliable. Reasons:
Essentially, creation of up-to-date parser is not possible without direct access to Snowflake internals. I highly suggest to take your time and switch SQL configs to YAML, or Terraform, or anything else well structured. It pays off in a long run. SQL configs might be fine initially, as long as you have small number of objects and / or avoid any advanced features. But when things get more complex, it will bite you. Also, as a bonus, well-strucutred format can be used outside of Snowflake (e.g. to generate dashboards, for monitoring). It can also help to automatically migrate the whole schema to another RDBMS. |
Beta Was this translation helpful? Give feedback.
-
Btw, please note. I use it as a good starting point while migrating larger accounts. But it is not something meant to be used regularly or in semi-automatic mode. It would take too much effort to support conversion of everything. Also, SnowDDL is a bit more restrictive. For example, you may have lower-cased table and column names in schemachange, but SnowDDL does not allow it. |
Beta Was this translation helpful? Give feedback.
Interestingly enough, the very first implementation of this tool was based on SQL configs. We've parsed SQL to extract the desired object state and compared it with actual object state in Snowflake account.
But we had to drop this idea relatively quickly and switched entirely to YAML. Unfortunately, SQL in Snowflake is not reliable.
Reasons: