[Bug] dbt-snowflake
raises an error when caching less than the maximum number of objects in a schema
#1234
Labels
bug
Something isn't working
Is this a new bug in dbt-snowflake?
Current Behavior
I'm seeing this error when running
dbt-snowflake
on a schema with 12,500 objects:I can confirm that there are less than 100,000 objects in this schema.
Expected Behavior
I should be able to run
dbt-snowflake
on a schema with 12,500 objects and the cache should reflect all 12,500 objects with no error.Steps To Reproduce
dbt run
on the full projectRelevant log output
No response
Environment
Additional Context
Initial analysis suggests this is the result of the 2024_07 Snowflake bundle. In particular, it looks like part of this change. We use the object name as a watermark when paginating. This is no longer deterministic, meaning that we could (and very likely do) get duplicates of relations and miss relations because we're picking a random starting point in a randomized list. In the past, this return recordset was ordered, making the pagination deterministic. We need to figure out a new watermark method.
The text was updated successfully, but these errors were encountered: