-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Too many queries or query is too complex" issue #1689
Comments
I'm also experiencing this on 0.16.0, dbt-bigquery 1.8.2. I tried setting the the query_max_size and insert_rows_method parameters, as my impression was that this was an attempt make changes that would alleviate this issue, but I am still getting the same error messages as before. |
Same for me! |
Uploaded log file: dbt.log |
Hi @jmainwaring and all, Apologies for the delayed response here. This configuration (that was mentioned in on of the threads) should probably solve this issue:
Some extended context about this problem - BQ has two types of limits that affect insert queries:
The Thanks, |
Hi @haritamar , The log I posted above is the output from elementary actively failing with the configuration you are mentioning. Unless I have incorrectly set these parameters, they have failed to fix this issue. |
@cwmuller @belle-crisp Let us know if the config @haritamar suggested doesn't solve this 🙏🏻 |
I have the same experience as @cwmuller |
Hi @cwmuller @belle-crisp , @cwmuller another setting that might help is to tweak the (I saw it was suggested in one of the slack thread so not sure if you tried that as well) Thanks, |
Also if you can share the full list of vars you set up would be great 🙏 |
Uploaded a new log running with the following profiles config:
Will get back to you on vars. |
Hey @cwmuller , e.g.:
(I removed one zero from If you can check if this configurations makes any difference would be great. |
Hi @haritamar , Indeed! I moved the config from to Thanks for the assistance! |
That's great to hear @cwmuller ! |
Hi, we've added the suggested changes to the
But still see this message on our longer runs (Done. PASS=13380 WARN=48 ERROR=4 SKIP=1305 TOTAL=14737):
Do you have any further suggestions to help with this? |
Hi @nickozilla ! Itamar |
Hi @haritamar , The log events I have shared are from our logs aggregator tool, so I don't have the dbt.log file accessible currently, I'll see what else I can provide, our elementary version is 0.16.0 Nick |
Hello @haritamar elementary limitsdbt_artifacts_chunk_size: 10000 I am getting - |
We had to reduce the |
@cwmuller what were your other vars set to? & did you also experience this error?
|
@nickozilla
|
We are hoping to close this issue with the proposed solution, but please let me know if you are still experiencing this issue @benoitgoujon @wei-wei29 @chenxilll @kylemcleland @neethu2409 @belle-crisp |
Adding
to the dbt_project.yaml worked for me. I'm using Elementary on 0.16.1 and DBT on 1.8.8 (using dbt-bigquery 1.8.3) |
Thanks! That is helpful to know |
Ran into same problem with elementary=0.16.1, dbt=1.8.3 and warehouse as bigquery. Resolved with following config:
|
Hi everyone! So after a few iterations here and thanks to a contributor (thanks @Niteeshkanungo !), we actually realized that the "chunk" method had several bugs in it, and in particular the The Would love if people here can try the new version and comment here how it goes (would be great if you can try it first without any configuration vars, and only if it doesn't work with lower limits as posted above). Thanks, |
Seems to have fixed things for us, thanks for investigating. Can you share the PR with the bugfix, out of curiosity? EDIT: I can also confirm this works for us without any variables set for |
Small update: After some further tests without the query_max_size or dbt_artifacts_chunk_size variables set, I have found that this issue does still occur, we've settled with a stable config of:
|
@nickozilla - Thanks for updating! This is helpful. Regarding the bugfix PR - here it is Technically it was done by adding support for the |
Hello! As seen in this thread, many of us continue to face the following error when run metadata is added in BigQuery:
Resources exceeded during query execution: Not enough resources for query planning - too many subqueries or query is too complex.
I see in the thread that the issue appears to have been resolved, but I and others later in the thread are still encountering this error after upgrading to version
0.16.0
. This and this are two other examples of users who appear to be running into the same thing. If you all have any suggestions for config changes we need to make or would be up for discussing a potential fix, that would be greatly appreciated, as there are some wonderful features in the new version that we hope to not miss out on. Thank you!Environment:
Elementary dbt package version: 1.6.0
dbt version you're using: 1.7.4
Data warehouse: BigQuery
The text was updated successfully, but these errors were encountered: