You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
not sure if we want to insert the data as a whole, it may be out of memory if data size is too large
-----update-----
when insert row by row, the time increased drmatically after inserting ~10k rows
when insert by chunk, I set chunk_size = 100, the time used for inserting chunk also dramatically increased (from !3second to 50+ seconds) too after ~100 chunks, (10k rows).
Iteration 100 took 2.7026 seconds
Iteration 101 took 2.7528 seconds
Iteration 102 took 2.8886 seconds
Iteration 103 took 2.7334 seconds
Iteration 104 took 2.5021 seconds
Iteration 105 took 2.6534 seconds
Iteration 106 took 3.8154 seconds
Iteration 107 took 3.4111 seconds
Iteration 108 took 4.0579 seconds
Iteration 109 took 49.1745 seconds
Iteration 110 took 109.2089 seconds
Iteration 111 took 74.9319 seconds
What happened?
code to reproduce the error:
It throws Exception because ofMEMORY_LIMIT_EXCEEDED
It is related to the_in_memory_table_exists
, I saw we have recently changed the implementation #10067smaller data runs OK
I guess this could be the reason of CI failures for #9908 #9744
What version of ibis are you using?
9.5.0
What backend(s) are you using, if any?
Trino
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: