Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to allocate 161. GiB for array with panicked failed #668

Open
heyuqi1970 opened this issue Aug 1, 2024 · 2 comments
Open

Unable to allocate 161. GiB for array with panicked failed #668

heyuqi1970 opened this issue Aug 1, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@heyuqi1970
Copy link

heyuqi1970 commented Aug 1, 2024

What language are you using?

Python 3.8.10

What version are you using?

0.32

What database are you using?

oracle

What dataframe are you using?

Arrow

Can you describe your bug?

What are the steps to reproduce the behavior?

Database setup if the error only happens on specific data or data type
Example query / code
select * from table

What is the error?

numpy.core._exceptions>ArrayMemoryError: Unable to allocate 161. GiB for array with shape <102, 211638219> and data type object
thread '' panicked at 'Python API call failedfailed'

@heyuqi1970 heyuqi1970 added the bug Something isn't working label Aug 1, 2024
@heyuqi1970
Copy link
Author

my system only 4G memory,

@rudyryk
Copy link

rudyryk commented Sep 16, 2024

ConnectorX load all data from the query, so if table contains 160Gb data is tries to allocate such amount of memory.

Looks like there's no chunked downloading capability, what we can do is limit through queries itself or download by ranges by ourselved, e.g.:

select * from table where id > 1000000 and id < 2000000

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants