-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory footprint optimization #18
Comments
Oof! Even after allocating 8GB, I can't build the database. At the end, when the progress bar shows:
I get:
And in
I don't know yet how much memory I will need, but our private cloud system is fairly limited on memory. We need to make sure our system admins are OK with this, or we need to optimize. Or, if the really large memory footprint is only a problem when initializing the database, we can pre-seed that file after generating it on a personal machine. |
Woo! I succeeded with 12GB of memory and 2GB of swap. Heavy swapping was going on during the write :) I think I'd suggest 16GB to avoid swapping to disk. |
The database update process also requires around 12GB. EDIT: The database update process fails (oom killer 🔪 ) with 12GB. 😭 |
It peaks around 13.1 GB when creating the pickle database file. 16GB is cutting it close, but should last a couple years at least. I haven't done the math. |
16GB isn't enough to run |
Currently, a >6GB numpy array is allocated, and if you don't have enough free memory, you're in trouble ;)
The text was updated successfully, but these errors were encountered: