Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError with extremely large .obj files — any partial/streaming option in meshio? #1504

Open
Unclaim3d opened this issue Jan 2, 2025 · 0 comments

Comments

@Unclaim3d
Copy link

Hi everyone,

I’m working with very large .obj files (up to 10–20 GB) and trying to import them with meshio.read("myfile.obj") in Python. Even after disabling UVs/normals and reducing the file size somewhat (from 20 GB to 10 GB), I still get a MemoryError because everything is loaded into RAM at once.

Is there any built-in functionality in meshio to partially (chunk-wise) read .obj files or stream them, rather than loading the entire mesh into memory at once? Or is that out of scope for this library?

If not, do you have any suggestions or recommended workarounds for handling extremely large OBJ files — e.g., any approach to reduce memory usage or break the file into smaller pieces while still using meshio?

Thanks in advance for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant