You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m working with very large .obj files (up to 10–20 GB) and trying to import them with meshio.read("myfile.obj") in Python. Even after disabling UVs/normals and reducing the file size somewhat (from 20 GB to 10 GB), I still get a MemoryError because everything is loaded into RAM at once.
Is there any built-in functionality in meshio to partially (chunk-wise) read .obj files or stream them, rather than loading the entire mesh into memory at once? Or is that out of scope for this library?
If not, do you have any suggestions or recommended workarounds for handling extremely large OBJ files — e.g., any approach to reduce memory usage or break the file into smaller pieces while still using meshio?
Thanks in advance for your help!
The text was updated successfully, but these errors were encountered:
Hi everyone,
I’m working with very large .obj files (up to 10–20 GB) and trying to import them with meshio.read("myfile.obj") in Python. Even after disabling UVs/normals and reducing the file size somewhat (from 20 GB to 10 GB), I still get a MemoryError because everything is loaded into RAM at once.
Is there any built-in functionality in meshio to partially (chunk-wise) read .obj files or stream them, rather than loading the entire mesh into memory at once? Or is that out of scope for this library?
If not, do you have any suggestions or recommended workarounds for handling extremely large OBJ files — e.g., any approach to reduce memory usage or break the file into smaller pieces while still using meshio?
Thanks in advance for your help!
The text was updated successfully, but these errors were encountered: