-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Split not working when submodels folder is binded on docker volume #1762
Comments
Thanks for the report, but it looks like you didn't include a copy of your dataset for us to reproduce this issue? Please make sure to follow our issue guidelines 🙏 p.s. I'm just an automated script, not a human being. |
Is this perchance on a windows machine or a disk type that doesn't support symlinks? |
NTFS does support symlinks, though I am not sure if we call out to mklink to generate them 🤔 All bets are off on NFS/SMB drives I think, though. |
Docker is running in an Ubuntu 24.04 LTS Hyper-V VM. Extract from /etc/fstab: /dev/disk/by-uuid/4d8b4b00-9ed0-4570-8422-9dd5848ddc48 /var/lib/docker ext4 defaults 1 2 The first one works, the second one not |
How did you install ODM? (Docker, installer, natively, ...)?
Docker - Image: opendronemap/odm:latest
What is the problem?
I have a large data set of about 3000 images.
Our server does not have enough RAM, so I wanted to split it into 1000 images per submodel using --split.
According to the log, the split process says that it will split it into 0 submodels.
I start the Docker container via the following Docker Compose file.
If I remove the last volume “- /data/Example/submodels:/code/submodels”, the process works.
So there are problems when you bind the submodels folder to a volume.
What should be the expected behavior? If this is a feature request, please describe in detail the changes you think should be made to the code, citing files and lines where changes should be made, if possible.
Creating submodels on an docker volume should be work.
How can we reproduce this? What steps did you do to trigger the problem? If this is an issue with processing a dataset, YOU MUST include a copy of your dataset AND task output log, uploaded on Google Drive or Dropbox (otherwise we cannot reproduce this).
It should be possible to reproduce it with each data set
Complete logfile: logfile.log
The text was updated successfully, but these errors were encountered: