You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to train Hfnet. During distillation, the exported npz files exceeded the 100 Gb in 20 minutes. As a result the process stopped because no space left on my device. My dataset is about 5 Gb of 50 000 images from Berkley DeepDrive dataset.
The command that I'm using is the following:
The text was updated successfully, but these errors were encountered:
Gpetrak
changed the title
Extremelly large size npz files during SuperPoint export_predictions
Extremely large size npz files during SuperPoint export_predictions
Jun 23, 2022
Hi developers,
I'm trying to train Hfnet. During distillation, the exported npz files exceeded the 100 Gb in 20 minutes. As a result the process stopped because no space left on my device. My dataset is about 5 Gb of 50 000 images from Berkley DeepDrive dataset.
The command that I'm using is the following:
Is it reasonable or am I doing something wrong ?
The text was updated successfully, but these errors were encountered: