-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformer aren't generate #224
Comments
After following the advice of the issue #210 the evaluation and prediction are right. But, I tried after that to train again the model of the first issue bye overwriting the previous model as #160 explained. The training is gonna be right except finishing with the same error as previously shown.
How can correct this ? Best, Chris |
Hi @Christophe-pere and apologies for taking this long to answer (was unwatched from this repo for some reason). The loading issue is some pickling problem connected to running it on multigpu (not 100% sure but likely). You can fix it by overriding how the model gets loaded in here: def fit(self, datagen, validation_datagen=None, meta_valid=None):
self._initialize_model_weights()
self.model = nn.DataParallel(self.model) I hope this helps! |
Hi there,
When I launch the command :
python3 main.py train --pipeline_name unet_weighted
I have this issue :
I tried the code on only one epoch to see how it works.
The code is running on a VM Ubuntu 18.04 with 2 GPUs.
How can I correct the code to generate the transformers layer ?
Best,
Chris
The text was updated successfully, but these errors were encountered: