You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
HI, i'm trying to use your code to reproduce the experiments in your paper. I use the given command line and pretrained float model, while it reports the error of size mismatching:
=> loading checkpoint '/home/wenjing/model-best.pth.tar'
Traceback (most recent call last):
File "main.py", line 685, in
main()
File "main.py", line 201, in main
load_model(model, checkpoint)
File "main.py", line 121, in load_model
model.load_state_dict(new_state_dict, strict=False) #strict false in case the loaded doesn't have alll variables like running mean
File "/home/kewenjing/anaconda3/envs/pytorch-0.4.1-py3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 719, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for ResNet_imagenet:
size mismatch for layer1.0.conv2.weight: copying a param of torch.Size([64, 2, 3, 3]) from checkpoint, where the shape is torch.Size([64, 64, 3, 3]) in current model.
size mismatch for layer1.1.conv2.weight: copying a param of torch.Size([64, 2, 3, 3]) from checkpoint, where the shape is torch.Size([64, 64, 3, 3]) in current model.
size mismatch for layer1.2.conv2.weight: copying a param of torch.Size([64, 2, 3, 3]) from checkpoint, where the shape is torch.Size([64, 64, 3, 3]) in current model.
size mismatch for layer2.0.conv2.weight: copying a param of torch.Size([128, 4, 3, 3]) from checkpoint, where the shape is torch.Size([128, 128, 3, 3]) in current model.
size mismatch for layer2.1.conv2.weight: copying a param of torch.Size([128, 4, 3, 3]) from checkpoint, where the shape is torch.Size([128, 128, 3, 3]) in current model.
size mismatch for layer2.2.conv2.weight: copying a param of torch.Size([128, 4, 3, 3]) from checkpoint, where the shape is torch.Size([128, 128, 3, 3]) in current model.
size mismatch for layer2.3.conv2.weight: copying a param of torch.Size([128, 4, 3, 3]) from checkpoint, where the shape is torch.Size([128, 128, 3, 3]) in current model.
size mismatch for layer3.0.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer3.1.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer3.2.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer3.3.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer3.4.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer3.5.conv2.weight: copying a param of torch.Size([256, 8, 3, 3]) from checkpoint, where the shape is torch.Size([256, 256, 3, 3]) in current model.
size mismatch for layer4.0.conv2.weight: copying a param of torch.Size([512, 16, 3, 3]) from checkpoint, where the shape is torch.Size([512, 512, 3, 3]) in current model.
size mismatch for layer4.1.conv2.weight: copying a param of torch.Size([512, 16, 3, 3]) from checkpoint, where the shape is torch.Size([512, 512, 3, 3]) in current model.
size mismatch for layer4.2.conv2.weight: copying a param of torch.Size([512, 16, 3, 3]) from checkpoint, where the shape is torch.Size([512, 512, 3, 3]) in current model.
The text was updated successfully, but these errors were encountered:
HI, i'm trying to use your code to reproduce the experiments in your paper. I use the given command line and pretrained float model, while it reports the error of size mismatching:
The text was updated successfully, but these errors were encountered: