Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test code only work for original label map pf 21 element #9

Open
UcefMountacer opened this issue Mar 26, 2022 · 1 comment
Open

test code only work for original label map pf 21 element #9

UcefMountacer opened this issue Mar 26, 2022 · 1 comment

Comments

@UcefMountacer
Copy link

Hi,

create_mobilenetv2_ssd_lite has a problem when using label map of 11 element.

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
[<ipython-input-18-eff74db7755d>](https://localhost:8080/#) in <module>()
     14 net = create_mobilenetv2_ssd_lite(11, is_test=1)
     15 
---> 16 net.load(model_path)
     17 
     18 predictor = create_mobilenetv2_ssd_lite_predictor(net, candidate_size=200, nms_method="soft")

1 frames
[/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in load_state_dict(self, state_dict, strict)
   1481         if len(error_msgs) > 0:
   1482             raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
-> 1483                                self.__class__.__name__, "\n\t".join(error_msgs)))
   1484         return _IncompatibleKeys(missing_keys, unexpected_keys)
   1485 

RuntimeError: Error(s) in loading state_dict for SSD:
	size mismatch for classification_headers.0.3.weight: copying a param with shape torch.Size([126, 576, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 576, 1, 1]).
	size mismatch for classification_headers.0.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.1.3.weight: copying a param with shape torch.Size([126, 1280, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 1280, 1, 1]).
	size mismatch for classification_headers.1.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.2.3.weight: copying a param with shape torch.Size([126, 512, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 512, 1, 1]).
	size mismatch for classification_headers.2.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.3.3.weight: copying a param with shape torch.Size([126, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 256, 1, 1]).
	size mismatch for classification_headers.3.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.4.3.weight: copying a param with shape torch.Size([126, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 256, 1, 1]).
	size mismatch for classification_headers.4.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).
	size mismatch for classification_headers.5.weight: copying a param with shape torch.Size([126, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([66, 64, 1, 1]).
	size mismatch for classification_headers.5.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([66]).

Do you have an idea how to correct this ? thanks

@malatang20001210
Copy link

image
if you want to show two classes you can try like this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants