You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The used config file is centermask_lite_V_19_slim_dw_eSE_FPN_ms_4x.yaml.
a. The input to the network whose backbone is a rcnn is a list which contains dictionaries composed of input image data and some other attirbutes. Looks like there's no way to pass similar input for onnx exporting. I added a member function in rcnn.py (detectron2/modeling/meta_arch/rcnn.py) which accepts such input to solve this issue.
b. Based on above, following error rises up during conversion:
RuntimeError: Failed to export an ONNX attribute 'onnx::Sub', since it's not constant, please try to make things (e.g., kernel size) static if possible
I cannot google any clues to the error. Anyone can kindly help? Thanks.
The text was updated successfully, but these errors were encountered:
Hello, @liamsun2019!
I've been troubled by this problem recently. I use the detectron2 tool to convert centermask to onnx, but I need to rewrite the underlying code of detectron2 to onnx (the existing code is written according to mask RCNN), which is difficult to implement.
Can you use this method to convert onnx?
Hi Pals,
I tried to convert the trained pth model to onnx format, but encounter several issues. The code segment are as follows:
model = Trainer.build_model(cfg)
state = torch.load(cfg.MODEL.WEIGHTS, map_location=lambda storage, loc: storage)
model.load_state_dict(state['model'])
model.eval()
model.cuda()
dummy_input = torch.randn(1, 3, 448, 448).to("cuda")
torch.onnx.export(model, dummy_input, "model.onnx", verbose=True, input_names=['image'], output_names=['pred'])
The used config file is centermask_lite_V_19_slim_dw_eSE_FPN_ms_4x.yaml.
a. The input to the network whose backbone is a rcnn is a list which contains dictionaries composed of input image data and some other attirbutes. Looks like there's no way to pass similar input for onnx exporting. I added a member function in rcnn.py (detectron2/modeling/meta_arch/rcnn.py) which accepts such input to solve this issue.
b. Based on above, following error rises up during conversion:
RuntimeError: Failed to export an ONNX attribute 'onnx::Sub', since it's not constant, please try to make things (e.g., kernel size) static if possible
I cannot google any clues to the error. Anyone can kindly help? Thanks.
The text was updated successfully, but these errors were encountered: