You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
trtexec --onnx=/home/andolab/mmdet_env/onnx_output/end2end.onnx --saveEngine=/home/andolab/mmdet_env/engine.trt --plugins=/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so --fp16
上記のコードを用いて、MMDetectionで学習、推論しているMask R-CNN(Resnet101)をTensorRTエンジンに変換して高速化を考えているが、下記のエラーがでます。
プラグインが見つからないというようなエラーです。
TensorRTのプラグインはビルドしてこの通りに配置されています。
/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so
/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.10
/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.10.3.0
/usr/lib/lib/libnvinfer_plugin.so
/usr/lib/lib/libnvinfer_plugin.so.10
/usr/lib/lib/libnvinfer_plugin.so.10.3.0
開発環境は以下のとおりです。
Device: Jetson Orin NX
OS: Ubuntu 22.04
Python Version: 3.10
CUDA Version: 12.6
cuDNN Version: 9.6
TensorRT Version: 10.3.0.30
MMdeploy Version: 1.3.1
MMDetection Version: 3.3.0
pytorch Version: 2.3.0
MMDetectionで学習したMask R-CNNモデルでONNXに変換したmodel.onnxをTensorRTエンジンに変換できません。このエラーの解決方法、または別の方法でTensorRTエンジンに変換する方法があれば教えていただきたいです。
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:951: --- End node ---
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:953: ERROR: onnxOpCheckers.cpp:780 In function checkFallbackPluginImporter:
[6] creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:948: While parsing node number 1047 [MMCVMultiLevelRoiAlign -> "/mask_roi_extractor/MMCVMultiLevelRoiAlign_output_0"]:
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:950: --- Begin node ---
input: "/Reshape_61_output_0"
input: "/neck/fpn_convs.0/conv/Conv_output_0"
input: "/neck/fpn_convs.1/conv/Conv_output_0"
input: "/neck/fpn_convs.2/conv/Conv_output_0"
input: "/neck/fpn_convs.3/conv/Conv_output_0"
output: "/mask_roi_extractor/MMCVMultiLevelRoiAlign_output_0"
name: "/mask_roi_extractor/MMCVMultiLevelRoiAlign"
op_type: "MMCVMultiLevelRoiAlign"
attribute {
name: "aligned"
i: 1
type: INT
}
attribute {
name: "featmap_strides"
floats: 4
floats: 8
floats: 16
floats: 32
type: FLOATS
}
attribute {
name: "finest_scale"
i: 56
type: INT
}
attribute {
name: "output_height"
i: 14
type: INT
}
attribute {
name: "output_width"
i: 14
type: INT
}
attribute {
name: "pool_mode"
i: 1
type: INT
}
attribute {
name: "roi_scale_factor"
f: 1
type: FLOAT
}
attribute {
name: "sampling_ratio"
i: 0
type: INT
}
domain: "mmdeploy"
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:951: --- End node ---
[01/31/2025-13:22:51] [E] [TRT] ModelImporter.cpp:953: ERROR: onnxOpCheckers.cpp:780 In function checkFallbackPluginImporter:
[6] creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[01/31/2025-13:22:51] [E] Failed to parse onnx file
[01/31/2025-13:22:51] [I] Finished parsing network model. Parse time: 0.347841
[01/31/2025-13:22:51] [E] Parsing model failed
[01/31/2025-13:22:51] [E] Failed to create engine from model or file.
[01/31/2025-13:22:51] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v100300] # trtexec --onnx=/home/andolab/mmdet_env/onnx_output/end2end.onnx --saveEngine=/home/andolab/mmdet_env/engine.trt --plugins=/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so --fp16
Beta Was this translation helpful? Give feedback.
All reactions