-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jetpack 4.5 triton 2.10 - deepstream 5.1 on Jetson Xavier NX #16
Comments
This repo is tracking only FasterRCNN and CenterFace model for the blog https://developer.nvidia.com/blog/deploying-models-from-tensorflow-model-zoo-using-deepstream-and-triton-inference-server/. Please create a thread on the DeepStream forum https://forums.developer.nvidia.com/c/accelerated-computing/intelligent-video-analytics/deepstream-sdk/15 listing the hardware(Jetson Xavier) and software(container vs .deb install of 5.1, JP version, model) you're using. From a glance, it seems like the log you've provided doesn't have anything to do with ONNX. When creating the thread on the forum provide a .tar with the model(if it's small enough, else cloud hosted link), and your config file along with the errors you've observed. |
I just run one of the samples in deepstream 5.1 container, not mine. Those samples worked fine with triton 2.5 but when I tar Jetpack 4.5(it has triton 2.10 inside), all samples are down with same error above. |
I see. Looks like the model is a TF1 model. So either the config file which is most probably at/near $DEEPSTREAM_DIR/samples/trtis_model_repo/<model_name>/config.pbtxt is different from Triton 2.5 and incorrect or Triton 2.10 is unable to infer input/output dimensions/layer names while 2.5 was able to. I'm on x86 so I can't repro this on a Jetson, but I can try running the 5.1 container and this model on x86 to see if I get something similar. |
this is ds image I used: nvcr.io/nvidia/deepstream-l4t:5.1-21.02-samples (Deepstream 5.1 and Triton 2.5). Could you please try it? Thank you so much. |
triton 2.10 supports onnx but still got error when loading model.
Release link: [(https://github.com/triton-inference-server/server/releases)]
Input or output layers are empty
The text was updated successfully, but these errors were encountered: