You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bad:
The inference time is more than 80 ms for realtime usage.
To make it work for realtime image has to be resized to less than 200x200 which reduces accuracy.
So in order to make it usable the only way is to make it faster.
Have you tried using TensorRT or TVM or Pytorch serving in C++ ?
The text was updated successfully, but these errors were encountered:
Good:
It is accurate.
Bad:
The inference time is more than 80 ms for realtime usage.
To make it work for realtime image has to be resized to less than 200x200 which reduces accuracy.
So in order to make it usable the only way is to make it faster.
Have you tried using TensorRT or TVM or Pytorch serving in C++ ?
The text was updated successfully, but these errors were encountered: