Is batch inference supported? #860
Unanswered
DonBraulio
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey guys, is there a way to leverage batch inference on the GPU?
According to what I can understand from this line it's not supported yet, can you confirm that? And in that case, is it something you're planning to add at some point?
In my experience, it's something that can speed up the predictions by a large margin (depending on the GPU), so I'm very interested in this feature, since I'm running on videos. But maybe this library's priority is not fast inference (I'd totally understand that). In that case, maybe I could study the issue a bit more and try to make a PR supporting this - I'm not sure of the effort required though, I'd appreciate any comments about that as well.
Anyway, I just want to know what's the status and the value you see on this issue, any comment is appreciated.
Beta Was this translation helpful? Give feedback.
All reactions