Can I do multiple concurrent inference tasks when the application just loads one model? #2725
Unanswered
MikeBai523
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
In my app, I want to load just one model using
init_recognizer()
, then use multithreads to infer mp4s concurrently usinginference_recognizer()
.Can I do this? Or we can say it's supported by default?
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions