Replies: 2 comments 7 replies
-
First, it is not recommended to change the rec_batch_num parameter. If it is too large, it will affect the accuracy. Second, it is recommended to resize the image to a smaller size before passing it into the model. Third, a large part of the time it takes to infer depends on the amount of text in the image. If the text is dense, it will naturally take longer. You can try |
Beta Was this translation helpful? Give feedback.
-
Thanks @SWHL, I tried using |
Beta Was this translation helpful? Give feedback.
-
Hi, I'm deploying a small project and wanted some general tips on how can I reduce the time for inference but ensuring the outputs are accurate. Currently I'm using a CPU container instance for deploying my api. I have converted my images to grayscale before passing to the model and currently using the default parameters when initializing. Some of the images are taking quite some time, hence looking for general tips on how can I improve this (for example should I increase the
rec_batch_num
from the default which is6
?) Please note that I'm using theRapidocr onnxruntime
library for this. Thanks!Beta Was this translation helpful? Give feedback.
All reactions