-
|
Could you please add batch inference support? It would really help speed up large image processing and improve GPU utilization. |
Beta Was this translation helpful? Give feedback.
Answered by
fcakyon
Aug 23, 2025
Replies: 2 comments 2 replies
-
|
There is an actual PR regarding this: |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@leyestd there is no clear ETA yet, but we plan to first introduce optimizations for postprocess, then work on multi-worker inference (which is practically similar to batch inference). cc: @onuralpszr |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
fcakyon
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@leyestd there is no clear ETA yet, but we plan to first introduce optimizations for postprocess, then work on multi-worker inference (which is practically similar to batch inference).
cc: @onuralpszr