GPU is not used during prediction ? #908
Replies: 6 comments 1 reply
-
Can you send your nvidia-smi output? How much normal RAM do you have? Have you studied the output? I'm asking because fallback to GPU and its reason is often described in the output. Do you use get_sliced_prediction or (batch) predict? (https://github.com/obss/sahi/blob/main/demo/inference_for_yolov8.ipynb) |
Beta Was this translation helpful? Give feedback.
-
Hi @jokober I used get_sliced_prediction (not batch). My pc has 62 Gi Total RAM. What does it mean to study the output? Currently Sahi prints out only 'Performing prediction on 216 number of slices.' during processing, how can I get more info? A snapshot immediately after the python script was killed: A snapshot of nvidia-smi during the python script was running: |
Beta Was this translation helpful? Give feedback.
-
Btw I found out this setting merge_buffer_length = 100 that can avoid OOM. But still the prediction is very slow. |
Beta Was this translation helpful? Give feedback.
-
Using batch prediction (predict() function) will likely improve speed as it will utilize your gpu more. |
Beta Was this translation helpful? Give feedback.
-
@patcharees hi, i encounter the same problem as you did. Did you manage to use GPU? |
Beta Was this translation helpful? Give feedback.
-
@pkcktkksh98 sorry for late reply. Unfortunately I still can not manage to use GPU. You? I tried batch prediction (predict() function) as suggested above. But it did not help. I monitored the GPU usage while prediction, it was under 10% all the time. |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am using sahi with yolov8 segmentation (#882). I created Yolov8DetectionModel with gpu as:
But when I run my script to predict, I noticed from nvidia-smi that gpu utilization is very low (< 10%)
My problem is when I use high overlap and number of slices is high, then my script, which loop through all images to predict one image at a time, is very slow. In addition it is killed from Out of memory after it predicts a few images.
My image is large 8192 * 5460
Any suggestions to improve speed and to avoid OOM?
Beta Was this translation helpful? Give feedback.
All reactions