Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of Memory Issue in Semantic Segmentation #11

Open
code4indo opened this issue Apr 17, 2023 · 2 comments
Open

Out of Memory Issue in Semantic Segmentation #11

code4indo opened this issue Apr 17, 2023 · 2 comments

Comments

@code4indo
Copy link

Why is it that when working on semantic segmentation, I constantly encounter out of memory errors, even though I have two GPUs with 15GB each? Is it possible to distribute the model workload across the GPUs in parallel?

@FingerRec
Copy link
Collaborator

The SAM itself is not heavy. But semantic segment anything requires four large model which is very memory consuming. At now, simply use --semantic_segment_device as 'CPU' to run. We are working on make this model lightweight now.

@FingerRec
Copy link
Collaborator

FingerRec commented Apr 17, 2023

Hi, we have implement a light version.

Can be run on 8G GPU less than 20s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants