Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak DNSMOS functional interface #2924

Open
augmented-albert opened this issue Jan 30, 2025 · 3 comments
Open

Memory leak DNSMOS functional interface #2924

augmented-albert opened this issue Jan 30, 2025 · 3 comments
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@augmented-albert
Copy link

augmented-albert commented Jan 30, 2025

🐛 Bug

CUDA memory overhang when using the DNSMOS functional interface with onnxruntime-gpu for computing batched metrics on various standard voice enhancement test sets.

To Reproduce

import torch
from torchmetrics.functional.audio import deep_noise_suppression_mean_opinion_score as DNSMOS

enhanced = torch.nn.utils.rnn.pad_sequence([torch.randn(1, torch.randint(48000, 160000, (1,))) for _ in range(8)], batch_first=True).to("cuda" if torch.cuda.is_available() else "cpu")

dnsmos_scores_1 = DNSMOS(preds=enhanced, fs=16_000, personalized=False)
dnsmos_scores_2 = DNSMOS(preds=enhanced, fs=16_000, personalized=False)
dnsmos_scores_3 = DNSMOS(preds=enhanced, fs=16_000, personalized=False)

Expected behavior

CUDA memory is expected to be cleared after use. Upon further inspection, an onnxruntime session is started but never terminated, hence the overhang.

Environment

  • torch==2.5.1
  • torchaudio==2.5.1
  • onnxruntime-gpu==1.20.1
  • torchmetrics==1.6.1
  • Python & PyTorch Version (e.g., 1.0): 3.10.x and 2.5.1
  • UBUNTU
@augmented-albert augmented-albert added bug / fix Something isn't working help wanted Extra attention is needed labels Jan 30, 2025
Copy link

Hi! thanks for your contribution!, great first issue!

@Borda
Copy link
Member

Borda commented Feb 4, 2025

hi, @augmented-albert could you pls add what TM version you are using?

@augmented-albert
Copy link
Author

hi, @augmented-albert could you pls add what TM version you are using?

torchmetrics==1.6.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants