Description
Description
I noticed a different behavior with decollate_batch
when using numpy vs torch scalars: the function allow decollating batches containing torch scalars but not numpy scalars. It becomes an issue when trying to decollate images containing numpy scalar metadata (for example images loaded with LoadImaged
).
To Reproduce
import numpy as np
import torch
from monai.data import decollate_batch
decollated = decollate_batch({"value": np.array(1)})
# TypeError: iteration over a 0-d array
# But this works
decollated = decollate_batch({"value": torch.tensor(1)})
In practice I encountered this issue with scalars saved inside meta tensor metadata:
from monai.data import MetaTensor
ct = MetaTensor(torch.randn(2, 1, 64, 64, 64), meta={"value": np.array(1)})
decollated = decollate_batch({"ct": ct})
# TypeError: iteration over a 0-d array
# But this works
ct = MetaTensor(torch.randn(2, 1, 64, 64, 64), meta={"ok": torch.tensor(1)})
decollated = decollate_batch({"ct": ct})
Expected behavior
I would expect that using numpy or torch would result in the same behavior, i.e. being able to decollate a batch.
Why this bug?
Because the special case for numpy arrays is not handled in the decollate_batch
(while the torch case is implemented) and the code fails here
Line 550 in c3a317d
because any numpy arrays are "iterable" even scalar values (with
value.ndim=0
). So it breaks here while trying to iterate on a non iterable value.
Solution / Fix proposition
In the code we only support the torch case:
Lines 631 to 632 in c3a317d
We could do the same for the numpy case (see #8470)
Environment
Ensuring you use the relevant python executable, please paste the output of:
Python version: 3.11
Numpy version: 2.2.6
Pytorch version: 2.6.0+cu124
MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
MONAI rev id: c3a317d2bcb486199f40bda0d722a41e3869712a
No optional dependencies.
Additional context
Add any other context about the problem here.