Patch size cannot be changed to different size in UNETR2D (disc. #3746) #4461
SakaltrasNikolaos
started this conversation in
General
Replies: 2 comments 2 replies
-
Hi @heyufan1995 , Could you please help share some best practice about this question? Thanks in advance. |
Beta Was this translation helpful? Give feedback.
1 reply
-
Hi @SakaltrasNikolaos:
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everybody,
primarily I would like to thank you all, for the amazing work and the overall support you provide. Keep up the good work!
I am testing the UNETR2D from the discussion #3746 . I run the code as provided and it runs smoothly. But I try to test different patch sizes, but it shows an error. I do not know yet why. It happens always in the last step of concatenation. The dims do not match.
Patch sizes with the same error 32 and 64:
unetr_block.py:83, in UnetrUpBlock.forward(self, inp, skip) [80](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=79) def forward(self, inp, skip): [81](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=80) # number of channels for skip should equals to out_channels [82](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=81) out = self.transp_conv(inp) ---> [83](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=82) out = torch.cat((out, skip), dim=1) [84](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=83) out = self.conv_block(out) [85](file:///c%3A/ProgramData/Anaconda3/envs/py38/lib/site-packages/monai/networks/blocks/unetr_block.py?line=84) return out
ERROR: RuntimeError: Sizes of tensors must match except in dimension 2. Got 256 and 64 (The offending index is 0): Patch 64x64
ERROR: RuntimeError: Sizes of tensors must match except in dimension 2. Got 256 and 128 (The offending index is 0): Patch 32x32
Is the code fixed to work only with this patch size of 16x16, or am I missing something?
I add the code again here, if there is something I am missing!
`# %%
from typing import Tuple, Union
import torch
import torch.nn as nn
from monai.networks.blocks.dynunet_block import UnetOutBlock
from monai.networks.blocks import UnetrBasicBlock, UnetrPrUpBlock, UnetrUpBlock
from monai.networks.nets import ViT
%%
class UNETR2D(nn.Module):
"""
UNETR based on: "Hatamizadeh et al.,
UNETR: Transformers for 3D Medical Image Segmentation https://arxiv.org/abs/2103.10504"
"""
%%
model = UNETR2D(
in_channels=1, # 3 channels, R,G,B
out_channels=1,
img_size=(256, 256),
feature_size=16,
hidden_size=768,
mlp_dim=3072,
num_heads=12,
pos_embed="perceptron",
norm_name="instance",
res_block=True,
dropout_rate=0.0,
).cuda()
%%
print(model)
%%
x = torch.rand((1,1,256,256)).cuda()
y = model(x)`
Thank you in advance for your time and assistance. I apologize is the discussion topic exists somewhere else, and I could not find it.
Regards,
Nikolaos
Beta Was this translation helpful? Give feedback.
All reactions