Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError when "brats_mri_segmentation_v0.2.1" from monaibundle is used. #1051

Open
PranayBolloju opened this issue Oct 7, 2022 · 61 comments

Comments

@PranayBolloju
Copy link

Describe the bug
MONAI Label server is giving the following error when "brats_mri_segmentation_v0.2.1" is used for brain tumor segmentation.

RuntimeError: Given groups=1, weight of size [16, 4, 3, 3, 3], expected input[1, 240, 240, 240, 160] to have 4 channels, but got 240 channels instead

To Reproduce
Steps to reproduce the behavior:

  1. pip install monailabel
  2. monailabel apps --download --name monaibundle --output apps
  3. monailabel datasets --download --name Task01_BrainTumour --output datasets
  4. monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1
  5. Run the model in 3D slicer with any image from the dataset.

Expected behavior
Segmentation should be displayed in 3D slicer.

Screenshots
image
image

Environment

Ensuring you use the relevant python executable, please paste the output of:

python -c 'import monai; monai.config.print_debug_info()'

================================
Printing MONAI config...

MONAI version: 1.0.0
Numpy version: 1.22.4
Pytorch version: 1.12.1+cpu
MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
MONAI rev id: 170093375ce29267e45681fcec09dfa856e1d7e7
MONAI file: C:\Users\Admin\AppData\Local\Programs\Python\Python39\lib\site-packages\monai_init_.py

Optional dependencies:
Pytorch Ignite version: 0.4.10
Nibabel version: 4.0.2
scikit-image version: 0.19.3
Pillow version: 9.2.0
Tensorboard version: 2.10.0
gdown version: 4.5.1
TorchVision version: 0.13.1+cpu
tqdm version: 4.64.0
lmdb version: 1.3.0
psutil version: 5.9.1
pandas version: 1.4.3
einops version: 0.4.1
transformers version: NOT INSTALLED or UNKNOWN VERSION.
mlflow version: NOT INSTALLED or UNKNOWN VERSION.
pynrrd version: 0.4.3

@tangy5
Copy link
Collaborator

tangy5 commented Oct 7, 2022

HI @PranayBolloju ,

For the BRATS bundle, each data contains 4 channels as input volume. The brats_mri_segmentation_v0.2.1 needs a pre-processing step for BRATS data later than 2018.
For the data you have downloaded from Task01, four modalities MRI images are already in one NIFTI file, but the channel dimension is at the last, e.g., (240, 240, 160, 4), the 4 is at index 3 as the input data. A solution is to preprocess the data to compatible with the bundle input: transpose the image to (4, 240, 240,160).

Thanks for reporting this. We'd better to add note in the bundle Readme or MONAI Label side to remind users on pre-processing BRATS data. Hope this helps to solve you problem.

@PranayBolloju
Copy link
Author

Hi @tangy5 ,

Thanks for the response. Can you suggest a way to preprocess the data i.e. transpose images?

@diazandr3s
Copy link
Collaborator

diazandr3s commented Oct 8, 2022

@tangy5, does the input to the bundle brats_mri_segmentation_v0.2.1 need to be as channel first?

Do the transforms AsChannelFirstd or AsChannelLastd help?

Perhaps we only need to add this argument when loading the images: https://github.com/Project-MONAI/MONAI/blob/dev/monai/transforms/io/dictionary.py#L128

Here is where this can be added: https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/inference.json#L37 as well as in training: https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/train.json#L59

@diazandr3s
Copy link
Collaborator

Hi @PranayBolloju,

I have tried myself this model and I've got the same error.

I've also changed the LoadImage args and managed to get a prediction. I think the quality of the model can be easily improved. Please watch this video:

multi-modality-orientation.mp4

One thing you could do is first update both the inference and train files (add ensure_channel_first arg) and then re-train the model using the Task01_BrainTumour dataset.

Please follows these steps: #1055 (comment)

BTW, there is another unsolved issue regarding multimodality/multiparametric images in Slicer. When a NIfTI file has more than one modality, Slicer reads only one.

NIfTI can be messy and that's why I make Slicer not consider the orientation. Ugly solution :/

MONAI Label does support multiparametric, but Slicer can't read multiple images when loaded in a single NIfTI image. More of this here: #729 (comment)

@tangy5
Copy link
Collaborator

tangy5 commented Oct 9, 2022

Hi @PranayBolloju,

I have tried myself this model and I've got the same error.

I've also changed the LoadImage args and managed to get a prediction. I think the quality of the model can be easily improved. Please watch this video:

multi-modality-orientation.mp4
One thing you could do is first update both the inference and train files (add ensure_channel_first arg) and then re-train the model using the Task01_BrainTumour dataset.

Please follows these steps: #1055 (comment)

BTW, there is another unsolved issue regarding multimodality/multiparametric images in Slicer. When a NIfTI file has more than one modality, Slicer reads only one.

NIfTI can be messy and that's why I make Slicer not consider the orientation. Ugly solution :/

MONAI Label does support multiparametric, but Slicer can't read multiple images when loaded in a single NIfTI image. More of this here: #729 (comment)

Thanks @diazandr3s , I got same of loading multimodality data in Slicer. You solution looks good, we might need to add to monaibundel Readme on using the BRATS bundle, both the images and reminder of Slicer loading multi-channel images.

@PranayBolloju
Copy link
Author

Hi @diazandr3s

Thanks for the video. I have tried the suggestions and got the prediction. The segmentation looks fine in 3D but nothing comes up in the other slides.
image

@diazandr3s
Copy link
Collaborator

Thanks for the update, @PranayBolloju

As you can see from the video (minute ~1:11), I proposed an ugly solution (discard orientation) for MONAI Label to load the multimodality images in Slicer.

I was wondering if all is absolutely needed all modalities for your use case. Otherwise, I'd suggest working with a single modality as it avoids this change from the Slicer module perspective.

Let us know

@PranayBolloju
Copy link
Author

Hi @diazandr3s ,

If it is possible to do Brain Hemorrhage or Tumor segmentation with equal accuracy when single modality or multimodality images are used, then I suppose we don't need to use multimodality images.

@diazandr3s
Copy link
Collaborator

Hi @diazandr3s ,

If it is possible to do Brain Hemorrhage or Tumor segmentation with equal accuracy when single modality or multimodality images are used, then I suppose we don't need to use multimodality images.

Hi @PranayBolloju,

Brain Hemorrhage and Tumor segmentation are two different tasks and they use different image modalities. AFAIK, for brain hemorrhage segmentation you employ CT images while for brain tumor segmentation MR images are more commonly used.

@PranayBolloju
Copy link
Author

Hi @diazandr3s ,

Thanks for the insights. Is there any model available for Brain hemorrhage segmentation separately or can we use the same model used for tumor segmentation?

@diazandr3s
Copy link
Collaborator

Although no brain hemorrhage segmentation model (using CT images) is available in MONAI Label, it shouldn't be difficult for you to create one from a public dataset like this one: https://instance.grand-challenge.org/

You may find this useful as well: #1055 (comment)

Regarding brain tumor segmentation model (using MR images), you could the same Task01_BrainTumour but with a single modality.

Hope this helps,

@PranayBolloju
Copy link
Author

Hi @diazandr3s

Thanks a lot for this information. The dataset you have provided the link to says its a forbidden dataset. Is there a way to get a dataset perhaps with label (hemorrhage) segmentation?

@diazandr3s
Copy link
Collaborator

That's strange.
Have you registered for the challenge? If yes, and still doesn't work, have you contacted any of the organizers? https://instance.grand-challenge.org/Organizers/

@PranayBolloju
Copy link
Author

Hi @diazandr3s
I have registered for the challenge and they are asking to sign an agreement and send by email. I have done that too but did not get any reply from them.

@diazandr3s
Copy link
Collaborator

Hi @PranayBolloju,

I'd suggest you try another dataset like this one: https://www.kaggle.com/c/rsna-intracranial-hemorrhage-detection
Hopefully, the organizers reply soon,

@PranayBolloju
Copy link
Author

Hi @diazandr3s ,

Many thanks for the suggestions,

I have seen that dataset too, but it does not contain 3D images and also it does not have annotations. We would have to annotate hemorrhages by ourselves which might lead to wrong labeling. I was hoping to get a dataset already annotated by experts like the Task01_BrainTumor dataset or INSTANCE 2022 dataset.

In case I don't find any pre-annotated dataset, as the last resort I will attempt to label the segmentations using 3D slicer. There are couple of questions in this section.

  • How to give DICOM images as an input to MONAI Label if working on a local machine(not a dicomweb server)?
  • What model can we use to do 2D segmentation in monailabel for brain images?

@diazandr3s
Copy link
Collaborator

Hi @diazandr3s ,

Many thanks for the suggestions,

I have seen that dataset too, but it does not contain 3D images and also it does not have annotations. We would have to annotate hemorrhages by ourselves which might lead to wrong labeling. I was hoping to get a dataset already annotated by experts like the Task01_BrainTumor dataset or INSTANCE 2022 dataset.

In case I don't find any pre-annotated dataset, as the last resort I will attempt to label the segmentations using 3D slicer. There are couple of questions in this section.

* How to give DICOM images as an input to MONAI Label if working on a local machine(not a dicomweb server)?

* What model can we use to do 2D segmentation in monailabel for brain images?

Hi @PranayBolloju,

Regarding this:

I have seen that dataset too, but it does not contain 3D images and also it does not have annotations. We would have to annotate hemorrhages by ourselves which might lead to wrong labeling. I was hoping to get a dataset already annotated by experts like the Task01_BrainTumor dataset or INSTANCE 2022 dataset.

I fully understand. I hope the challenge organizers reply soon. That will facilitate things a lot.
I was wondering whether you have access to expert manpower that can help to create these labels. Can I ask you what's the use case you have in mind once you get a the trained model?

  • How to give DICOM images as an input to MONAI Label if working on a local machine(not a dicomweb server)?

Currently, MONAI Label does not support DICOM images on a local folder. There are two options here: 1/ Convert the images to NRRD or NIfTI format and then work on a local folder or 2/ use a DICOM Web server.

  • What model can we use to do 2D segmentation in monailabel for brain images?

MONAI Label has examples for 2D segmentation such as the endoscopy and pathology app. The question is which viewer you want to integrate MONAI Label with.

You could also modify the radiology app to work on 2D as well. Please see discussion: #829

Hope this helps,

@PranayBolloju
Copy link
Author

Hi @diazandr3s ,

Thank you for all the suggestions, it really helped.

I was following your suggestion and converted some images from NIFTI to DICOM using plastimatch.
The command I have used is:
plastimatch convert --patient-id patient1 --input BRATS_001.nii.gz --output-dicom BRATS_001

I have added 'ensure_channel_first' arg in inference.json in monaibundle\brats_mri_segmentation_v0.2.1\configs.

Then I have started the monailabel server using this command:
monailabel start_server -a apps\monaibundle -s <URL to Google DICOM Web server> -c models brats_segmentation_v0.2.1

I was able to see the images stored in Google DICOM web server in 3D slicer but when I tried to run inference I got the following error.

image

The same model was doing tumor segmentation perfectly when using local images i.e NIFTI.

@diazandr3s
Copy link
Collaborator

Hi @PranayBolloju,

Thanks for the update.

Did you make sure the DICOM images are multiparametric? I mean, does the input have the 4 modalities needed for the pretrained model?

I believe this is why you're getting this error.

Hope this helps,

@PranayBolloju
Copy link
Author

Hi @diazandr3s ,

Thanks for the reply

I think the images converted to DICOM are not with 4 modalities. I have tried 2 ways to convert the images.

  • By using export to dicom option from 3D slicer.
  • plastimatch convert --patient-id patient1 --input BRATS_001.nii.gz --output-dicom BRATS_001

Is there a way to preserve the modality when converted to DICOM?

@diazandr3s
Copy link
Collaborator

Hi @diazandr3s ,

Thanks for the reply

I think the images converted to DICOM are not with 4 modalities. I have tried 2 ways to convert the images.

* By using export to dicom option from 3D slicer.

* plastimatch convert --patient-id patient1 --input BRATS_001.nii.gz --output-dicom BRATS_001

Is there a way to preserve the modality when converted to DICOM?

BRATS or Task01_BrainTumour are highly preprocessed datasets. They are skull-stripped and modality co-registered. It is not easy to find a similar dataset with these characteristics.

I'm not sure about this, but I think you can't save all modalities in a single DICOM file.

@diazandr3s
Copy link
Collaborator

Hi @diazandr3s ,

Thanks for the reply

I think the images converted to DICOM are not with 4 modalities. I have tried 2 ways to convert the images.

* By using export to dicom option from 3D slicer.

* plastimatch convert --patient-id patient1 --input BRATS_001.nii.gz --output-dicom BRATS_001

Is there a way to preserve the modality when converted to DICOM?

BRATS or Task01_BrainTumour are highly preprocessed datasets. They are skull-stripped and modality co-registered. It is not easy to find a similar dataset with these characteristics.

I'm not sure about this, but I think you can't save all modalities in a single DICOM file.

@wyli do you know if this is possible? Can we store 4 modalities in a single DICOM file?

@PranayBolloju
Copy link
Author

Hi @diazandr3s

Thanks for the clarification.

I went ahead and trained a model with the converted images(i.e images that converted to single modality). The following are the changes I made in config files before training the model.

  • Changed the "in_channels" value to 1 from 4 in brats_mri_segmentation_v0.3.0/configs/train.json ,inference.json and metadata.json.
  • Added "ensure_channel_first":true

The model was trained successfully with 300 epochs and with average dice score of around 81 . But when I tried inference, only one of the label was being segmented.

image

Is there anything I have missed here?

@diazandr3s
Copy link
Collaborator

Hi @PranayBolloju,

Thanks for the update. It's good to see these results.

Does this happen to all test cases? Which modality did you use here?

Bear in mind that the tumor core (necrotic area) and edema (whole tumor) are visible on the other modalities (T1 + Contrast, T2, etc). That's mainly the reason for using different modalities.

@PranayBolloju
Copy link
Author

PranayBolloju commented Oct 26, 2022

Hi @diazandr3s
Yes it is happening to all test cases.

These are a couple of images used for this model.
BRATS_001.nii.gz
BRATS_002.nii.gz

And these are the labels.
BRATS_001.nii.gz
BRATS_002.nii.gz

And a similar thing is happening while doing inference with pretrained model from monaibundle i.e brats_mri_segmentation_v0.2.1 on "Task01_BrainTumour" dataset. All three labels can be seen in segment editor but only one label is visible in the mask.
image

@diazandr3s
Copy link
Collaborator

Thanks for clarifying this, @PranayBolloju.

It seems this issue comes from the post-processing transforms:

Please change this argument (https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/inference.json#L76) to softmax=true and this (https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/inference.json#L90) to argmax=true

They should work like this: https://github.com/Project-MONAI/MONAILabel/blob/main/sample-apps/radiology/lib/infers/deepedit.py#L118-L119

It seems the network is outputting 3 channels but only one is being shown in Slicer.

Please let me know how that goes.

@PranayBolloju
Copy link
Author

Hi @diazandr3s

I have changed the suggested lines.
image

And the segmentation looks like this.
image

@diazandr3s
Copy link
Collaborator

Hi @PranayBolloju,

As I mentioned before, this bundle was designed to output three channels, one per label. 3D Slicer only takes the first one.

I've checked the training process and it seems it was designed to work like that - sigmoid per channel and to have one-hot representation of the output.

See the training transforms: https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/train.json#L153-L160

I initially thought previous changes could solve the issue. But as the model wasn't trained using the softmax activation function, you get the result you're showing. @tangy5 can you please confirm this?

A solution for this is to keep the transforms as is and add another post transform that merges all three channels before this one: https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/inference.json#L92

Another solution is to use the deepedit or segmentation model. Here are the instructions: https://www.youtube.com/watch?v=3HTh2dqZqew&list=PLtoSVSQ2XzyD4lc-lAacFBzOdv5Ou-9IA&index=3

Hope this helps,

@PranayBolloju
Copy link
Author

Hi @diazandr3s
Thanks for the clear explanation.
I have tried to train a deepedit model.These are the changes I made in config file.

self.labels = {
"NCR": 1,
"ED": 2,
"ET": 3,
"background": 0,
}

These are example files that I used for training.
Volume
BRATS_001.nii.gz
Label
BRATS_001.nii.gz
Total number of images with labels in labels/final directory was 60. And the model was trained for 50 epochs.
This is the segmentation.
image

Only one the labels was being segmented.

Do you think the model can be improved with more number of epochs and more images ? Or should I change the network being used or network definition?

@diazandr3s
Copy link
Collaborator

diazandr3s commented Oct 27, 2022

Hi @PranayBolloju,

I'd suggest the following:

Deepedit uses the whole image for training and inference, while the segmentation model uses patches.

Sorry, I've totally forgotten I've developed a model for BRATS. Please use this radiology app: https://github.com/Project-MONAI/MONAILabel/tree/bratsSegmentation/sample-apps/radiology

There you have the brats algo. Just uncomment these lines and comment the others: https://github.com/Project-MONAI/MONAILabel/blob/bratsSegmentation/sample-apps/radiology/lib/configs/segmentation_brats.py#L33

You could download that radiology app and train the model.

Let me know how that goes,

@SachidanandAlle
Copy link
Collaborator

Where you have downloaded the app? There r 4 different apps (sample apps).. check if dir 'monaibundle' exits from where you r running the command

@SachidanandAlle
Copy link
Collaborator

Also note.. bundles work good on Linux version.. as sometimes they have bash scripts.. specially training.. however infer you can still on windows using bundle via monailabel

@ulphypro
Copy link

@SachidanandAlle Thank you very much.

And I added code that is "ensure_channel_first" : true' in Inference.json and train.json, but I occurred error that is Failed to run inference in MONAI Label Server.

What should I do?

@SachidanandAlle
Copy link
Collaborator

start with simple spleen one.. brain mri input has some 4 channels.. and possibly model is trained over 3 or vice versa..

@SachidanandAlle
Copy link
Collaborator

also u need to check the error on the server side.. there will be a descriptive log for each of those steps.. that should give fair amount of information.. what's happening.. why it's happening

@ulphypro
Copy link

@SachidanandAlle OK. I will try.

@ulphypro
Copy link

Dear all members

I’m working auto segmentation with brats_mri_segmentation_v0.2.1 in 3D-Slicer.

When I conduct server start, I used command that is ‘monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1’.

I added code that is ’ “ensure_channel_first”: true ’ in “preprocessing” part in Inference.json of monaibundle.

But it occurs error that is ‘Failed to run Inference in MONAI Label Server’. Does it have solution?

Train.json also need to edit, but I don’t know where it adds code.

brats_error

please, let me know solution.

Detailed error is as following.

[3D-Slicer error]

This will close current scene. Please make sure you have saved your current work.
Are you sure to continue?
Current Selection Options Section: infer
Current Selection Options Name: brats_mri_segmentation_v0.2.1
Invalidate:: models => brats_mri_segmentation_v0.2.1 => device => [‘cuda’] => <class ‘list’>
{‘id’: ‘BRATS_424’, ‘weight’: 1668702326, ‘path’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘ts’: 1657860597, ‘name’: ‘BRATS_424.nii.gz’}
Check if file exists/shared locally: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz => True
Original label not found …
Current Selection Options Section: infer
Current Selection Options Name: brats_mri_segmentation_v0.2.1
Invalidate:: models => brats_mri_segmentation_v0.2.1 => device => [‘cuda’] => <class ‘list’>
Failed to run inference in MONAI Label Server
Time consumed by segmentation: 7.4
Time consumed by next_sample: 7.9
[Server error]
PS C:\Users\AA\AppData\Local\MONAILabel\MONAILabel> monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1
Using PYTHONPATH=C:\Users\AA\AppData\Local\MONAILabel\MONAILabel;
“”
2022-11-18 01:24:32,250 - USING:: version = False
2022-11-18 01:24:32,250 - USING:: app = C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle
2022-11-18 01:24:32,251 - USING:: studies = C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr
2022-11-18 01:24:32,251 - USING:: verbose = INFO
2022-11-18 01:24:32,252 - USING:: conf = [[‘models’, ‘brats_mri_segmentation_v0.2.1’]]
2022-11-18 01:24:32,252 - USING:: host = 0.0.0.0
2022-11-18 01:24:32,252 - USING:: port = 8000
2022-11-18 01:24:32,252 - USING:: uvicorn_app = monailabel.app:app
2022-11-18 01:24:32,253 - USING:: ssl_keyfile = None
2022-11-18 01:24:32,253 - USING:: ssl_certfile = None
2022-11-18 01:24:32,253 - USING:: ssl_keyfile_password = None
2022-11-18 01:24:32,254 - USING:: ssl_ca_certs = None
2022-11-18 01:24:32,254 - USING:: workers = None
2022-11-18 01:24:32,254 - USING:: limit_concurrency = None
2022-11-18 01:24:32,254 - USING:: access_log = False
2022-11-18 01:24:32,255 - USING:: log_config = None
2022-11-18 01:24:32,255 - USING:: dryrun = False
2022-11-18 01:24:32,255 - USING:: action = start_server
2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_API_STR =
2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_PROJECT_NAME = MONAILabel
2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_APP_DIR =
2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_STUDIES =
2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_AUTH_ENABLE = False
2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_AUTH_DB =
2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_APP_CONF = ‘{}’
2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_TASKS_TRAIN = True
2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_STRATEGY = True
2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_SCORING = True
2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_BATCH_INFER = True
2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_DATASTORE =
2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_URL =
2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_USERNAME =
2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PASSWORD =
2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_API_KEY =
2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_CACHE_PATH =
2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PROJECT =
2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_ASSET_PATH =
2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_DSA_ANNOTATION_GROUPS =
2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_USERNAME =
2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PASSWORD =
2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_PATH =
2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_QIDO_PREFIX = None
2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_WADO_PREFIX = None
2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_STOW_PREFIX = None
2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_FETCH_BY_FRAME = False
2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CONVERT_TO_NIFTI = True
2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_SEARCH_FILTER = ‘{“Modality”: “CT”}’
2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_EXPIRY = 180
2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PROXY_TIMEOUT = 30.0
2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_READ_TIMEOUT = 5.0
2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_AUTO_RELOAD = True
2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_READ_ONLY = False
2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_FILE_EXT = ‘[“.nii.gz", ".nii”, “.nrrd", ".jpg”, “.png", ".tif”, “.svs", ".xml”]’
2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_SERVER_PORT = 8000
2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_CORS_ORIGINS = ‘’
2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSIONS = True
2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSION_PATH =
2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSION_EXPIRY = 3600
2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_INFER_CONCURRENCY = -1
2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_INFER_TIMEOUT = 600
2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_AUTO_UPDATE_SCORING = True
2022-11-18 01:24:32,266 -
Allow Origins: [‘‘]
[2022-11-18 01:24:32,995] [18644] [MainThread] [INFO] (uvicorn.error:75) - Started server process [18644]
[2022-11-18 01:24:32,996] [18644] [MainThread] [INFO] (uvicorn.error:45) - Waiting for application startup.
[2022-11-18 01:24:32,997] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.app:38) - Initializing App from: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle; studies: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr; conf: {‘models’: ‘brats_mri_segmentation_v0.2.1’}
[2022-11-18 01:24:33,039] [18644] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class ‘main.MyApp’>
[2022-11-18 01:24:33,835] [18644] [MainThread] [INFO] (monailabel.utils.others.generic:305) - +++ Adding Bundle from Local: brats_mri_segmentation_v0.2.1 => C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle\model\brats_mri_segmentation_v0.2.1
[2022-11-18 01:24:33,836] [18644] [MainThread] [INFO] (monailabel.utils.others.generic:317) - +++ Using Bundle Models: [‘brats_mri_segmentation_v0.2.1’]
[2022-11-18 01:24:33,837] [18644] [MainThread] [INFO] (monailabel.interfaces.app:129) - Init Datastore for: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr
[2022-11-18 01:24:33,838] [18644] [MainThread] [INFO] (monailabel.datastore.local:129) - Auto Reload: True; Extensions: [’.nii.gz’, ‘.nii’, '.nrrd’, ‘.jpg’, '.png’, ‘.tif’, '.svs’, ‘*.xml’]
[2022-11-18 01:24:33,976] [18644] [MainThread] [INFO] (monailabel.datastore.local:576) - Invalidate count: 0
[2022-11-18 01:24:33,976] [18644] [MainThread] [INFO] (monailabel.datastore.local:150) - Start observing external modifications on datastore (AUTO RELOAD)
[2022-11-18 01:24:34,037] [18644] [MainThread] [INFO] (main:63) - +++ Adding Inferer:: brats_mri_segmentation_v0.2.1 => <monailabel.tasks.infer.bundle.BundleInferTask object at 0x000001E2F49BF250>
[2022-11-18 01:24:34,038] [18644] [MainThread] [INFO] (main:77) - +++ Adding Trainer:: brats_mri_segmentation_v0.2.1 => <monailabel.tasks.train.bundle.BundleTrainTask object at 0x000001E2F4BBBA60>
[2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (main:87) - Active Learning Strategies:: [‘random’, ‘first’]
[2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: C:\Users\AA.cache\monailabel\sessions
[2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600
[2022-11-18 01:24:34,040] [18644] [MainThread] [INFO] (monailabel.interfaces.app:468) - App Init - completed
[2022-11-18 01:24:34,040] [timeloop] [INFO] Starting Timeloop…
[2022-11-18 01:24:34,040] [18644] [MainThread] [INFO] (timeloop:60) - Starting Timeloop…
[2022-11-18 01:24:34,041] [timeloop] [INFO] Registered job <function MONAILabelApp.on_init_complete..run_scheduler at 0x000001E2F4BBECA0>
[2022-11-18 01:24:34,041] [18644] [MainThread] [INFO] (timeloop:42) - Registered job <function MONAILabelApp.on_init_complete..run_scheduler at 0x000001E2F4BBECA0>
[2022-11-18 01:24:34,042] [timeloop] [INFO] Timeloop now started. Jobs will run based on the interval set
[2022-11-18 01:24:34,042] [18644] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set
[2022-11-18 01:24:34,042] [18644] [MainThread] [INFO] (uvicorn.error:59) - Application startup complete.
[2022-11-18 01:24:34,043] [18644] [MainThread] [INFO] (uvicorn.error:206) - Uvicorn running on http://0.0.0.0:8000/ (Press CTRL+C to quit)
[2022-11-18 01:25:26,734] [18644] [MainThread] [INFO] (monailabel.endpoints.activelearning:43) - Active Learning Request: {‘strategy’: ‘random’, ‘client_id’: ‘user-xyz’}
[2022-11-18 01:25:26,786] [18644] [MainThread] [INFO] (monailabel.tasks.activelearning.random:47) - Random: Selected Image: BRATS_424; Weight: 1668702326
[2022-11-18 01:25:26,800] [18644] [MainThread] [INFO] (monailabel.endpoints.activelearning:59) - Next sample: {‘id’: ‘BRATS_424’, ‘weight’: 1668702326, ‘path’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘ts’: 1657860597, ‘name’: ‘BRATS_424.nii.gz’}
[2022-11-18 01:25:27,160] [18644] [MainThread] [INFO] (monailabel.endpoints.infer:160) - Infer Request: {‘model’: ‘brats_mri_segmentation_v0.2.1’, ‘image’: ‘BRATS_424’, ‘device’: ‘cuda’, ‘result_extension’: ‘.nrrd’, ‘result_dtype’: ‘uint8’, ‘client_id’: ‘user-xyz’}
[2022-11-18 01:25:27,161] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:276) - Infer Request (final): {‘device’: ‘cuda’, ‘model’: ‘brats_mri_segmentation_v0.2.1’, ‘image’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘result_extension’: ‘.nrrd’, ‘result_dtype’: ‘uint8’, ‘client_id’: ‘user-xyz’, ‘description’: ‘A pre-trained model for volumetric (3D) segmentation of brain tumor subregions from multimodal MRIs based on BraTS 2018 data’}
[2022-11-18 01:25:27,164] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:76) - PRE - Run Transform(s)
[2022-11-18 01:25:27,165] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:77) - PRE - Input Keys: [‘device’, ‘model’, ‘image’, ‘result_extension’, ‘result_dtype’, ‘client_id’, ‘description’, ‘image_path’]
[2022-11-18 01:25:27,774] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:122) - PRE - Transform (LoadImageTensord): Time: 0.6082; image: (4, 240, 240, 155)(torch.float32)
[2022-11-18 01:25:28,090] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:122) - PRE - Transform (NormalizeIntensityd): Time: 0.3166; image: (4, 240, 240, 155)(torch.float32)
[2022-11-18 01:25:28,091] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:464) - Inferer:: cuda => SlidingWindowInferer => {‘roi_size’: [240, 240, 160], ‘sw_batch_size’: 1, ‘overlap’: 0.5, ‘mode’: constant, ‘sigma_scale’: 0.125, ‘padding_mode’: constant, ‘cval’: 0.0, ‘sw_device’: None, ‘device’: None, ‘progress’: False, ‘cpu_thresh’: None, ‘roi_weight_map’: None}
[2022-11-18 01:25:28,092] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:413) - Infer model path: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle\model\brats_mri_segmentation_v0.2.1\models\model.pt
[2022-11-18 01:25:31,082] [18644] [MainThread] [ERROR] (uvicorn.error:369) - Exception in ASGI application
Traceback (most recent call last):
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\protocols\http\h11_impl.py”, line 366, in run_asgi
result = await app(self.scope, self.receive, self.send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\middleware\proxy_headers.py”, line 75, in call
return await self.app(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\applications.py”, line 269, in call
await super().call(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\applications.py”, line 124, in call
await self.middleware_stack(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py”, line 184, in call
raise exc
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py”, line 162, in call
await self.app(scope, receive, _send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\cors.py”, line 84, in call
await self.app(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py”, line 93, in call
raise exc
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py”, line 82, in call
await self.app(scope, receive, sender)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py”, line 21, in call
raise e
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py”, line 18, in call
await self.app(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 670, in call
await route.handle(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 266, in handle
await self.app(scope, receive, send)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 65, in app
response = await func(request)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py”, line 227, in app
raw_response = await run_endpoint_function(
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py”, line 160, in run_endpoint_function
return await dependant.call(**values)
File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\endpoints\infer.py”, line 179, in api_run_inference
return run_inference(background_tasks, model, image, session_id, params, file, label, output)
File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\endpoints\infer.py”, line 161, in run_inference
result = instance.infer(request)
File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\interfaces\app.py”, line 300, in infer
result_file_name, result_json = task(request)
File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\tasks\infer\basic_infer.py”, line 300, in call
data = self.run_inferer(data, device=device)
File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\tasks\infer\basic_infer.py”, line 480, in run_inferer
outputs_d = decollate_batch(outputs)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 587, in decollate_batch
for t, m in zip(out_list, decollate_batch(batch.meta)):
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 599, in decollate_batch
b, non_iterable, deco = _non_zipping_check(batch, detach, pad, fill_value)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 500, in _non_zipping_check
_deco = {key: decollate_batch(batch_data[key], detach, pad=pad, fill_value=fill_value) for key in batch_data}
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 500, in
_deco = {key: decollate_batch(batch_data[key], detach, pad=pad, fill_value=fill_value) for key in batch_data}
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 599, in decollate_batch
b, non_iterable, deco = _non_zipping_check(batch, detach, pad, fill_value)
File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 502, in _non_zipping_check
_deco = [decollate_batch(b, detach, pad=pad, fill_value=fill_value) for b in batch_data]
TypeError: iteration over a 0-d array

@diazandr3s
Copy link
Collaborator

Dear all members

I’m working auto segmentation with brats_mri_segmentation_v0.2.1 in 3D-Slicer.

When I conduct server start, I used command that is ‘monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1’.

I added code that is ’ “ensure_channel_first”: true ’ in “preprocessing” part in Inference.json of monaibundle.

But it occurs error that is ‘Failed to run Inference in MONAI Label Server’. Does it have solution?

Train.json also need to edit, but I don’t know where it adds code.

brats_error

please, let me know solution.

Detailed error is as following.

[3D-Slicer error]

This will close current scene. Please make sure you have saved your current work. Are you sure to continue? Current Selection Options Section: infer Current Selection Options Name: brats_mri_segmentation_v0.2.1 Invalidate:: models => brats_mri_segmentation_v0.2.1 => device => [‘cuda’] => <class ‘list’> {‘id’: ‘BRATS_424’, ‘weight’: 1668702326, ‘path’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘ts’: 1657860597, ‘name’: ‘BRATS_424.nii.gz’} Check if file exists/shared locally: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz => True Original label not found … Current Selection Options Section: infer Current Selection Options Name: brats_mri_segmentation_v0.2.1 Invalidate:: models => brats_mri_segmentation_v0.2.1 => device => [‘cuda’] => <class ‘list’> Failed to run inference in MONAI Label Server Time consumed by segmentation: 7.4 Time consumed by next_sample: 7.9 [Server error] PS C:\Users\AA\AppData\Local\MONAILabel\MONAILabel> monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1 Using PYTHONPATH=C:\Users\AA\AppData\Local\MONAILabel\MONAILabel; “” 2022-11-18 01:24:32,250 - USING:: version = False 2022-11-18 01:24:32,250 - USING:: app = C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle 2022-11-18 01:24:32,251 - USING:: studies = C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr 2022-11-18 01:24:32,251 - USING:: verbose = INFO 2022-11-18 01:24:32,252 - USING:: conf = [[‘models’, ‘brats_mri_segmentation_v0.2.1’]] 2022-11-18 01:24:32,252 - USING:: host = 0.0.0.0 2022-11-18 01:24:32,252 - USING:: port = 8000 2022-11-18 01:24:32,252 - USING:: uvicorn_app = monailabel.app:app 2022-11-18 01:24:32,253 - USING:: ssl_keyfile = None 2022-11-18 01:24:32,253 - USING:: ssl_certfile = None 2022-11-18 01:24:32,253 - USING:: ssl_keyfile_password = None 2022-11-18 01:24:32,254 - USING:: ssl_ca_certs = None 2022-11-18 01:24:32,254 - USING:: workers = None 2022-11-18 01:24:32,254 - USING:: limit_concurrency = None 2022-11-18 01:24:32,254 - USING:: access_log = False 2022-11-18 01:24:32,255 - USING:: log_config = None 2022-11-18 01:24:32,255 - USING:: dryrun = False 2022-11-18 01:24:32,255 - USING:: action = start_server 2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_API_STR = 2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_PROJECT_NAME = MONAILabel 2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_APP_DIR = 2022-11-18 01:24:32,256 - ENV SETTINGS:: MONAI_LABEL_STUDIES = 2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_AUTH_ENABLE = False 2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_AUTH_DB = 2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_APP_CONF = ‘{}’ 2022-11-18 01:24:32,257 - ENV SETTINGS:: MONAI_LABEL_TASKS_TRAIN = True 2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_STRATEGY = True 2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_SCORING = True 2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_TASKS_BATCH_INFER = True 2022-11-18 01:24:32,258 - ENV SETTINGS:: MONAI_LABEL_DATASTORE = 2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_URL = 2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_USERNAME = 2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PASSWORD = 2022-11-18 01:24:32,259 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_API_KEY = 2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_CACHE_PATH = 2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PROJECT = 2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_ASSET_PATH = 2022-11-18 01:24:32,260 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_DSA_ANNOTATION_GROUPS = 2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_USERNAME = 2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PASSWORD = 2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_PATH = 2022-11-18 01:24:32,261 - ENV SETTINGS:: MONAI_LABEL_QIDO_PREFIX = None 2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_WADO_PREFIX = None 2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_STOW_PREFIX = None 2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_FETCH_BY_FRAME = False 2022-11-18 01:24:32,262 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CONVERT_TO_NIFTI = True 2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_SEARCH_FILTER = ‘{“Modality”: “CT”}’ 2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_EXPIRY = 180 2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PROXY_TIMEOUT = 30.0 2022-11-18 01:24:32,263 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_READ_TIMEOUT = 5.0 2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_AUTO_RELOAD = True 2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_READ_ONLY = False 2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_FILE_EXT = ‘[“.nii.gz", ".nii”, “.nrrd", ".jpg”, “.png", ".tif”, “.svs", ".xml”]’ 2022-11-18 01:24:32,264 - ENV SETTINGS:: MONAI_LABEL_SERVER_PORT = 8000 2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_CORS_ORIGINS = ‘’ 2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSIONS = True 2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSION_PATH = 2022-11-18 01:24:32,265 - ENV SETTINGS:: MONAI_LABEL_SESSION_EXPIRY = 3600 2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_INFER_CONCURRENCY = -1 2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_INFER_TIMEOUT = 600 2022-11-18 01:24:32,266 - ENV SETTINGS:: MONAI_LABEL_AUTO_UPDATE_SCORING = True 2022-11-18 01:24:32,266 - Allow Origins: [‘‘] [2022-11-18 01:24:32,995] [18644] [MainThread] [INFO] (uvicorn.error:75) - Started server process [18644] [2022-11-18 01:24:32,996] [18644] [MainThread] [INFO] (uvicorn.error:45) - Waiting for application startup. [2022-11-18 01:24:32,997] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.app:38) - Initializing App from: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle; studies: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr; conf: {‘models’: ‘brats_mri_segmentation_v0.2.1’} [2022-11-18 01:24:33,039] [18644] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class ‘main.MyApp’> [2022-11-18 01:24:33,835] [18644] [MainThread] [INFO] (monailabel.utils.others.generic:305) - +++ Adding Bundle from Local: brats_mri_segmentation_v0.2.1 => C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle\model\brats_mri_segmentation_v0.2.1 [2022-11-18 01:24:33,836] [18644] [MainThread] [INFO] (monailabel.utils.others.generic:317) - +++ Using Bundle Models: [‘brats_mri_segmentation_v0.2.1’] [2022-11-18 01:24:33,837] [18644] [MainThread] [INFO] (monailabel.interfaces.app:129) - Init Datastore for: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr [2022-11-18 01:24:33,838] [18644] [MainThread] [INFO] (monailabel.datastore.local:129) - Auto Reload: True; Extensions: [’.nii.gz’, ‘.nii’, '.nrrd’, ‘.jpg’, '.png’, ‘.tif’, '.svs’, ‘*.xml’] [2022-11-18 01:24:33,976] [18644] [MainThread] [INFO] (monailabel.datastore.local:576) - Invalidate count: 0 [2022-11-18 01:24:33,976] [18644] [MainThread] [INFO] (monailabel.datastore.local:150) - Start observing external modifications on datastore (AUTO RELOAD) [2022-11-18 01:24:34,037] [18644] [MainThread] [INFO] (main:63) - +++ Adding Inferer:: brats_mri_segmentation_v0.2.1 => <monailabel.tasks.infer.bundle.BundleInferTask object at 0x000001E2F49BF250> [2022-11-18 01:24:34,038] [18644] [MainThread] [INFO] (main:77) - +++ Adding Trainer:: brats_mri_segmentation_v0.2.1 => <monailabel.tasks.train.bundle.BundleTrainTask object at 0x000001E2F4BBBA60> [2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (main:87) - Active Learning Strategies:: [‘random’, ‘first’] [2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (monailabel.utils.sessions:51) - Session Path: C:\Users\AA.cache\monailabel\sessions [2022-11-18 01:24:34,039] [18644] [MainThread] [INFO] (monailabel.utils.sessions:52) - Session Expiry (max): 3600 [2022-11-18 01:24:34,040] [18644] [MainThread] [INFO] (monailabel.interfaces.app:468) - App Init - completed [2022-11-18 01:24:34,040] [timeloop] [INFO] Starting Timeloop… [2022-11-18 01:24:34,040] [18644] [MainThread] [INFO] (timeloop:60) - Starting Timeloop… [2022-11-18 01:24:34,041] [timeloop] [INFO] Registered job <function MONAILabelApp.on_init_complete..run_scheduler at 0x000001E2F4BBECA0> [2022-11-18 01:24:34,041] [18644] [MainThread] [INFO] (timeloop:42) - Registered job <function MONAILabelApp.on_init_complete..run_scheduler at 0x000001E2F4BBECA0> [2022-11-18 01:24:34,042] [timeloop] [INFO] Timeloop now started. Jobs will run based on the interval set [2022-11-18 01:24:34,042] [18644] [MainThread] [INFO] (timeloop:63) - Timeloop now started. Jobs will run based on the interval set [2022-11-18 01:24:34,042] [18644] [MainThread] [INFO] (uvicorn.error:59) - Application startup complete. [2022-11-18 01:24:34,043] [18644] [MainThread] [INFO] (uvicorn.error:206) - Uvicorn running on http://0.0.0.0:8000/ (Press CTRL+C to quit) [2022-11-18 01:25:26,734] [18644] [MainThread] [INFO] (monailabel.endpoints.activelearning:43) - Active Learning Request: {‘strategy’: ‘random’, ‘client_id’: ‘user-xyz’} [2022-11-18 01:25:26,786] [18644] [MainThread] [INFO] (monailabel.tasks.activelearning.random:47) - Random: Selected Image: BRATS_424; Weight: 1668702326 [2022-11-18 01:25:26,800] [18644] [MainThread] [INFO] (monailabel.endpoints.activelearning:59) - Next sample: {‘id’: ‘BRATS_424’, ‘weight’: 1668702326, ‘path’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘ts’: 1657860597, ‘name’: ‘BRATS_424.nii.gz’} [2022-11-18 01:25:27,160] [18644] [MainThread] [INFO] (monailabel.endpoints.infer:160) - Infer Request: {‘model’: ‘brats_mri_segmentation_v0.2.1’, ‘image’: ‘BRATS_424’, ‘device’: ‘cuda’, ‘result_extension’: ‘.nrrd’, ‘result_dtype’: ‘uint8’, ‘client_id’: ‘user-xyz’} [2022-11-18 01:25:27,161] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:276) - Infer Request (final): {‘device’: ‘cuda’, ‘model’: ‘brats_mri_segmentation_v0.2.1’, ‘image’: ‘C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\datasets\Task01_BrainTumour\imagesTr\BRATS_424.nii.gz’, ‘result_extension’: ‘.nrrd’, ‘result_dtype’: ‘uint8’, ‘client_id’: ‘user-xyz’, ‘description’: ‘A pre-trained model for volumetric (3D) segmentation of brain tumor subregions from multimodal MRIs based on BraTS 2018 data’} [2022-11-18 01:25:27,164] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:76) - PRE - Run Transform(s) [2022-11-18 01:25:27,165] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:77) - PRE - Input Keys: [‘device’, ‘model’, ‘image’, ‘result_extension’, ‘result_dtype’, ‘client_id’, ‘description’, ‘image_path’] [2022-11-18 01:25:27,774] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:122) - PRE - Transform (LoadImageTensord): Time: 0.6082; image: (4, 240, 240, 155)(torch.float32) [2022-11-18 01:25:28,090] [18644] [MainThread] [INFO] (monailabel.interfaces.utils.transform:122) - PRE - Transform (NormalizeIntensityd): Time: 0.3166; image: (4, 240, 240, 155)(torch.float32) [2022-11-18 01:25:28,091] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:464) - Inferer:: cuda => SlidingWindowInferer => {‘roi_size’: [240, 240, 160], ‘sw_batch_size’: 1, ‘overlap’: 0.5, ‘mode’: constant, ‘sigma_scale’: 0.125, ‘padding_mode’: constant, ‘cval’: 0.0, ‘sw_device’: None, ‘device’: None, ‘progress’: False, ‘cpu_thresh’: None, ‘roi_weight_map’: None} [2022-11-18 01:25:28,092] [18644] [MainThread] [INFO] (monailabel.tasks.infer.basic_infer:413) - Infer model path: C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\apps\monaibundle\model\brats_mri_segmentation_v0.2.1\models\model.pt [2022-11-18 01:25:31,082] [18644] [MainThread] [ERROR] (uvicorn.error:369) - Exception in ASGI application Traceback (most recent call last): File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\protocols\http\h11_impl.py”, line 366, in run_asgi result = await app(self.scope, self.receive, self.send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\middleware\proxy_headers.py”, line 75, in call return await self.app(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\applications.py”, line 269, in call await super().call(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\applications.py”, line 124, in call await self.middleware_stack(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py”, line 184, in call raise exc File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py”, line 162, in call await self.app(scope, receive, _send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\cors.py”, line 84, in call await self.app(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py”, line 93, in call raise exc File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py”, line 82, in call await self.app(scope, receive, sender) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py”, line 21, in call raise e File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py”, line 18, in call await self.app(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 670, in call await route.handle(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 266, in handle await self.app(scope, receive, send) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py”, line 65, in app response = await func(request) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py”, line 227, in app raw_response = await run_endpoint_function( File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py”, line 160, in run_endpoint_function return await dependant.call(**values) File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\endpoints\infer.py”, line 179, in api_run_inference return run_inference(background_tasks, model, image, session_id, params, file, label, output) File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\endpoints\infer.py”, line 161, in run_inference result = instance.infer(request) File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\interfaces\app.py”, line 300, in infer result_file_name, result_json = task(request) File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\tasks\infer\basic_infer.py”, line 300, in call data = self.run_inferer(data, device=device) File “C:\Users\AA\AppData\Local\MONAILabel\MONAILabel\monailabel\tasks\infer\basic_infer.py”, line 480, in run_inferer outputs_d = decollate_batch(outputs) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 587, in decollate_batch for t, m in zip(out_list, decollate_batch(batch.meta)): File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 599, in decollate_batch b, non_iterable, deco = _non_zipping_check(batch, detach, pad, fill_value) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 500, in _non_zipping_check _deco = {key: decollate_batch(batch_data[key], detach, pad=pad, fill_value=fill_value) for key in batch_data} File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 500, in _deco = {key: decollate_batch(batch_data[key], detach, pad=pad, fill_value=fill_value) for key in batch_data} File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 599, in decollate_batch b, non_iterable, deco = _non_zipping_check(batch, detach, pad, fill_value) File “C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\data\utils.py”, line 502, in _non_zipping_check _deco = [decollate_batch(b, detach, pad=pad, fill_value=fill_value) for b in batch_data] TypeError: iteration over a 0-d array

Hi @ulphypro,

As mentioned before, having 4 modalities in a single nifti file does not make much sense: #1051 (comment)

I'd recommend the same as @SachidanandAlle: #1051 (comment)

Unfortunately, the monaibundle for brats (brats_mri_segmentation_v0.2.1) needs more work to properly manage the 4 modalities and be used in Slicer. It currently works in MONAI Core only.

Hope that makes sense,

@ulphypro
Copy link

ulphypro commented Nov 19, 2022

Dear @diazandr3s

Thank you for answering my question.

Then, shall I edit in_channel->4 into 1, out_channel ->3 into 1 and shall I run using only one target arugment as following in /configs/inference.json.files? :
"transforms" --> "target": "Activationsd",
"keys": "pred",
"sigmoid": true

Also configs/train.json?

@diazandr3s
Copy link
Collaborator

diazandr3s commented Nov 22, 2022

Dear @diazandr3s

Thank you for answering my question.

Then, shall I edit in_channel->4 into 1, out_channel ->3 into 1 and shall I run using only one target arugment as following in /configs/inference.json.files? : "transforms" --> "target": "Activationsd", "keys": "pred", "sigmoid": true

Also configs/train.json?

Hi @ulphypro,

The issue isn't only the code, but it is also the dataset. Each file should have a single modality, not 4 as it currently has.

If you want to use Slicer, you have to separate the 4 modalities or use the original BRATS 2021 dataset - it originally has the 4 modalities separated.

Once you have the separated files, I'd recommend using the segmentation app in MONAI Label radiology app: https://github.com/Project-MONAI/MONAILabel/tree/main/sample-apps/radiology

Same discussion is happening here: Project-MONAI/model-zoo#239

Hope that makes sense.

@ulphypro
Copy link

ulphypro commented Nov 25, 2022

@diazandr3s

Thank you for answering my question.

I downloaded BraTS2021 dataset as you mention.

Should I run using apps/radiology with BraTS2021 dataset?

After starting monailabel server using command 'monailabel start_server --app apps/radiology --studies datasets/Task01_BrainTumour/imagesTr --conf models segmentation' in Window Powershell, I can't run 3D-Slicer.

Because It doesn't support segmentation model associated with brain tumor.

Person that I posted in Project-MONAI/model-zoo#239 is also me.

@diazandr3s
Copy link
Collaborator

Hi @ulphypro,

It seems you've downloaded Task01 from the medical segmentation decathlon. That dataset is composed of files that contain all four modalities in a single nifti file. This is precisely the issue: #1051 (comment)

I'd recommend you download the original BRATS dataset that has the nifti files separated - please check here https://www.med.upenn.edu/cbica/brats2021/

Then you could start training a model from scratch as recommended here: https://www.youtube.com/watch?v=3HTh2dqZqew&list=PLtoSVSQ2XzyD4lc-lAacFBzOdv5Ou-9IA&index=4

I hope that helps,

@ulphypro
Copy link

ulphypro commented Nov 29, 2022

Dear @diazandr3s

Thank you for answering my question.

I ran monailabel using 3D-Slicer as you refer mention.

I conducted as following.

Please, see process as following.

  1. I downloaded BraTS2021 dataset.
    (kaggle link : https://www.kaggle.com/datasets/dschettler8845/brats-2021-task1)
    [included dataset]
    -BraTS2021_00495.tar
    ---BraTS2021_00495_flair.nii.gz
    ---BraTS2021_00495_seg.nii.gz
    ---BraTS2021_00495_t1.nii.gz
    ---BraTS2021_00495_t1ce.nii.gz
    ---BraTS2021_00495_t2.nii.gz

-BraTS2021_00621.tar
---BraTS2021_00621_flair.nii.gz
---BraTS2021_00621_seg.nii.gz
---BraTS2021_00621_t1.nii.gz
---BraTS2021_00621_t1ce.nii.gz
---BraTS2021_00621_t2.nii.gz

-BraTS2021_Training_Data.tar
--BraTS2021_00000
----BraTS2021_00000_flair.nii.gz
----BraTS2021_00000_seg.nii.gz
----BraTS2021_00000_t1.nii.gz
----BraTS2021_00000_t1ce.nii.gz
----BraTS2021_00000_t2.nii.gz
--BraTS2021_00001
----BraTS2021_00001_flair.nii.gz
----BraTS2021_00001_seg.nii.gz
----BraTS2021_00001_t1.nii.gz
----BraTS2021_00001_t1ce.nii.gz
----BraTS2021_00001_t2.nii.gz
...
...
...
...
...
...
...
--BraTS2021_01666
----BraTS2021_01666_flair.nii.gz
----BraTS2021_01666_seg.nii.gz
----BraTS2021_01666_t1.nii.gz
----BraTS2021_01666_t1ce.nii.gz
----BraTS2021_01666_t2.nii.gz

  1. I edited code 'apps/radiology/lib/configs/segmentation_brats.py' as figure bottom.
    segmentation_brats_code_edit

  2. I ran server start using window powershell.

  • monailabel start_server --app apps/radiology --studies datasets/BraTS2021_Training_Data/BraTS2021_00000 --conf models segmentation_brats
  1. After I had ran monailabel module in 3D-Slicer, I pressed 'refresh' button in monailabel server option. And then I pressed 'next sample' button.
    -> but it occurs the error in here as following
    image
    [2022-11-28 19:42:42,465] [29288] [MainThread] [ERROR] (uvicorn.error:369) - Exception in ASGI application
    Traceback (most recent call last):
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 366, in run_asgi
    result = await app(self.scope, self.receive, self.send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 75, in call
    return await self.app(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\applications.py", line 269, in call
    await super().call(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\applications.py", line 124, in call
    await self.middleware_stack(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py", line 184, in call
    raise exc
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py", line 162, in call
    await self.app(scope, receive, _send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\cors.py", line 84, in call
    await self.app(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py", line 93, in call
    raise exc
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py", line 82, in call
    await self.app(scope, receive, sender)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
    raise e
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
    await self.app(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 670, in call
    await route.handle(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 266, in handle
    await self.app(scope, receive, send)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 65, in app
    response = await func(request)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py", line 227, in app
    raw_response = await run_endpoint_function(
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py", line 160, in run_endpoint_function
    return await dependant.call(**values)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monailabel\endpoints\infer.py", line 179, in api_run_inference
    return run_inference(background_tasks, model, image, session_id, params, file, label, output)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monailabel\endpoints\infer.py", line 161, in run_inference
    result = instance.infer(request)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monailabel\interfaces\app.py", line 299, in infer
    result_file_name, result_json = task(request)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monailabel\tasks\infer\basic_infer.py", line 271, in call
    data = self.run_inferer(data, device=device)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monailabel\tasks\infer\basic_infer.py", line 436, in run_inferer
    outputs = inferer(inputs, network)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\inferers\inferer.py", line 202, in call
    return sliding_window_inference(
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\inferers\utils.py", line 180, in sliding_window_inference
    seg_prob_out = predictor(window_data, *args, **kwargs) # batched patch segmentation
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\networks\nets\unet.py", line 311, in forward
    x = self.model(x)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\container.py", line 139, in forward
    input = module(input)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\monai\networks\blocks\convolutions.py", line 314, in forward
    res: torch.Tensor = self.residual(x) # create the additive residual from x
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\conv.py", line 607, in forward
    return self._conv_forward(input, self.weight, self.bias)
    File "C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\conv.py", line 602, in _conv_forward
    return F.conv3d(
    RuntimeError: Given groups=1, weight of size [16, 4, 3, 3, 3], expected input[1, 1, 128, 128, 128] to have 4 channels, but got 1 channels instead

Question 1.
I ran monailabel module referring youtube and using radiology apps and BraTS2021 dataset that you informed me.
What should I do to solve the error such as '4.' .
Please, let me know solution tip for error of '4.'.

Question 2.
How many times do I need to repeat traning before I can run 'Auto Segmentation'?

@diazandr3s
Copy link
Collaborator

Hi @ulphypro,

A couple of things here:

  • Files with this ending _seg.nii.gz are the segmentation ground truth. To run MONAI Label, they shouldn't be in the same folder as the images.
  • Set the self.number_intensity_ch = 1 -- This means the model only uses single modality
  • Remove all files within the folder apps/radiology/model

Hope this helps,

@ulphypro
Copy link

ulphypro commented Dec 1, 2022

Dear @diazandr3s
Thank you for informing solution tip.

Your Answer) Files with this ending _seg.nii.gz are the segmentation ground truth. To run MONAI Label, they shouldn't be in the same folder as the images.
---->I didn't put seg.nii.gz file in dataset/BraTS2021/BraTS2021_folders as figure bottom as referring your mention.
image

----->I edited code segmentation_brats.py
image

------->I eliminated anything file in apps/radology/model folder
image

And then, I ran server start and 3D-Slicer again.
The procedure is as following.

1. monailabel start_server --app apps/radiology --studies datasets/BraTS2021_Training_Data/BraTS2021_00002 --conf models segmentation_brats

image

2. < Run MONAI Label module using 3D-Slicer> To start MONAI Label server, I pressed 'MONAI Label server' button. Despite of running wihout error, when I had pressed 'next sample' button. All segment including outside of brain detected.

seg_nii_gz_file_result

Please, let me know how to not see green box in 3d view of 3D-Slicer when I pressed 'next sample' button in MONAI Label module option.

@diazandr3s
Copy link
Collaborator

Hi @ulphypro,

After getting the next sample, did you press the run auto-segmentation button? For how long have you trained the model?

The green region seems a prediction from a model that hasn't been trained.

BTW, the images you have in the main folder are of the same and of different modalities (FLAIR, T1, T1ce and T2). Are you sure you want to do that? You're training a model to recognise tumours on multiple modalities at the same time. I'd use a single modality FLAIR, T1ce or T2 but not all of them and use more cases/patients

Hope this helps,

@ulphypro
Copy link

ulphypro commented Dec 2, 2022

Dear @diazandr3s

I'll answer associated with your mention.

Now, I'm using one file of sample files(flair, t1, t2 or t1ce), but maybe.. I'm likely to only use t2_nii.gz per one patient later.

Then, should I include other t2.nii.gz files (ex: a1_t2.nii.gz, a2_t2.nii.gz, a3_t2.nii.gz,..., and etc.) in one folder?

I just pressed 'next sample' button, and then it shows greenbox in 3D view. I didn't nothing except pressing 'Next Sample'. I didn't also trained any model.

First, what I want is not to show green box and anything in 3D view when I pressed 'Next Sample' button using MONAI Label module in 3D-Slicer.

Second, and then I want to conduct train to extract brain tumor.
How many repetitions should the training be to extract brain tumor as 'Auto Segmentation'?

Third, (so) when I pressed 'Run' button in 'Auto Segmentation' option, brain tumor should be detected its all segment within brain tumor region.

That's all.

Please, answer my three mention above.

@diazandr3s
Copy link
Collaborator

Hi @ulphypro,

Now, I'm using one file of sample files(flair, t1, t2 or t1ce), but maybe.. I'm likely to only use t2_nii.gz per one patient later.

Then you should train the model on T2 only.

Then, should I include other t2.nii.gz files (ex: a1_t2.nii.gz, a2_t2.nii.gz, a3_t2.nii.gz,..., and etc.) in one folder?

Yes, put all T2 images from all patients in the same folder.

I just pressed 'next sample' button, and then it shows greenbox in 3D view. I didn't nothing except pressing 'Next Sample'. I didn't also trained any model.

It is strange, if you click on Next Sample, you should only see the image. Please make sure this folder is empty: datasets/BraTS2021_Training_Data/BraTS2021_00002/labels/original

How many repetitions should the training be to extract brain tumor as 'Auto Segmentation'?

It is difficult to say. I'd suggest you train for some epochs (~100) and see how the model performs.

@bilalcodehub
Copy link
Contributor

bilalcodehub commented Feb 12, 2023

monailabel for brats segmentation is giving a different error: Invalid Model(s) are provided: ['brats_mri_segmentation_v0.3.8'], either not in model zoo or not supported with MONAI Label

The server log is below:

root@turing:/opt/monai# monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.3.8
Using PYTHONPATH=/opt:

2023-02-12 11:28:28,363 - USING:: version = False
2023-02-12 11:28:28,364 - USING:: app = /opt/monai/apps/monaibundle
2023-02-12 11:28:28,364 - USING:: studies = /opt/monai/datasets/Task01_BrainTumour/imagesTr
2023-02-12 11:28:28,364 - USING:: verbose = INFO
2023-02-12 11:28:28,364 - USING:: conf = [['models', 'brats_mri_segmentation_v0.3.8']]
2023-02-12 11:28:28,364 - USING:: host = 0.0.0.0
2023-02-12 11:28:28,364 - USING:: port = 8000
2023-02-12 11:28:28,364 - USING:: uvicorn_app = monailabel.app:app
2023-02-12 11:28:28,364 - USING:: ssl_keyfile = None
2023-02-12 11:28:28,364 - USING:: ssl_certfile = None
2023-02-12 11:28:28,364 - USING:: ssl_keyfile_password = None
2023-02-12 11:28:28,364 - USING:: ssl_ca_certs = None
2023-02-12 11:28:28,364 - USING:: workers = None
2023-02-12 11:28:28,364 - USING:: limit_concurrency = None
2023-02-12 11:28:28,364 - USING:: access_log = False
2023-02-12 11:28:28,364 - USING:: log_config = None
2023-02-12 11:28:28,364 - USING:: dryrun = False
2023-02-12 11:28:28,364 - USING:: action = start_server
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_API_STR = 
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_PROJECT_NAME = MONAILabel
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_APP_DIR = 
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_STUDIES = 
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_AUTH_ENABLE = False
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_AUTH_DB = 
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_APP_CONF = '{}'
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_TASKS_TRAIN = True
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_TASKS_STRATEGY = True
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_TASKS_SCORING = True
2023-02-12 11:28:28,364 - ENV SETTINGS:: MONAI_LABEL_TASKS_BATCH_INFER = True
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE = cvat
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_URL = http://127.0.0.1:8080
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_USERNAME =
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PASSWORD = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_API_KEY = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_CACHE_PATH = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_PROJECT = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_ASSET_PATH = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_DSA_ANNOTATION_GROUPS = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_USERNAME = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PASSWORD = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_PATH = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_QIDO_PREFIX = None
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_WADO_PREFIX = None
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_STOW_PREFIX = None
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_FETCH_BY_FRAME = False
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CONVERT_TO_NIFTI = True
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_SEARCH_FILTER = '{"Modality": "CT"}'
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_CACHE_EXPIRY = 180
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_PROXY_TIMEOUT = 30.0
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DICOMWEB_READ_TIMEOUT = 5.0
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_AUTO_RELOAD = True
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_READ_ONLY = False
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_DATASTORE_FILE_EXT = '["*.nii.gz", "*.nii", "*.nrrd", "*.jpg", "*.png", "*.tif", "*.svs", "*.xml"]'
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_SERVER_PORT = 8000
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_CORS_ORIGINS = '[]'
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_SESSIONS = True
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_SESSION_PATH = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_SESSION_EXPIRY = 3600
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_INFER_CONCURRENCY = -1
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_INFER_TIMEOUT = 600
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_TRACKING_ENABLED = True
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_TRACKING_URI = 
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_ZOO_SOURCE = github
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_ZOO_REPO = Project-MONAI/model-zoo/hosting_storage_v1
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_ZOO_AUTH_TOKEN =
2023-02-12 11:28:28,365 - ENV SETTINGS:: MONAI_LABEL_AUTO_UPDATE_SCORING = True
2023-02-12 11:28:28,365 - 
Allow Origins: ['*']
[2023-02-12 11:28:28,749] [3699] [MainThread] [INFO] (uvicorn.error:75) - Started server process [3699]
[2023-02-12 11:28:28,750] [3699] [MainThread] [INFO] (uvicorn.error:45) - Waiting for application startup.
[2023-02-12 11:28:28,750] [3699] [MainThread] [INFO] (monailabel.interfaces.utils.app:38) - Initializing App from: /opt/monai/apps/monaibundle; studies: /opt/monai/datasets/Task01_BrainTumour/imagesTr; conf: {'models': 'brats_mri_segmentation_v0.3.8'}
[2023-02-12 11:28:28,793] [3699] [MainThread] [INFO] (monailabel.utils.others.class_utils:37) - Subclass for MONAILabelApp Found: <class 'main.MyApp'>

---------------------------------------------------------------------------------------
Invalid Model(s) are provided: ['brats_mri_segmentation_v0.3.8'], either not in model zoo or not supported with MONAI Label
Following are the available models.  You can pass comma (,) separated names to pass multiple
Available bundle with latest tags:
   -c models
        lung_nodule_ct_detection 
        pancreas_ct_dints_segmentation 
        prostate_mri_anatomy 
        renalStructures_UNEST_segmentation 
        spleen_ct_segmentation 
        spleen_deepedit_annotation 
        swin_unetr_btcv_segmentation 
        wholeBrainSeg_Large_UNEST_segmentation
Or provide valid local bundle directories
---------------------------------------------------------------------------------------

[2023-02-12 11:28:32,170] [3699] [MainThread] [ERROR] (uvicorn.error:119) - Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 635, in lifespan
    async with self.lifespan_context(app):
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 530, in __aenter__
    await self._router.startup()
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 612, in startup
    await handler()
  File "/opt/conda/lib/python3.8/site-packages/monailabel/app.py", line 106, in startup_event
    instance = app_instance()
  File "/opt/conda/lib/python3.8/site-packages/monailabel/interfaces/utils/app.py", line 51, in app_instance
    app = c(app_dir=app_dir, studies=studies, conf=conf)
  File "/opt/monai/apps/monaibundle/main.py", line 36, in __init__
    self.models = get_bundle_models(app_dir, conf)
  File "/opt/conda/lib/python3.8/site-packages/monailabel/utils/others/generic.py", line 296, in get_bundle_models
    exit(-1)
  File "/opt/conda/lib/python3.8/_sitebuiltins.py", line 26, in __call__
    raise SystemExit(code)
SystemExit: -1

[2023-02-12 11:28:32,170] [3699] [MainThread] [ERROR] (uvicorn.error:56) - Application startup failed. Exiting.

@EdenSehatAI
Copy link

Hi,

I'm up to the training phase but am always getting this error on the 9th epoch. Why is this the case?

Val 9/50 221/250 , dice_tc: 0.7409106 , dice_wt: 0.837617 , dice_et: 0.78997755 , time 4.82s
Val 9/50 222/250 , dice_tc: 0.74006385 , dice_wt: 0.8375687 , dice_et: 0.7895737 , time 4.82s

RuntimeError Traceback (most recent call last)
in <cell line: 11>()
9 loss_epochs,
10 trains_epoch,
---> 11 ) = trainer(
12 model=model,
13 train_loader=train_loader,

5 frames
/usr/local/lib/python3.10/dist-packages/torch/_utils.py in reraise(self)
692 # instantiate since we don't know how to
693 raise RuntimeError(msg) from None
--> 694 raise exception
695
696

RuntimeError: Caught RuntimeError in DataLoader worker process 1.
Original Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/nibabel/loadsave.py", line 90, in load
stat_result = os.stat(filename)
FileNotFoundError: [Errno 2] No such file or directory: '/content/drive/MyDrive/EdenSehat/BraTS2021_Training_Data/TrainingData/TrainingData/BraTS2021_00390/BraTS2021_00390_flair.nii.gz'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 141, in apply_transform
return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 98, in _apply_transform
return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/io/dictionary.py", line 162, in call
data = self.loader(d[key], reader)
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/io/array.py", line 255, in call
img = reader.read(filename)
File "/usr/local/lib/python3.10/dist-packages/monai/data/image_reader.py", line 908, in read
img = nib.load(name, **kwargs
)
File "/usr/local/lib/python3.10/dist-packages/nibabel/loadsave.py", line 92, in load
raise FileNotFoundError(f"No such file or no access: '{filename}'")
FileNotFoundError: No such file or no access: '/content/drive/MyDrive/EdenSehat/BraTS2021_Training_Data/TrainingData/TrainingData/BraTS2021_00390/BraTS2021_00390_flair.nii.gz'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 141, in apply_transform
return _apply_transform(transform, data, unpack_items, lazy, overrides, log_stats)
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 98, in _apply_transform
return transform(data, lazy=lazy) if isinstance(transform, LazyTrait) else transform(data)
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/compose.py", line 335, in call
result = execute_compose(
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/compose.py", line 111, in execute_compose
data = apply_transform(
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 171, in apply_transform
raise RuntimeError(f"applying transform {transform}") from e
RuntimeError: applying transform <monai.transforms.io.dictionary.LoadImaged object at 0x7c01153aa680>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
data = fetcher.fetch(index)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/_utils/fetch.py", line 51, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/usr/local/lib/python3.10/dist-packages/monai/data/dataset.py", line 112, in getitem
return self._transform(index)
File "/usr/local/lib/python3.10/dist-packages/monai/data/dataset.py", line 98, in _transform
return apply_transform(self.transform, data_i) if self.transform is not None else data_i
File "/usr/local/lib/python3.10/dist-packages/monai/transforms/transform.py", line 171, in apply_transform
raise RuntimeError(f"applying transform {transform}") from e
RuntimeError: applying transform <monai.transforms.compose.Compose object at 0x7c01153abee0>

@diazandr3s
Copy link
Collaborator

HI @EdenSehatAI,

From the logs, I see the FLAIR sequence is missing for this patient: BraTS2021_00390_flair

FileNotFoundError: [Errno 2] No such file or directory: '/content/drive/MyDrive/EdenSehat/BraTS2021_Training_Data/TrainingData/TrainingData/BraTS2021_00390/BraTS2021_00390_flair.nii.gz'

Can you make sure this file is in the folder?

@EdenSehatAI
Copy link

image

@diazandr3s
Copy link
Collaborator

Have you checked that the file is in the downloaded folder? That's what the error is about.

@TrushalGulhane
Copy link

Screenshot (4989)

segmentation looks fine in 3D but not in other slicers. is it possible to do it in another slice?

@diazandr3s
Copy link
Collaborator

Hi @TrushalGulhane,

This may be the typical orientation problem. I'm assuming you are using the BRATS files that have the 4 modalities merged into a single NIfTI. Is that correct?

In the Slicer MONAI Auto3DSeg we also demonstrated a way of using the BRATS models: https://github.com/lassoan/SlicerMONAIAuto3DSeg

Please give it a try.

@TrushalGulhane
Copy link

Thank you @diazandr3s

Is is datasets issue? Is there any available dataset that will be working with brats MRI segmentation model.
Please provide dataset link if it's available.

@diazandr3s
Copy link
Collaborator

Thank you @diazandr3s

Is is datasets issue? Is there any available dataset that will be working with brats MRI segmentation model. Please provide dataset link if it's available.

Hi @TrushalGulhane,

I am not aware of another dataset similar to BRATS.

I'm sure you know this, BRATS dataset is a processed dataset composed of four MR sequence co-registered/aligned.

BTW, have you tried using the MONAI Auto3DSeg module in Slicer. There, you can also find the BRATS models for the different tumor types: https://github.com/lassoan/SlicerMONAIAuto3DSeg/

Just download the latest Slicer and then instal Auto3DSeg via the Extension Manager.

Hope this helps,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants