Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Client needs to be able to send data for annotation to remote-server (MONAILabel server) #154

Closed
che85 opened this issue Jun 29, 2021 · 23 comments
Assignees
Labels
enhancement New feature or request

Comments

@che85
Copy link

che85 commented Jun 29, 2021

Is your feature request related to a problem? Please describe.
MONAILabel V1 only supports server-sided hosting of data.

For clinical use, we want our research assistants to be able to use MONAILabel from individual computers. A workstation with a dedicated GPU was assigned to be the remote-server, receiving client requests for annotating data and returning the annotation to the client.

Describe the solution you'd like
Clients should be able to send images for automatic annotation to the MONAILabel server.

Additional context
JolleyLab/DeepHeart#1
JolleyLab/DeepHeart#2

@lassoan
Copy link
Collaborator

lassoan commented Jun 29, 2021

It would be great if this interface would be exactly the same as the AIAA server interface. It would allow the same clients (Slicer, OHIF, MITK, ...) to use either MONAILabel or AIAA server.

@SachidanandAlle
Copy link
Collaborator

SachidanandAlle commented Jun 29, 2021

There is little difference between aiaa and MONAILabel.. but MONAILabel supports apis to upload your own image..

In your client.. you can follow this to create similar behavior

  1. Upload image and get the id for further use.. /datastore api is helpful here..
  2. Run inference and pass the image id from above step..

The current MONAILabel plugin in slicer fetches the images based on active learning strategies.. thats one reason it is tagged as active learning solution.. thats one important difference between aiaa (infer) only solution

@SachidanandAlle
Copy link
Collaborator

Let me add a quick option to push the image to server from client box.. this should unblock for users sitting remote.. and use their own samples to run infer kind of tasks..

@SachidanandAlle SachidanandAlle self-assigned this Jun 30, 2021
@SachidanandAlle SachidanandAlle added the enhancement New feature or request label Jun 30, 2021
@che85
Copy link
Author

che85 commented Jun 30, 2021

Ideal for our use case would be:

  1. user loads image
  2. user creates annulus contour curve
  3. user creates landmarks
  4. user initiates automatic segmentation process
    a. MONAILabel-client pre-processes data (normalization, orientation, resampling)
    b. MONAILabel-client uploads image + annulus contour (currently as a label map)
    c. MONAILabel-server runs inference and returns segmentation (multi-label)
  5. user can refine segmentation

@che85
Copy link
Author

che85 commented Jun 30, 2021

How is additional information from the user transferred to the MONAILabel server? Are just coordinates of background/foreground landmarks sent via REST API?

@SachidanandAlle
Copy link
Collaborator

How is additional information from the user transferred to the MONAILabel server? Are just coordinates of background/foreground landmarks sent via REST API?

You can pass anything in params.. deepgrow sends some foreground/background points and uses the same during pre transform to prepare the input for the model/network

@lassoan
Copy link
Collaborator

lassoan commented Jun 30, 2021

@SachidanandAlle #156 looks awesome, thank you for implementing so quickly! @che85 could you test it?

For the leaflet segmentation the inputs are: an image, a closed curve, and a fiducial list with a few named points. @SachidanandAlle how do you think these could be added to the MONAILabel module GUI? A few options:

  1. Command-line interface GUI generation

Slicer supports generating user interface for command-line executables using an XML file. The file lists all input and output parameters of an algorithm. It is used in MITK, MeVisLab, MedInria, Gimias, and a number of other applications - see more information here),

See SEM interface XML files in each subfolder here:
https://github.com/Slicer/Slicer/tree/master/Modules/CLI

For example, this description file:
https://github.com/Slicer/Slicer/blob/master/Modules/CLI/ACPCTransform/ACPCTransform.xml

Generates this GUI:
image

Slicer also writes all the selected data objects to files and simple parameters (Booleans, floats, etc.) into a command-line argument list. This could be adapted to send parameters via network.

  1. Qt Designer .ui file

The MONAILabel app could send a Qt designer .ui file that Slicer can display in a GUI section. We could implement automatic sending of GUI content to the server (e.g., if the GUI element is a slider then we would send a number, if the GUI element is a node selector then we would write that node to file and send that file).

  1. Python code

The client could send a Python class to the MONAILabel Slicer module, which would contain a few methods, such as 1. setting up GUI widgets, and 2. sending data that the user specified in the widgets to the server. Running Python code received via the network is a bit higher security risk than just interpreting XML descriptor files, so maybe this option is not desirable in the long term.

@SachidanandAlle
Copy link
Collaborator

@che85
For the ideal usecase as you mentioned above, now you can push a volume to server so that you can run infer task (pre/inference/post transforms)

I believe you guys are expert in slicer, so you should be able to quickly add additional interactions that you are looking in MONAILabel plugin which is kind of specific to an interaction model you are building (using annulus contour curve and landing marks)

For a quick implementation (for project week), I suggest you can clone the main branch and add your interaction logic over there...

Once you have the required contour curve and landing marks you can send it in params (JSON/Dict) and access the same in pre-transforms. If you are looking to send binary data then you can either make use of label field and access the same in your pre-transforms

For example: Scribbles is one your kind where some other researcher from KCL tried to add extra interactions as part of the user-case
https://github.com/Project-MONAI/MONAILabel/tree/scribbles/sample-apps/segmentation_spleen_postproc

@lassoan we can discuss how to make it flexible for others to add/inject new interactions into MONAILabel plugin. This is where I need your help.

May be we can discuss ideas you suggested and the priority and importance of adding SegmentEditEffects kind of extensions for MONAILabel plugin... or how to import those bunch of standard interactions in MONAILabel plugin.

@SachidanandAlle
Copy link
Collaborator

How is additional information from the user transferred to the MONAILabel server? Are just coordinates of background/foreground landmarks sent via REST API?

in the base version we support only auto-segmentation and deepgrow models. First one doesn't need any input.. and second one needs foreground/background clicks from user

@lassoan
Copy link
Collaborator

lassoan commented Jun 30, 2021

@SachidanandAlle these are very good suggestions. I would be very happy to discuss about them. Would you be available to talk today or Thursday afternoon (Eastern time) on the Slicer project week discord?

@che85
Copy link
Author

che85 commented Jun 30, 2021

I would be interested in joining that call as well.

@aihsani
Copy link
Contributor

aihsani commented Jun 30, 2021

Same here.

@diazandr3s
Copy link
Collaborator

diazandr3s commented Jun 30, 2021

Same here. Is it too late to set a meeting for today?

@SachidanandAlle
Copy link
Collaborator

can we setup a meeting then... @diazandr3s

@diazandr3s
Copy link
Collaborator

can we setup a meeting then... @diazandr3s

What about today at 19:00 BST? If that works for everyone, I'll send the zoom link to the meeting

@lassoan
Copy link
Collaborator

lassoan commented Jun 30, 2021

19:00 BST is OK for both Sachi and me.
For Sachi it would be better to talk right now. @diazandr3s @che85 are you available now?

@diazandr3s
Copy link
Collaborator

Yes, happy to chat now as well

@lassoan
Copy link
Collaborator

lassoan commented Jun 30, 2021

OK, let's meet at the ai-assisted-annotation voice channel on discord. Hopefully @che85 can make it, too.

@che85
Copy link
Author

che85 commented Jun 30, 2021

I made some good progress, but getting an error on the serverside:

Traceback (most recent call last):
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/uvicorn/protocols/http/h11_impl.py", line 396, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/fastapi/applications.py", line 199, in __call__
    await super().__call__(scope, receive, send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/applications.py", line 111, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/middleware/errors.py", line 181, in __call__
    raise exc from None
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/middleware/errors.py", line 159, in __call__
    await self.app(scope, receive, _send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/exceptions.py", line 82, in __call__
    raise exc from None
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/exceptions.py", line 71, in __call__
    await self.app(scope, receive, sender)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/routing.py", line 566, in __call__
    await route.handle(scope, receive, send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/routing.py", line 227, in handle
    await self.app(scope, receive, send)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/starlette/routing.py", line 41, in app
    response = await func(request)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/fastapi/routing.py", line 202, in app
    dependant=dependant, values=values, is_coroutine=is_coroutine
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/fastapi/routing.py", line 148, in run_endpoint_function
    return await dependant.call(**values)
  File "/home/herzc/sources/MONAILabel/monailabel/endpoints/infer.py", line 129, in run_inference
    result = instance.infer(request)
  File "/home/herzc/sources/MONAILabel/monailabel/interfaces/app.py", line 124, in infer
    result_file_name, result_json = task(request)
  File "/home/herzc/sources/MONAILabel/monailabel/interfaces/tasks/infer.py", line 210, in __call__
    result_file_name, result_json = self.writer(data)
  File "/home/herzc/sources/MONAILabel/monailabel/interfaces/tasks/infer.py", line 337, in writer
    return writer(data)
  File "/home/herzc/sources/MONAILabel/monailabel/utils/others/writer.py", line 59, in __call__
    result_image = itk.image_from_array(image_np)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/itkExtras.py", line 292, in GetImageFromArray
    return _GetImageFromArray(arr, "GetImageFromArray", is_vector)
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/itkExtras.py", line 272, in _GetImageFromArray
    ImageType = itk.Image[PixelType, Dimension]
  File "/home/herzc/.conda/envs/monailabel/lib/python3.7/site-packages/itkTemplate.py", line 342, in __getitem__
    raise TemplateTypeError(self, tuple(cleanParameters))
itkTemplate.TemplateTypeError: itk.Image is not wrapped for input type `itk.F, int`.

@che85
Copy link
Author

che85 commented Jun 30, 2021

Okay seems like I fixed that. It's already looking pretty good, but there is only data written to the first segment of the returned segmentation. Seems like only the background label (non-valve-region) got returned.

image

@che85
Copy link
Author

che85 commented Jun 30, 2021

yay!

image

@mattjolley
Copy link

@SachidanandAlle @diazandr3s @che85 this is really awesome.

Thank you!

@SachidanandAlle
Copy link
Collaborator

For now, I believe we can close this issue. Feel free to reopen if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants