Back to Projects List
- Andrey Fedorov (Brigham and Women’s Hospital, USA)
- Marco Nolden (German Cancer Research Center, Germany)
- Hans Meine (Fraunhofer MEVIS, Germany)
- Klaus Kades (German Cancer Research Center, Germany)
Kaapana is a Kubernetes-based open source toolkit for platform provisioning in the field of medical data analysis. Kapana is leveraging a number of open source tools that are relevant for the NA-MIC community (specifically, OHIF Viewer, MITK, nnU-Net segmentation tools) and relies on DICOM for managing images, image-derived data and metadata.
In this project current, perspective and aspiring users of Kaapana will have the opportunity to work with the developers of the platform to get help with deploying and using the platform, and to discuss potential problems or directions for future development and collaboration.
- Deploy latest version of the platform locally and on GCP.
- Discuss specific topics of interest.
- Document results of discussion, share any code developed in the process.
- Deploy Kaapana on Andrey's linux laptop.
- Deploy Kaapana on a GCP VM.
- Establish shared GCP project for collaboration.
- Discuss specific topics of interest as summarized below, document the main points of the discussion (in the below, "I" refers to Andrey Fedorov).
Improved Slicer integration : we already have Slicer app added to Kaapana following the example of MITK (see https://github.com/fedorov/kaapana/tree/0.1.2-november-slicer). However, communication to / out of the app is quite clunky. Specifically, we have not figured out how to be able to select cases from the dashboard and open those directly in Slicer. Also, we would like to have a workflow that writes DICOM segmentations etc back into the DICOM server. Related to Integration of Desktop Apps.
Integration with GCP Healthcare DICOM stores : right now we use dcm4chee as the DICOM server. This is problematic while deploying kaapana on the cloud, since 1) it is huge waste of resources: we already have our data in storage buckets, we need to replicate those files on attached disk (and attached storage is very expensive), then import into dcm4chee (which is very very very slow, and does not work for all types of DICOM objects - SRs are rejected); 2) I am not sure it is scalable to use dcm4chee. We can very easily set up a DICOM store under GCP Healthcare, which is cheaper, faster, is highly scalable, and can be accessed using standard DICOMweb interface with authentication. It would be extremely helpful to be able to use that GCP DICOM Store in place of dcm4chee. Related to Connecting/Using Kaapana to Google Cloud/Google Health/Google FHIR.
Integration with IDC : All of IDC data is available from public GCP buckets, egress is free. All you need is to have Google Cloud SDK https://cloud.google.com/sdk installed, and to do searching, one needs to have a GCP project and credentials. Maybe we can discuss this. Related to Data and model exchange across different sources.
Integration of new analysis tools into Kaapana : we have been developing use cases that utilize publicly available AI tools, starting from DICOM images and producing DICOM output, see some here: https://app.modelhub.ai/. It would be good to go over the process of adding one of those to kaapana as an experiment, so I can understand the process. We could also use prostate cancer segmentation model from MONAI zoo that we are going to investigate in this project: https://github.com/NA-MIC/ProjectWeek/pull/486/files#diff-1b4e320dd5db1df87192959dee521ff75d94129c1b97ede523d6b740271191b7R3. Related to Data and model exchange across different sources. Relatred questions:
- how to debug failures? e.g., see this as an example
Running Kaapana on Google Kubernetes Engine : while using GCP, we've been following an extremely naive and inefficient approach for deploying Kaapana. We allocate a fixed linux VM, and install it as if we are on a on-prem server. As I understand it, to fully leverage the power of k8s, it would make a lot more sense to use Google Kubernetes engine. My knowledge of k8s and microk8s is very close to 0, so maybe this is something that is highly trivial. Maybe we could experiment with this together. We can even set up a shared GCP projects where I can add you, so you can experiment directly. Related to Connecting/Using Kaapana to Google Cloud/Google Health/Google FHIR.
DICOM Dashboard Setup : having a dashboard summarizing a data collection in a meaningful way is a recurring theme also outside of kaapana. We would like to investigate to which degree the requirements coming with common use cases (such as AI annotation, cohort definition, AI model training, automatic quality assurance) are already met and if they're not, how extensible the existing dashboard is. Furthermore, it would be interesting to assess whether such a dashboard can be shared with other projects (IDC, Grand-Challenge), and whether that really makes sense in practice. Related to Fast viewing and tagging of DICOM Images.
Maintenance of Kaapana instance : discuss the process of checking for security vulnerabilities, updating the developers of identified vulnerabilities, communicating the need to update to the users, look if scanning features available in GCP could be helpful.
- Andrey is working on setting up Kaapana on the linux laptop he plans to bring along.
- Andrey is setting up a GCP project to share with the Kaapana developers for experimentation.
- Hans has access to some(?) kaapana installation at MEVIS (from the RACOON project).