This repository contains code for replicating the results of the paper titled "U-Net: Convolutional Networks for Biomedical Image Segmentation." (Link to Paper)
Biomedical image segmentation plays a crucial role in various medical applications, such as disease diagnosis and treatment planning. The U-Net architecture is a widely recognized deep learning model for biomedical image segmentation. This repository provides the code necessary to replicate the results and explore the U-Net architecture for biomedical image segmentation.
You can install required dependencies using pip
:
pip install -r requirements.txt
- Training
- Refer to the
Usage.ipynb
notebook for a step-by-step guide on how to train the model. - You can customize training configurations, such as batch size, learning rate, and more, within the notebook.
- Experimentation: If you want to experiment with different configurations, you can modify the settings directly in the notebook during training.
- Evaluation
- Use the
Usage.ipynb
notebook to visualize and evaluate the results. - You can customize evaluation options, within the notebook.
By following the instructions in the Usage.ipynb
notebook, you can easily replicate the experiments, visualize the results, and conduct further experiments with different settings.
For more detailed instructions and examples, please refer to the notebook itself.
If you use this code or replicate the results, please consider citing the original paper:
@article{arXiv:1505.04597,
title={U-Net: Convolutional Networks for Biomedical Image Segmentation},
author={Ronneberger, Olaf and Fischer, Philipp and Brox, Thomas},
journal={Medical Image Computing and Computer-Assisted Intervention (MICCAI)},
year={2015}
}
We extend our thanks to the authors of the original paper, "U-Net: Convolutional Networks for Biomedical Image Segmentation," for their valuable research and inspiration.
- Implementing original U-Net from scratch using PyTorch
- PyTorch Image Segmentation Tutorial with U-NET: everything from scratch baby
This project is licensed under the MIT License - see the LICENSE file for details.