Skip to content

Commit

Permalink
Update README with vSHARP
Browse files Browse the repository at this point in the history
  • Loading branch information
georgeyiasemis committed Mar 25, 2024
1 parent 68eebda commit 131dbc7
Showing 1 changed file with 18 additions and 15 deletions.
33 changes: 18 additions & 15 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,11 @@
DIRECT: Deep Image REConstruction Toolkit
=========================================

``DIRECT`` is a Python, end-to-end pipeline for solving Inverse Problems emerging in Imaging Processing. It is built with PyTorch and stores state-of-the-art Deep Learning imaging inverse problem solvers such as denoising, dealiasing and reconstruction. By defining a base forward linear or non-linear operator, ``DIRECT`` can be used for training models for recovering images such as MRIs from partially observed or noisy input data.
``DIRECT`` stores inverse problem solvers such as the Learned Primal Dual algorithm, Recurrent Inference Machine and Recurrent Variational Network, which were part of the winning solution in Facebook & NYUs FastMRI challenge in 2019 and the Calgary-Campinas MRI reconstruction challenge at MIDL 2020. For a full list of the baselines currently implemented in DIRECT see `here <#baselines-and-trained-models>`_.
``DIRECT`` is a Python, end-to-end pipeline for solving Inverse Problems emerging in Imaging Processing.
It is built with PyTorch and stores state-of-the-art Deep Learning imaging inverse problem solvers such as denoising, dealiasing and reconstruction.
By defining a base forward linear or non-linear operator, ``DIRECT`` can be used for training models for recovering images such as MRIs from partially observed or noisy input data.
``DIRECT`` stores inverse problem solvers such as the vSHARP, Learned Primal Dual algorithm, Recurrent Inference Machine and Recurrent Variational Network, which were part of the winning solutions in Facebook & NYUs FastMRI challenge in 2019, the Calgary-Campinas MRI reconstruction challenge at MIDL 2020 and the CMRxRecon challenge 2023.
For a full list of the baselines currently implemented in DIRECT see `here <#baselines-and-trained-models>`_.

.. raw:: html

Expand All @@ -49,7 +52,7 @@ In the `projects <https://github.com/NKI-AI/direct/tree/main/projects>`_ folder
Baselines and trained models
----------------------------

We provide a set of baseline results and trained models in the `DIRECT Model Zoo <https://docs.aiforoncology.nl/direct/model_zoo.html>`_. Baselines and trained models include the `Recurrent Variational Network (RecurrentVarNet) <https://arxiv.org/abs/2111.09639>`_, the `Recurrent Inference Machine (RIM) <https://www.sciencedirect.com/science/article/abs/pii/S1361841518306078>`_, the `End-to-end Variational Network (VarNet) <https://arxiv.org/pdf/2004.06688.pdf>`_, the `Learned Primal Dual Network (LDPNet) <https://arxiv.org/abs/1707.06474>`_, the `X-Primal Dual Network (XPDNet) <https://arxiv.org/abs/2010.07290>`_, the `KIKI-Net <https://pubmed.ncbi.nlm.nih.gov/29624729/>`_, the `U-Net <https://arxiv.org/abs/1811.08839>`_, the `Joint-ICNet <https://openaccess.thecvf.com/content/CVPR2021/papers/Jun_Joint_Deep_Model-Based_MR_Image_and_Coil_Sensitivity_Reconstruction_Network_CVPR_2021_paper.pdf>`_, and the `AIRS Medical fastmri model (MultiDomainNet) <https://arxiv.org/pdf/2012.06318.pdf>`_.
We provide a set of baseline results and trained models in the `DIRECT Model Zoo <https://docs.aiforoncology.nl/direct/model_zoo.html>`_. Baselines and trained models include the `vSHARP <https://arxiv.org/abs/2309.09954>`_, `Recurrent Variational Network (RecurrentVarNet) <https://arxiv.org/abs/2111.09639>`_, the `Recurrent Inference Machine (RIM) <https://www.sciencedirect.com/science/article/abs/pii/S1361841518306078>`_, the `End-to-end Variational Network (VarNet) <https://arxiv.org/pdf/2004.06688.pdf>`_, the `Learned Primal Dual Network (LDPNet) <https://arxiv.org/abs/1707.06474>`_, the `X-Primal Dual Network (XPDNet) <https://arxiv.org/abs/2010.07290>`_, the `KIKI-Net <https://pubmed.ncbi.nlm.nih.gov/29624729/>`_, the `U-Net <https://arxiv.org/abs/1811.08839>`_, the `Joint-ICNet <https://openaccess.thecvf.com/content/CVPR2021/papers/Jun_Joint_Deep_Model-Based_MR_Image_and_Coil_Sensitivity_Reconstruction_Network_CVPR_2021_paper.pdf>`_, and the `AIRS Medical fastmri model (MultiDomainNet) <https://arxiv.org/pdf/2012.06318.pdf>`_.

License and usage
-----------------
Expand All @@ -63,15 +66,15 @@ If you use DIRECT in your own research, or want to refer to baseline results pub

.. code-block:: BibTeX
@misc{DIRECTTOOLKIT,
doi = {10.21105/joss.04278},
url = {https://doi.org/10.21105/joss.04278},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {73},
pages = {4278},
author = {George Yiasemis and Nikita Moriakov and Dimitrios Karkalousos and Matthan Caan and Jonas Teuwen},
title = {DIRECT: Deep Image REConstruction Toolkit},
journal = {Journal of Open Source Software}
}
@article{DIRECTTOOLKIT,
doi = {10.21105/joss.04278},
url = {https://doi.org/10.21105/joss.04278},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {73},
pages = {4278},
author = {George Yiasemis and Nikita Moriakov and Dimitrios Karkalousos and Matthan Caan and Jonas Teuwen},
title = {DIRECT: Deep Image REConstruction Toolkit},
journal = {Journal of Open Source Software}
}

0 comments on commit 131dbc7

Please sign in to comment.