Skip to content

Commit

Permalink
Fix FUGUE bug (#769)
Browse files Browse the repository at this point in the history
* Add FUGUEvsm2ANTSwarp from niworkflows.

* Add maternal_brain_project test.

* Add test to CircleCI config.

* Run isort.

* Add Forrest Gump test.

* Update expected outputs.

* Drop boilerplate option.

* Add hmc-model to multishell test.

* Fix typo.

* Update test_cli.py

* Trigger new tests.

* Add omp-nthreads.

* Update test_cli.py

* Update test_cli.py

* Add optional outputs for pyafq_recon_external_trk.

* Try using more reduced data.

* Set nthreads to 1.

* Update test_cli.py

* Set resource class to xlarge.

* Update outputs

* [FIX] Fix the mif2fib workflow (#778)

* Update pyafq_recon_full_outputs.txt

* Update pyafq_recon_full_optional_outputs.txt

---------

Co-authored-by: Matt Cieslak <[email protected]>
  • Loading branch information
tsalo and mattcieslak authored Jul 12, 2024
1 parent db15b6d commit dfd0ab2
Show file tree
Hide file tree
Showing 13 changed files with 524 additions and 98 deletions.
74 changes: 74 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -348,6 +348,56 @@ jobs:
- store_artifacts:
path: /src/qsiprep/.circleci/out/drbuddi_rpe/

maternal_brain_project:
resource_class: xlarge
environment:
CIRCLE_CPUS: 4
<<: *dockersetup
steps:
- checkout
- run: *runinstall
- run:
name: Run QSIPrep on multi-shell dataset with GRE field maps
no_output_timeout: 3h
command: |
pytest -rP -o log_cli=true -m "maternal_brain_project" --cov-config=/src/qsiprep/pyproject.toml --cov-append --cov-report term-missing --cov=qsiprep --data_dir=/src/qsiprep/.circleci/data --output_dir=/src/qsiprep/.circleci/out --working_dir=/src/qsiprep/.circleci/work qsiprep
mkdir /src/coverage
mv /src/qsiprep/.coverage /src/coverage/.coverage.maternal_brain_project
# remove nifti files before uploading artifacts
find /src/qsiprep/.circleci/out/ -name "*.nii.gz" -type f -delete
find /src/qsiprep/.circleci/out/ -name "*.fib.gz" -type f -delete
- persist_to_workspace:
root: /src/coverage/
paths:
- .coverage.maternal_brain_project
- store_artifacts:
path: /src/qsiprep/.circleci/out/maternal_brain_project/

forrest_gump:
resource_class: xlarge
environment:
CIRCLE_CPUS: 4
<<: *dockersetup
steps:
- checkout
- run: *runinstall
- run:
name: Run QSIPrep on single-shell dataset with GRE field maps
no_output_timeout: 3h
command: |
pytest -rP -o log_cli=true -m "forrest_gump" --cov-config=/src/qsiprep/pyproject.toml --cov-append --cov-report term-missing --cov=qsiprep --data_dir=/src/qsiprep/.circleci/data --output_dir=/src/qsiprep/.circleci/out --working_dir=/src/qsiprep/.circleci/work qsiprep
mkdir /src/coverage
mv /src/qsiprep/.coverage /src/coverage/.coverage.forrest_gump
# remove nifti files before uploading artifacts
find /src/qsiprep/.circleci/out/ -name "*.nii.gz" -type f -delete
find /src/qsiprep/.circleci/out/ -name "*.fib.gz" -type f -delete
- persist_to_workspace:
root: /src/coverage/
paths:
- .coverage.forrest_gump
- store_artifacts:
path: /src/qsiprep/.circleci/out/forrest_gump/

IntramodalTemplate:
resource_class: large
<<: *dockersetup
Expand Down Expand Up @@ -855,6 +905,26 @@ workflows:
tags:
only: /.*/

- maternal_brain_project:
requires:
- build
filters:
branches:
ignore:
- /recon\/.*/
tags:
only: /.*/

- forrest_gump:
requires:
- build
filters:
branches:
ignore:
- /recon\/.*/
tags:
only: /.*/

- IntramodalTemplate:
requires:
- build
Expand Down Expand Up @@ -961,6 +1031,8 @@ workflows:
- DRBUDDI_SHORELine_EPI
- DRBUDDI_eddy_rpe_series
- DRBUDDI_TENSORLine_EPI
- maternal_brain_project
- forrest_gump
- IntramodalTemplate
- MultiT1w
- Recon_3Tissue_Singleshell_ACT
Expand Down Expand Up @@ -991,6 +1063,8 @@ workflows:
- DRBUDDI_SHORELine_EPI
- DRBUDDI_eddy_rpe_series
- DRBUDDI_TENSORLine_EPI
- maternal_brain_project
- forrest_gump
- IntramodalTemplate
- MultiT1w
- Recon_3Tissue_Singleshell_ACT
Expand Down
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -191,6 +191,8 @@ markers = [
"mrtrix3_recon: test 20",
"tortoise_recon: test 21",
"multi_t1w: test 22",
"maternal_brain_project: multi-shell with GRE field map",
"forrest_gump: single-shell with GRE field map",
]
env = [
"RUNNING_PYTEST = 1",
Expand Down
196 changes: 193 additions & 3 deletions qsiprep/interfaces/niworkflows.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,11 @@
"""

from __future__ import absolute_import, division, print_function, unicode_literals

from mimetypes import guess_type

import matplotlib.pyplot as plt
import nibabel as nb
import numpy as np
Expand All @@ -18,8 +21,15 @@
from nipype import logging
from nipype.interfaces import ants
from nipype.interfaces.ants import Registration
from nipype.interfaces.base import traits
from nipype.interfaces.base import (
BaseInterfaceInputSpec,
File,
SimpleInterface,
TraitedSpec,
traits,
)
from nipype.interfaces.mixins import reporting
from nipype.utils.filemanip import fname_presuffix
from niworkflows.interfaces.norm import (
SpatialNormalization,
_SpatialNormalizationInputSpec,
Expand Down Expand Up @@ -112,7 +122,9 @@ def plot(self, figure=None):
grid_id += 1

plot_sliceqc(
self.qc_data["slice_scores"].T, self.qc_data["slice_counts"], subplot=grid[-1]
self.qc_data["slice_scores"].T,
self.qc_data["slice_counts"],
subplot=grid[-1],
)
return figure

Expand Down Expand Up @@ -231,7 +243,6 @@ def confoundplot(
cutoff=None,
ylims=None,
):

# Define TR and number of frames
notr = False
if tr is None:
Expand Down Expand Up @@ -439,3 +450,182 @@ def _post_run_hook(self, runtime):
)

return super(RobustMNINormalizationRPT, self)._post_run_hook(runtime)


class FUGUEvsm2ANTSwarpInputSpec(BaseInterfaceInputSpec):
in_file = File(exists=True, mandatory=True, desc="input displacements field map")
pe_dir = traits.Enum("i", "i-", "j", "j-", "k", "k-", desc="phase-encoding axis")


class FUGUEvsm2ANTSwarpOutputSpec(TraitedSpec):
out_file = File(desc="the output warp field")


class FUGUEvsm2ANTSwarp(SimpleInterface):
"""Convert a voxel-shift-map to ants warp."""

input_spec = FUGUEvsm2ANTSwarpInputSpec
output_spec = FUGUEvsm2ANTSwarpOutputSpec

def _run_interface(self, runtime):
nii = nb.load(self.inputs.in_file)

phaseEncDim = {"i": 0, "j": 1, "k": 2}[self.inputs.pe_dir[0]]

if len(self.inputs.pe_dir) == 2:
phaseEncSign = 1.0
else:
phaseEncSign = -1.0

# Fix header
hdr = nii.header.copy()
hdr.set_data_dtype(np.dtype("<f4"))
hdr.set_intent("vector", (), "")

# Get data, convert to mm
data = nii.get_fdata()

aff = np.diag([1.0, 1.0, -1.0])
if np.linalg.det(aff) < 0 and phaseEncDim != 0:
# Reverse direction since ITK is LPS
aff *= -1.0

aff = aff.dot(nii.affine[:3, :3])

data *= phaseEncSign * nii.header.get_zooms()[phaseEncDim]

# Add missing dimensions
zeros = np.zeros_like(data)
field = [zeros, zeros]
field.insert(phaseEncDim, data)
field = np.stack(field, -1)
# Add empty axis
field = field[:, :, :, np.newaxis, :]

# Write out
self._results["out_file"] = fname_presuffix(
self.inputs.in_file, suffix="_antswarp", newpath=runtime.cwd
)
nb.Nifti1Image(field.astype(np.dtype("<f4")), nii.affine, hdr).to_filename(
self._results["out_file"]
)

return runtime


def _mat2itk(args):
from nipype.interfaces.c3 import C3dAffineTool
from nipype.utils.filemanip import fname_presuffix

in_file, in_ref, in_src, index, newpath = args
# Generate a temporal file name
out_file = fname_presuffix(in_file, suffix="_itk-%05d.txt" % index, newpath=newpath)

# Run c3d_affine_tool
C3dAffineTool(
transform_file=in_file,
reference_file=in_ref,
source_file=in_src,
fsl2ras=True,
itk_transform=out_file,
resource_monitor=False,
).run()
transform = "#Transform %d\n" % index
with open(out_file) as itkfh:
transform += "".join(itkfh.readlines()[2:])

return (index, transform)


def _applytfms(args):
"""
Applies ANTs' antsApplyTransforms to the input image.
All inputs are zipped in one tuple to make it digestible by
multiprocessing's map
"""
import nibabel as nb
from nipype.utils.filemanip import fname_presuffix
from niworkflows.interfaces.fixes import FixHeaderApplyTransforms as ApplyTransforms

in_file, in_xform, ifargs, index, newpath = args
out_file = fname_presuffix(
in_file, suffix="_xform-%05d" % index, newpath=newpath, use_ext=True
)

copy_dtype = ifargs.pop("copy_dtype", False)
xfm = ApplyTransforms(
input_image=in_file, transforms=in_xform, output_image=out_file, **ifargs
)
xfm.terminal_output = "allatonce"
xfm.resource_monitor = False
runtime = xfm.run().runtime

if copy_dtype:
nii = nb.load(out_file)
in_dtype = nb.load(in_file).get_data_dtype()

# Overwrite only iff dtypes don't match
if in_dtype != nii.get_data_dtype():
nii.set_data_dtype(in_dtype)
nii.to_filename(out_file)

return (out_file, runtime.cmdline)


def _arrange_xfms(transforms, num_files, tmp_folder):
"""
Convenience method to arrange the list of transforms that should be applied
to each input file
"""
base_xform = ["#Insight Transform File V1.0", "#Transform 0"]
# Initialize the transforms matrix
xfms_T = []
for i, tf_file in enumerate(transforms):
# If it is a deformation field, copy to the tfs_matrix directly
if guess_type(tf_file)[0] != "text/plain":
xfms_T.append([tf_file] * num_files)
continue

with open(tf_file) as tf_fh:
tfdata = tf_fh.read().strip()

# If it is not an ITK transform file, copy to the tfs_matrix directly
if not tfdata.startswith("#Insight Transform File"):
xfms_T.append([tf_file] * num_files)
continue

# Count number of transforms in ITK transform file
nxforms = tfdata.count("#Transform")

# Remove first line
tfdata = tfdata.split("\n")[1:]

# If it is a ITK transform file with only 1 xform, copy to the tfs_matrix directly
if nxforms == 1:
xfms_T.append([tf_file] * num_files)
continue

if nxforms != num_files:
raise RuntimeError(
"Number of transforms (%d) found in the ITK file does not match"
" the number of input image files (%d)." % (nxforms, num_files)
)

# At this point splitting transforms will be necessary, generate a base name
out_base = fname_presuffix(
tf_file, suffix="_pos-%03d_xfm-{:05d}" % i, newpath=tmp_folder.name
).format
# Split combined ITK transforms file
split_xfms = []
for xform_i in range(nxforms):
# Find start token to extract
startidx = tfdata.index("#Transform %d" % xform_i)
next_xform = base_xform + tfdata[startidx + 1 : startidx + 4] + [""]
xfm_file = out_base(xform_i)
with open(xfm_file, "w") as out_xfm:
out_xfm.write("\n".join(next_xform))
split_xfms.append(xfm_file)
xfms_T.append(split_xfms)

# Transpose back (only Python 3)
return list(map(list, zip(*xfms_T)))
8 changes: 8 additions & 0 deletions qsiprep/tests/data/forrest_gump_filter.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
"t1w": {
"reconstruction": "autobox"
},
"t2w": {
"reconstruction": "autobox"
}
}
35 changes: 35 additions & 0 deletions qsiprep/tests/data/forrest_gump_outputs.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
qsiprep
qsiprep/dataset_description.json
qsiprep/dwiqc.json
qsiprep/logs
qsiprep/logs/CITATION.bib
qsiprep/logs/CITATION.html
qsiprep/logs/CITATION.md
qsiprep/logs/CITATION.tex
qsiprep/sub-01
qsiprep/sub-01.html
qsiprep/sub-01/anat
qsiprep/sub-01/anat/sub-01_desc-aseg_dseg.nii.gz
qsiprep/sub-01/anat/sub-01_desc-brain_mask.nii.gz
qsiprep/sub-01/anat/sub-01_desc-preproc_T1w.nii.gz
qsiprep/sub-01/anat/sub-01_dseg.nii.gz
qsiprep/sub-01/anat/sub-01_from-MNI152NLin2009cAsym_to-T1w_mode-image_xfm.h5
qsiprep/sub-01/anat/sub-01_from-T1wACPC_to-T1wNative_mode-image_xfm.mat
qsiprep/sub-01/anat/sub-01_from-T1wNative_to-T1wACPC_mode-image_xfm.mat
qsiprep/sub-01/anat/sub-01_from-T1w_to-MNI152NLin2009cAsym_mode-image_xfm.h5
qsiprep/sub-01/ses-forrestgump
qsiprep/sub-01/ses-forrestgump/anat
qsiprep/sub-01/ses-forrestgump/anat/sub-01_ses-forrestgump_rec-autobox_from-orig_to-T1w_mode-image_xfm.txt
qsiprep/sub-01/ses-forrestgump/dwi
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_confounds.tsv
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_desc-ImageQC_dwi.csv
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_desc-SliceQC_dwi.json
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_dwiqc.json
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-brain_mask.nii.gz
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-eddy_cnr.nii.gz
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-preproc_dwi.b
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-preproc_dwi.bval
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-preproc_dwi.bvec
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-preproc_dwi.nii.gz
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_desc-preproc_dwi.txt
qsiprep/sub-01/ses-forrestgump/dwi/sub-01_ses-forrestgump_space-T1w_dwiref.nii.gz
5 changes: 5 additions & 0 deletions qsiprep/tests/data/maternal_brain_project_filter.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"t1w": {
"reconstruction": "autobox"
}
}
Loading

0 comments on commit dfd0ab2

Please sign in to comment.