Skip to content

Commit

Permalink
Merge branch 'main' into method
Browse files Browse the repository at this point in the history
  • Loading branch information
Remi-Gau authored Dec 14, 2023
2 parents d2a4fad + 321705e commit 92f408d
Show file tree
Hide file tree
Showing 19 changed files with 235 additions and 118 deletions.
14 changes: 14 additions & 0 deletions .github/workflows/run_precommit.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: pre-commit

on:
pull_request:
push:
branches: [main]

jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: pre-commit/[email protected]
17 changes: 4 additions & 13 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,9 @@ jobs:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v5
- name: install tox
run: pip install tox
- uses: actions/cache@v3
id: cache
env:
Expand All @@ -54,19 +57,7 @@ jobs:
- if: ${{ steps.cache.outputs.cache-hit != 'true' }}
name: Download fmriprep derivative of ds000017
id: download
run: |
mkdir -p /home/runner/work/giga_connectome/giga_connectome/giga_connectome/data/test_data
cd /home/runner/work/giga_connectome/giga_connectome/giga_connectome/data/test_data
wget --retry-connrefused \
--waitretry=5 \
--read-timeout=20 \
--timeout=15 \
-t 0 \
-q \
-O ds000017.tar.gz \
"https://zenodo.org/record/8091903/files/ds000017-fmriprep22.0.1-downsampled-nosurface.tar.gz?download=1"
tar -xzf ds000017.tar.gz
rm ds000017.tar.gz
run: tox -e test_data

build:
runs-on: ubuntu-latest
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
*/_version.py
*/data/test_data/
output/
work/

# Byte-compiled / optimized / DLL files
__pycache__/
Expand Down
4 changes: 2 additions & 2 deletions CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ authors:
email: [email protected]
- family-names: Dessain
given-names: Quentin
- family-names: Natasha
- family-names: Natasha
given-names: Clarke


license: MIT
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
"sphinx.ext.napoleon",
"sphinxarg.ext",
]

templates_path = ["_templates"]
Expand All @@ -35,7 +36,6 @@
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output

html_theme = "sphinx_rtd_theme"
html_static_path = ["_static"]

# -- Options for myst_parser -------------------------------------------------
myst_enable_extensions = ["colon_fence"]
14 changes: 11 additions & 3 deletions docs/source/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,14 @@ pip install -e .[dev]
pre-commit install
```

5. Install the data required for testing from zenodo

This can be done using tox by running:

```bash
tox -e test_data
```

## Contributing to code

This is a very generic workflow.
Expand All @@ -51,7 +59,7 @@ git checkout -b your_branch
5. Run the tests locally; you can run spectfic tests to speed up the process:

```bash
pytest -v giga_connectome/tests/tests/test_connectome.py::test_calculate_intranetwork_correlation
pytest -v giga_connectome/tests/test_connectome.py::test_calculate_intranetwork_correlation
```

6. push your changes to your online fork. If this is the first commit, you might want to set up the remote tracking:
Expand All @@ -75,7 +83,7 @@ The workflow is the same as code contributions, with some minor differences.
1. Install the `[doc]` dependencies.

```bash
pip install -e .[doc]
pip install -e '.[doc]'
```

2. After making changes, build the docs locally:
Expand Down Expand Up @@ -104,7 +112,7 @@ This tells the development team that your pull request is a "work-in-progress",

One your PR is ready a member of the development team will review your changes to confirm that they can be merged into the main codebase.

## Making an release
## Making a release

Currently this project is not pushed to PyPi.
We simply tag the version on the repository so users can reference the version of installation.
Expand Down
50 changes: 10 additions & 40 deletions docs/source/usage.md
Original file line number Diff line number Diff line change
@@ -1,48 +1,17 @@
# Usage Notes

```bash
usage: giga_connectome [-h] [-v] [--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]] [-w WORK_DIR] [--atlas ATLAS]
[--denoise-strategy DENOISE_STRATEGY] [--standardize {zscore,psc}] [--smoothing_fwhm SMOOTHING_FWHM] [--reindex-bids]
[--bids-filter-file BIDS_FILTER_FILE]
bids_dir output_dir {participant,group}

Generate connectome based on denoising strategy for fmriprep processed dataset.

positional arguments:
bids_dir The directory with the input dataset (e.g. fMRIPrep derivative)formatted according to the BIDS standard.
output_dir The directory where the output files should be stored.
{participant,group} Level of the analysis that will be performed. Only group level is allowed as we need to generate a dataset inclusive brain mask.

optional arguments:
-h, --help show this help message and exit
-v, --version show program's version number and exit
--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]
The label(s) of the participant(s) that should be analyzed. The label corresponds to sub-<participant_label> from the BIDS spec (so it does not include 'sub-'). If this parameter is not provided all subjects should be analyzed. Multiple participants can be specified with a space separated list.
-w WORK_DIR, --work-dir WORK_DIR
Path where intermediate results should be stored.
--atlas ATLAS The choice of atlas for time series extraction. Default atlas choices are: 'Schaefer20187Networks, 'MIST', 'DiFuMo'. User can pass a path to a json file containing configuration for their own choice of atlas. The default is 'MIST'.
--denoise-strategy DENOISE_STRATEGY
The choice of post-processing for denoising. The default choices are: 'simple', 'simple+gsr', 'scrubbing.2', 'scrubbing.2+gsr', 'scrubbing.5', 'scrubbing.5+gsr', 'acompcor50', 'icaaroma'. User can pass a path to a json file containing configuration for their own choice of denoising strategy. The defaultis 'simple'.
--standardize {zscore,psc}
The choice of signal standardization. The choices are z score or percent signal change (psc). The default is 'zscore'.
--smoothing_fwhm SMOOTHING_FWHM
Size of the full-width at half maximum in millimeters of the spatial smoothing to apply to the signal. The default is 5.0.
--reindex-bids Reindex BIDS data set, even if layout has already been created.
--bids-filter-file BIDS_FILTER_FILE
A JSON file describing custom BIDS input filters using PyBIDS.We use the same format as described in fMRIPrep documentation: https://fmriprep.org/en/latest/faq.html#how-do-i-select-only-certain-files-to-be-input-to-fmriprepHowever, the query filed should always be 'bold'
## Command line interface

```{eval-rst}
.. argparse::
:prog: giga_connectome
:module: giga_connectome.run
:func: global_parser
```

When performing `participant` level analysis, the output is a HDF5 file per participant that was passed to `--participant_label` or all subjects under `bids_dir`.
The output file name is: `sub-<participant_id>_atlas-<atlas_name>_desc-<denoising_strategy>.h5`
When performing `group` level analysis, the output is a HDF5 file per participant that was passed to `--participant_label` or all subjects under `bids_dir`.
The output file name is: `atlas-<atlas_name>_desc-<denoising_strategy>.h5`
The file will contain time series and connectomes of each subject, as well as group average connectomes.
## Writing configuration files

All preset can be found in `giga_connectome/data`
All preset can be found in [`giga_connectome/data`](https://github.com/SIMEXP/giga_connectome/tree/main/giga_connectome/data).

### Denoising strategy

Expand All @@ -62,13 +31,14 @@ In a `json` file, define the customised strategy in the following format:
}
```

See examples in `giga_connectome/data/denoise_strategy`.
See examples in [`giga_connectome/data/denoise_strategy`](https://github.com/SIMEXP/giga_connectome/tree/main/giga_connectome/data/denoise_strategy).

### Atlas

After the atlas files are organised according to the [TemplateFlow](https://www.templateflow.org/python-client/0.7.1/naming.html) convention.

A minimal set up should look like this:

```
my_atlas/
└──tpl-MNI152NLin2009cAsym/ # template directory of a valid template name
Expand Down Expand Up @@ -96,4 +66,4 @@ In a `json` file, define the customised atlas. We will use the atlas above as an
}
```

See examples in `giga_connectome/data/atlas`.
See examples in [`giga_connectome/data`](https://github.com/SIMEXP/giga_connectome/tree/main/giga_connectome/data).
8 changes: 6 additions & 2 deletions giga_connectome/atlas.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@
from nibabel import Nifti1Image
from pkg_resources import resource_filename

from giga_connectome.logger import gc_logger

gc_log = gc_logger()


PRESET_ATLAS = ["DiFuMo", "MIST", "Schaefer20187Networks"]

Expand Down Expand Up @@ -42,7 +46,7 @@ def load_atlas_setting(atlas: Union[str, Path, dict]):
Path to the atlas files.
"""
atlas_config = _check_altas_config(atlas)
print(atlas_config)
gc_log.info(atlas_config)

# load template flow
templateflow_dir = atlas_config.get("templateflow_dir")
Expand Down Expand Up @@ -104,7 +108,7 @@ def resample_atlas_collection(
List of pathlib.Path
Paths to atlases sampled to group level grey matter mask.
"""
print("Resample atlas to group grey matter mask.")
gc_log.info("Resample atlas to group grey matter mask.")
resampled_atlases = []
for desc in tqdm(atlas_config["file_paths"]):
parcellation = atlas_config["file_paths"][desc]
Expand Down
20 changes: 20 additions & 0 deletions giga_connectome/logger.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
"""General logger for the cohort_creator package."""
from __future__ import annotations

import logging

from rich.logging import RichHandler


def gc_logger(log_level: str = "INFO") -> logging.Logger:
# FORMAT = '\n%(asctime)s - %(name)s - %(levelname)s\n\t%(message)s\n'
FORMAT = "%(message)s"

logging.basicConfig(
level=log_level,
format=FORMAT,
datefmt="[%X]",
handlers=[RichHandler()],
)

return logging.getLogger("giga_connectome")
64 changes: 29 additions & 35 deletions giga_connectome/mask.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,10 @@

from giga_connectome.atlas import resample_atlas_collection

from giga_connectome.logger import gc_logger

gc_log = gc_logger()


def generate_gm_mask_atlas(
working_dir: Path,
Expand Down Expand Up @@ -63,7 +67,6 @@ def generate_group_mask(
template: str = "MNI152NLin2009cAsym",
templateflow_dir: Optional[Path] = None,
n_iter: int = 2,
verbose: int = 1,
) -> Nifti1Image:
"""
Generate a group EPI grey matter mask, and overlaid with a MNI grey
Expand All @@ -88,9 +91,6 @@ def generate_group_mask(
Number of repetitions of dilation and erosion steps performed in
scipy.ndimage.binary_closing function.
verbose :
Level of verbosity.
Keyword Arguments
-----------------
Used to filter the cirret
Expand All @@ -102,12 +102,10 @@ def generate_group_mask(
nibabel.nifti1.Nifti1Image
EPI (grey matter) mask for the current group of subjects.
"""
if verbose > 1:
print(f"Found {len(imgs)} masks")
if exclude := _check_mask_affine(imgs, verbose):
gc_log.debug(f"Found {len(imgs)} masks")
if exclude := _check_mask_affine(imgs):
imgs, __annotations__ = _get_consistent_masks(imgs, exclude)
if verbose > 1:
print(f"Remaining: {len(imgs)} masks")
gc_log.debug(f"Remaining: {len(imgs)} masks")

# templateflow environment setting to get around network issue
if templateflow_dir and templateflow_dir.exists():
Expand All @@ -129,7 +127,7 @@ def generate_group_mask(
memory=None,
verbose=0,
)
print(
gc_log.info(
f"Group EPI mask affine:\n{group_epi_mask.affine}"
f"\nshape: {group_epi_mask.shape}"
)
Expand Down Expand Up @@ -204,7 +202,7 @@ def _get_consistent_masks(


def _check_mask_affine(
mask_imgs: List[Union[Path, str, Nifti1Image]], verbose: int = 1
mask_imgs: List[Union[Path, str, Nifti1Image]]
) -> Union[list, None]:
"""Given a list of input mask images, show the most common affine matrix
and subjects with different values.
Expand All @@ -215,9 +213,6 @@ def _check_mask_affine(
See :ref:`extracting_data`.
3D or 4D EPI image with same affine.
verbose :
Level of verbosity.
Returns
-------
Expand All @@ -244,12 +239,11 @@ def _check_mask_affine(
common_affine = max(
set(header_info["affine"]), key=header_info["affine"].count
)
if verbose > 0:
print(
f"We found {len(set(header_info['affine']))} unique affine "
f"matrices. The most common one is "
f"{key_to_header[common_affine]}"
)
gc_log.info(
f"We found {len(set(header_info['affine']))} unique affine "
f"matrices. The most common one is "
f"{key_to_header[common_affine]}"
)
odd_balls = set(header_info["affine"]) - {common_affine}
if not odd_balls:
return None
Expand All @@ -259,18 +253,16 @@ def _check_mask_affine(
ob_index = [
i for i, aff in enumerate(header_info["affine"]) if aff == ob
]
if verbose > 1:
print(
"The following subjects has a different affine matrix "
f"({key_to_header[ob]}) comparing to the most common value: "
f"{mask_imgs[ob_index]}."
)
exclude += ob_index
if verbose > 0:
print(
f"{len(exclude)} out of {len(mask_imgs)} has "
"different affine matrix. Ignore when creating group mask."
gc_log.debug(
"The following subjects has a different affine matrix "
f"({key_to_header[ob]}) comparing to the most common value: "
f"{mask_imgs[ob_index]}."
)
exclude += ob_index
gc_log.info(
f"{len(exclude)} out of {len(mask_imgs)} has "
"different affine matrix. Ignore when creating group mask."
)
return sorted(exclude)


Expand All @@ -284,7 +276,9 @@ def _check_pregenerated_masks(template, working_dir, atlas):
if not group_mask.exists():
group_mask = None
else:
print(f"Found pregenerated group level grey matter mask: {group_mask}")
gc_log.info(
f"Found pregenerated group level grey matter mask: {group_mask}"
)

# atlas
resampled_atlases = []
Expand All @@ -301,8 +295,8 @@ def _check_pregenerated_masks(template, working_dir, atlas):
if not all(all_exist):
resampled_atlases = None
else:
print(
f"Found resampled atlases: {resampled_atlases}. Skipping group "
"level mask generation step."
gc_log.info(
f"Found resampled atlases:\n{[str(x) for x in resampled_atlases]}."
"\nSkipping group level mask generation step."
)
return group_mask, resampled_atlases
Loading

0 comments on commit 92f408d

Please sign in to comment.