Skip to content

Commit

Permalink
doc: Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
schuhschuh authored Jul 10, 2017
1 parent 218afbb commit 2dbeda7
Showing 1 changed file with 28 additions and 28 deletions.
56 changes: 28 additions & 28 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,17 @@ HTCondor DAGMan application for the construction of a spatio-temporal brain atla
from cross-sectional brain MR images using deformable registration between all pairs of images.
The subject-to-atlas deformations can be iteratively refined. Alternatively, the initial
pairwise registrations can be skipped, and only an initial global average image computed.
A more recent iterative atlas construction is implemented by the `mirtk construct-atlas`
command in [MIRTK](https://mirtk.github.io).

**This workflow has been tested with MIRTK master branch revision c18e1ac, but should work with the latest version.**
A more recent iterative atlas construction is implemented by the `construct-atlas`
[MIRTK](https://mirtk.github.io) command.

**This workflow has been tested with MIRTK master branch revision c18e1ac, but HEAD commit should work.**


Initial Setup
=============

Clone this repository into a **workflow** subdirectory next to the directories
Clone this repository into a `workflow` subdirectory next to the directories
containing the individual MR images and corresponding segmentation label maps.
For example, run the following commands to create a new directory for the
construction of a new brain atlas.
Expand All @@ -24,53 +25,52 @@ git clone [email protected]/MIRTK/BAtCh.git workflow
cd workflow
```

**Update:** The location of the input files can be specified in the `etc/config/custom.sh`
shell script. Hence, no need to copy/link these.
**Update:** The location of input files can be specified in the configuration file. No need to copy/link these.


Configuration Files
===================

The atlas construction workflow is configured by mainly three files:

- **etc/config/default.sh**: A shell script containing global variables used by **bin/gen-workflow**.
- **etc/config/custom.sh**: Optional shell script to override global variables used by **bin/gen-workflow**.
- **etc/config/ages.csv**: A comma or space separated CSV file with image IDs and corresponding ages.
- **etc/config/subjects.lst**: An optional subject list containing only the IDs of those images
from which the spatio-temporal atlas should be created.
- `etc/config/default.sh`: A shell script containing global variables used by `bin/gen-workflow`.
- `etc/config/custom.sh`: Optional shell script to override global default variables.
- `etc/config/ages.csv`: A comma or space separated CSV file with image IDs and corresponding ages.
- `etc/config/subjects.lst`: An optional subject list containing only the IDs of those images
from which the spatio-temporal atlas should be created.

Additionally, some of the workflows require a reference image used for the initial
global normalization of the images which also defines the coordinate system of the
generated atlas images. A neonatal reference brain image downloaded from
[brain-development.org](http://biomedic.doc.ic.ac.uk/brain-development/index.php?n=Main.Neonatal2)
can be found in the `etc/reference` directory.

**See the comments in `etc/config/default.sh` for what parameters and file paths can be set.**
**See the comments in [etc/config/default.sh](etc/config/default.sh) for what parameters and file paths can be set.**


Temporal Regression Kernels
===========================

The atlas construction workflow produces a spatial anatomical atlas and
tissue/structure probability maps for each time point for which a temporal kernel
is found in the **kernel** directory specified in the configuration file.
is found in the `kernel` directory specified in the configuration file.

The kernels used for the neonatal atlas are based on a Gaussian function with
mean corresponding to the desired atlas time point (gestational age, GA) and a
constant standard deviation (default 1 week). A variable kernel width is
possible by generating kernels with varying standard deviation for different
atlas time points. An input "kernel" is simply a comma or tab separated CSV
file, e.g., named **t$age.tsv**, where the first column contains the ID of
file, e.g., named `t$age.tsv`, where the first column contains the ID of
the images from which the atlas at the respective time point is created and the
second column their respective kernel weight. The provided **bin/gen-kernels** script
second column their respective kernel weight. The provided `bin/gen-kernels` script
can be used to create such CSV files using a Gaussian kernel function. It should
be noted, however, that the kernels can be generated with any tool, including MATLAB.

For example, the kernels for the dHCP atlas built from 275 images for the
age range 36 to 44 weeks GA, with a temporal resolution of 1 week with constant kernel
width can be generated by setting `sigma=1` (default set in `etc/config/default.sh`)
in the **etc/config/dhcp-v2.4/dHCP275/constant-sigma.sh** file and then running
the command
in the respective [configuration file](etc/config/dhcp-v2.4/dHCP275/constant-sigma.sh)
and by then running the command

```shell
bin/gen-kernels -c etc/config/dhcp-v2.4/dHCP275/constant-sigma.sh -range 36 44
Expand All @@ -80,14 +80,14 @@ bin/gen-kernels -c etc/config/dhcp-v2.4/dHCP275/constant-sigma.sh -range 36 44
Generate Workflow DAG
=====================

Given the **ages.csv** (and **subjects.lst**) as well as the temporal regression kernels
generated in the previous step, execute the **setup** script to generate the
Given the `ages.csv` (and `subjects.lst`) as well as the temporal regression kernels
generated in the previous step, execute the `bin/gen-workflow` script to generate the
HTCondor and DAGMan files which specify the separate jobs to be executed by
HTCondor and describe the directed acyclic graph (DAG) of the workflow
(i.e., job dependencies). The setup script will also copy the used MIRTK commands
into the configured **bindir** to ensure these are not modified while the workflow
into the configured `bindir` to ensure these are not modified while the workflow
is being executed. The generated DAG files, parameter files, and job descriptions
can be found in the configured **dagdir**.
can be found in the configured `dagdir`.

```shell
bin/gen-workflow -v
Expand All @@ -97,29 +97,29 @@ bin/gen-workflow -v
Workflow Execution
==================

The atlas construction workflow can be executed by simply submitting the
**$dagdir/main.dag** to HTCondor using **condor_submit_dag**.
The workflow can be executed by submitting the
`main.dag` to HTCondor using `condor_submit_dag`.

The long running DAGMan job needs to have a valid authentication method to
submit new jobs and monitor running jobs. The current Imperial College London
Department of Computing (DoC) HTCondor installation uses Kerberos v5
authentication. The user running the DAGMan job must periodically renew
their Kerberos ticket granting ticket (TGT). This can be done by executing
the **bin/run-dagman** script instead of calling **condor_submit_dag** directly:
the `bin/run-dagman` script instead of calling `condor_submit_dag` directly:

```shell
bin/run-dagman "$dagdir/main.dag"
```

This script will replace the *condor_dagman* executable usually submitted to
HTCondor by a Bash script named **lib/tools/run-dagman** which runs *condor_dagman* as
background job and periodically reinitializes the Kerberos ticket cache using **kinit**.
HTCondor by a Bash script named `lib/tools/run-dagman` which runs *condor_dagman* as
background job and periodically reinitializes the Kerberos ticket cache using `kinit`.
To be able to do so without the user account password, it requires a user-generated
kerb5.keytab file.

Alternatively, a cron job independently of this atlas creation workflow can be setup,
Alternatively, a cron job independently of this atlas creation workflow can be setup
which periodically obtains a new Kerberos ticket. Instructions are available to
BioMedIA members at http://biomedic.doc.ic.ac.uk/index.php?n=Internal.KerberosTickets.
BioMedIA members [here](http://biomedic.doc.ic.ac.uk/index.php?n=Internal.KerberosTickets).

A Python script for either serial execution or submission of batch jobs to SLURM
instead of HTCondor is included as well. To run the atlas construction on a SLURM
Expand Down

0 comments on commit 2dbeda7

Please sign in to comment.