Skip to content

Commit

Permalink
Merge pull request #58 from scilus/feat/scilus_runners
Browse files Browse the repository at this point in the history
Feat/scilus_runners
  • Loading branch information
AlexVCaron authored Feb 8, 2024
2 parents 4ae564d + e5659de commit cdbdf54
Show file tree
Hide file tree
Showing 9 changed files with 142 additions and 25 deletions.
4 changes: 3 additions & 1 deletion .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
"build": { "dockerfile": "Dockerfile" },
"forwardPorts": [3000],
"onCreateCommand": "bash .devcontainer/setup_container.sh",
"postStartCommand": "git config --global --add safe.directory ${containerWorkspaceFolder}",
"features": {
"ghcr.io/devcontainers/features/git:1": {},
"ghcr.io/devcontainers/features/git-lfs:1": {},
Expand Down Expand Up @@ -56,7 +57,8 @@
},
"extensions": [
"AlexVCaron.nf-scil-extensions",
"ms-python.autopep8"
"ms-python.autopep8",
"ms-python.vscode-pylance"
]
}
},
Expand Down
8 changes: 8 additions & 0 deletions .devcontainer/setup_container.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
#!/usr/bin/env bash

NODE_MAJOR=18

poetry install --no-root
echo "export PROFILE=docker" >> ~/.bashrc

curl -fsSL https://deb.nodesource.com/setup_${NODE_MAJOR}.x | bash - &&\
apt-get install -y nodejs

npm install --save-dev --save-exact prettier

echo "function prettier() { npm exec prettier $@; }" >> ~/.bashrc
2 changes: 1 addition & 1 deletion .github/workflows/code_linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
contains(github.event.comment.html_url, '/pull/') &&
contains(github.event.comment.body, '@nf-scil-bot fix linting') &&
github.repository == 'scilus/nf-scil'
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
steps:
# Use the @nf-scil-bot token to check out so we can push later
- uses: actions/checkout@v4
Expand Down
16 changes: 8 additions & 8 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: Modules Tests
on:
push:
branches: [main]
pull_request:
pull_request_target:
branches: [main]
merge_group:
types: [checks_requested]
Expand All @@ -20,7 +20,7 @@ env:

jobs:
pre-commit:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v3
Expand All @@ -32,7 +32,7 @@ jobs:
extra_args: ""

prettier:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
steps:
- name: Check out repository
uses: actions/checkout@v4
Expand All @@ -49,7 +49,7 @@ jobs:
run: prettier --check .

editorconfig:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
steps:
- uses: actions/checkout@v4

Expand All @@ -65,7 +65,7 @@ jobs:

pytest-changes:
name: pytest-changes
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
outputs:
# Expose matched filters as job 'modules' output variable
modules: ${{ steps.filter.outputs.changes }}
Expand All @@ -82,7 +82,7 @@ jobs:
token: ""

nf-core-lint:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
name: nf-core-lint
needs: [pytest-changes]
if: ${{ (needs.pytest-changes.outputs.modules != '[]') }}
Expand Down Expand Up @@ -134,7 +134,7 @@ jobs:
if: ${{ startsWith(matrix.tags, 'subworkflows/') }}

pytest:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
name: pytest
needs: [pytest-changes]
if: needs.pytest-changes.outputs.modules != '[]'
Expand Down Expand Up @@ -212,7 +212,7 @@ jobs:
!${{ github.workspace }}/.singularity
confirm-pass:
runs-on: ubuntu-latest
runs-on: scilus-nf-scil-runners
needs: [prettier, editorconfig, pytest-changes, nf-core-lint, pytest]
if: always()
steps:
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,5 @@
.DS_Store

*.code-workspace

node_modules/
29 changes: 26 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ nf-core modules \
- Java Runtime ≥ 11, ≤ 17
- On Ubuntu, install `openjdk-jre-<version>` packages
- Nextflow &geq; 21.04.3
- Node &geq; 14 and Prettier (seel [below](#installing-prettier))

> [!IMPORTANT]
> Nextflow might not detect the right `Java virtual machine` by default, more so if
Expand All @@ -91,7 +92,7 @@ nf-core modules \
The project uses _poetry_ to manage python dependencies. To install it using pipx,
run the following commands :

```
```bash
pip install pipx
pipx ensurepath
pipx install poetry
Expand All @@ -107,7 +108,7 @@ pipx install poetry
Once done, install the project with :

```
```bash
poetry install
```

Expand All @@ -118,7 +119,7 @@ poetry install
The project scripts and dependencies can be accessed using :

```
```bash
poetry shell
```

Expand Down Expand Up @@ -184,3 +185,25 @@ nf-core modules \
```

The tool can be omitted to run tests for all modules in a category.


# Installing Prettier

To install **Prettier** for the project, you need to have `node` and `npm` installed on your system to at least version 14. On Ubuntu, you can do it using snap :

```bash
sudo snap install node --classic
```

However, if you cannot install snap, or have another OS, refer to the [official documentation](https://nodejs.org/en/download/package-manager/) for the installation procedure.

Under the current configuration for the *Development Container*, for this project, we use the following procedure, considering `${NODE_MAJOR}` is at least 14 for Prettier :

```bash
curl -fsSL https://deb.nodesource.com/setup_${NODE_MAJOR}.x | bash - &&\
apt-get install -y nodejs

npm install --save-dev --save-exact prettier

echo "function prettier() { npm exec prettier $@; }" >> ~/.bashrc
```
66 changes: 54 additions & 12 deletions docs/MODULE.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

First verify you are located at the root of this repository (not in `modules`), then run the following interactive command :

```
```bash
nf-core modules create
```

Expand All @@ -19,7 +19,7 @@ to the following to ensure configuration abides with `nf-scil` :

Alternatively, you can use the following command to supply nearly all information :

```
```bash
nf-core modules create \
--author @scilus \
--label process_single \
Expand Down Expand Up @@ -79,7 +79,9 @@ already follow all guidelines. You will find related files in :
In the script section, before the script definition (in `""" """`), unpack the
optional argument into a `usable variable`. For a optional input `input1`, add :

def optional_input1 = input1 ? "<unpack input1>" : "<default if input1 unusable>"
```groovy
def optional_input1 = input1 ? "<unpack input1>" : "<default if input1 unusable>"
```

The variable `optional_input1` is the one to use in the script.

Expand Down Expand Up @@ -142,6 +144,15 @@ don't need to specify them all. At least define the `keywords`, describe the pro
> the module ! If you use scripts from `scilpy`, here you describe scilpy. If using
> `ANTs`, describe ANts. Etcetera.
Once done, commit your module and push the changes. Then, to look at the documentation it creates for your module, run :
```bash
nf-core modules \
--git-remote <your reository> \
--branch <your branch unless main branch> \
list <category/name>
```
### Editing `./tests/modules/nf-scil/<category>/<tool>/main.nf` :
The module's test suite is a collection of workflows containing isolated test cases. You
Expand Down Expand Up @@ -183,7 +194,7 @@ so output files that gets generated are checksum correctly.
Run :
```
```bash
nf-core modules create-test-yml \
--run-tests \
--force \
Expand All @@ -196,6 +207,19 @@ smoothly, look at the test metadata file produced : `tests/modules/nf-scil/<cate
and validate that ALL outputs produced by test cases have been caught. Their `md5sum` is
critical to ensure future executions of your test produce valid outputs.
## Lint your code
Before submitting to *nf-scil*, once you've commit and push everything, the code need to be correctly linted, else the checks won't pass. This is done using `prettier` on your new module, through the *nf-core* command line :
```bash
nf-core module \
--git-remote <your repository> \
--branch <your branch unless main branch> \
lint <category>/<tool>
```
YOu'll probably get a bunch of *whitespace* and *indentation* errors, but also image errors, bad *nextflow* syntax and more. You need to fix all `errors` and as much as the `ẁarnings`as possible.
## Last safety test
You're mostly done ! If every tests passes, your module is ready ! Still, you have not tested
Expand All @@ -214,7 +238,7 @@ testing one.
Run the following command, to try installing the module :
```
```bash
nf-core module \
--git-remote https://github.com/scilus/nf-scil.git \
--branch <branch> \
Expand Down Expand Up @@ -319,7 +343,7 @@ for the `dictionary key` : `params.test_data[<category>][<tool>][<input_name>]`.
Thus, a new binding in `tests/config/test_data.config` should resemble the following
```
```groovy
params {
test_data {
...
Expand All @@ -346,7 +370,7 @@ You then use `params.test_data[<category>][<tool>][<input_name>]` in your test c
attach the data to the test case, since the `params.test_data` collection is loaded
automatically. To do so, in a test workflow, define an `input` object :
```
```groovy
input = [
[ id:'test', single_end:false ], // meta map
params.test_data[<category>][<tool>][<input_name1>],
Expand All @@ -365,7 +389,7 @@ and use it as input to the processes to test.
The Scilpy Fetcher is a tool that allows you to download datasets from the Scilpy test data
depository. To use it, first include the _fetcher workflow_ in your test's `main.nf` :
```
```groovy
include { LOAD_TEST_DATA } from '../../../../../subworkflows/nf-scil/load_test_data/main'
```
Expand All @@ -376,11 +400,22 @@ The workflow has two inputs :
- A name for the temporary directory where the data will be put.
The directories where the archives contents are unpacked are accessed using the output
parameter of the workflow `LOAD_TEST_DATA.out.test_data_directory`. To create the test
input from it, use the `.map` operator :
To call it, use the following syntax :
```groovy
archives = Channel.from( [ "<archive1>", "archive2", ... ] )
LOAD_TEST_DATA( archives, "<directory>" )
```
>[!IMPORTANT]
>This will download the `archives` and unpack them under the `directory`
>specified, using the archive's names as `sub-directories` to unpack to.
The archives contents are accessed using the output parameter of the workflow
`LOAD_TEST_DATA.out.test_data_directory`. To create the test input from it for
a given `PROCESS` to test use the `.map` operator :
```groovy
input = LOAD_TEST_DATA.out.test_data_directory
.map{ test_data_directory -> [
[ id:'test', single_end:false ], // meta map
Expand All @@ -390,7 +425,14 @@ input = LOAD_TEST_DATA.out.test_data_directory
] }
```
Then feed it to it :
```groovy
PROCESS( input )
```
> [!NOTE]
> The subworkflow must be called individually in each test workflow, even if they download
> the same archives, since there is no mechanism to pass data channels to them from the
> outside.
> outside, or share cache between them.
35 changes: 35 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 5 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"devDependencies": {
"prettier": "3.2.5"
}
}

0 comments on commit cdbdf54

Please sign in to comment.