Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat CI Pipeline #4

Merged
merged 50 commits into from
Oct 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
633176e
feat: add ci pipeline [experimental]
fabioseel Oct 7, 2024
6208c69
fix: fixed pycairo version in singularity def file
fabioseel Oct 8, 2024
d4840d8
perf: reduce size of singularity container
fabioseel Oct 8, 2024
b57fbc1
fix: replace container build with pull
fabioseel Oct 8, 2024
50940c7
fix(?): move singularity image to github / ghcr.io
fabioseel Oct 8, 2024
1e2c6fa
fix(?): add auto login credentials to workflow
fabioseel Oct 8, 2024
c5e133d
fix(?): try fixing the auto login
fabioseel Oct 8, 2024
0fe8982
fix(?): typo in package name for singularity container in workflow
fabioseel Oct 8, 2024
e592feb
fix(?): pylint to all files instead of just changed (for now)
fabioseel Oct 9, 2024
f7f9b49
chore: try bumping checkout action version to v4
fabioseel Oct 9, 2024
550e0a1
fix: update example cfgs
fabioseel Oct 9, 2024
45ae155
fix: make pylint not fail the pipeline (WIP)
fabioseel Oct 9, 2024
2180265
feat: add workflow to check if cfgs can be read
fabioseel Oct 9, 2024
f2870ff
fix: paths in config scan workflow
fabioseel Oct 9, 2024
2da3900
fix: make config scan run without cuda / gpu
fabioseel Oct 9, 2024
84cc1fa
feat: add workflow to test whether singularity container can be build
fabioseel Oct 9, 2024
361702a
feat: container build with automatical upload to ghcr
fabioseel Oct 9, 2024
5c53f58
fix: name of created sif file for scan
fabioseel Oct 9, 2024
e9d52ec
fix/refactor: build and push now should run, nicify multiline comment…
fabioseel Oct 10, 2024
5d59850
fix(?): reusable workflow
fabioseel Oct 10, 2024
8d848c2
fix(?):automatic referencing of sif file in reusable job
fabioseel Oct 10, 2024
453d8e3
feat: run build action once a month or when def file has changed
fabioseel Oct 10, 2024
3352e36
refactor: adjust when code check and scan are run
fabioseel Oct 10, 2024
45b1928
feat: run pylint on changed py files (compared to master)
fabioseel Oct 10, 2024
b26d636
refactor: reusable action for initial setup
fabioseel Oct 10, 2024
1d65f0e
fix: pylint complains in compile scenario of doom_creator
fabioseel Oct 10, 2024
9522e10
fix: checkout must be in every top level workflow
fabioseel Oct 10, 2024
19ecf45
fix(?): add checkout also to setup workflow
fabioseel Oct 10, 2024
00dddfd
fix(?): reusable workflow
fabioseel Oct 10, 2024
2840e46
fix(?): local workflows can not be called in steps
fabioseel Oct 10, 2024
70229c3
fix: env variables seem poorly supported, removed that
fabioseel Oct 10, 2024
40a09a8
feat: add cache, make code check more verbose
fabioseel Oct 10, 2024
77f3121
fix(?): code check comparison to master
fabioseel Oct 10, 2024
e8fd44d
fix: screw reusable actions, hopefully it runs now
fabioseel Oct 10, 2024
ff58590
fix: changed sif file path
fabioseel Oct 10, 2024
17db57e
exp: show state of directory before / after checkout
fabioseel Oct 10, 2024
298af6d
fix(?): checkout first before pulling container
fabioseel Oct 10, 2024
41ed433
fix(?): path for cache
fabioseel Oct 10, 2024
4278a6a
fix: step order/cache order in config_scan
fabioseel Oct 10, 2024
c2fd3cb
fix: enable pylint instead of listing again
fabioseel Oct 10, 2024
c7a8a69
fix: add pylintrc adding the repository to python path
fabioseel Oct 10, 2024
00b5586
feat: restore code check only if py files changed
fabioseel Oct 10, 2024
f06ecc5
fix: make code check run on pull requests to master
fabioseel Oct 10, 2024
7770a4b
fix: make cache hit branch independent
fabioseel Oct 10, 2024
6fbface
revert: cross-branch caching only possible in child branches
fabioseel Oct 11, 2024
426d0d1
fix: doom_creator/templates.py linting errors
fabioseel Oct 11, 2024
3b7d091
fix: restore keys in action needed for cross branch access?
fabioseel Oct 11, 2024
b7c698c
fix: pylint suggestions for vizdoom template
fabioseel Oct 11, 2024
2cd2bed
Merge pull request #5 from berenslab/feat-ci-pipeline-cache-test
fabioseel Oct 11, 2024
5f8ac43
feat: make build singularity container run on pull requests where def…
fabioseel Oct 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions .github/workflows/code_check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
name: Code Checking

on:
push:
paths:
- '**/*.py'
pull_request:
branches:
- master

env:
singularity_image: oras://ghcr.io/berenslab/retinal-rl:singularity-image-latest
sif_file: retinal-rl_singularity-image-latest.sif

jobs:
check:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Fetch all branches
run: git fetch --all

- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@v2
with:
apptainer-version: 1.3.0
- name: Cache Singularity Image
id: cache-singularity
uses: actions/cache@v3
with:
path: ${{ env.sif_file }}
key: ${{ runner.os }}-singularity-${{ hashFiles('~/resources/retinal-rl.def') }}
restore-keys: |
${{ runner.os }}-singularity-${{ hashFiles('~/resources/retinal-rl.def') }}
${{ runner.os }}-singularity-
- name: Pull Singularity container
if: steps.cache-singularity.outputs.cache-hit != 'true'
run: |
singularity registry login --username ${{ github.actor }} --password ${{ secrets.GITHUB_TOKEN }} oras://ghcr.io
singularity pull ${{ env.sif_file }} ${{ env.singularity_image }}

- name: Run Pylint
run: |
singularity exec ${{ env.sif_file }} \
pylint $(git diff --name-only origin/master...HEAD -- '*.py')
36 changes: 36 additions & 0 deletions .github/workflows/config_scan.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
name: Scan Configs
on: [pull_request,workflow_dispatch]

env:
singularity_image: oras://ghcr.io/berenslab/retinal-rl:singularity-image-latest
sif_file: retinal-rl_singularity-image-latest.sif

jobs:
scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@v2
with:
apptainer-version: 1.3.0
- name: Cache Singularity Image
id: cache-singularity
uses: actions/cache@v3
with:
path: ${{ env.sif_file }}
key: ${{ runner.os }}-singularity-${{ hashFiles('~/resources/retinal-rl.def') }}
restore-keys: |
${{ runner.os }}-singularity-${{ hashFiles('~/resources/retinal-rl.def') }}
${{ runner.os }}-singularity-
- name: Pull Singularity container
if: steps.cache-singularity.outputs.cache-hit != 'true'
run: |
singularity registry login --username ${{ github.actor }} --password ${{ secrets.GITHUB_TOKEN }} oras://ghcr.io
singularity pull ${{ env.sif_file }} ${{ env.singularity_image }}

- name: Scan classification config
run: |
cp -r resources/config_templates/* config/
singularity exec ${{ env.sif_file }} \
python main.py -m +experiment=cifar10-class-recon command=scan system.device=cpu
35 changes: 35 additions & 0 deletions .github/workflows/container_build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
name: Build Singularity Container

on:
schedule:
- cron: '0 2 1 * *'
push:
paths:
- 'resources/retinal-rl.def'
pull_request:
branches:
- master
paths:
- 'resources/retinal-rl.def'

jobs:
singularity-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: eWaterCycle/setup-apptainer@v2
with:
apptainer-version: 1.3.0

- name: Build Singularity container
run: apptainer build retinal-rl.sif resources/retinal-rl.def

- name: Scan classification config / ensure minimal functionality
run: |
cp -r resources/config_templates/* config/
singularity exec retinal-rl.sif python main.py -m +experiment=cifar10-class-recon command=scan system.device=cpu

- name: Push to ghcr.io
run: |
singularity registry login --username ${{ github.actor }} --password ${{ secrets.GITHUB_TOKEN }} oras://ghcr.io
singularity push retinal-rl.sif oras://ghcr.io/berenslab/retinal-rl:singularity-image-latest
2 changes: 2 additions & 0 deletions .pylintrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[MASTER]
init-hook='import sys; sys.path.append(".")'
Original file line number Diff line number Diff line change
@@ -1,3 +1,15 @@
"""Scenario Compiler for Retinal-RL

This module provides a utility to construct scenarios for the retinal-rl project.
It merges YAML files from a specified directory into a scenario specification and
subsequently compiles the scenario defined through those yaml files.

Usage:
python -m exec.compile-scenario [options] [yaml_files...]

Example:
python -m exec.compile-scenario gathering apples"""

import argparse
import os
import sys
Expand All @@ -11,6 +23,13 @@


def make_parser():
"""
Create and configure an argument parser for the scenario compiler.
Check the parsers description and arguments help for further information.

Returns:
argparse.ArgumentParser: Configured argument parser for the scenario compiler.
"""
# Initialize parser
Directories()
parser = argparse.ArgumentParser(
Expand All @@ -22,7 +41,7 @@ def make_parser():
running the first time one should use the --preload flag to download the
necessary resources into the --out_dir ('{Directories().CACHE_DIR}').
""",
epilog="Example: python -m exec.compile-scenario gathering apples",
epilog="Example: python -m exec.compile_scenario gathering apples",
)
# Positional argument for scenario yaml files (required, can be multiple)
parser.add_argument(
Expand All @@ -44,7 +63,7 @@ def make_parser():
parser.add_argument(
"--dataset_dir",
default=None,
help="source directory of a dataset (for preloading), if you already downloaded it somewhere",
help="source directory of a dataset (for preloading), if already downloaded somewhere",
)
parser.add_argument(
"--resource_dir",
Expand Down Expand Up @@ -72,6 +91,15 @@ def make_parser():


def main():
"""
Main function to parse arguments and execute scenario compilation tasks.

The function supports various modes of operation including preloading resources,
listing available YAML files, and creating scenarios. It also handles error
checking and warns the user if no actions are specified.

For a more detailed documentation, check the parser.
"""
# Parse args
argv = sys.argv[1:]
parser = make_parser()
Expand All @@ -92,7 +120,7 @@ def main():
print(f"Listing contents of {dirs.SCENARIO_YAML_DIR}:")
for flnm in os.listdir(dirs.SCENARIO_YAML_DIR):
print(flnm)
print(f"If you want to load from a different folder, change this to")
print("If you want to load from a different folder, change this to")
if do_make:
cfg, needed_types = check_preload(cfg, args.test)
any_dataset = False
Expand Down
16 changes: 11 additions & 5 deletions doom_creator/util/_templates/vizdoom.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
def config(scenario_name):
return """\
"""
This module contains a dummy for a vizdoom config.
"""


def config(scenario_name: str):
"""Returns a config for a vizdoom game,
referencing {scenario_name}.zip as the scenario."""

return f"""\
doom_scenario_path = {scenario_name}.zip

living_reward = 0.0
Expand Down Expand Up @@ -32,6 +40,4 @@ def config(scenario_name):
available_game_variables = {{ HEALTH }}

mode = PLAYER
""".format(
scenario_name=scenario_name
)
"""
7 changes: 7 additions & 0 deletions doom_creator/util/templates.py
Original file line number Diff line number Diff line change
@@ -1 +1,8 @@
"""Templates for doom scenario creation

This module provides the templates for acs scripts, decorate
definitions and the overall config (vizdoom)
"""
from doom_creator.util._templates import acs, decorate, vizdoom

__all__ = ['acs', 'decorate', 'vizdoom']
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ circuits:
_target_: retinal_rl.models.circuits.convolutional.ConvolutionalEncoder
num_layers: 2
num_channels: [16,32] # Two layers with 16 and 32 channels
kernel_size: ${kernel_size}
stride: ${stride}
kernel_size: 8
stride: 2
act_name: ${activation}
layer_names: ["bipolar", "retinal_ganglion"] # Names inspired by retinal cell types

Expand All @@ -34,8 +34,8 @@ circuits:
_target_: retinal_rl.models.circuits.convolutional.ConvolutionalEncoder
num_layers: 1
num_channels: 64
kernel_size: ${kernel_size}
stride: ${stride}
kernel_size: 5
stride: 1
act_name: ${activation}
layer_names: ["lgn"] # Lateral Geniculate Nucleus

Expand All @@ -44,27 +44,27 @@ circuits:
_target_: retinal_rl.models.circuits.convolutional.ConvolutionalEncoder
num_layers: 1
num_channels: 64
kernel_size: ${kernel_size}
stride: ${stride}
kernel_size: 8
stride: 2
act_name: ${activation}
layer_names: ["v1"] # Primary Visual Cortex

# Prefrontal Cortex: high-level cognitive processing
pfc:
_target_: retinal_rl.models.circuits.fully_connected.FullyConnectedEncoder
output_shape:
- ${latent_size} # Size of the latent representation
- 128 # Size of the latent representation
hidden_units:
- ${hidden_units} # Number of hidden units
- 64 # Number of hidden units
act_name: ${activation}

# Decoder: for reconstructing the input from the latent representation
decoder:
_target_: retinal_rl.models.circuits.convolutional.ConvolutionalDecoder
num_layers: 3
num_channels: [32,16,3] # For a symmetric encoder, this should be the reverse of the num_channels in the CNN layers up to the point of decoding (in this case, the thalamus)
kernel_size: ${kernel_size}
stride: ${stride}
kernel_size: [5,8,8]
stride: [1,2,2]
act_name: ${activation}

# Classifier: for categorizing the input into classes
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,9 @@ imageset:
image_rescale_range: [1, 5]
noise_transforms:
- _target_: retinal_rl.datasets.transforms.ShotNoiseTransform
lambda_range: [0.8, 1.2]
lambda_range: [0.5, 1.5]
- _target_: retinal_rl.datasets.transforms.ContrastTransform
contrast_range: [0.5, 1.5]
contrast_range: [0.01, 1.2]
apply_normalization: true
# normalization_mean: [0.4914, 0.4822, 0.4465]
# normalization_std: [0.2023, 0.1994, 0.2010]
fixed_transformation: false
multiplier: 1
29 changes: 13 additions & 16 deletions resources/config_templates/user/experiment/cifar10-class-recon.yaml
Original file line number Diff line number Diff line change
@@ -1,25 +1,22 @@
# This is the main entry point for users to specify their config parameters, and
# should be freely copied and edited.

# Defaults for the various subconfigs. Can be overriden from the commandline
# with e.g. experiment/brain=new_brain, where new_brain.yaml lives in the brain
# subdirectory
# @package _global_
defaults:
- _self_
- sweep: kernel-size
- dataset: cifar10-large
- brain: retinal-classifier
- optimizer: class-recon
- override /dataset: cifar10-decontrast
- override /brain: shallow-autoencoder
- override /optimizer: recon-weight

# This *must* match the experiment file name
name: cifar10-class-recon
# This is the main entry point for control of a retinal-rl experiment. Variables
# created here will be top-level, and defaults can be set for the various parts
# of an experiment (NB: do not add comments above the defaults list or it will
# break the config system.)
framework: classification

# This is a free list of parameters that can be interpolated by the subconfigs
# in sweep, dataset, brain, and optimizer. A major use for this is interpolating
# values in the subconfigs, and then looping over them in a sweep.
latent_size: 128
hidden_units: 64
activation: "elu"
kernel_size: 8
stride: 2
activation_sparsity: 0.0001
weight_decay: 0.0001
sparse_objective: retinal_rl.models.objective.L1Sparsity
recon_weight_retina: 1
recon_weight_thalamus: 0.99
46 changes: 0 additions & 46 deletions resources/config_templates/user/optimizer/class-recon.yaml

This file was deleted.

Loading
Loading