pip install mpyl
mpyl --help
β It is recommended to run this before running any other commands.
mpyl health
Will validate the configuration and check if all required tools are installed.
Find out which projects need to be built.
mpyl build status
Run a build.
mpyl build run
Create a pull request.
gh pr create --draft
If you use MPyL in a github action, a build will be triggered automatically and the results will be reported there. If you use jenkins as your CI tool, you can trigger a build on your pull request:
mpyl build jenkins
.. include:: tests/cli/test_resources/main_help_text.txt
Top level commands options are passed on to sub commands and need to be specified before the sub command.
In mpyl projects --filter <name> list
, the --filter
option applies to all project
commands, like list
or lint
.
Projects
``` .. include:: tests/cli/test_resources/projects_help_text.txt ```Repo
``` .. include:: tests/cli/test_resources/repo_help_text.txt ```Build
``` .. include:: tests/cli/test_resources/build_help_text.txt ```MPyL can be configured through a file that adheres to the mpyl_config.yml
schema.
Which configuration fields need to be set depends on your usecase. The error messages that you
encounter while using the cli may guide you through the process.
Note that the included mpyl_config.example.yml
is just an example.
Secrets can be injected through environment variable substitution via the pyaml-env library. Note that values for which the ENV variable is not set, will be absent in the resulting configuration dictionary.
Example config
```yaml .. include:: mpyl_config.example.yml ```Check the schema for run_properties.yml
, which contains detailed
documentation and can be used to enable on-the-fly validation and auto-completion in your IDE.
MPyL can be configured to use an arbitrary set of build stages. Typical CI/CD stages are build
, test
or deploy
.
See mpyl.steps
for the steps that come bundled and how to define and register your own.
Example stage configuration
```yaml .. include:: mpyl_stages.schema.yml ```Usability of the CLI is greatly enhanced by autocompletion. To enable autocompletion, depending on your terminal, do the following:
Add this to ~/.bashrc
:
eval "$(_MPYL_COMPLETE=bash_source mpyl)"
Add this to ~/.zshrc
:
eval "$(_MPYL_COMPLETE=zsh_source mpyl)"
Add this to ~/.config/fish/completions/foo-bar.fish
:
eval (env _MPYL_COMPLETE=fish_source mpyl)
Go to: Preferences | Languages & Frameworks | Schemas and DTDs | JSON Schema Mappings
- Add new schema
- Add matching schema file from latest release:
- */deployment/project.yml -> https://vandebron.github.io/mpyl/schema/project.schema.yml
- mpyl_config.example.yml -> https://vandebron.github.io/mpyl/schema/mpyl_config.schema.yml
- run_properties.yml -> https://vandebron.github.io/mpyl/schema/run_properties.schema.yml
- Select version:
JSON Schema Version 7
- Add YAML files corresponding to the schema or add the file pattern. (For instance, adding the file pattern
project.yml
to theproject.schema.yml
will take care of autocompletion in anyproject.yml
.)
All CI/CD related files reside in a ./deployment
sub folder, relative to the project source code folder.
A typical deployment folder may contain the following files
βββ Dockerfile-mpl
βββ project.yml
βββ docker-compose-test.yml
The project.yml
defines which steps needs to be executed during the CI/CD process.
name: batterypackApi
stages:
build: Sbt Build
test: Sbt Test
deploy: Kubernetes Deploy
name
is a required parameterstages
are optional parameters. Stages that are undefined will be skipped. Depending on the type of project you want to build, you need to specify an appropriate action to be performed in each stage. For example:Sbt Build
can be used for scala projects, andDocker Build
can be used for front-end projects.kubernetes
is a required parameter ifdeploy
stage is set toKubernetes Deploy
.
The schema for project.yml
contains detailed
documentation and
can be used to enable on-the-fly validation and auto-completion in your IDE.
MPyL is not a taskrunner nor is it a tool to define and run CI-CD flows. It does however provide a building blocks that can easily be plugged into any existing CI-CD platform.
Github actions are a natural fit for MPyL. To build a pull request, you can use the following workflow:
name: Build pull request
on:
push:
branches-ignore: [ 'main' ]
jobs:
Build_PR:
name: Build and deploy the pull request
runs-on: ubuntu-latest
steps:
- uses: actions/setup-python@v4
with:
python-version: '3.9'
- name: Install MPyL
run: pip install 'mpyl==<latest_version>'
- name: Initialize repo
run: mpyl repo init --branch ${{ github.ref }} --url https://${{ env.GITHUB_TOKEN }}@github.com/${{ github.repository }}.git --pristine
- name: Print execution plan
run: mpyl build status
- name: Build run
run: mpyl build run
The --pristine
flag in the mpyl repo init
command will clone the repository into the current empty workspace, using
git clone --shallow-exclude main --single-branch --branch <branch_name> https://github.com/<org>/<repo>.git
This will result in a shallow clone of the repository, containing only the files that are relevant for the current pull request.
Although dagster's primary focus is data processing and lineage, it can be used as a runner for MPyL. It provides a nice UI to inspect the flow and logs. It supports concurrent execution of steps in a natural way. These features make it a convenient runner for local development and debugging.
Dagster flow runner
```python .. include:: mpyl-dagster-example.py ```It can be started from the command line with dagit --workspace workspace.yml
.
Docker image layers built in previous runs can be used as a cache for subsequent runs. An external cache source can
be configured in mpyl_config.yml
as follows:
docker:
registry:
cache:
cacheFromRegistry: true
custom:
to: 'type=gha,mode=max'
from: 'type=gha'
The to
and from
fields map to --cache-to
and --cache-from
buildx arguments.
The docker cache can be used in both the mpyl.steps.build.dockerbuild
and mpyl.steps.test.dockertest
steps.
MPyL's artifact metadata is stored in the hidden .mpyl
folders next to project.yml
.
These folders are used to cache information about (intermediate) build results.
A typical .mpyl
folder has a file for each executed stage. The BUILD.yml
file contains the metadata for the
build step. For example:
message: Pushed ghcr.io/samtheisens/nodeservice:pr-6
produced_artifact: !Artifact
artifact_type: !ArtifactType DOCKER_IMAGE-1
revision: b6c87b70c3c16174bdacac6c7dd4ef71b4bb0047
producing_step: After Docker Build
spec: !DockerImageSpec
image: ghcr.io/samtheisens/nodeservice:pr-6
These files speed up subsequent runs by preventing steps from being executed when their inputs have not changed.
π§Ή These .mpyl
folders can be safely deleted to force a full rebuild via
mpyl build clean
To preserve intermediate build results between runs, you can use the
mpyl build artifacts push
command at the end of a run. This will push the .mpyl
folder to the remote repository configured in mpyl_config.yml
vcs:
cachingRepository:
mainBranch: 'main'
remote:
url: 'https://github.com/acme/artifact-repo.git'
userName: !ENV ${GIT_CREDENTIALS_USR}
password: !ENV ${GIT_CREDENTIALS_PSW}
email: "[email protected]"
To pull the previously pushed artifacts, use
mpyl build artifacts pull
at the beginning of your run.
MPyL comes with built-in reporters for Github, Jira and Slack. See mpyl.reporting.targets
how to configure
them and for instructions on how to create your own reporter.
See mpyl.steps
.
If the output of your build step is a docker image, you can use the mpyl.steps.build.docker_after_build
step to
make sure the resulting image is tagged, pushed to the registry and made available as an artifact for
later (deploy) steps.
MPyL can parse Junit test results for reporting purposes. Your test step needs to produce a
mpyl.steps.models.ArtifactType.JUNIT_TESTS
artifact.
See mpyl.steps.test.echo
for an example of how such an artifact can be created.
If your project includes "integration tests" that require a docker container to run during the test stage,
you can define these containers in a file named docker-compose-test.yml
. For example, to test your database schema
upgrades, with a real postgres database:
Example `docker-compose-test.yml`
```yaml .. include:: tests/projects/service/deployment/docker-compose-test.yml ```Note: make sure to define a reliable healthcheck
to prevent your tests from being run before the database is
fully up and running.