Developers that use Kabanero pipelines often times have to extend these pipelines to do certain tasks that do not come in the out-of-the-box Kabanero pipelines. These tasks may include code coverage, or use third party applications like Pact Broker, Sonarqube or Artifactory to full-fill software requirements. Currently, there are not many methods to manage and version control your Kabanero pipelines, and the goal of this repository is to help you get going.
You will learn how to package, host your pipelines in different environments such as Git or Artifactory and use these pipelines to automate the process of updating the Kabanero custom resource to a respective host where your Kabanero pipelines exist.
- Extend, Build & Deploy Kabanero Pipelines
- Overview
- Pre-requisites
- How to use artifactory-package-release-update pipeline
- How to use clone-storefront-ms-push-repos-to-org pipeline
- How to use git-package-release-update pipeline
- Create tekton webhook
This repository includes 2 directories, experimental
(pipelines that are not production-ready and are considered, proof of concept), and stable
(pipelines that are production ready). It also includes the following pipelines:
Stable | Description |
---|---|
artifactory-package-release-update | Compress custom pipelines, upload compressed pipelines to artifactory, and updates the Kabanero Custom Resource |
clone-storefront-ms-push-repos-to-org | Given a github org name, clone storefront microservices and deploy them to the github org |
git-package-release-update | Compress custom pipelines, create a github release, upload compressed pipelines to the release, and update the Kabanero Custom Resource |
storefront-springboot | This pipeline contains healthcheck, sonar-scan, pact-broker tasks for spring boot applications |
Incubator | Description |
---|---|
cloud-foundry | Deploys an application to cloud foundry given a specific namespace using CloudFoundry CLI |
deploy-app-ibm-cloud | Deploys an application to cloud foundry given a specific namespace using the IBM CLI |
-
Install the following CLI's on your laptop/workstation:
-
Fork the devops-pipelines repository
-
Deploy Artifactory on your Openshift cluster
-
Generate an API Key.
-
Update Artifactory config map artifactory-config.yaml and update the
artifactory_key
. Once done, run the following commands:oc project kabanero cd ./configmaps oc apply -f artifactory-config.yaml
-
Go to the pipelines directory make any modifications you want to do to any of the pipelines, or include your own.
-
Create your pipeline by running the following command:
cd pipelines/experimental oc apply --recursive --filename pipelines/experminetal/artifactory-package-release-update/
-
Go to the dashboard and verify that the
artifactory-package-release-update-pl
has been added to the Tekton dashboard -
Go to section Create tekton webhook to create your web hook.
-
Go to your forked repository and make a change, and your Tekton dashboard should create a new pipeline run as shown below: Where the
git-source
is defined as the pipeline resource with key [url] and value [github repo url]
The end result should look like the following:
You can use a pipeline to automate the process of extending, packaging and releasing your pipelines via a Git Release. The process is very similar to the section above.
- Fork this repository devops-pipelines
-
Add your custom pipelines or modify an existing one
If you inspect
./pipelines/
you can create a new folder for each new pipeline you have and follow a similar structure as below.echo pwd ./devops-pipelines/pipelines ├── experimental │ ├── README.md │ ├── abc │ │ ├── bindings │ │ │ ├── abc-pl-pullrequest-binding.yaml │ │ │ └── abc-pl-push-binding.yaml │ │ ├── configmaps │ │ │ └── abc-pl-configmap.yaml │ │ ├── pipelines │ │ │ └── abc-pl.yaml │ │ ├── secrets │ │ │ └── abc-pl-secret.yaml │ │ ├── tasks │ │ │ └── abc-task.yaml │ │ └── template │ │ └── abc-pl-template.yaml │ └── manifest.yaml ├── stable │ ├── README.md │ ├── cde │ │ ├── bindings │ │ │ ├── cde-pl-pullrequest-binding.yaml │ │ │ └── cde-pl-push-binding.yaml │ │ ├── configmaps │ │ │ └── cde-pl-configmap.yaml │ │ ├── pipelines │ │ │ └── cde-pl.yaml │ │ ├── secrets │ │ │ └── cde-pl-secret.yaml │ │ ├── tasks │ │ │ └── cde-task.yaml │ │ └── template │ │ └── cde-pl-template.yaml │ └── manifest.yaml
Pipelines in
expermental
do not get built. -
Now drag and drop your pipelines and tasks to any of these folders,
-
You must update the
configmap
andsecret
we provided for you. But first, create another repository such asdevops-server
. In this repodevops-server
you will be hosting your pipelines as Git releases. Do not forget to create a README.md file.Navigate to
pipelines/stable/git-package-release-update/configmaps
and update thepipeline-server-configmap.yaml
apiVersion: v1 kind: ConfigMap metadata: name: pipeline-server-configmap namespace: kabanero data: repo_org: your-github-username-or-org repo_name: your-github-repo-where-you-will-host-pipelines image_registry_publish: 'false' kabanero_pipeline_id: pipeline-manager
Update the secret in
pipelines/stable/git-package-release-update/secrets/
apiVersion: v1 kind: Secret metadata: name: pipeline-server-git namespace: kabanero type: kubernetes.io/basic-auth data: password: your-git-token-encoded username: your-git-username-encoded
Now run the following command to be able to retrieve resources for the
kabanero-pipeline
service account.oc adm policy add-cluster-role-to-user view system:serviceaccount:kabanero:kabanero-pipeline
-
Create web hook for the
devops-pipelines
repository you created on step 1. -
Deploy your pipeline, tasks, event bindings and trigger templates by running the following command in the
devops-pipelines
repo you created on step 1:oc apply --recursive --filename pipelines/stable/git-package-release-update git add . git commit -m "adding new pipelines..." git push
Your output should be the following:
If you go to the devops-server
repo you created on step 2, you should see a new release with your zip files as shown below:
Now inspect your Kabanero Custom Resource to ensure your default-kabanero-pipelines.tar.gz
got added to the pipelines
key value pair.
oc get kabanero -o yaml
stacks:
pipelines:
- https:
url: https://github.com/ibm-garage-ref-storefront/pipelines-server/releases/download/1.0/default-kabanero-pipelines.tar.gz
id: pipeline-manager
sha256: 8fe10018016e5059640b1a790afe2d6a1ff6c4f54bf3e7e4fa3fc0f82bb2207d
The pipelines that you added to the devops-pipelines
repository should now be visible on the tekton dashboard as shown below:
Now you can reuse these pipelines across your organization! If your cluster comes down you now have a backup of your pipelines.
You need to create an access token on the tekton dashboard or cli in the kabanero namespace. Earlier you created a github token on the github dashboard. You will need to get that token or generate another one and paste it below.
Web hook Settings:
Name: devops-demo-kabanero-pipelines
Repistory-url: your forked repo url goes here
Access Token: Token you generated previously
Target Pipeline Settings
Namespace: kabanero
Pipeline: Choose artifactory-package-release-update-pl or git-package-release-update-pl
Service Account: Pipeline
Docker Registry: us.icr.io/project-name or docker.hub.io/projectname