-
Notifications
You must be signed in to change notification settings - Fork 0
Conversation
dleard
commented
Mar 19, 2020
•
edited
Loading
edited
- Adds modified code from https://github.com/puckel/docker-airflow to docker folder
- change all logger handlers to 'console'
- follows / adapts instructions from https://github.com/apache/airflow/blob/1e3cdddcd87be3c0f11b43efea11cdbddaff4470/docs/howto/write-logs.rst
- Instructions under 'Writing logs to Azure Blob Storage'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
notes about semver might be worth a team discussion; only minor things otherwise :)
extraVolumeMounts: | ||
- name: cas-airflow-home | ||
mountPath: /usr/local/airflow | ||
mountPath: /usr/local/airflow/dags |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
cas-airflow/values.yaml
Outdated
@@ -2,10 +2,10 @@ airflow: | |||
airflow: | |||
image: | |||
repository: docker-registry.default.svc:5000/wksv3k-tools/cas-airflow | |||
tag: 0.0.1 | |||
tag: '0.1.0-pre.1.0' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
while this does fit the grammar of a semver tag maybe we could use build identifiers for pre-releases? See spec item 10 of the spec.
@@ -1,9 +1,9 @@ | |||
dependencies: | |||
- name: postgresql | |||
repository: file://../postgresql | |||
version: 0.13.1-p3 | |||
version: 0.13.1-p4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
semver thoughts would apply here too
this is becoming more important in the context of release management strategy for our team
docker/cas-airflow/Dockerfile
Outdated
# Never prompt the user for choices on installation/configuration of packages | ||
ENV DEBIAN_FRONTEND noninteractive | ||
ENV TERM linux | ||
|
||
LABEL io.openshift.s2i.assemble-user="1000" | ||
USER root | ||
RUN set -ex \ | ||
&& apt-get update -yqq \ | ||
&& apt-get upgrade -yqq \ | ||
&& apt-get install python-lxml -yqq \ | ||
&& pip install lxml | ||
|
||
COPY root/usr/local/airflow/airflow.cfg ${AIRFLOW_USER_HOME}/airflow.cfg | ||
COPY root/usr/local/airflow/config/__init__.py ${AIRFLOW_USER_HOME}/__init__.py | ||
COPY root/usr/local/airflow/config ${AIRFLOW_USER_HOME}/config |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks great!
@@ -64,7 +64,7 @@ persistence: | |||
## set, choosing the default provisioner. (gp2 on AWS, standard on | |||
## GKE, AWS & OpenStack) | |||
## | |||
storageClass: netapp-block-standard | |||
storageClass: netapp-file-standard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's try changing this back to netapp-block-standard
and see if it's still blowing up; we really should use block storage for db if we can get it to provision properly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
still exploding
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
'console': { | ||
'class': 'logging.StreamHandler', | ||
'formatter': 'airflow.task', | ||
'stream': 'ext://sys.stdout' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice 👍
The changes to values.yaml in cas-airflow appears to copy everything to $Home/dags, so that we have all contents of $Home in the dags folder, and then the dags are in $Home/dags/dags |
yes that sounds expected; we'll need to update the volume itself to move the dags up a folder level (and remove the home folder files from the volume) |
It appears sending tasks directly to the console handler doesn't work. It needs some extra processing. Attempting to use the elasticsearch handler to stream to stdout. GoogleCloudPlatform/airflow-operator#72 |