This repository contains example DAGs showing features released in Apache Airflow 2.9.
Aside from core Apache Airflow this project uses:
- The Astro CLI to run Airflow locally (version 1.25.0).
- The Amazon Airflow provider with the
s3fs
extra installed. - The Google Airflow provider.
- The Microsoft Azure Airflow provider.
- The Common IO Airflow provider.
- The Slack Airflow provider.
- The Snowflake Airflow provider.
- The transformers package.
- The torch package.
For pinned versions of the provider packages see the requirements.txt
file.
Note
You can find new Airflow 2.9 features in the DAG code by searching for # NEW in Airflow 2.9:
.
This section explains how to run this repository with Airflow.
Note
Some DAGs in this repository require additional connections or tools.
You can define these connection in the Airflow UI under Admin > Connections or by using the .env
file with the format shown in .env.example
.
The load_to_snowflake
DAG requires some additional setup in Snowflake, see the DAG docstring for more information.
DAGs with the tag toy
work without any additional connections or tools.
See the Manage Connections in Apache Airflow guide for further instructions on Airflow connections.
Download the Astro CLI to run Airflow locally in Docker. astro
is the only package you will need to install.
- Run
git clone https://github.com/astronomer/2-9-example-dags.git
on your computer to create a local clone of this repository. - Install the Astro CLI by following the steps in the Astro CLI documentation. Docker Desktop/Docker Engine is a prerequisite, but you don't need in-depth Docker knowledge to run Airflow with the Astro CLI.
- Run
astro dev start
in your cloned repository. - After your Astro project has started. View the Airflow UI at
localhost:8080
.
The following sections list the DAGs shown sorted by the feature that they showcase. You can filter DAGs in the UI by their tags
.
The DAGs in the data-engineering-use-case folder showcase a data engineering use case using AWS and Snowflake with several Airflow 2.9 features implemented throughout the DAGs.
create_ingestion_dags
: is a script to dynamically create 3 DAGs based on the include/ingestion_source_config.json file.load_to_snowflake
: DAG that loads data from S3 to Snowflake. Uses conditional dataset scheduling.analyze_customer_feedback
: DAG that runs sentiment analysis on customer feedback data. Uses named dynamic task indexes.prepare_earnings_report
: DAG hat creates a report on earnings data. Uses a DatasetOrTimeSchedule.
The DAGs in the toy_conditional_dataset_scheduling folder show new ways to use dataset scheduling without needing any additional connections or tools.
upstream1
,upstream2
,upstream3
andupstream4
are helper DAGs that update the datasetsdataset1
,dataset2
,dataset3
anddataset4
respectively.downstream1_on_any
: DAG that runs when any of the upstream datasets are updated.downstream2_one_in_each_group
: DAG that runs when one dataset from each group is updated.downstream3_dataset_and_time_schedule
: DAG that runs on a dataset schedule and a time schedule.
complex_dag_structure_rainbow
: DAG to see UI features with a complex structure. Uses custom operators from include/rainbow_operators/rainbow_operators.py.toy_auto_pause
: DAG that shows how to use themax_consecutive_failed_dag_runs
parameter.toy_custom_names_dynamic_tasks_taskflow
: DAG that shows how to use custom names for dynamic tasks map indexes with@task
.toy_custom_names_dynamic_tasks_traditional_operators
: DAG that shows how to use custom names for dynamic tasks map indexes with traditional operators.toy_custom_operator_push_multiple_xcom
: DAG that shows how to push multiple XComs from any operator. Uses a custom operator from include/toy_helpers/custom_operators.py.toy_upstream_obj_storage_dataset
: DAG that shows how to use ObjectStoragePath with a Dataset object. Upstream.toy_downstream_obj_storage_dataset
: DAG that shows how to use ObjectStoragePath with a Dataset object. Downstream.toy_on_skipped_callback
: DAG that shows how to use theon_skipped_callback
parameter. Uses a slack connection for the callback.toy_task_duration_page
: DAG created to showcase the task duration page.toy_taskflow_bash
: DAG that shows the@task.bash
decorator.toy_xcom_big_v_small
: DAG that pushes a big and a small object to XCom to be used with an ObjectStorage custom XCom backend using a threshold. (see.env.example
for configuration).
- Datasets guide.
- Dynamic Task Mapping guide.
- Options for custom XCom backends in Airflow guide: Coming soon!
- Set up a custom XCom backend using Object Storage tutorial: Coming soon!
- Object Storage Basic tutorial.
- Object Storage OSS Docs tutorial.
- Object Storage OSS Docs guide.
- Airflow Config reference.
This repository contains the following files and folders:
.astro
: files necessary for Astro CLI commands.dags
: all DAGs in your Airflow environment. Files in this folder will be parsed by the Airflow scheduler when looking for DAGs to add to your environment. You can add your own dagfiles in this folder.include
: supporting files that will be included in the Airflow environment. Among other files contains the code for the listener plugin ininclude/listeners.py
.plugins
: folder to place Airflow plugins. Contains a listener plugin.tests
: folder to place pytests running on DAGs in the Airflow instance. Contains default tests..astro-registry.yaml
: file to configure DAGs being uploaded to the Astronomer registry. Can be ignored for local development..dockerignore
: list of files to ignore for Docker..env.example
: example environment variables for the DAGs in this repository. Copy this file to.env
and replace the values with your own credentials..gitignore
: list of files to ignore for git.Dockerfile
: the Dockerfile using the Astro CLI. Sets environment variables to change Airflow webserver settings.packages.txt
: system-level packages to be installed in the Airflow environment upon building of the Docker image. Empty.README.md
: this Readme.requirements.txt
: python packages to be installed to be used by DAGs upon building of the Docker image.