forked from apache/airflow
-
Notifications
You must be signed in to change notification settings - Fork 0
/
INSTALL
59 lines (41 loc) · 2.57 KB
/
INSTALL
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
# INSTALL / BUILD instructions for Apache Airflow
This ia a generic installation method that requires a number of dependencies to be installed.
Depending on your system you might need different prerequisites, but the following
systems/prerequisites are known to work:
Linux (Debian Buster and Linux Mint Tricia):
sudo apt install build-essentials python3.6-dev python3.7-dev python-dev openssl \
sqlite sqlite-dev default-libmysqlclient-dev libmysqld-dev postgresq
MacOS (Mojave/Catalina):
brew install sqlite mysql postgresql
# [required] fetch the tarball and untar the source move into the directory that was untarred.
# [optional] run Apache RAT (release audit tool) to validate license headers
# RAT docs here: https://creadur.apache.org/rat/. Requires Java and Apache Rat
java -jar apache-rat.jar -E ./.rat-excludes -d .
# [optional] Airflow pulls in quite a lot of dependencies in order
# to connect to other services. You might want to test or run Airflow
# from a virtual env to make sure those dependencies are separated
# from your system wide versions
python3 -m venv PATH_TO_YOUR_VENV
source PATH_TO_YOUR_VENV/bin/activate
# [required] building and installing by pip (preferred)
pip install .
# or directly
python setup.py install
# You can also install recommended version of the dependencies by using
# requirements-python<PYTHON_MAJOR_MINOR_VERSION>.txt as constraint file. This is needed in case
# you have problems with installing the current requirements from PyPI.
# There are different requirements for different python versions. For example"
pip install . --constraint requirements/requirements-python3.7.txt
# You can also install Airflow with extras specified. The list of available extras:
# START EXTRAS HERE
all_dbs, amazon, apache.atlas, apache_beam, apache.cassandra, apache.druid, apache.hdfs,
apache.hive, apache.pinot, apache.webhdfs, async, atlas, aws, azure, cassandra, celery, cgroups,
cloudant, cncf.kubernetes, dask, databricks, datadog, devel, devel_hadoop, doc, docker, druid,
elasticsearch, exasol, facebook, gcp, gcp_api, github_enterprise, google, google_auth, grpc,
hashicorp, hdfs, hive, jdbc, jira, kerberos, kubernetes, ldap, microsoft.azure, microsoft.mssql,
microsoft.winrm, mongo, mssql, mysql, odbc, oracle, pagerduty, papermill, password, pinot, postgres,
presto, qds, rabbitmq, redis, salesforce, samba, segment, sendgrid, sentry, singularity, slack,
snowflake, spark, ssh, statsd, tableau, vertica, virtualenv, webhdfs, winrm, yandexcloud, all,
devel_ci
# END EXTRAS HERE
# For installing Airflow in development environments - see CONTRIBUTING.rst