Fokko closed pull request #3853: [WIP] Remove Tox from the pipeline
URL: https://github.com/apache/incubator-airflow/pull/3853
This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:
As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):
diff --git a/.gitignore b/.gitignore
index d005437890..36d3308c0c 100644
--- a/.gitignore
+++ b/.gitignore
@@ -56,7 +56,6 @@ pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
-.tox/
.coverage
.coverage.*
.cache
diff --git a/.travis.yml b/.travis.yml
index 5bd750453a..0f3656468a 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -25,15 +25,15 @@ env:
- SLUGIFY_USES_TEXT_UNIDECODE=yes
- TRAVIS_CACHE=$HOME/.travis_cache/
matrix:
- - TOX_ENV=flake8
- - TOX_ENV=py27-backend_mysql
- - TOX_ENV=py27-backend_sqlite
- - TOX_ENV=py27-backend_postgres
- - TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
- - TOX_ENV=py35-backend_sqlite PYTHON_VERSION=3
- - TOX_ENV=py35-backend_postgres PYTHON_VERSION=3
- - TOX_ENV=py27-backend_postgres KUBERNETES_VERSION=v1.9.0
- - TOX_ENV=py35-backend_postgres KUBERNETES_VERSION=v1.10.0 PYTHON_VERSION=3
+ - PYTHON=python3 TASK=flake8
+ - PYTHON=python2 AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@mysql/airflow
+ - PYTHON=python2
AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:////home/airflow/airflow.db
AIRFLOW__CORE__EXECUTOR=SequentialExecutor
+ - PYTHON=python2
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
+ - PYTHON=python3 AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@mysql/airflow
+ - PYTHON=python3
AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:////home/airflow/airflow.db
AIRFLOW__CORE__EXECUTOR=SequentialExecutor
+ - PYTHON=python3
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
+ - PYTHON=python2
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
KUBERNETES_VERSION=v1.9.0
+ - PYTHON=python3
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
KUBERNETES_VERSION=v1.10.0
cache:
directories:
- $HOME/.wheelhouse/
@@ -49,6 +49,8 @@ install:
- curl -L
https://github.com/docker/compose/releases/download/${DOCKER_COMPOSE_VERSION}/docker-compose-`uname
-s`-`uname -m` > docker-compose
- chmod +x docker-compose
- sudo mv docker-compose /usr/local/bin
- - pip install --upgrade pip
script:
- - docker-compose --log-level ERROR -f scripts/ci/docker-compose.yml run
airflow-testing /app/scripts/ci/run-ci.sh
+ - docker-compose --log-level ERROR -f scripts/ci/docker-compose.yml run --rm
airflow-testing /app/scripts/ci/run-ci.sh
+after_success:
+ - sudo chown -R travis.travis .
+ - codecov
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 152d5d9aab..0f9ec590e3 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -142,34 +142,37 @@ There are three ways to setup an Apache Airflow
development environment.
Start a docker container through Compose for development to avoid installing
the packages directly on your system. The following will give you a shell
inside a container, run all required service containers (MySQL, PostgresSQL,
krb5 and so on) and install all the dependencies:
```bash
+ export PYTHON=python3
+ export AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@mysql/airflow
+
docker-compose -f scripts/ci/docker-compose.yml run airflow-testing bash
- # From the container
- pip install -e .[devel]
- # Run all the tests with python and mysql through tox
- tox -e py35-backend_mysql
+
+ # Run all the tests with python and mysql
+ scripts/ci/run-ci.sh
```
### Running unit tests
-To run tests locally, once your unit test environment is setup (directly on
your
-system or through our Docker setup) you should be able to simply run
-``./run_unit_tests.sh`` at will.
+We're moving from a venv based test environment, to a Docker based
environment. Docker allows us to set up the same environment up on different
systems, for example having the same environment on your local machine and the
CI. Since Airflow by its nature relies on a lot of external dependencies, it
also start up these dependencies using docker-compose.
+
+To run tests locally, once your unit test environment is setup (directly on
your system or through our Docker setup) you should be able to simply run
`docker-compose -f scripts/ci/docker-compose.yml run
airflowci/incubator-airflow-ci /app/scripts/ci/run-ci.sh` at will.
For example, in order to just execute the "core" unit tests, run the following:
-```
+```bash
./run_unit_tests.sh tests.core:CoreTest -s --logging-level=DEBUG
```
or a single test method:
-```
+```bash
+docker-compose -f scripts/ci/docker-compose.yml run airflow-testing
/app/scripts/ci/run-ci.sh tests.core:CoreTest.test_check_operators
./run_unit_tests.sh tests.core:CoreTest.test_check_operators -s
--logging-level=DEBUG
```
To run the whole test suite with Docker Compose, do:
-```
+```bash
# Install Docker Compose first, then this will run the tests
docker-compose -f scripts/ci/docker-compose.yml run airflow-testing
/app/scripts/ci/run-ci.sh
```
diff --git a/scripts/ci/1-setup-env.sh b/scripts/ci/1-setup-env.sh
deleted file mode 100755
index 0a976b35f0..0000000000
--- a/scripts/ci/1-setup-env.sh
+++ /dev/null
@@ -1,33 +0,0 @@
-#!/usr/bin/env bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-set -exuo pipefail
-
-# Start MiniCluster
-java -cp "/tmp/minicluster-1.1-SNAPSHOT/*" com.ing.minicluster.MiniCluster >
/dev/null &
-
-# Set up ssh keys
-echo 'yes' | ssh-keygen -t rsa -C [email protected] -P '' -f
~/.ssh/id_rsa
-cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
-ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
-chmod 600 ~/.ssh/*
-
-# SSH Service
-sudo service ssh restart
diff --git a/scripts/ci/2-setup-kdc.sh b/scripts/ci/2-setup-kdc.sh
deleted file mode 100755
index c5cd381c2f..0000000000
--- a/scripts/ci/2-setup-kdc.sh
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/usr/bin/env bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-set -exuo pipefail
-
-DIRNAME=$(cd "$(dirname "$0")"; pwd)
-
-FQDN=`hostname`
-ADMIN="admin"
-PASS="airflow"
-KRB5_KTNAME=/etc/airflow.keytab
-
-cat /etc/hosts
-echo "hostname: ${FQDN}"
-
-sudo cp $DIRNAME/krb5/krb5.conf /etc/krb5.conf
-
-echo -e "${PASS}\n${PASS}" | sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q
"addprinc -randkey airflow/${FQDN}"
-sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME} airflow"
-sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME}
airflow/${FQDN}"
-sudo chmod 0644 ${KRB5_KTNAME}
diff --git a/scripts/ci/3-setup-databases.sh b/scripts/ci/3-setup-databases.sh
deleted file mode 100755
index 2a5cb682e0..0000000000
--- a/scripts/ci/3-setup-databases.sh
+++ /dev/null
@@ -1,23 +0,0 @@
-#!/usr/bin/env bash
-# Licensed to the Apache Software Foundation (ASF) under one *
-# or more contributor license agreements. See the NOTICE file *
-# distributed with this work for additional information *
-# regarding copyright ownership. The ASF licenses this file *
-# to you under the Apache License, Version 2.0 (the *
-# "License"); you may not use this file except in compliance *
-# with the License. You may obtain a copy of the License at *
-# *
-# http://www.apache.org/licenses/LICENSE-2.0 *
-# *
-# Unless required by applicable law or agreed to in writing, *
-# software distributed under the License is distributed on an *
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY *
-# KIND, either express or implied. See the License for the *
-# specific language governing permissions and limitations *
-# under the License. *
-
-set -exuo pipefail
-
-MYSQL_HOST=mysql
-
-mysql -h ${MYSQL_HOST} -u root -e 'drop database if exists airflow; create
database airflow'
diff --git a/scripts/ci/4-load-data.sh b/scripts/ci/4-load-data.sh
deleted file mode 100755
index 7935482be0..0000000000
--- a/scripts/ci/4-load-data.sh
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/usr/bin/env bash
-# Licensed to the Apache Software Foundation (ASF) under one *
-# or more contributor license agreements. See the NOTICE file *
-# distributed with this work for additional information *
-# regarding copyright ownership. The ASF licenses this file *
-# to you under the Apache License, Version 2.0 (the *
-# "License"); you may not use this file except in compliance *
-# with the License. You may obtain a copy of the License at *
-# *
-# http://www.apache.org/licenses/LICENSE-2.0 *
-# *
-# Unless required by applicable law or agreed to in writing, *
-# software distributed under the License is distributed on an *
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY *
-# KIND, either express or implied. See the License for the *
-# specific language governing permissions and limitations *
-# under the License. *
-
-set -exuo pipefail
-
-DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
-DATA_DIR="${DIR}/data"
-DATA_FILE="${DATA_DIR}/baby_names.csv"
-DATABASE=airflow_ci
-HOST=mysql
-
-mysqladmin -h ${HOST} -u root create ${DATABASE}
-mysql -h ${HOST} -u root < ${DATA_DIR}/mysql_schema.sql
-mysqlimport --local -h ${HOST} -u root --fields-optionally-enclosed-by="\""
--fields-terminated-by=, --ignore-lines=1 ${DATABASE} ${DATA_FILE}
diff --git a/scripts/ci/5-run-tests.sh b/scripts/ci/5-run-tests.sh
deleted file mode 100755
index 9a4e06f5e2..0000000000
--- a/scripts/ci/5-run-tests.sh
+++ /dev/null
@@ -1,97 +0,0 @@
-#!/usr/bin/env bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-set -o verbose
-
-if [ -z "$HADOOP_HOME" ]; then
- echo "HADOOP_HOME not set - abort" >&2
- exit 1
-fi
-
-echo "Using ${HADOOP_DISTRO} distribution of Hadoop from ${HADOOP_HOME}"
-
-pwd
-
-echo "Using travis airflow.cfg"
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
-cp -f ${DIR}/airflow_travis.cfg ~/unittests.cfg
-
-ROOTDIR="$(dirname $(dirname $DIR))"
-export AIRFLOW__CORE__DAGS_FOLDER="$ROOTDIR/tests/dags"
-
-# add test/contrib to PYTHONPATH
-export PYTHONPATH=${PYTHONPATH:-$ROOTDIR/tests/test_utils}
-
-echo Backend: $AIRFLOW__CORE__SQL_ALCHEMY_CONN
-
-# environment
-export AIRFLOW_HOME=${AIRFLOW_HOME:=~}
-export AIRFLOW__CORE__UNIT_TEST_MODE=True
-
-# configuration test
-export AIRFLOW__TESTSECTION__TESTKEY=testvalue
-
-# use Airflow 2.0-style imports
-export AIRFLOW_USE_NEW_IMPORTS=1
-
-# any argument received is overriding the default nose execution arguments:
-nose_args=$@
-
-# Generate the `airflow` executable if needed
-which airflow > /dev/null || python setup.py develop
-
-# For impersonation tests on Travis, make airflow accessible to other users
via the global PATH
-# (which contains /usr/local/bin)
-sudo ln -sf "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
-
-echo "Initializing the DB"
-yes | airflow initdb
-yes | airflow resetdb
-
-if [ -z "$nose_args" ]; then
- nose_args="--with-coverage \
- --cover-erase \
- --cover-html \
- --cover-package=airflow \
- --cover-html-dir=airflow/www/static/coverage \
- --with-ignore-docstrings \
- --rednose \
- --with-timer \
- -v \
- --logging-level=DEBUG"
-fi
-
-# kdc init happens in setup_kdc.sh
-kinit -kt ${KRB5_KTNAME} airflow
-
-# For impersonation tests running on SQLite on Travis, make the database world
readable so other
-# users can update it
-AIRFLOW_DB="$HOME/airflow.db"
-
-if [ -f "${AIRFLOW_DB}" ]; then
- chmod a+rw "${AIRFLOW_DB}"
- chmod g+rwx "${AIRFLOW_HOME}"
-fi
-
-echo "Starting the unit tests with the following nose arguments: "$nose_args
-nosetests $nose_args
-
-# To run individual tests:
-# nosetests tests.core:CoreTest.test_scheduler_job
diff --git a/scripts/ci/docker-compose.yml b/scripts/ci/docker-compose.yml
index 4accf119f6..ef023c9709 100644
--- a/scripts/ci/docker-compose.yml
+++ b/scripts/ci/docker-compose.yml
@@ -62,12 +62,11 @@ services:
domainname: example.com
airflow-testing:
- image: airflowci/incubator-airflow-ci:latest
+ image: fokkodriesprong/incubator-airflow-ci
init: true
environment:
- USER=airflow
- SLUGIFY_USES_TEXT_UNIDECODE=yes
- - TOX_ENV
- PYTHON_VERSION
- CI
- TRAVIS
@@ -80,6 +79,24 @@ services:
- TRAVIS_REPO_SLUG
- TRAVIS_OS_NAME
- TRAVIS_TAG
+
+ - PYTHON
+ - AIRFLOW__CORE__EXECUTOR
+ - AIRFLOW__CORE__SQL_ALCHEMY_CONN
+
+ - HADOOP_DISTRO=cdh
+ - HADOOP_HOME=/tmp/hadoop-cdh
+ - HADOOP_OPTS=-D/tmp/krb5.conf
+ - HIVE_HOME=/tmp/hive
+ - MINICLUSTER_HOME=/tmp/minicluster-1.1-SNAPSHOT
+ - KRB5_CONFIG=/etc/krb5.conf
+ - KRB5_KTNAME=/etc/airflow.keytab
+ - AIRFLOW_HOME=/home/airflow/
+ - AIRFLOW_USE_NEW_IMPORTS=1
+ - AIRFLOW_ROOT=/app/
+ - AIRFLOW__CORE__UNIT_TEST_MODE=True
+ - AIRFLOW__TESTSECTION__TESTKEY=testvalue
+ - AIRFLOW__CORE__DAGS_FOLDER=/app/tests/dags
depends_on:
- postgres
- mysql
@@ -90,6 +107,6 @@ services:
- krb5-kdc-server
volumes:
- ../../:/app
- - ~/.cache/pip:/home/airflow/.cache/pip
+ - ~/.cache/pip:/home/airflow/.cache/pip/
- ~/.wheelhouse/:/home/airflow/.wheelhouse/
working_dir: /app
diff --git a/scripts/ci/run-ci.sh b/scripts/ci/run-ci.sh
index f2815bbd95..1199c7b081 100755
--- a/scripts/ci/run-ci.sh
+++ b/scripts/ci/run-ci.sh
@@ -18,39 +18,136 @@
# specific language governing permissions and limitations
# under the License.
-set -x
+set -xe
+
+sudo chmod -R g+w /home/airflow/
+
+#####################################################################
+### Configure environment
DIRNAME=$(cd "$(dirname "$0")"; pwd)
AIRFLOW_ROOT="$DIRNAME/../.."
-# Fix file permissions
-sudo chown -R airflow.airflow . $HOME/.wheelhouse/ $HOME/.cache/pip
+export PATH=$PATH:/home/airflow/.local/bin
-if [[ $PYTHON_VERSION == '3' ]]; then
- PIP=pip3
-else
- PIP=pip
+# Check if variables are set, otherwise set some defaults
+cd ${AIRFLOW_ROOT} && ${PYTHON} -m pip --version
+cp -f ${DIRNAME}/airflow_travis.cfg ${AIRFLOW_HOME}/unittests.cfg
+
+echo Python version: ${PYTHON}
+echo Backend: ${AIRFLOW__CORE__SQL_ALCHEMY_CONN}
+
+#####################################################################
+### Start MiniCluster
+java -cp "/tmp/minicluster-1.1-SNAPSHOT/*" com.ing.minicluster.MiniCluster >
/tmp/minicluster.txt &
+
+# Set up ssh keys
+echo 'yes' | ssh-keygen -t rsa -C [email protected] -P '' -f
~/.ssh/id_rsa
+cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
+#ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
+chmod 600 ~/.ssh/*
+
+# SSH Service
+sudo service ssh restart
+
+
+#####################################################################
+### Configure Kerberos
+
+FQDN=`hostname`
+ADMIN="admin"
+PASS="airflow"
+KRB5_KTNAME=/etc/airflow.keytab
+
+cat /etc/hosts
+echo "hostname: ${FQDN}"
+
+sudo cp $DIRNAME/krb5/krb-conf/client/krb5.conf /etc/krb5.conf
+
+echo -e "${PASS}\n${PASS}" | sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q
"addprinc -randkey airflow/${FQDN}"
+sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME} airflow"
+sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME}
airflow/${FQDN}"
+sudo chmod 0644 ${KRB5_KTNAME}
+
+kinit -kt ${KRB5_KTNAME} airflow
+
+#####################################################################
+### Prepare MySQL Database
+
+MYSQL_HOST=mysql
+mysql -h ${MYSQL_HOST} -u root -e 'DROP DATABASE IF EXISTS airflow; CREATE
DATABASE airflow'
+
+#####################################################################
+### Insert some data into MySQL
+
+DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
+DATA_DIR="${DIR}/data"
+DATA_FILE="${DATA_DIR}/baby_names.csv"
+DATABASE=airflow_ci
+HOST=mysql
+
+mysqladmin -h ${HOST} -u root create ${DATABASE} || echo "Database already
exists"
+mysql -h ${HOST} -u root < ${DATA_DIR}/mysql_schema.sql
+mysqlimport --local -h ${HOST} -u root --fields-optionally-enclosed-by="\""
--fields-terminated-by=, --ignore-lines=1 ${DATABASE} ${DATA_FILE}
+
+#####################################################################
+### Prepare Python
+
+# Generate the `airflow` executable
+${PYTHON} setup.py develop -q
+
+${PYTHON} -m pip wheel --progress-bar off -w /home/airflow/.wheelhouse -f
/home/airflow/.wheelhouse -e .[devel_ci]
+${PYTHON} -m pip install --progress-bar off
--find-links=/home/airflow/.wheelhouse --no-index -e .[devel_ci]
+
+echo "Initializing the DB"
+yes | airflow initdb
+yes | airflow resetdb
+
+# add test/contrib to PYTHONPATH
+export PYTHONPATH=${PYTHONPATH:-$AIRFLOW_ROOT/tests/test_utils}
+
+# For impersonation tests running on SQLite on Travis, make the database world
readable so other
+# users can update it
+AIRFLOW_DB="$HOME/airflow.db"
+
+if [ -f "${AIRFLOW_DB}" ]; then
+ chmod a+rw "${AIRFLOW_DB}"
+ chmod g+rwx "${AIRFLOW_HOME}"
fi
-sudo $PIP install --upgrade pip
-sudo $PIP install tox
+# any argument received is overriding the default nose execution arguments:
+nose_args=$@
+
+if [ -z "$nose_args" ]; then
+ nose_args="--with-coverage \
+ --cover-erase \
+ --cover-html \
+ --cover-package=airflow \
+ --cover-html-dir=airflow/www/static/coverage \
+ --with-ignore-docstrings \
+ --rednose \
+ --with-timer \
+ -v \
+ --logging-level=DEBUG"
+fi
-cd $AIRFLOW_ROOT && $PIP --version && tox --version
if [ -z "$KUBERNETES_VERSION" ];
then
- tox -e $TOX_ENV
+ nosetests ${nose_args}
else
KUBERNETES_VERSION=${KUBERNETES_VERSION}
$DIRNAME/kubernetes/setup_kubernetes.sh && \
- tox -e $TOX_ENV -- tests.contrib.minikube \
- --with-coverage \
- --cover-erase \
- --cover-html \
- --cover-package=airflow \
- --cover-html-dir=airflow/www/static/coverage \
- --with-ignore-docstrings \
- --rednose \
- --with-timer \
- -v \
- --logging-level=DEBUG
+ nosetests -- tests.contrib.minikube \
+ --with-coverage \
+ --cover-erase \
+ --cover-html \
+ --cover-package=airflow \
+ --cover-html-dir=airflow/www/static/coverage \
+ --with-ignore-docstrings \
+ --rednose \
+ --with-timer \
+ -v \
+ --logging-level=DEBUG
fi
+
+scripts/ci/6-check-license.sh
diff --git a/setup.py b/setup.py
index aecc218170..49807b13fe 100644
--- a/setup.py
+++ b/setup.py
@@ -54,25 +54,6 @@ def verify_gpl_dependency():
"version set AIRFLOW_GPL_UNIDECODE")
-class Tox(TestCommand):
- user_options = [('tox-args=', None, "Arguments to pass to tox")]
-
- def initialize_options(self):
- TestCommand.initialize_options(self)
- self.tox_args = ''
-
- def finalize_options(self):
- TestCommand.finalize_options(self)
- self.test_args = []
- self.test_suite = True
-
- def run_tests(self):
- # import here, cause outside the eggs aren't loaded
- import tox
- errno = tox.cmdline(args=self.tox_args.split())
- sys.exit(errno)
-
-
class CleanCommand(Command):
"""Custom clean command to tidy up the project root."""
user_options = []
@@ -325,7 +306,7 @@ def do_setup():
'tabulate>=0.7.5, <=0.8.2',
'tenacity==4.8.0',
'thrift>=0.9.2',
- 'tzlocal>=1.4',
+ 'tzlocal>=1.4, <2.0.0.0',
'unicodecsv>=0.14.1',
'werkzeug>=0.14.1, <0.15.0',
'zope.deprecation>=4.0, <5.0',
@@ -406,7 +387,6 @@ def do_setup():
download_url=(
'https://dist.apache.org/repos/dist/release/incubator/airflow/' +
version),
cmdclass={
- 'test': Tox,
'extra_clean': CleanCommand,
'compile_assets': CompileAssets
},
diff --git a/tox.ini b/tox.ini
deleted file mode 100644
index c4b74a1e55..0000000000
--- a/tox.ini
+++ /dev/null
@@ -1,75 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-[tox]
-envlist = flake8,{py27,py35}-backend_{mysql,sqlite,postgres}
-skipsdist = True
-
-[global]
-wheel_dir = {homedir}/.wheelhouse
-find_links =
- {homedir}/.wheelhouse
- {homedir}/.pip-cache
-
-[flake8]
-max-line-length = 90
-ignore = E731,W503
-
-[testenv]
-deps =
- wheel
- codecov
-
-basepython =
- py27: python2.7
- py35: python3.5
-
-setenv =
- HADOOP_DISTRO=cdh
- HADOOP_HOME=/tmp/hadoop-cdh
- HADOOP_OPTS=-D/tmp/krb5.conf
- HIVE_HOME=/tmp/hive
- MINICLUSTER_HOME=/tmp/minicluster-1.1-SNAPSHOT
- KRB5_CONFIG=/etc/krb5.conf
- KRB5_KTNAME=/etc/airflow.keytab
- backend_mysql: AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@mysql/airflow
- backend_postgres:
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
- backend_sqlite:
AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:///{homedir}/airflow.db
- backend_sqlite: AIRFLOW__CORE__EXECUTOR=SequentialExecutor
-
-passenv = *
-
-commands =
- pip wheel --progress-bar off -w {homedir}/.wheelhouse -f
{homedir}/.wheelhouse -e .[devel_ci]
- pip install --progress-bar off --find-links={homedir}/.wheelhouse --no-index
-e .[devel_ci]
- {toxinidir}/scripts/ci/1-setup-env.sh
- {toxinidir}/scripts/ci/2-setup-kdc.sh
- {toxinidir}/scripts/ci/3-setup-databases.sh
- {toxinidir}/scripts/ci/4-load-data.sh
- {toxinidir}/scripts/ci/5-run-tests.sh []
- {toxinidir}/scripts/ci/6-check-license.sh
- codecov -e TOXENV
-
-[testenv:flake8]
-basepython = python3
-
-deps =
- flake8
-
-commands =
- {toxinidir}/scripts/ci/flake8-diff.sh
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services