potiuk closed pull request #4062: Proof of Concept for CI Integration tests 
with GCP for Airflow
URL: https://github.com/apache/incubator-airflow/pull/4062
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.gcloudignore b/.gcloudignore
new file mode 100644
index 0000000000..481bd2698a
--- /dev/null
+++ b/.gcloudignore
@@ -0,0 +1,17 @@
+# This file specifies files that are *not* uploaded to Google Cloud Platform
+# using gcloud. It follows the same syntax as .gitignore, with the addition of
+# "#!include" directives (which insert the entries of the given 
.gitignore-style
+# file at that point).
+#
+# For more information, run:
+#   $ gcloud topic gcloudignore
+#
+.gcloudignore
+# If you would like to upload your .git directory, .gitignore file or files
+# from your .gitignore file, remove the corresponding line
+# below:
+.git
+.gitignore
+
+node_modules
+#!include:.gitignore
diff --git a/.gitignore b/.gitignore
index d005437890..15e7c8c35a 100644
--- a/.gitignore
+++ b/.gitignore
@@ -14,7 +14,9 @@ unittests.db
 airflow/git_version
 airflow/www/static/coverage/
 logs/
-airflow-webserver.pid
+airflow-webserver.*
+airflow-scheduler.*
+airflow-webserver-monitor.*
 
 # Byte-compiled / optimized / DLL files
 __pycache__/
diff --git a/.travis.yml b/.travis.yml
index dd493363ab..ca8f20a3da 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -25,6 +25,8 @@ env:
     - SLUGIFY_USES_TEXT_UNIDECODE=yes
     - TRAVIS_CACHE=$HOME/.travis_cache/
   matrix:
+    - TOX_ENV=integration
+    - TOX_ENV=docs
     - TOX_ENV=flake8
     - TOX_ENV=py27-backend_mysql-env_docker
     - TOX_ENV=py27-backend_sqlite-env_docker
@@ -34,7 +36,6 @@ env:
     - TOX_ENV=py35-backend_postgres-env_docker PYTHON_VERSION=3
     - TOX_ENV=py27-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.9.0
     - TOX_ENV=py35-backend_postgres-env_kubernetes KUBERNETES_VERSION=v1.10.0 
PYTHON_VERSION=3
-
 cache:
   directories:
     - $HOME/.wheelhouse/
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index f114c66585..a41929c1ed 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -15,6 +15,7 @@ little bit helps, and credit will always be given.
   * [Development and Testing](#development-and-testing)
       - [Setting up a development 
environment](#setting-up-a-development-environment)
       - [Running unit tests](#running-unit-tests)
+      - [Running integration tests](#running-integration-tests)
   * [Pull requests guidelines](#pull-request-guidelines)
   * [Changing the Metadata Database](#changing-the-metadata-database)
 
@@ -193,6 +194,32 @@ See also the list of test classes and methods in 
`tests/core.py`.
 
 Feel free to customize based on the extras available in [setup.py](./setup.py)
 
+### Running integration tests
+
+To run DAGs as integration tests locally directly from the CLI, once your 
development 
+  environment is setup 
+(directly on your system or through a Docker setup) you can simply run 
`./run_int_tests.sh`.
+
+It accepts 2 parameters:
+
+`-v / --vars`: a comma-separated list of key-value pairs 
(`[KEY1=VALUE1,KEY2=VALUE2,..
+.]`) - 
+these are Airflow variables, through which you can inject the necessary config 
values 
+to the tested DAGs
+
+`-d / --dags`: a path expression specifying which DAGs to run, e.g. 
+`$AIRFLOW_HOME/incubator-airflow/airflow/contrib/example_dags/example_gcf*` 
will run all 
+DAGs from `example_dags` with names beginning with `example_gcf`.
+
+Full example running tests for Google Cloud Functions operators:
+``` 
+./run_int_tests.sh --vars=[PROJECT_ID=<gcp_project_id>,LOCATION=<gcp_region>,
+SOURCE_REPOSITORY="https://source.developers.google
+.com/projects/<gcp_project_id>/repos/<your_repo>/moveable-aliases/master",
+ENTRYPOINT=helloWorld] 
--dags=$AIRFLOW_HOME/incubator-airflow/airflow/contrib/example_dags/example_gcf*
+```
+
+
 ## Pull Request Guidelines
 
 Before you submit a pull request from your forked repo, check that it
diff --git a/dags/test_dag.py b/dags/test_dag.py
deleted file mode 100644
index a133dd5a12..0000000000
--- a/dags/test_dag.py
+++ /dev/null
@@ -1,43 +0,0 @@
-# -*- coding: utf-8 -*-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-from airflow import utils
-from airflow import DAG
-from airflow.operators.dummy_operator import DummyOperator
-from datetime import datetime, timedelta
-
-now = datetime.now()
-now_to_the_hour = (
-    now - timedelta(0, 0, 0, 0, 0, 3)
-).replace(minute=0, second=0, microsecond=0)
-START_DATE = now_to_the_hour
-DAG_NAME = 'test_dag_v1'
-
-default_args = {
-    'owner': 'airflow',
-    'depends_on_past': True,
-    'start_date': utils.dates.days_ago(2)
-}
-dag = DAG(DAG_NAME, schedule_interval='*/10 * * * *', 
default_args=default_args)
-
-run_this_1 = DummyOperator(task_id='run_this_1', dag=dag)
-run_this_2 = DummyOperator(task_id='run_this_2', dag=dag)
-run_this_2.set_upstream(run_this_1)
-run_this_3 = DummyOperator(task_id='run_this_3', dag=dag)
-run_this_3.set_upstream(run_this_2)
diff --git a/run_int_tests.sh b/run_int_tests.sh
new file mode 100755
index 0000000000..75043167f5
--- /dev/null
+++ b/run_int_tests.sh
@@ -0,0 +1,176 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+ROOTDIR="$(dirname $(dirname ${DIR}))"
+
+export AIRFLOW__CORE__DAGS_FOLDER="/tmp/dags"
+export AIRFLOW_HOME=${AIRFLOW_HOME:-/home/airflow}
+
+# Generate the `airflow` executable if needed
+which airflow > /dev/null || python setup.py develop
+
+# Cleaning up after potential previous run
+echo Removing all dags from the dags folder
+rm -vf ${AIRFLOW__CORE__DAGS_FOLDER}/*
+mkdir -pv ${AIRFLOW__CORE__DAGS_FOLDER}/
+
+echo "Killing scheduler and webserver daemons (just in case)"
+
+function kill_process_and_wait() {
+    export PID_FILE_NAME=$1
+    PID=$(cat /app/${PID_FILE_NAME}.pid 2>/dev/null)
+    while [ -f /app/${PID_FILE_NAME}.pid ]; do
+        kill ${PID} 2>/dev/null
+        echo "Sleeping until ${PID_FILE_NAME} gets killed"
+        sleep 1
+        if ps -p ${PID} > /dev/null
+        then
+           echo "Process ${PID} is running"
+        else
+           rm -f /app/${PID_FILE_NAME}.pid
+           break
+        fi
+    done
+}
+
+
+kill_process_and_wait airflow-scheduler
+kill_process_and_wait airflow-webserver-monitor
+kill_process_and_wait airflow-webserver
+
+for i in "$@"
+do
+case ${i} in
+    -v=*|--vars=*)
+    INT_TEST_VARS="${i#*=}"
+    shift # past argument=value
+    ;;
+    -d=*|--dags=*)
+    INT_TEST_DAGS="${i#*=}"
+    shift # past argument=value
+    ;;
+    *)
+          # unknown option
+    ;;
+esac
+done
+echo "VARIABLES  = ${INT_TEST_VARS}"
+echo "DAGS       = ${INT_TEST_DAGS}"
+if [[ -n $1 ]]; then
+    echo "Last line of file specified as non-opt/last argument:"
+    tail -1 $1
+fi
+
+# Remove square brackets if they exist
+TEMP=${INT_TEST_VARS//[}
+SANITIZED_VARIABLES=${TEMP//]}
+echo ""
+echo "========= AIRFLOW VARIABLES =========="
+echo ${SANITIZED_VARIABLES}
+echo ""
+
+
+IFS=',' read -ra ENVS <<< "${SANITIZED_VARIABLES}"
+for item in "${ENVS[@]}"; do
+    IFS='=' read -ra ENV <<< "$item"
+    airflow variables -s "${ENV[0]}" "${ENV[1]}"
+    echo "Set Airflow variable:"" ${ENV[0]}"" ${ENV[1]}"
+done
+
+INT_TEST_DAGS=${INT_TEST_DAGS:-${AIRFLOW_HOME}/incubator-airflow/airflow/contrib/example_dags/*.py}
+INT_TEST_VARS=${INT_TEST_VARS:-"[PROJECT_ID=project,LOCATION=europe-west1,SOURCE_REPOSITORY=https://example.com,ENTRYPOINT=helloWorld]"}
+
+echo "Running test DAGs from: ${AIRFLOW__CORE__DAGS_FOLDER}"
+cp -v ${INT_TEST_DAGS} ${AIRFLOW__CORE__DAGS_FOLDER}/
+
+airflow initdb
+airflow webserver --daemon
+airflow scheduler --daemon
+
+sleep 5
+
+function get_dag_state() {
+    tmp=$(airflow dag_state $1 $(date -d "1 day ago" '+%m-%dT00:00:00+00:00'))
+    result=$(echo "$tmp" | tail -1)
+    echo ${result}
+}
+
+results=()
+while read -r name ; do
+    echo "Unpausing $name"
+    airflow unpause ${name}
+    while [ "$(get_dag_state ${name})" = "running" ]
+    do
+        echo "Sleeping 1s..."
+        sleep 1
+        continue
+    done
+    res=$(get_dag_state ${name})
+    if ! [[ ${res} = "success" ]]; then
+        res="failed"
+    fi
+    echo ">>> FINISHED $name: "${res}
+    results+=("$name:"${res})
+done < <(ls ${AIRFLOW__CORE__DAGS_FOLDER} | grep '.*py$' | grep -Po '.*(?=\.)')
+# `ls ...` -> Get all .py files and remove the file extension from the names
+# ^ Process substitution to avoid the sub-shell and interact with array 
outside of the loop
+# https://unix.stackexchange.com/a/407794/78408
+
+echo ""
+echo "===== RESULTS: ====="
+for item in "${results[@]}"
+do
+    echo ${item}
+done
+echo ""
+
+for item in "${results[@]}"
+do
+    IFS=':' read -ra NAMES <<< "$item"
+    if [[ ${NAMES[1]} = "failed" ]]; then
+        dir_name="${NAMES[0]}"
+        for entry in "${AIRFLOW_HOME}/logs/${dir_name}/*"
+        do
+            echo ""
+            echo ""
+            echo "===== ERROR LOG [START]: ${dir_name} ===== "
+            echo ${entry}
+            echo ""
+            tail -n 50 ${entry}/$(date -d "1 day ago" 
'+%Y-%m-%dT00:00:00+00:00')/1.log
+            echo ""
+            echo "===== ERROR LOG [END]: ${dir_name} ===== "
+            echo ""
+        done
+    fi
+done
+
+echo "NUMBER OF TESTS RUN: ${#results[@]}"
+
+for item in "${results[@]}"
+do
+    if ! [[ ${item} = *"success"* ]]; then
+        echo "STATUS: TESTS FAILED"
+        exit 1
+    fi
+done
+
+echo "STATUS: ALL TESTS SUCCEEDED"
+exit 0
diff --git a/scripts/ci/1-setup-env.sh b/scripts/ci/1-setup-env.sh
index 0a976b35f0..d07b966a61 100755
--- a/scripts/ci/1-setup-env.sh
+++ b/scripts/ci/1-setup-env.sh
@@ -26,7 +26,7 @@ java -cp "/tmp/minicluster-1.1-SNAPSHOT/*" 
com.ing.minicluster.MiniCluster > /de
 # Set up ssh keys
 echo 'yes' | ssh-keygen -t rsa -C [email protected] -P '' -f 
~/.ssh/id_rsa
 cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
-ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
+ln -sf ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
 chmod 600 ~/.ssh/*
 
 # SSH Service
diff --git a/scripts/ci/5-run-tests.sh b/scripts/ci/5-run-tests.sh
index 8a74d824ef..5a59b19a5d 100755
--- a/scripts/ci/5-run-tests.sh
+++ b/scripts/ci/5-run-tests.sh
@@ -42,7 +42,7 @@ export PYTHONPATH=${PYTHONPATH:-$ROOTDIR/tests/test_utils}
 echo Backend: $AIRFLOW__CORE__SQL_ALCHEMY_CONN
 
 # environment
-export AIRFLOW_HOME=${AIRFLOW_HOME:=~}
+export AIRFLOW_HOME=${AIRFLOW_HOME:=${HOME}}
 export AIRFLOW__CORE__UNIT_TEST_MODE=True
 
 # any argument received is overriding the default nose execution arguments:
diff --git a/scripts/ci/5a-run-int-tests.sh b/scripts/ci/5a-run-int-tests.sh
new file mode 100755
index 0000000000..2e8b8051fa
--- /dev/null
+++ b/scripts/ci/5a-run-int-tests.sh
@@ -0,0 +1,224 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -o verbose
+set +x
+
+pwd
+
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+ROOTDIR="$(dirname $(dirname ${DIR}))"
+
+
+export AIRFLOW__CORE__DAGS_FOLDER="/tmp/dags"
+export AIRFLOW_HOME=${AIRFLOW_HOME:=/app}
+
+# Generate the `airflow` executable if needed
+which airflow > /dev/null || python setup.py develop
+
+# Cleaning up after potential previous run
+echo Removing all dags from the dags folder
+rm -vf ${AIRFLOW__CORE__DAGS_FOLDER}/*
+mkdir -pv ${AIRFLOW__CORE__DAGS_FOLDER}/
+
+
+echo "Killing scheduler and webserver daemons (just in case)"
+
+function kill_process_and_wait() {
+    export PID_FILE_NAME=$1
+    PID=$(cat /app/${PID_FILE_NAME}.pid 2>/dev/null)
+    while [ -f /app/${PID_FILE_NAME}.pid ]; do
+        kill ${PID} 2>/dev/null
+        echo "Sleeping until ${PID_FILE_NAME} gets killed"
+        sleep 1
+        if ps -p ${PID} > /dev/null
+        then
+           echo "Process ${PID} is running"
+        else
+           rm -f /app/${PID_FILE_NAME}.pid
+           break
+        fi
+    done
+}
+
+
+kill_process_and_wait airflow-scheduler
+kill_process_and_wait airflow-webserver-monitor
+kill_process_and_wait airflow-webserver
+
+# For impersonation tests on Travis, make airflow accessible to other users 
via the global PATH
+# (which contains /usr/local/bin)
+sudo ln -sf "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
+sudo chown -R airflow.airflow "${HOME}/.config"
+# kdc init happens in setup_kdc.sh
+kinit -kt ${KRB5_KTNAME} airflow
+
+# For impersonation tests running on SQLite on Travis, make the database world 
readable so other
+# users can update it
+AIRFLOW_DB="$HOME/airflow.db"
+
+if [ -f "${AIRFLOW_DB}" ]; then
+  sudo chown airflow.airflow "${AIRFLOW_DB}"
+  chmod a+rw "${AIRFLOW_DB}"
+  chmod g+rwx "${AIRFLOW_HOME}"
+else
+  touch "${AIRFLOW_DB}"
+fi
+
+if [ ! -z "${GCLOUD_SERVICE_KEY_BASE64}" ]; then
+
+    echo "Initializing the DB"
+    yes | airflow initdb
+    yes | airflow resetdb
+    echo "Installing lsb_release"
+    echo
+    sudo apt-get update && sudo apt-get install -y --no-install-recommends 
lsb-core
+    echo
+    echo "Extracting the key"
+    echo ${GCLOUD_SERVICE_KEY_BASE64} | base64 --decode > /tmp/key.json
+    KEY_DIR=/tmp
+    GCP_SERVICE_ACCOUNT_KEY_NAME=key.json
+
+    echo
+    echo "Installing gcloud CLI"
+    echo
+    export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"
+    echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | 
sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list
+    curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key 
add -
+    sudo apt-get update && sudo apt-get install -y --no-install-recommends 
google-cloud-sdk
+    echo
+    echo "Activating service account with 
${KEY_DIR}/${GCP_SERVICE_ACCOUNT_KEY_NAME}"
+    echo
+    export 
GOOGLE_APPLICATION_CREDENTIALS=${KEY_DIR}/${GCP_SERVICE_ACCOUNT_KEY_NAME}
+    gcloud auth activate-service-account \
+       --key-file="${KEY_DIR}/${GCP_SERVICE_ACCOUNT_KEY_NAME}"
+    ACCOUNT=$(cat "${KEY_DIR}/${GCP_SERVICE_ACCOUNT_KEY_NAME}" | \
+      python -c 'import json, sys; info=json.load(sys.stdin); 
print(info["client_email"])')
+    PROJECT=$(cat "${KEY_DIR}/${GCP_SERVICE_ACCOUNT_KEY_NAME}" | \
+      python -c 'import json, sys; info=json.load(sys.stdin); 
print(info["project_id"])')
+    echo "ACCOUNT=${ACCOUNT}"
+    echo "PROJECT=${PROJECT}"
+    gcloud config set account "${ACCOUNT}"
+    gcloud config set project "${PROJECT}"
+    python ${DIR}/_setup_gcp_connection.py "${PROJECT}"
+else
+    echo "Skipping integration tests as no GCLOUD_SERVICE_KEY_BASE64 defined."\
+         "Set the variable to base64-encoded service account private key .json 
file"
+    exit 0
+fi
+
+AIRFLOW_HOME=${AIRFLOW_HOME:-/app}
+INT_TEST_DAGS=${INT_TEST_DAGS:-${AIRFLOW_HOME}/airflow/contrib/example_dags/*.py}
+INT_TEST_VARS=${INT_TEST_VARS:-"[PROJECT_ID=project,LOCATION=europe-west1,SOURCE_REPOSITORY=https://example.com,ENTRYPOINT=helloWorld]"}
+
+echo "AIRFLOW_HOME = ${AIRFLOW_HOME}"
+echo "VARIABLES    = ${INT_TEST_VARS}"
+echo "DAGS         = ${INT_TEST_DAGS}"
+
+# Remove square brackets if they exist
+TEMP=${INT_TEST_VARS//[}
+SANITIZED_VARIABLES=${TEMP//]}
+echo ""
+echo "========= AIRFLOW VARIABLES =========="
+echo ${SANITIZED_VARIABLES}
+echo ""
+
+IFS=',' read -ra ENVS <<< "${SANITIZED_VARIABLES}"
+for item in "${ENVS[@]}"; do
+    IFS='=' read -ra ENV <<< "$item"
+    airflow variables -s "${ENV[0]}" "${ENV[1]}"
+    echo "Set Airflow variable:"" ${ENV[0]}"" ${ENV[1]}"
+done
+
+echo "Running test DAGs from: ${INT_TEST_DAGS}"
+cp -v ${INT_TEST_DAGS} ${AIRFLOW__CORE__DAGS_FOLDER}/
+
+airflow webserver --daemon
+airflow scheduler --daemon
+
+sleep 5
+
+function get_dag_state() {
+    tmp=$(airflow dag_state $1 $(date -d "1 day ago" '+%m-%dT00:00:00+00:00'))
+    result=$(echo "$tmp" | tail -1)
+    echo ${result}
+}
+
+results=()
+while read -r name ; do
+    echo "Unpausing $name"
+    airflow unpause ${name}
+    while [ "$(get_dag_state ${name})" = "running" ]
+    do
+        echo "Sleeping 1s..."
+        sleep 1
+        continue
+    done
+    res=$(get_dag_state ${name})
+    if ! [[ ${res} = "success" ]]; then
+        res="failed"
+    fi
+    echo ">>> FINISHED $name: "${res}
+    results+=("$name:"${res})
+done < <(ls ${AIRFLOW__CORE__DAGS_FOLDER} | grep '.*py$' | grep -Po '.*(?=\.)')
+# `ls ...` -> Get all .py files and remove the file extension from the names
+# ^ Process substitution to avoid the sub-shell and interact with array 
outside of the loop
+# https://unix.stackexchange.com/a/407794/78408
+
+echo ""
+echo "===== RESULTS: ====="
+for item in "${results[@]}"
+do
+    echo ${item}
+done
+echo ""
+
+for item in "${results[@]}"
+do
+    IFS=':' read -ra NAMES <<< "$item"
+    if [[ ${NAMES[1]} = "failed" ]]; then
+        dir_name="${NAMES[0]}"
+        for entry in "${AIRFLOW_HOME}/logs/${dir_name}/*"
+        do
+            echo ""
+            echo ""
+            echo "===== ERROR LOG [START]: ${dir_name} ===== "
+            echo ${entry}
+            echo ""
+            tail -n 50 ${entry}/$(date -d "1 day ago" 
'+%Y-%m-%dT00:00:00+00:00')/1.log
+            echo ""
+            echo "===== ERROR LOG [END]: ${dir_name} ===== "
+            echo ""
+        done
+    fi
+done
+
+echo "NUMBER OF TESTS RUN: ${#results[@]}"
+
+for item in "${results[@]}"
+do
+    if ! [[ ${item} = *"success"* ]]; then
+        echo "STATUS: TESTS FAILED"
+        exit 1
+    fi
+done
+
+echo "STATUS: ALL TESTS SUCCEEDED"
+exit 0
diff --git a/scripts/ci/5b-prepare-docs.sh b/scripts/ci/5b-prepare-docs.sh
new file mode 100755
index 0000000000..2a1927dc3f
--- /dev/null
+++ b/scripts/ci/5b-prepare-docs.sh
@@ -0,0 +1,66 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -o verbose
+
+pwd
+
+echo "Using travis airflow.cfg"
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+cp -f ${DIR}/airflow_travis.cfg ~/unittests.cfg
+
+ROOTDIR="$(dirname $(dirname ${DIR}))"
+export AIRFLOW__CORE__DAGS_FOLDER="$ROOTDIR/tests/dags"
+
+# add test/contrib to PYTHONPATH
+export PYTHONPATH=${PYTHONPATH:-$ROOTDIR/tests/test_utils}
+
+# environment
+export AIRFLOW_HOME=${AIRFLOW_HOME:=${HOME}}
+export AIRFLOW__CORE__UNIT_TEST_MODE=True
+
+# configuration test
+export AIRFLOW__TESTSECTION__TESTKEY=testvalue
+
+# any argument received is overriding the default nose execution arguments:
+nose_args=$@
+
+# Generate the `airflow` executable if needed
+which airflow > /dev/null || python setup.py develop
+
+# For impersonation tests on Travis, make airflow accessible to other users 
via the global PATH
+# (which contains /usr/local/bin)
+sudo ln -sf "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
+
+# kdc init happens in setup_kdc.sh
+kinit -kt ${KRB5_KTNAME} airflow
+
+# For impersonation tests running on SQLite on Travis, make the database world 
readable so other
+# users can update it
+AIRFLOW_DB="$HOME/airflow.db"
+
+if [ -f "${AIRFLOW_DB}" ]; then
+  chmod a+rw "${AIRFLOW_DB}"
+  chmod g+rwx "${AIRFLOW_HOME}"
+fi
+
+cd /app/docs
+./build.sh
+
diff --git a/scripts/ci/_setup_gcp_connection.py 
b/scripts/ci/_setup_gcp_connection.py
new file mode 100644
index 0000000000..355582eca2
--- /dev/null
+++ b/scripts/ci/_setup_gcp_connection.py
@@ -0,0 +1,49 @@
+# Copyright 2018 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""Writes GCP Connection to the airflow db."""
+import json
+import os
+import sys
+
+from airflow import models
+from airflow import settings
+
+KEYPATH_EXTRA = 'extra__google_cloud_platform__key_path'
+SCOPE_EXTRA = 'extra__google_cloud_platform__scope'
+PROJECT_EXTRA = 'extra__google_cloud_platform__project'
+
+full_key_path = '/tmp/key.json'
+if not os.path.isfile(full_key_path):
+    print()
+    print('The key file ' + full_key_path + ' is missing!')
+    print()
+    sys.exit(1)
+
+session = settings.Session()
+try:
+    conn = session.query(models.Connection).filter(
+        models.Connection.conn_id == 'google_cloud_default')[0]
+    extras = conn.extra_dejson
+    extras[KEYPATH_EXTRA] = full_key_path
+    print('Setting GCP key file to ' + full_key_path)
+    extras[SCOPE_EXTRA] = 'https://www.googleapis.com/auth/cloud-platform'
+    extras[PROJECT_EXTRA] = sys.argv[1]
+    conn.extra = json.dumps(extras)
+    session.commit()
+except BaseException as e:
+    print('session error' + str(e.message))
+    session.rollback()
+    raise
+finally:
+    session.close()
diff --git a/scripts/ci/docker-compose.yml b/scripts/ci/docker-compose.yml
index 101ad95297..8ce41ac5c9 100644
--- a/scripts/ci/docker-compose.yml
+++ b/scripts/ci/docker-compose.yml
@@ -81,6 +81,9 @@ services:
       - TRAVIS_REPO_SLUG
       - TRAVIS_OS_NAME
       - TRAVIS_TAG
+      - INT_TEST_DAGS
+      - INT_TEST_VARS
+      - GCLOUD_SERVICE_KEY_BASE64
     depends_on:
       - postgres
       - mysql
diff --git a/scripts/ci/run-ci.sh b/scripts/ci/run-ci.sh
index 2c893bcaa0..be1f9ddc0b 100755
--- a/scripts/ci/run-ci.sh
+++ b/scripts/ci/run-ci.sh
@@ -18,7 +18,7 @@
 #  specific language governing permissions and limitations
 #  under the License.
 
-set -x
+set +x
 
 DIRNAME=$(cd "$(dirname "$0")"; pwd)
 AIRFLOW_ROOT="$DIRNAME/../.."
@@ -26,16 +26,34 @@ AIRFLOW_ROOT="$DIRNAME/../.."
 # Fix file permissions
 sudo chown -R airflow.airflow . $HOME/.cache $HOME/.wheelhouse/ 
$HOME/.cache/pip $HOME/.kube $HOME/.minikube
 
-if [[ $PYTHON_VERSION == '3' ]]; then
+mkdir -pv ${AIRFLOW_ROOT}/dags
+
+if [[ ${PYTHON_VERSION} == '3' ]]; then
   PIP=pip3
 else
   PIP=pip2
 fi
 
-sudo -H $PIP install --upgrade pip
-sudo -H $PIP install tox
+sudo -H ${PIP} install --upgrade pip
+sudo -H ${PIP} install tox
+
+cd ${AIRFLOW_ROOT} && $PIP --version && tox --version
 
-cd $AIRFLOW_ROOT && $PIP --version && tox --version
+if [ -z "${TOX_ENV}" ]; then
+    echo 
"################################################################################################"
+    echo
+    echo "You need to specify TOX_ENV environment variable to run tox-based 
tests automatically"
+    echo
+    echo 
"################################################################################################"
+    echo
+    echo "For now we drop you directly in bash shell (in case you use 
docker-compose to run it manually)"
+    echo "You can run tests yourself: 'tox -e integration'"
+    echo
+    echo 
"################################################################################################"
+    echo
+    bash
+    exit
+fi
 
 if [ -z "$KUBERNETES_VERSION" ];
 then
diff --git a/tox.ini b/tox.ini
index 6a55df6e5d..7c53df88f2 100644
--- a/tox.ini
+++ b/tox.ini
@@ -17,7 +17,7 @@
 # under the License.
 
 [tox]
-envlist = 
flake8,{py27,py35}-backend_{mysql,sqlite,postgres}-env_{docker,kubernetes}
+envlist = 
flake8,integration,docs,{py27,py35}-backend_{mysql,sqlite,postgres}-env_{docker,kubernetes}
 skipsdist = True
 
 [global]
@@ -69,3 +69,33 @@ deps =
     flake8
 
 commands = flake8
+
+
+[testenv:integration]
+basepython = python2.7
+
+deps =
+    wheel
+
+commands =
+    ; You can comment below two lines if you run the command for the second 
time
+    ; to avoid time consuming pip installation
+    pip wheel --progress-bar off -w {homedir}/.wheelhouse -f 
{homedir}/.wheelhouse -e .[devel_ci]
+    pip install --progress-bar off --find-links={homedir}/.wheelhouse 
--no-index -e .[devel_ci]
+    {toxinidir}/scripts/ci/1-setup-env.sh
+    {toxinidir}/scripts/ci/3-setup-mysql.sh
+    {toxinidir}/scripts/ci/5a-run-int-tests.sh []
+
+[testenv:docs]
+basepython = python2.7
+
+deps =
+    wheel
+
+commands =
+    pip wheel --progress-bar off -w {homedir}/.wheelhouse -f 
{homedir}/.wheelhouse -e .[devel_ci]
+    pip install --progress-bar off --find-links={homedir}/.wheelhouse 
--no-index -e .[devel_ci]
+    {toxinidir}/scripts/ci/1-setup-env.sh
+    {toxinidir}/scripts/ci/2-setup-kdc.sh
+    {toxinidir}/scripts/ci/3-setup-mysql.sh
+    {toxinidir}/scripts/ci/5b-prepare-docs.sh []


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to