[GitHub] gerardo commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
gerardo commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported 
Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416833941
 
 
   (posting again in here)
   
   @kaxil give we're using tox for this, it should be a matter of just 
adding/updating the runtimes inside the tox file 
[here](https://github.com/apache/incubator-airflow/blob/master/tox.ini#L20) and 
[here](https://github.com/apache/incubator-airflow/blob/master/tox.ini#L40) and 
then pass the desired python version through the Travis script as we do now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI pipeline

2018-08-28 Thread GitBox
gerardo commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised 
CI pipeline
URL: https://github.com/apache/incubator-airflow/pull/3393#discussion_r213552321
 
 

 ##
 File path: .travis.yml
 ##
 @@ -19,94 +19,40 @@
 sudo: true
 dist: trusty
 language: python
-jdk:
-  - oraclejdk8
-services:
-  - cassandra
-  - mongodb
-  - mysql
-  - postgresql
-  - rabbitmq
-addons:
-  apt:
-packages:
-  - slapd
-  - ldap-utils
-  - openssh-server
-  - mysql-server-5.6
-  - mysql-client-core-5.6
-  - mysql-client-5.6
-  - krb5-user
-  - krb5-kdc
-  - krb5-admin-server
-  - oracle-java8-installer
-  postgresql: "9.2"
-python:
-  - "2.7"
-  - "3.5"
 env:
   global:
+- DOCKER_COMPOSE_VERSION=1.20.0
 - SLUGIFY_USES_TEXT_UNIDECODE=yes
 - TRAVIS_CACHE=$HOME/.travis_cache/
-- KRB5_CONFIG=/etc/krb5.conf
-- KRB5_KTNAME=/etc/airflow.keytab
-# Travis on google cloud engine has a global /etc/boto.cfg that
-# does not work with python 3
-- BOTO_CONFIG=/tmp/bogusvalue
   matrix:
+- TOX_ENV=flake8
 - TOX_ENV=py27-backend_mysql
 - TOX_ENV=py27-backend_sqlite
 - TOX_ENV=py27-backend_postgres
-- TOX_ENV=py35-backend_mysql
-- TOX_ENV=py35-backend_sqlite
-- TOX_ENV=py35-backend_postgres
-- TOX_ENV=flake8
+- TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
 
 Review comment:
   @kaxil give we're using tox for this, it should be a matter of just 
adding/updating the runtimes inside the tox file 
[here](https://github.com/apache/incubator-airflow/blob/master/tox.ini#L20) and 
[here](https://github.com/apache/incubator-airflow/blob/master/tox.ini#L40) and 
then pass the desired python version through the Travis script as we do now.
   
   There's an extra check inside the [run-ci 
script](https://github.com/apache/incubator-airflow/blob/master/scripts/ci/run-ci.sh#L29)
 to use the correct pip binary (pip vs pip3), but this shouldn't be affected, 
as that only cares about the mayor version.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on issue #3797: [AIRFLOW-2952] Splits CI into k8s + docker-compose

2018-08-28 Thread GitBox
gerardo commented on issue #3797: [AIRFLOW-2952] Splits CI into k8s + 
docker-compose
URL: 
https://github.com/apache/incubator-airflow/pull/3797#issuecomment-416832081
 
 
   @dimberman I was trying to run the tests as-is inside our docker image, but 
so far, minikube doesn't seem to like to run inside docker.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on issue #3797: [AIRFLOW-2952] Splits CI into k8s + docker-compose

2018-08-28 Thread GitBox
gerardo commented on issue #3797: [AIRFLOW-2952] Splits CI into k8s + 
docker-compose
URL: 
https://github.com/apache/incubator-airflow/pull/3797#issuecomment-416831761
 
 
   Hey @dimberman, after your change, the build started failing in a different 
place: `sudo: kadmin: command not found`. This means tox is running the 
[`2-setup-kdc.sh` 
script](https://github.com/apache/incubator-airflow/blob/master/tox.ini#L61).
   
   Not sure what's the best way to do this with Tox, but at this point, we need 
to skip these scripts and only run the package installation steps, 
`5-run-tests.sh`, `6-check-license.sh` and codecov. 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-2901) WebHdfsSensor doesn't support HDFS HA

2018-08-28 Thread Manu Zhang (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Manu Zhang updated AIRFLOW-2901:

Description: 
If  HDFS is configured with HA, we cannot use WebHdfsSensor to check for file 
existence since WebHdfs cannot resolve the name service ID. Consider using 
[pyarrow.hdfs|https://arrow.apache.org/docs/python/filesystems.html] as a 
replacement.

An alternative way is to allow users to configure a list of namenodes if the 
dependencies of pyarrow (including libhdfs.so) are too heavy

  was:If  HDFS is configured with HA, we cannot use WebHdfsSensor to check for 
file existence since WebHdfs cannot resolve the name service ID. Consider using 
[pyarrow.hdfs|https://arrow.apache.org/docs/python/filesystems.html] as a 
replacement.


> WebHdfsSensor doesn't support HDFS HA
> -
>
> Key: AIRFLOW-2901
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2901
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Reporter: Manu Zhang
>Priority: Major
>
> If  HDFS is configured with HA, we cannot use WebHdfsSensor to check for file 
> existence since WebHdfs cannot resolve the name service ID. Consider using 
> [pyarrow.hdfs|https://arrow.apache.org/docs/python/filesystems.html] as a 
> replacement.
> An alternative way is to allow users to configure a list of namenodes if the 
> dependencies of pyarrow (including libhdfs.so) are too heavy



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] yrqls21 commented on a change in pull request #3798: [AIRFLOW-2951] Update dag_run table end_date when state change

2018-08-28 Thread GitBox
yrqls21 commented on a change in pull request #3798: [AIRFLOW-2951] Update 
dag_run table end_date when state change
URL: https://github.com/apache/incubator-airflow/pull/3798#discussion_r213515276
 
 

 ##
 File path: tests/models.py
 ##
 @@ -896,6 +896,65 @@ def on_failure_callable(context):
 updated_dag_state = dag_run.update_state()
 self.assertEqual(State.FAILED, updated_dag_state)
 
+def test_dagrun_set_state_end_date(self):
+session = settings.Session()
+
+dag = DAG(
+'test_dagrun_set_state_end_date',
+start_date=DEFAULT_DATE,
+default_args={'owner': 'owner1'})
+
+# A -> B
+with dag:
 
 Review comment:
   you don't really need this line.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] yrqls21 commented on a change in pull request #3798: [AIRFLOW-2951] Update dag_run table end_date when state change

2018-08-28 Thread GitBox
yrqls21 commented on a change in pull request #3798: [AIRFLOW-2951] Update 
dag_run table end_date when state change
URL: https://github.com/apache/incubator-airflow/pull/3798#discussion_r213515493
 
 

 ##
 File path: tests/models.py
 ##
 @@ -896,6 +896,65 @@ def on_failure_callable(context):
 updated_dag_state = dag_run.update_state()
 self.assertEqual(State.FAILED, updated_dag_state)
 
+def test_dagrun_set_state_end_date(self):
+session = settings.Session()
+
+dag = DAG(
+'test_dagrun_set_state_end_date',
+start_date=DEFAULT_DATE,
+default_args={'owner': 'owner1'})
+
+# A -> B
+with dag:
+op1 = DummyOperator(task_id='A')
+op2 = DummyOperator(task_id='B')
+op1.set_upstream(op2)
+
+dag.clear()
+
+now = timezone.utcnow()
+dr = dag.create_dagrun(run_id='test_dagrun_set_state_end_date',
+   state=State.RUNNING,
+   execution_date=now,
+   start_date=now)
+
+# Initial end_date should be NULL
+# State.SUCCESS and State.FAILED are all ending state and should set 
end_date
+# State.RUNNING set end_date back to NULL
+session.add(dr)
+session.commit()
+self.assertIsNone(dr.end_date)
+
+dr.set_state(State.SUCCESS)
+session.commit()
+
+dr_database = session.query(DagRun).filter(
+DagRun.run_id == 'test_dagrun_set_state_end_date'
+).one()
+self.assertIsNotNone(dr.end_date)
+self.assertIsNotNone(dr_database.end_date)
+self.assertEqual(dr.end_date, dr_database.end_date)
 
 Review comment:
   assert dr_database should be enough


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-2930) scheduler exit when using celery executor

2018-08-28 Thread Yingbo Wang (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yingbo Wang closed AIRFLOW-2930.


PR merged.

> scheduler exit when using celery executor
> -
>
> Key: AIRFLOW-2930
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2930
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Reporter: Yingbo Wang
>Assignee: Yingbo Wang
>Priority: Major
>
> Caused by:
> [https://github.com/apache/incubator-airflow/pull/3740]
>  
> Use CeleryExecutor for airflow, scheduler exit after a Dag is activated. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] isknight commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
isknight commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213500076
 
 

 ##
 File path: airflow/contrib/hooks/databricks_hook.py
 ##
 @@ -143,6 +144,18 @@ def _do_api_call(self, endpoint_info, json):
 raise AirflowException(('API requests to Databricks failed {} times. ' 
+
'Giving up.').format(self.retry_limit))
 
+def run_now(self, json):
+"""
+Utility function to call the ``api/2.0/jobs/run-now`` endpoint.
+
+:param json: The data used in the body of the request to the 
``submit`` endpoint.
 
 Review comment:
   Oops! nice catch. I read through my new docs like 5 times but managed to 
miss it. :) I'll update.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] isknight commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
isknight commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213499796
 
 

 ##
 File path: airflow/contrib/operators/databricks_operator.py
 ##
 @@ -30,6 +30,81 @@
 XCOM_RUN_PAGE_URL_KEY = 'run_page_url'
 
 
+def _deep_string_coerce(content, json_path='json'):
+"""
+Coerces content or all values of content if it is a dict to a string. The
+function will throw if content contains non-string or non-numeric types.
+
+The reason why we have this function is because the ``self.json`` field 
must be a
+ dict with only string values. This is because ``render_template`` will 
fail
+for numerical values.
+"""
+c = _deep_string_coerce
+if isinstance(content, six.string_types):
+return content
+elif isinstance(content, six.integer_types + (float,)):
+# Databricks can tolerate either numeric or string types in the API 
backend.
+return str(content)
+elif isinstance(content, (list, tuple)):
+return [c(e, '{0}[{1}]'.format(json_path, i)) for i, e in 
enumerate(content)]
+elif isinstance(content, dict):
+return {k: c(v, '{0}[{1}]'.format(json_path, k))
+for k, v in list(content.items())}
+else:
+param_type = type(content)
+msg = 'Type {0} used for parameter {1} is not a number or a string' \
+.format(param_type, json_path)
+raise AirflowException(msg)
+
+
+def _handle_databricks_operator_execution(operator, context):
+"""
+Handles the Airflow + Databricks lifecycle logic for a data bricks operator
+:param operator: databricks operator being handled
+:param context: airflow context
+"""
+hook = operator.get_hook()
+operator.call_hook(hook)
+# hook.submit_run(operator.json)
+if operator.do_xcom_push:
+context['ti'].xcom_push(key=XCOM_RUN_ID_KEY, value=operator.run_id)
+operator.log.info('Run submitted with run_id: %s', operator.run_id)
+run_page_url = hook.get_run_page_url(operator.run_id)
+if operator.do_xcom_push:
+context['ti'].xcom_push(key=XCOM_RUN_PAGE_URL_KEY, value=run_page_url)
+operator.log_run_page_url(run_page_url)
+while True:
+run_state = hook.get_run_state(operator.run_id)
+if run_state.is_terminal:
+if run_state.is_successful:
+operator.log.info('%s completed successfully.', 
operator.task_id)
+operator.log_run_page_url(run_page_url)
+return
+else:
+error_message = '{t} failed with terminal state: {s}'.format(
+t=operator.task_id,
+s=run_state)
+raise AirflowException(error_message)
+else:
+operator.log.info('%s in run state: %s', operator.task_id, 
run_state)
+operator.log_run_page_url(run_page_url)
+operator.log.info('Sleeping for %s seconds.', 
operator.polling_period_seconds)
+time.sleep(operator.polling_period_seconds)
+
+
+def _handle_databricks_operator_on_kill(operator):
 
 Review comment:
   Yeah, I felt iffy when I pulled that out. It's not good to duplicate code, 
but what if their lifecycle deviates? I'll update this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
andrewmchen commented on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416743238
 
 
   Looks reasonable. I can take a review once the comments that @ashb brought 
up are addressed. Is the eventual goal to extend these to DatabricksOperators 
eventually?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213470207
 
 

 ##
 File path: airflow/contrib/operators/databricks_operator.py
 ##
 @@ -30,6 +30,81 @@
 XCOM_RUN_PAGE_URL_KEY = 'run_page_url'
 
 
+def _deep_string_coerce(content, json_path='json'):
+"""
+Coerces content or all values of content if it is a dict to a string. The
+function will throw if content contains non-string or non-numeric types.
+
+The reason why we have this function is because the ``self.json`` field 
must be a
+ dict with only string values. This is because ``render_template`` will 
fail
+for numerical values.
+"""
+c = _deep_string_coerce
+if isinstance(content, six.string_types):
+return content
+elif isinstance(content, six.integer_types + (float,)):
+# Databricks can tolerate either numeric or string types in the API 
backend.
+return str(content)
+elif isinstance(content, (list, tuple)):
+return [c(e, '{0}[{1}]'.format(json_path, i)) for i, e in 
enumerate(content)]
+elif isinstance(content, dict):
+return {k: c(v, '{0}[{1}]'.format(json_path, k))
+for k, v in list(content.items())}
+else:
+param_type = type(content)
+msg = 'Type {0} used for parameter {1} is not a number or a string' \
+.format(param_type, json_path)
+raise AirflowException(msg)
+
+
+def _handle_databricks_operator_execution(operator, context):
 
 Review comment:
   Can we explicitly pass the parameters: `hook` and `log`, into this function 
and lift the logic to initially invoke `run-now` or `run-submit` back into the 
operator? The end result would be a helper function responsible for polling for 
the run result.
   
   With this solution, we'd duplicate a bit of logic with the xcom business 
(which we can refactor later) but we avoid having to pass the wide interface of 
an `operator` (which is really a special databricks operato) to this function.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213473499
 
 

 ##
 File path: airflow/contrib/operators/databricks_operator.py
 ##
 @@ -30,6 +30,81 @@
 XCOM_RUN_PAGE_URL_KEY = 'run_page_url'
 
 
+def _deep_string_coerce(content, json_path='json'):
+"""
+Coerces content or all values of content if it is a dict to a string. The
+function will throw if content contains non-string or non-numeric types.
+
+The reason why we have this function is because the ``self.json`` field 
must be a
+ dict with only string values. This is because ``render_template`` will 
fail
+for numerical values.
+"""
+c = _deep_string_coerce
+if isinstance(content, six.string_types):
+return content
+elif isinstance(content, six.integer_types + (float,)):
+# Databricks can tolerate either numeric or string types in the API 
backend.
+return str(content)
+elif isinstance(content, (list, tuple)):
+return [c(e, '{0}[{1}]'.format(json_path, i)) for i, e in 
enumerate(content)]
+elif isinstance(content, dict):
+return {k: c(v, '{0}[{1}]'.format(json_path, k))
+for k, v in list(content.items())}
+else:
+param_type = type(content)
+msg = 'Type {0} used for parameter {1} is not a number or a string' \
+.format(param_type, json_path)
+raise AirflowException(msg)
+
+
+def _handle_databricks_operator_execution(operator, context):
+"""
+Handles the Airflow + Databricks lifecycle logic for a data bricks operator
+:param operator: databricks operator being handled
+:param context: airflow context
+"""
+hook = operator.get_hook()
+operator.call_hook(hook)
+# hook.submit_run(operator.json)
+if operator.do_xcom_push:
+context['ti'].xcom_push(key=XCOM_RUN_ID_KEY, value=operator.run_id)
+operator.log.info('Run submitted with run_id: %s', operator.run_id)
+run_page_url = hook.get_run_page_url(operator.run_id)
+if operator.do_xcom_push:
+context['ti'].xcom_push(key=XCOM_RUN_PAGE_URL_KEY, value=run_page_url)
+operator.log_run_page_url(run_page_url)
+while True:
+run_state = hook.get_run_state(operator.run_id)
+if run_state.is_terminal:
+if run_state.is_successful:
+operator.log.info('%s completed successfully.', 
operator.task_id)
+operator.log_run_page_url(run_page_url)
+return
+else:
+error_message = '{t} failed with terminal state: {s}'.format(
+t=operator.task_id,
+s=run_state)
+raise AirflowException(error_message)
+else:
+operator.log.info('%s in run state: %s', operator.task_id, 
run_state)
+operator.log_run_page_url(run_page_url)
+operator.log.info('Sleeping for %s seconds.', 
operator.polling_period_seconds)
+time.sleep(operator.polling_period_seconds)
+
+
+def _handle_databricks_operator_on_kill(operator):
 
 Review comment:
   For this one, I'm thinking we can simply duplicate the logic in both 
operators.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213427675
 
 

 ##
 File path: airflow/contrib/hooks/databricks_hook.py
 ##
 @@ -143,6 +144,18 @@ def _do_api_call(self, endpoint_info, json):
 raise AirflowException(('API requests to Databricks failed {} times. ' 
+
'Giving up.').format(self.retry_limit))
 
+def run_now(self, json):
+"""
+Utility function to call the ``api/2.0/jobs/run-now`` endpoint.
+
+:param json: The data used in the body of the request to the 
``submit`` endpoint.
 
 Review comment:
   nit: `submit` -> `run-now`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...687e28f](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...687e28f](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...687e28f](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Issue Comment Deleted] (AIRFLOW-2416) executor_config column in task_instance isn't getting created

2018-08-28 Thread Samuel Mullin (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Samuel Mullin updated AIRFLOW-2416:
---
Comment: was deleted

(was: Worth mentioning that I came across this issue in 1.10.0 when upgrading 
from 1.9.0.  If anyone else using postgres stumbles across this issue and wants 
to fix it manually:
{code:java}
ALTER TABLE task_instance ADD COLUMN executor_config bytea
{code})

> executor_config column in task_instance isn't getting created
> -
>
> Key: AIRFLOW-2416
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2416
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Affects Versions: 1.9.0
> Environment: Running on a mac (System Version: OS X 10.11.6 
> (15G19009)) dev environment with Python 3.6.
>Reporter: Curtis Deems
>Assignee: Cameron Moberg
>Priority: Major
>
> There's a new column called 'executor_config' in the 'task_instance' table 
> that the scheduler is attempting to query.  The column isn't created with 
> initdb or upgradedb so the scheduler just loops and never picks up any dag 
> objects.  The only way I discovered this was to run the scheduler thru the 
> debugger and review the exceptions thrown by the scheduler.  This issue 
> doesn't show up in the scheduler logs or any other output that I could see.
> The workaround is to create the column manually but since the root issue is 
> not easily discoverable this could be a blocker for some people.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil edited a comment on issue #3815: [AIRFLOW-2973] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416716841
 
 
   The way CI works now with Docker, we will have to change bits on our Docker 
CI Image as well. I have pinged @gerardo on the Dockerised PR to help the case 
where we would have to run tests for multiple python versions. I am hence just 
keeping the scope of this PR to add 3.6 to setup.py, or we can close this PR 
entirely and have first sort out our CI pipeline and can add these changes over 
there? 
   
   cc @ashb @Fokko 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416716841
 
 
   The way CI works now with Docker, we will have to change bits on our Docker 
CI Image as well. I have pinged @gerardo on the Dockerised PR to help the case 
where we would have to run tests for multiple python versions. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI pipeline

2018-08-28 Thread GitBox
kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI 
pipeline
URL: https://github.com/apache/incubator-airflow/pull/3393#discussion_r213447645
 
 

 ##
 File path: .travis.yml
 ##
 @@ -19,94 +19,40 @@
 sudo: true
 dist: trusty
 language: python
-jdk:
-  - oraclejdk8
-services:
-  - cassandra
-  - mongodb
-  - mysql
-  - postgresql
-  - rabbitmq
-addons:
-  apt:
-packages:
-  - slapd
-  - ldap-utils
-  - openssh-server
-  - mysql-server-5.6
-  - mysql-client-core-5.6
-  - mysql-client-5.6
-  - krb5-user
-  - krb5-kdc
-  - krb5-admin-server
-  - oracle-java8-installer
-  postgresql: "9.2"
-python:
-  - "2.7"
-  - "3.5"
 env:
   global:
+- DOCKER_COMPOSE_VERSION=1.20.0
 - SLUGIFY_USES_TEXT_UNIDECODE=yes
 - TRAVIS_CACHE=$HOME/.travis_cache/
-- KRB5_CONFIG=/etc/krb5.conf
-- KRB5_KTNAME=/etc/airflow.keytab
-# Travis on google cloud engine has a global /etc/boto.cfg that
-# does not work with python 3
-- BOTO_CONFIG=/tmp/bogusvalue
   matrix:
+- TOX_ENV=flake8
 - TOX_ENV=py27-backend_mysql
 - TOX_ENV=py27-backend_sqlite
 - TOX_ENV=py27-backend_postgres
-- TOX_ENV=py35-backend_mysql
-- TOX_ENV=py35-backend_sqlite
-- TOX_ENV=py35-backend_postgres
-- TOX_ENV=flake8
+- TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
 
 Review comment:
   https://github.com/apache/incubator-airflow/pull/3815


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2416) executor_config column in task_instance isn't getting created

2018-08-28 Thread Samuel Mullin (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2416?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16595484#comment-16595484
 ] 

Samuel Mullin commented on AIRFLOW-2416:


Worth mentioning that I came across this issue in 1.10.0 when upgrading from 
1.9.0.  If anyone else using postgres stumbles across this issue and wants to 
fix it manually:
{code:java}
ALTER TABLE task_instance ADD COLUMN executor_config bytea
{code}

> executor_config column in task_instance isn't getting created
> -
>
> Key: AIRFLOW-2416
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2416
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db
>Affects Versions: 1.9.0
> Environment: Running on a mac (System Version: OS X 10.11.6 
> (15G19009)) dev environment with Python 3.6.
>Reporter: Curtis Deems
>Assignee: Cameron Moberg
>Priority: Major
>
> There's a new column called 'executor_config' in the 'task_instance' table 
> that the scheduler is attempting to query.  The column isn't created with 
> initdb or upgradedb so the scheduler just loops and never picks up any dag 
> objects.  The only way I discovered this was to run the scheduler thru the 
> debugger and review the exceptions thrown by the scheduler.  This issue 
> doesn't show up in the scheduler logs or any other output that I could see.
> The workaround is to create the column manually but since the root issue is 
> not easily discoverable this could be a blocker for some people.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io commented on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416701478
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=h1)
 Report
   > Merging 
[#3817](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3817/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3817   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=footer).
 Last update 
[56bae60...fb2ea03](https://codecov.io/gh/apache/incubator-airflow/pull/3817?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI pipeline

2018-08-28 Thread GitBox
kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI 
pipeline
URL: https://github.com/apache/incubator-airflow/pull/3393#discussion_r213433536
 
 

 ##
 File path: .travis.yml
 ##
 @@ -19,94 +19,40 @@
 sudo: true
 dist: trusty
 language: python
-jdk:
-  - oraclejdk8
-services:
-  - cassandra
-  - mongodb
-  - mysql
-  - postgresql
-  - rabbitmq
-addons:
-  apt:
-packages:
-  - slapd
-  - ldap-utils
-  - openssh-server
-  - mysql-server-5.6
-  - mysql-client-core-5.6
-  - mysql-client-5.6
-  - krb5-user
-  - krb5-kdc
-  - krb5-admin-server
-  - oracle-java8-installer
-  postgresql: "9.2"
-python:
-  - "2.7"
-  - "3.5"
 env:
   global:
+- DOCKER_COMPOSE_VERSION=1.20.0
 - SLUGIFY_USES_TEXT_UNIDECODE=yes
 - TRAVIS_CACHE=$HOME/.travis_cache/
-- KRB5_CONFIG=/etc/krb5.conf
-- KRB5_KTNAME=/etc/airflow.keytab
-# Travis on google cloud engine has a global /etc/boto.cfg that
-# does not work with python 3
-- BOTO_CONFIG=/tmp/bogusvalue
   matrix:
+- TOX_ENV=flake8
 - TOX_ENV=py27-backend_mysql
 - TOX_ENV=py27-backend_sqlite
 - TOX_ENV=py27-backend_postgres
-- TOX_ENV=py35-backend_mysql
-- TOX_ENV=py35-backend_sqlite
-- TOX_ENV=py35-backend_postgres
-- TOX_ENV=flake8
+- TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
 
 Review comment:
   Also, we need to keep in mind that in future we would need to test both 
Python3.6 and Python3.7


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI pipeline

2018-08-28 Thread GitBox
kaxil commented on a change in pull request #3393: [AIRFLOW-2499] Dockerised CI 
pipeline
URL: https://github.com/apache/incubator-airflow/pull/3393#discussion_r213433211
 
 

 ##
 File path: .travis.yml
 ##
 @@ -19,94 +19,40 @@
 sudo: true
 dist: trusty
 language: python
-jdk:
-  - oraclejdk8
-services:
-  - cassandra
-  - mongodb
-  - mysql
-  - postgresql
-  - rabbitmq
-addons:
-  apt:
-packages:
-  - slapd
-  - ldap-utils
-  - openssh-server
-  - mysql-server-5.6
-  - mysql-client-core-5.6
-  - mysql-client-5.6
-  - krb5-user
-  - krb5-kdc
-  - krb5-admin-server
-  - oracle-java8-installer
-  postgresql: "9.2"
-python:
-  - "2.7"
-  - "3.5"
 env:
   global:
+- DOCKER_COMPOSE_VERSION=1.20.0
 - SLUGIFY_USES_TEXT_UNIDECODE=yes
 - TRAVIS_CACHE=$HOME/.travis_cache/
-- KRB5_CONFIG=/etc/krb5.conf
-- KRB5_KTNAME=/etc/airflow.keytab
-# Travis on google cloud engine has a global /etc/boto.cfg that
-# does not work with python 3
-- BOTO_CONFIG=/tmp/bogusvalue
   matrix:
+- TOX_ENV=flake8
 - TOX_ENV=py27-backend_mysql
 - TOX_ENV=py27-backend_sqlite
 - TOX_ENV=py27-backend_postgres
-- TOX_ENV=py35-backend_mysql
-- TOX_ENV=py35-backend_sqlite
-- TOX_ENV=py35-backend_postgres
-- TOX_ENV=flake8
+- TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
 
 Review comment:
   @gerardo I am currently updating the Airflow CI to point to 3.6 instead of 
python 3.5 
   
   But the way Python Version is passed using this environment variable, can 
you help me figure out how this can be achieved?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213427570
 
 

 ##
 File path: airflow/contrib/hooks/databricks_hook.py
 ##
 @@ -143,6 +144,18 @@ def _do_api_call(self, endpoint_info, json):
 raise AirflowException(('API requests to Databricks failed {} times. ' 
+
'Giving up.').format(self.retry_limit))
 
+def run_now(self, json):
+"""
+Utility function to call the ``api/2.0/jobs/run-now`` endpoint.
+
+:param json: The data used in the body of the request to the 
``submit`` endpoint.
 
 Review comment:
   nit: `submit` -> `run_now`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
andrewmchen commented on a change in pull request #3813: [AIRFLOW-1998] 
Implemented DatabricksRunNowOperator for jobs/run-now …
URL: https://github.com/apache/incubator-airflow/pull/3813#discussion_r213427570
 
 

 ##
 File path: airflow/contrib/hooks/databricks_hook.py
 ##
 @@ -143,6 +144,18 @@ def _do_api_call(self, endpoint_info, json):
 raise AirflowException(('API requests to Databricks failed {} times. ' 
+
'Giving up.').format(self.retry_limit))
 
+def run_now(self, json):
+"""
+Utility function to call the ``api/2.0/jobs/run-now`` endpoint.
+
+:param json: The data used in the body of the request to the 
``submit`` endpoint.
 
 Review comment:
   nit: `submit` -> `run_now`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-2974) Databricks Cluster Operations

2018-08-28 Thread Wayne Morris (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2974?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wayne Morris updated AIRFLOW-2974:
--
Labels: features  (was: )

> Databricks Cluster Operations
> -
>
> Key: AIRFLOW-2974
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2974
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, hooks
>Affects Versions: 1.9.0
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
>  Labels: features
> Fix For: 1.9.0
>
>
> This extends the current databricks hook for adding the functionality of 
> starting, restarting or terminating clusters in databricks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2974) Databricks Cluster Operations

2018-08-28 Thread Wayne Morris (JIRA)
Wayne Morris created AIRFLOW-2974:
-

 Summary: Databricks Cluster Operations
 Key: AIRFLOW-2974
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2974
 Project: Apache Airflow
  Issue Type: New Feature
  Components: contrib, hooks
Affects Versions: 1.9.0
Reporter: Wayne Morris
Assignee: Wayne Morris
 Fix For: 1.9.0


This extends the current databricks hook for adding the functionality of 
starting, restarting or terminating clusters in databricks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-2973) Use Python 3.6.x everywhere possible

2018-08-28 Thread Taylor Edmiston (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2973?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Taylor Edmiston reassigned AIRFLOW-2973:


Assignee: Kaxil Naik  (was: Taylor Edmiston)

Reassigning this to [~kaxilnaik] since we have merged my PR into his.

> Use Python 3.6.x everywhere possible
> 
>
> Key: AIRFLOW-2973
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2973
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci, docs, tests
>Reporter: Taylor Edmiston
>Assignee: Kaxil Naik
>Priority: Minor
>
> - Add Python 3.6 support to PyPI, tox, Travis CI
> - Update dev docs to recommend 3.6.x
> - Update MLEngine operator tests to use 3.6
> - Update Kubernetes ExtractXcomPodRequestFactory pod spec to use 3.6



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil edited a comment on issue #3816: [AIRFLOW-2973] Use Python 3.6.x everywhere possible

2018-08-28 Thread GitBox
kaxil edited a comment on issue #3816: [AIRFLOW-2973] Use Python 3.6.x 
everywhere possible
URL: 
https://github.com/apache/incubator-airflow/pull/3816#issuecomment-416648447
 
 
   I have merged this changes to my PR @tedmiston . Thanks.
   
   It would be great if you can test Python 3.7 and as Bolke mentioned, we can 
then replace 3.6 with 3.7 :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416649492
 
 
   @tedmiston I have merged the changes to this PR but as a follow-up to this 
PR, I suppose it would be great if you can test Python 3.7 and as Bolke 
mentioned, we can then replace 3.6 with 3.7 :) 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
ashb commented on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416648773
 
 
   I'm not familiar with DataBricks but otherwise your changes look sensible. 
Someone with more familiarity with it will need to review it once you've fixed 
up your commits etc.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3816: [AIRFLOW-2973] Use Python 3.6.x everywhere possible

2018-08-28 Thread GitBox
kaxil commented on issue #3816: [AIRFLOW-2973] Use Python 3.6.x everywhere 
possible
URL: 
https://github.com/apache/incubator-airflow/pull/3816#issuecomment-416648447
 
 
   I have merged this changes to my PR @tedmiston . Thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #3817: Feature/cluster operations

2018-08-28 Thread GitBox
ashb commented on issue #3817: Feature/cluster operations
URL: 
https://github.com/apache/incubator-airflow/pull/3817#issuecomment-416648296
 
 
   Please follow the contributing guide, and avoid white-space only changes to 
lines please.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] wmorris75 opened a new pull request #3817: Feature/cluster operations

2018-08-28 Thread GitBox
wmorris75 opened a new pull request #3817: Feature/cluster operations
URL: https://github.com/apache/incubator-airflow/pull/3817
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] tedmiston commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
tedmiston commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported 
Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416642411
 
 
   @kaxil @Fokko I also put together a quick complementary PR 
https://github.com/apache/incubator-airflow/pull/3816 adding / upgrading to 3.6 
in all of the places.  CI is still running so I may expect some test failures 
but hopefully nothing major.  I would be happy to merge this into Kaxil's 
current PR or as a follow-on PR.
   
   Personally I would prefer Travis to run against 3.6 over 3.5.  It's been out 
for a while now, and AFAIK we don't have a compelling reason to prefer 3.5 
today.
   
   
   
   
   That said it would be nice if we could have some sort of nightly-ish build 
to run on all supported subversions for a very thorough test.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] tedmiston opened a new pull request #3816: [AIRFLOW-2973] Use Python 3.6.x everywhere possible

2018-08-28 Thread GitBox
tedmiston opened a new pull request #3816: [AIRFLOW-2973] Use Python 3.6.x 
everywhere possible
URL: https://github.com/apache/incubator-airflow/pull/3816
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-2973
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   This PR is similar in nature / complementary to @kaxil's #3815.
   
   - Add Python 3.6 support to PyPI, tox, Travis CI
   - Update dev docs to recommend 3.6.x
   - Update MLEngine operator tests to use 3.6
   - Update Kubernetes ExtractXcomPodRequestFactory pod spec to use 3.6
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   Already covered by existing tests.  I'm currently waiting on the CI to see 
if this breaks any existing unit tests.
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-2973) Use Python 3.6.x everywhere possible

2018-08-28 Thread Taylor Edmiston (JIRA)
Taylor Edmiston created AIRFLOW-2973:


 Summary: Use Python 3.6.x everywhere possible
 Key: AIRFLOW-2973
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2973
 Project: Apache Airflow
  Issue Type: Improvement
  Components: ci, docs, tests
Reporter: Taylor Edmiston
Assignee: Taylor Edmiston


- Add Python 3.6 support to PyPI, tox, Travis CI
- Update dev docs to recommend 3.6.x
- Update MLEngine operator tests to use 3.6
- Update Kubernetes ExtractXcomPodRequestFactory pod spec to use 3.6



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] bolkedebruin commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416635800
 
 
   2.7 and 3.6 (and maybe 3.7 instead)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2972) Can't see the dag logs in the browser

2018-08-28 Thread Dedi (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16595120#comment-16595120
 ] 

Dedi commented on AIRFLOW-2972:
---

I also try upgrade AirFlow 1.9 to 1.10 and still empty logs

even to old logs that I was able to read

> Can't see the dag logs in the browser
> -
>
> Key: AIRFLOW-2972
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2972
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10
>Reporter: Dedi
>Priority: Major
> Attachments: empty log.jpg
>
>
>  
>  # I install clean 10.1 AirFlow
>  # Trigger tutorial dag
>  # Click on dag logs
> [http://192.168.56.13:8080/admin/airflow/log?task_id=print_date_id=tutorial_date=2018-08-28T14%3A54%3A34.326568%2B00%3A00]
> Expected:
> to see the print date
>  
> Actual:
> Empty, See attach image



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2972) Can't see the dag logs in the browser

2018-08-28 Thread Dedi (JIRA)
Dedi created AIRFLOW-2972:
-

 Summary: Can't see the dag logs in the browser
 Key: AIRFLOW-2972
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2972
 Project: Apache Airflow
  Issue Type: Bug
  Components: webserver
Affects Versions: 1.10
Reporter: Dedi
 Attachments: empty log.jpg

 
 # I install clean 10.1 AirFlow
 # Trigger tutorial dag
 # Click on dag logs

[http://192.168.56.13:8080/admin/airflow/log?task_id=print_date_id=tutorial_date=2018-08-28T14%3A54%3A34.326568%2B00%3A00]

Expected:

to see the print date

 

Actual:

Empty, See attach image



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416621083
 
 
   @Fokko Agree with that, so do you like to keep it as it is or changing py3.5 
to py3.6 and keep py2.7 and py3.6 ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
Fokko commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416615867
 
 
   This would give too much load on Travis. Apache infra paying for cpu hours.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416612949
 
 
   @bolkedebruin @ashb Do you guys suggest to add Python 3.4 and Python 3.6 
both to Travis? As currently just have py27 and py35 on Travis.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] 
Restore Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213343026
 
 

 ##
 File path: airflow/contrib/auth/backends/github_enterprise_auth.py
 ##
 @@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 #
 
 Review comment:
   Yep :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
ashb commented on a change in pull request #3803: [AIRFLOW-2779] Restore 
Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213334804
 
 

 ##
 File path: airflow/contrib/auth/backends/github_enterprise_auth.py
 ##
 @@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 #
 
 Review comment:
   Oh right yes, so sorter license here, and copyright notice in NOTICES file:
   
   ```
   # See the NOTICE file distributed with this work for additional information
   # regarding copyright ownership.
   # 
   # Licensed under the Apache License, Version 2.0 (the "License");
   # you may not use this file except in compliance with the License.
   # You may obtain a copy of the License at
   # 
   # http://www.apache.org/licenses/LICENSE-2.0
   # 
   # Unless required by applicable law or agreed to in writing, software
   # distributed under the License is distributed on an "AS IS" BASIS,
   # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   # See the License for the specific language governing permissions and
   # limitations under the License.
   ```
   
   (This is the short license, plus a note to see NOTICES file)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...35a4b44](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...35a4b44](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...35a4b44](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
codecov-io commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported 
Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416603006
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=h1)
 Report
   > Merging 
[#3815](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/56bae60c139036ab506af595bd44b31eb21967df?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3815/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3815   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=footer).
 Last update 
[56bae60...35a4b44](https://codecov.io/gh/apache/incubator-airflow/pull/3815?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
ashb commented on a change in pull request #3803: [AIRFLOW-2779] Restore 
Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213332980
 
 

 ##
 File path: LICENSE
 ##
 @@ -208,6 +208,7 @@ limitations under the License.
subcomponents is subject to the terms and conditions of the following
licenses.
 
+(ALv2 License) airflow.contrib.auth.backends.github_enterprise_auth
 
 Review comment:
   I wasn't sure on this one since it's "in" the Airflow module it seemed like 
it was a sub-component, not a third party library/dependency.
   
   Will move.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3815: [AIRFLOW-XXX] Add Python 3.6 to 
Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416601817
 
 
   Agree with @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3803: [AIRFLOW-2779] Restore Copyright notice 
of GHE auth backend
URL: 
https://github.com/apache/incubator-airflow/pull/3803#issuecomment-416601548
 
 
   Please note that this file (and the other auth mechanisms) will be removed 
in the very near future when we remove the old ‘www’.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] 
Restore Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213330614
 
 

 ##
 File path: airflow/contrib/auth/backends/github_enterprise_auth.py
 ##
 @@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 #
+# Copyright 2015 Matthew Pelland (m...@pelland.io)
 
 Review comment:
   I mean this line.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] 
Restore Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213330249
 
 

 ##
 File path: airflow/contrib/auth/backends/github_enterprise_auth.py
 ##
 @@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 #
 
 Review comment:
   It should be the ‘short’ license header, see the one in the original commit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] 
Restore Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213330014
 
 

 ##
 File path: airflow/contrib/auth/backends/github_enterprise_auth.py
 ##
 @@ -1,5 +1,7 @@
 # -*- coding: utf-8 -*-
 #
 
 Review comment:
   This line should not be here. The license does not require stating copyright


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
bolkedebruin commented on a change in pull request #3803: [AIRFLOW-2779] 
Restore Copyright notice of GHE auth backend
URL: https://github.com/apache/incubator-airflow/pull/3803#discussion_r213329780
 
 

 ##
 File path: LICENSE
 ##
 @@ -208,6 +208,7 @@ limitations under the License.
subcomponents is subject to the terms and conditions of the following
licenses.
 
+(ALv2 License) airflow.contrib.auth.backends.github_enterprise_auth
 
 Review comment:
   It should be below with the other 3rd party licenses


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-2971) Health check command for scheduler

2018-08-28 Thread Jon Davies (JIRA)
Jon Davies created AIRFLOW-2971:
---

 Summary: Health check command for scheduler
 Key: AIRFLOW-2971
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2971
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Jon Davies


As part of a Kubernetes deployment of Airflow, I would like to define an exec 
command based health check for the Airflow scheduler:

- 
https://kubernetes.io/docs/tasks/configure-pod-container/configure-liveness-readiness-probes/

...the webserver is simple as all that needs is checking that the HTTP port is 
available. For the scheduler, it would be neat to have a command such as:

airflow scheduler health

That returned OK and exit 0/NOT OK and a non-zero value when it cannot reach 
the database for instance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2970) Kubernetes logging is broken

2018-08-28 Thread Jon Davies (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jon Davies updated AIRFLOW-2970:

Description: 
I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
are configured to do get_log=True and all my DAGs are set to log to stdout and 
I can see all the logs in kubectl logs.

I can see that the scheduler logs things to: 
$AIRFLOW_HOME/logs/scheduler/2018-08-28/*

However, this just consists of:

{code:java}
[2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
work on /home/airflow/dags/dag.py
[2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
/home/airflow/dags/dag.py for tasks to queue
[2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
/home/airflow/dags/dag.py
{code}

If I quickly exec into the executor the scheduler spins up, I can see that 
things are properly logged to:

{code:java}
/home/airflow/logs/dag$ tail -f 
dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
[2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
an event of type Pending
...
[2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:37,193] {pod_launcher.py:95} INFO - 
b'INFO:botocore.vendored.requests.packages.urllib3.connectionpool:Starting new 
HTTPS connection (7): blah-blah.s3.eu-west-1.amazonaws.com\n'
...
...all other log lines from pod...
{code}

However, this executor pod only exists for the duration of the lifetime of the 
task pod so the logs are lost pretty much immediately after the task runs. 
There is nothing that ships the logs back to the scheduler and/or web UI.

  was:
I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
are configured to do get_log=True and all my DAGs are set to log to stdout and 
I can see all the logs in kubectl logs.

I can see that the scheduler logs things to: 
$AIRFLOW_HOME/logs/scheduler/2018-08-28/*

However, this just consists of:

{code:java}
[2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
work on /home/airflow/dags/dag.py
[2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
/home/airflow/dags/dag.py for tasks to queue
[2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
/home/airflow/dags/dag.py
{code}

If I quickly exec into the executor the scheduler spins up, I can see that 
things are properly logged to:

{code:java}
/home/airflow/logs/dag$ tail -f 
dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
[2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
an event of type Pending
...
[2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:37,193] {pod_launcher.py:95} INFO - 
b'INFO:botocore.vendored.requests.packages.urllib3.connectionpool:Starting new 
HTTPS connection (7): blah-blah.s3.eu-west-1.amazonaws.com\n'
...
...all other log lines from pod...
{code}

However, this executor pod only exists for the duration of the lifetime of the 
task pod so the logs are lost pretty much immediately after the task runs. 
There is nothing that ships the logs back to the scheduler.


> Kubernetes logging is broken
> 
>
> Key: AIRFLOW-2970
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2970
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jon Davies
>Assignee: Daniel Imberman
>Priority: Major
>
> I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
> are configured to do get_log=True and all my DAGs are set to log to stdout 
> and I can see all the logs in kubectl logs.
> I can see that the scheduler logs things to: 
> $AIRFLOW_HOME/logs/scheduler/2018-08-28/*
> However, this just consists of:
> {code:java}
> [2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
> work on /home/airflow/dags/dag.py
> [2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
> /home/airflow/dags/dag.py for tasks to queue
> [2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
> 13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
> /home/airflow/dags/dag.py
> {code}
> If I quickly exec into the executor the scheduler spins up, I can see that 
> things are properly logged to:
> {code:java}
> /home/airflow/logs/dag$ tail -f 
> dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
> [2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
> 13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
> an event of type Pending
> ...
> [2018-08-28 

[GitHub] ashb commented on issue #3815: Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
ashb commented on issue #3815: Add Python 3.6 to Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416588704
 
 
   Is it worth adding/changing Py 3.5 to Py 3.6 in Travis? (I don't think we 
want to have both py 3.5 and py 3.6 tests as not that much has changed and we 
don't want _more_ runtime) but...?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #3815: Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil opened a new pull request #3815: Add Python 3.6 to Supported Prog Langs
URL: https://github.com/apache/incubator-airflow/pull/3815
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   I remember Fokko and Joy had tested Airflow on Python 3.6. Also, 
DigitalOcean has also been using Airflow with 3.6 as mentioned in the mailing 
list: 
https://lists.apache.org/thread.html/4b3ab976ad268f3da96c5317290d11120c26c7e0f92886eef5b8b50f@%3Cdev.airflow.apache.org%3E
   
   
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3815: Add Python 3.6 to Supported Prog Langs

2018-08-28 Thread GitBox
kaxil commented on issue #3815: Add Python 3.6 to Supported Prog Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-416588142
 
 
   cc @bolkedebruin @Fokko 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-2970) Kubernetes logging is broken

2018-08-28 Thread Jon Davies (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jon Davies updated AIRFLOW-2970:

Description: 
I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
are configured to do get_log=True and all my DAGs are set to log to stdout and 
I can see all the logs in kubectl logs.

I can see that the scheduler logs things to: 
$AIRFLOW_HOME/logs/scheduler/2018-08-28/*

However, this just consists of:

{code:java}
[2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
work on /home/airflow/dags/dag.py
[2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
/home/airflow/dags/dag.py for tasks to queue
[2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
/home/airflow/dags/dag.py
{code}

If I quickly exec into the executor the scheduler spins up, I can see that 
things are properly logged to:

{code:java}
/home/airflow/logs/dag$ tail -f 
dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
[2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
an event of type Pending
...
[2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:37,193] {pod_launcher.py:95} INFO - 
b'INFO:botocore.vendored.requests.packages.urllib3.connectionpool:Starting new 
HTTPS connection (7): blah-blah.s3.eu-west-1.amazonaws.com\n'
...
...all other log lines from pod...
{code}

However, this executor pod only exists for the duration of the lifetime of the 
task pod so the logs are lost pretty much immediately after the task runs. 
There is nothing that ships the logs back to the scheduler.

  was:
I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
are configured to do get_log=True and all my DAGs are set to log to stdout and 
I can see all the logs in kubectl logs.

I can see that the scheduler logs things to: 
$AIRFLOW_HOME/logs/scheduler/2018-08-28/*

However, this just consists of:

{code:java}
[2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
work on /home/airflow/dags/dag.py
[2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
/home/airflow/dags/dag.py for tasks to queue
[2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
/home/airflow/dags/dag.py
{code}

If I quickly exec into the executor the scheduler spins up, I can see that 
things are properly logged to:

{code:java}
/home/airflow/logs/dag$ tail -f 
dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
[2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
an event of type Pending
...
[2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:37,193] {pod_launcher.py:95} INFO - 
b'INFO:botocore.vendored.requests.packages.urllib3.connectionpool:Starting new 
HTTPS connection (7): blah-blah.s3.eu-west-1.amazonaws.com\n'
...
...all other log lines from pod...
{code}

However, this executor pod only exists for the duration of the lifetime of the 
task pod so the logs are lost pretty much immediately after the task runs.


> Kubernetes logging is broken
> 
>
> Key: AIRFLOW-2970
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2970
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jon Davies
>Assignee: Daniel Imberman
>Priority: Major
>
> I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
> are configured to do get_log=True and all my DAGs are set to log to stdout 
> and I can see all the logs in kubectl logs.
> I can see that the scheduler logs things to: 
> $AIRFLOW_HOME/logs/scheduler/2018-08-28/*
> However, this just consists of:
> {code:java}
> [2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
> work on /home/airflow/dags/dag.py
> [2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
> /home/airflow/dags/dag.py for tasks to queue
> [2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
> 13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
> /home/airflow/dags/dag.py
> {code}
> If I quickly exec into the executor the scheduler spins up, I can see that 
> things are properly logged to:
> {code:java}
> /home/airflow/logs/dag$ tail -f 
> dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
> [2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
> 13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
> an event of type Pending
> ...
> [2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
> 13:05:37,193] 

[jira] [Created] (AIRFLOW-2970) Kubernetes logging is broken

2018-08-28 Thread Jon Davies (JIRA)
Jon Davies created AIRFLOW-2970:
---

 Summary: Kubernetes logging is broken
 Key: AIRFLOW-2970
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2970
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Jon Davies
Assignee: Daniel Imberman


I'm using Airflow with the Kubernetes executor and pod operator. And my DAGs 
are configured to do get_log=True and all my DAGs are set to log to stdout and 
I can see all the logs in kubectl logs.

I can see that the scheduler logs things to: 
$AIRFLOW_HOME/logs/scheduler/2018-08-28/*

However, this just consists of:

{code:java}
[2018-08-28 13:03:27,695] {jobs.py:385} INFO - Started process (PID=16994) to 
work on /home/airflow/dags/dag.py
[2018-08-28 13:03:27,697] {jobs.py:1782} INFO - Processing file 
/home/airflow/dags/dag.py for tasks to queue
[2018-08-28 13:03:27,697] {logging_mixin.py:95} INFO - [2018-08-28 
13:03:27,697] {models.py:258} INFO - Filling up the DagBag from 
/home/airflow/dags/dag.py
{code}

If I quickly exec into the executor the scheduler spins up, I can see that 
things are properly logged to:

{code:java}
/home/airflow/logs/dag$ tail -f 
dag-downloader/2018-08-28T13\:05\:07.704072+00\:00/1.log
[2018-08-28 13:05:24,399] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:24,399] {pod_launcher.py:112} INFO - Event: dag-downloader-015ca48c had 
an event of type Pending
...
[2018-08-28 13:05:37,193] {logging_mixin.py:95} INFO - [2018-08-28 
13:05:37,193] {pod_launcher.py:95} INFO - 
b'INFO:botocore.vendored.requests.packages.urllib3.connectionpool:Starting new 
HTTPS connection (7): blah-blah.s3.eu-west-1.amazonaws.com\n'
...
...all other log lines from pod...
{code}

However, this executor pod only exists for the duration of the lifetime of the 
task pod so the logs are lost pretty much immediately after the task runs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #3810: [AIRFLOW-2960] Pin boto3 to <1.8

2018-08-28 Thread GitBox
ashb commented on issue #3810: [AIRFLOW-2960] Pin boto3 to <1.8
URL: 
https://github.com/apache/incubator-airflow/pull/3810#issuecomment-416579192
 
 
   Problem is with how Moto (Mocking library for Boto) works - reported 
upstream as https://github.com/spulec/moto/issues/1793


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #3803: [AIRFLOW-2779] Restore Copyright notice of GHE auth backend

2018-08-28 Thread GitBox
ashb commented on issue #3803: [AIRFLOW-2779] Restore Copyright notice of GHE 
auth backend
URL: 
https://github.com/apache/incubator-airflow/pull/3803#issuecomment-416576674
 
 
   @bolkedebruin Updated to include the copyright in the header of the file. 
Also as per the original comment in 
https://issues.apache.org/jira/browse/AIRFLOW-2779 I have included/kept the 
information in LICENSE and NOTICE, I think that's the right thing to do?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] verdan commented on a change in pull request #3804: [AIRFLOW-2866] Fix missing CSRF token header when using RBAC UI

2018-08-28 Thread GitBox
verdan commented on a change in pull request #3804: [AIRFLOW-2866] Fix missing 
CSRF token header when using RBAC UI
URL: https://github.com/apache/incubator-airflow/pull/3804#discussion_r213296428
 
 

 ##
 File path: airflow/www_rbac/templates/appbuilder/baselayout.html
 ##
 @@ -70,6 +70,13 @@
   // below variables are used in clock.js
   var hostName = '{{ hostname }}';
   var csrfToken = '{{ csrf_token() }}';
+  $.ajaxSetup({
+beforeSend: function(xhr, settings) {
+  if (!/^(GET|HEAD|OPTIONS|TRACE)$/i.test(settings.type) && 
!this.crossDomain) {
+xhr.setRequestHeader("X-CSRFToken", "{{ csrf_token() }}");
 
 Review comment:
   @gsilk can you please use the `csrfToken` variable that is declared above 
instead of  calling the `csrf_token()` again?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3749: [AIRFLOW-2900] Show code for packaged DAGs

2018-08-28 Thread GitBox
kaxil commented on issue #3749: [AIRFLOW-2900] Show code for packaged DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/3749#issuecomment-416568265
 
 
   @jakebiesinger Follow is the error:
   ```
   ==
   29) ERROR: test_open_maybe_zipped_archive (tests.www.test_utils.UtilsTest)
   --
  Traceback (most recent call last):
   .tox/py27-backend_mysql/lib/python2.7/site-packages/mock/mock.py line 
1305 in patched
 return func(*args, **keywargs)
   tests/www/test_utils.py line 218 in test_open_maybe_zipped_archive
 mocked_open.assert_called_once()
  NameError: global name 'mocked_open' is not defined
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2916) Add argument `verify` for AwsHook() and S3 related sensors/operators

2018-08-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2916?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594912#comment-16594912
 ] 

ASF GitHub Bot commented on AIRFLOW-2916:
-

bolkedebruin closed pull request #3764: [AIRFLOW-2916] Arg `verify` for 
AwsHook() & S3 sensors/operators
URL: https://github.com/apache/incubator-airflow/pull/3764
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_hook.py 
b/airflow/contrib/hooks/aws_hook.py
index 8ca1f3d744..448de63ffe 100644
--- a/airflow/contrib/hooks/aws_hook.py
+++ b/airflow/contrib/hooks/aws_hook.py
@@ -84,8 +84,9 @@ class AwsHook(BaseHook):
 This class is a thin wrapper around the boto3 python library.
 """
 
-def __init__(self, aws_conn_id='aws_default'):
+def __init__(self, aws_conn_id='aws_default', verify=None):
 self.aws_conn_id = aws_conn_id
+self.verify = verify
 
 def _get_credentials(self, region_name):
 aws_access_key_id = None
@@ -162,12 +163,14 @@ def _get_credentials(self, region_name):
 def get_client_type(self, client_type, region_name=None):
 session, endpoint_url = self._get_credentials(region_name)
 
-return session.client(client_type, endpoint_url=endpoint_url)
+return session.client(client_type, endpoint_url=endpoint_url,
+  verify=self.verify)
 
 def get_resource_type(self, resource_type, region_name=None):
 session, endpoint_url = self._get_credentials(region_name)
 
-return session.resource(resource_type, endpoint_url=endpoint_url)
+return session.resource(resource_type, endpoint_url=endpoint_url,
+verify=self.verify)
 
 def get_session(self, region_name=None):
 """Get the underlying boto3.session."""
diff --git a/airflow/contrib/operators/gcs_to_s3.py 
b/airflow/contrib/operators/gcs_to_s3.py
index a87aa3af5c..0df6170eab 100644
--- a/airflow/contrib/operators/gcs_to_s3.py
+++ b/airflow/contrib/operators/gcs_to_s3.py
@@ -47,6 +47,16 @@ class 
GoogleCloudStorageToS3Operator(GoogleCloudStorageListOperator):
 :type dest_aws_conn_id: str
 :param dest_s3_key: The base S3 key to be used to store the files. 
(templated)
 :type dest_s3_key: str
+:parame dest_verify: Whether or not to verify SSL certificates for S3 
connection.
+By default SSL certificates are verified.
+You can provide the following values:
+- False: do not validate SSL certificates. SSL will still be used
+ (unless use_ssl is False), but SSL certificates will not be
+ verified.
+- path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.
+ You can specify this argument if you want to use a different
+ CA cert bundle than the one used by botocore.
+:type dest_verify: bool or str
 """
 template_fields = ('bucket', 'prefix', 'delimiter', 'dest_s3_key')
 ui_color = '#f0eee4'
@@ -60,6 +70,7 @@ def __init__(self,
  delegate_to=None,
  dest_aws_conn_id=None,
  dest_s3_key=None,
+ dest_verify=None,
  replace=False,
  *args,
  **kwargs):
@@ -75,12 +86,13 @@ def __init__(self,
 )
 self.dest_aws_conn_id = dest_aws_conn_id
 self.dest_s3_key = dest_s3_key
+self.dest_verify = dest_verify
 self.replace = replace
 
 def execute(self, context):
 # use the super to list all files in an Google Cloud Storage bucket
 files = super(GoogleCloudStorageToS3Operator, self).execute(context)
-s3_hook = S3Hook(aws_conn_id=self.dest_aws_conn_id)
+s3_hook = S3Hook(aws_conn_id=self.dest_aws_conn_id, 
verify=self.dest_verify)
 
 if not self.replace:
 # if we are not replacing -> list all files in the S3 bucket
diff --git a/airflow/contrib/operators/s3_list_operator.py 
b/airflow/contrib/operators/s3_list_operator.py
index b85691b005..a9e005eed3 100644
--- a/airflow/contrib/operators/s3_list_operator.py
+++ b/airflow/contrib/operators/s3_list_operator.py
@@ -38,6 +38,16 @@ class S3ListOperator(BaseOperator):
 :type delimiter: string
 :param aws_conn_id: The connection ID to use when connecting to S3 storage.
 :type aws_conn_id: string
+:parame verify: Whether or not to verify SSL certificates for S3 
connection.
+By default SSL certificates are verified.
+You can provide the following values:
+- False: do not validate SSL certificates. SSL will still be used
+ (unless use_ssl is False), but SSL certificates will not 

[GitHub] bolkedebruin closed pull request #3764: [AIRFLOW-2916] Arg `verify` for AwsHook() & S3 sensors/operators

2018-08-28 Thread GitBox
bolkedebruin closed pull request #3764: [AIRFLOW-2916] Arg `verify` for 
AwsHook() & S3 sensors/operators
URL: https://github.com/apache/incubator-airflow/pull/3764
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_hook.py 
b/airflow/contrib/hooks/aws_hook.py
index 8ca1f3d744..448de63ffe 100644
--- a/airflow/contrib/hooks/aws_hook.py
+++ b/airflow/contrib/hooks/aws_hook.py
@@ -84,8 +84,9 @@ class AwsHook(BaseHook):
 This class is a thin wrapper around the boto3 python library.
 """
 
-def __init__(self, aws_conn_id='aws_default'):
+def __init__(self, aws_conn_id='aws_default', verify=None):
 self.aws_conn_id = aws_conn_id
+self.verify = verify
 
 def _get_credentials(self, region_name):
 aws_access_key_id = None
@@ -162,12 +163,14 @@ def _get_credentials(self, region_name):
 def get_client_type(self, client_type, region_name=None):
 session, endpoint_url = self._get_credentials(region_name)
 
-return session.client(client_type, endpoint_url=endpoint_url)
+return session.client(client_type, endpoint_url=endpoint_url,
+  verify=self.verify)
 
 def get_resource_type(self, resource_type, region_name=None):
 session, endpoint_url = self._get_credentials(region_name)
 
-return session.resource(resource_type, endpoint_url=endpoint_url)
+return session.resource(resource_type, endpoint_url=endpoint_url,
+verify=self.verify)
 
 def get_session(self, region_name=None):
 """Get the underlying boto3.session."""
diff --git a/airflow/contrib/operators/gcs_to_s3.py 
b/airflow/contrib/operators/gcs_to_s3.py
index a87aa3af5c..0df6170eab 100644
--- a/airflow/contrib/operators/gcs_to_s3.py
+++ b/airflow/contrib/operators/gcs_to_s3.py
@@ -47,6 +47,16 @@ class 
GoogleCloudStorageToS3Operator(GoogleCloudStorageListOperator):
 :type dest_aws_conn_id: str
 :param dest_s3_key: The base S3 key to be used to store the files. 
(templated)
 :type dest_s3_key: str
+:parame dest_verify: Whether or not to verify SSL certificates for S3 
connection.
+By default SSL certificates are verified.
+You can provide the following values:
+- False: do not validate SSL certificates. SSL will still be used
+ (unless use_ssl is False), but SSL certificates will not be
+ verified.
+- path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.
+ You can specify this argument if you want to use a different
+ CA cert bundle than the one used by botocore.
+:type dest_verify: bool or str
 """
 template_fields = ('bucket', 'prefix', 'delimiter', 'dest_s3_key')
 ui_color = '#f0eee4'
@@ -60,6 +70,7 @@ def __init__(self,
  delegate_to=None,
  dest_aws_conn_id=None,
  dest_s3_key=None,
+ dest_verify=None,
  replace=False,
  *args,
  **kwargs):
@@ -75,12 +86,13 @@ def __init__(self,
 )
 self.dest_aws_conn_id = dest_aws_conn_id
 self.dest_s3_key = dest_s3_key
+self.dest_verify = dest_verify
 self.replace = replace
 
 def execute(self, context):
 # use the super to list all files in an Google Cloud Storage bucket
 files = super(GoogleCloudStorageToS3Operator, self).execute(context)
-s3_hook = S3Hook(aws_conn_id=self.dest_aws_conn_id)
+s3_hook = S3Hook(aws_conn_id=self.dest_aws_conn_id, 
verify=self.dest_verify)
 
 if not self.replace:
 # if we are not replacing -> list all files in the S3 bucket
diff --git a/airflow/contrib/operators/s3_list_operator.py 
b/airflow/contrib/operators/s3_list_operator.py
index b85691b005..a9e005eed3 100644
--- a/airflow/contrib/operators/s3_list_operator.py
+++ b/airflow/contrib/operators/s3_list_operator.py
@@ -38,6 +38,16 @@ class S3ListOperator(BaseOperator):
 :type delimiter: string
 :param aws_conn_id: The connection ID to use when connecting to S3 storage.
 :type aws_conn_id: string
+:parame verify: Whether or not to verify SSL certificates for S3 
connection.
+By default SSL certificates are verified.
+You can provide the following values:
+- False: do not validate SSL certificates. SSL will still be used
+ (unless use_ssl is False), but SSL certificates will not be
+ verified.
+- path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.
+ You can specify this argument if you want to use a different
+ CA cert bundle than the one used by 

[GitHub] bolkedebruin commented on issue #3749: [AIRFLOW-2900] Show code for packaged DAGs

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3749: [AIRFLOW-2900] Show code for packaged 
DAGs
URL: 
https://github.com/apache/incubator-airflow/pull/3749#issuecomment-416566987
 
 
   All ci builds failed, you check what is going on? It would be great to have 
this in


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2930) scheduler exit when using celery executor

2018-08-28 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594909#comment-16594909
 ] 

ASF GitHub Bot commented on AIRFLOW-2930:
-

bolkedebruin closed pull request #3784: [AIRFLOW-2930] Fix celery excecutor 
scheduler crash
URL: https://github.com/apache/incubator-airflow/pull/3784
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/executors/celery_executor.py 
b/airflow/executors/celery_executor.py
index 2128ae7b09..61bbc66716 100644
--- a/airflow/executors/celery_executor.py
+++ b/airflow/executors/celery_executor.py
@@ -82,7 +82,7 @@ def execute_async(self, key, command,
 self.log.info("[celery] queuing {key} through celery, "
   "queue={queue}".format(**locals()))
 self.tasks[key] = execute_command.apply_async(
-args=command, queue=queue)
+args=[command], queue=queue)
 self.last_state[key] = celery_states.PENDING
 
 def sync(self):
diff --git a/tests/executors/dask_executor.py b/tests/executors/dask_executor.py
index 9bf051f580..4f0009e1cc 100644
--- a/tests/executors/dask_executor.py
+++ b/tests/executors/dask_executor.py
@@ -55,8 +55,8 @@ def assert_tasks_on_executor(self, executor):
 # start the executor
 executor.start()
 
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 executor.execute_async(key='success', command=success_command)
 executor.execute_async(key='fail', command=fail_command)
diff --git a/tests/executors/test_celery_executor.py 
b/tests/executors/test_celery_executor.py
index 69f9fbfe9f..95ad58f6a2 100644
--- a/tests/executors/test_celery_executor.py
+++ b/tests/executors/test_celery_executor.py
@@ -34,8 +34,8 @@ def test_celery_integration(self):
 executor.start()
 with start_worker(app=app, logfile=sys.stdout, loglevel='debug'):
 
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 executor.execute_async(key='success', command=success_command)
 # errors are propagated for some reason
diff --git a/tests/executors/test_local_executor.py 
b/tests/executors/test_local_executor.py
index 846e132561..59cb09c74e 100644
--- a/tests/executors/test_local_executor.py
+++ b/tests/executors/test_local_executor.py
@@ -33,8 +33,8 @@ def execution_parallelism(self, parallelism=0):
 executor.start()
 
 success_key = 'success {}'
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 for i in range(self.TEST_SUCCESS_COMMANDS):
 key, command = success_key.format(i), success_command


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> scheduler exit when using celery executor
> -
>
> Key: AIRFLOW-2930
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2930
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Reporter: Yingbo Wang
>Assignee: Yingbo Wang
>Priority: Major
>
> Caused by:
> [https://github.com/apache/incubator-airflow/pull/3740]
>  
> Use CeleryExecutor for airflow, scheduler exit after a Dag is activated. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] bolkedebruin closed pull request #3784: [AIRFLOW-2930] Fix celery excecutor scheduler crash

2018-08-28 Thread GitBox
bolkedebruin closed pull request #3784: [AIRFLOW-2930] Fix celery excecutor 
scheduler crash
URL: https://github.com/apache/incubator-airflow/pull/3784
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/executors/celery_executor.py 
b/airflow/executors/celery_executor.py
index 2128ae7b09..61bbc66716 100644
--- a/airflow/executors/celery_executor.py
+++ b/airflow/executors/celery_executor.py
@@ -82,7 +82,7 @@ def execute_async(self, key, command,
 self.log.info("[celery] queuing {key} through celery, "
   "queue={queue}".format(**locals()))
 self.tasks[key] = execute_command.apply_async(
-args=command, queue=queue)
+args=[command], queue=queue)
 self.last_state[key] = celery_states.PENDING
 
 def sync(self):
diff --git a/tests/executors/dask_executor.py b/tests/executors/dask_executor.py
index 9bf051f580..4f0009e1cc 100644
--- a/tests/executors/dask_executor.py
+++ b/tests/executors/dask_executor.py
@@ -55,8 +55,8 @@ def assert_tasks_on_executor(self, executor):
 # start the executor
 executor.start()
 
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 executor.execute_async(key='success', command=success_command)
 executor.execute_async(key='fail', command=fail_command)
diff --git a/tests/executors/test_celery_executor.py 
b/tests/executors/test_celery_executor.py
index 69f9fbfe9f..95ad58f6a2 100644
--- a/tests/executors/test_celery_executor.py
+++ b/tests/executors/test_celery_executor.py
@@ -34,8 +34,8 @@ def test_celery_integration(self):
 executor.start()
 with start_worker(app=app, logfile=sys.stdout, loglevel='debug'):
 
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 executor.execute_async(key='success', command=success_command)
 # errors are propagated for some reason
diff --git a/tests/executors/test_local_executor.py 
b/tests/executors/test_local_executor.py
index 846e132561..59cb09c74e 100644
--- a/tests/executors/test_local_executor.py
+++ b/tests/executors/test_local_executor.py
@@ -33,8 +33,8 @@ def execution_parallelism(self, parallelism=0):
 executor.start()
 
 success_key = 'success {}'
-success_command = ['true', ]
-fail_command = ['false', ]
+success_command = ['true', 'some_parameter']
+fail_command = ['false', 'some_parameter']
 
 for i in range(self.TEST_SUCCESS_COMMANDS):
 key, command = success_key.format(i), success_command


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3805: [AIRFLOW-2062] Add per-connection KMS encryption.

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3805: [AIRFLOW-2062] Add per-connection KMS 
encryption.
URL: 
https://github.com/apache/incubator-airflow/pull/3805#issuecomment-416565520
 
 
   This is very tied to a specific implementation. I would rather see the 
kms_Xx fields be added to the extra options of a connection


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3804: [AIRFLOW-2866] Fix missing CSRF token header when using RBAC UI

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3804: [AIRFLOW-2866] Fix missing CSRF token 
header when using RBAC UI
URL: 
https://github.com/apache/incubator-airflow/pull/3804#issuecomment-416565051
 
 
   @verdan PTaL


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416564829
 
 
   Docs please?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin closed pull request #3814: [AIRFLOW-XXX] Remove residual line in Changelog

2018-08-28 Thread GitBox
bolkedebruin closed pull request #3814: [AIRFLOW-XXX] Remove residual line in 
Changelog
URL: https://github.com/apache/incubator-airflow/pull/3814
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index 06d5aed3f1..b4ee1755b4 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -780,7 +780,6 @@ AIRFLOW 1.10.0, 2018-08-03
 [AIRFLOW-1609] Fix gitignore to ignore all venvs
 [AIRFLOW-1601] Add configurable task cleanup time
 
->>> 862ad8b9... [AIRFLOW-XXX] Update changelog for 1.10
 AIRFLOW 1.9.0, 2018-01-02
 -
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin commented on issue #3814: [AIRFLOW-XXX] Remove residual line in Changelog

2018-08-28 Thread GitBox
bolkedebruin commented on issue #3814: [AIRFLOW-XXX] Remove residual line in 
Changelog
URL: 
https://github.com/apache/incubator-airflow/pull/3814#issuecomment-416564624
 
 
   Oh I missed that :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] bolkedebruin edited a comment on issue #3660: [AIRFLOW-2817] Force explicit choice on GPL dependency

2018-08-28 Thread GitBox
bolkedebruin edited a comment on issue #3660: [AIRFLOW-2817] Force explicit 
choice on GPL dependency
URL: 
https://github.com/apache/incubator-airflow/pull/3660#issuecomment-416564049
 
 
   The issue needs to be fixed by either removing the dependency on 
python-nvd3, fixing upstream, or vendoring in python-slugify. 
   
   The choice you are describing @cHYzZQo is our preferred option but doesn’t 
work with package managers in the current state of the upstream package.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3814: [AIRFLOW-XXX] Remove residual line in Changelog

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3814: [AIRFLOW-XXX] Remove residual line 
in Changelog
URL: 
https://github.com/apache/incubator-airflow/pull/3814#issuecomment-416554775
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=h1)
 Report
   > Merging 
[#3814](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3814/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3814   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=footer).
 Last update 
[1801bae...88784ce](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3814: [AIRFLOW-XXX] Remove residual line in Changelog

2018-08-28 Thread GitBox
codecov-io commented on issue #3814: [AIRFLOW-XXX] Remove residual line in 
Changelog
URL: 
https://github.com/apache/incubator-airflow/pull/3814#issuecomment-416554775
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=h1)
 Report
   > Merging 
[#3814](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3814/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3814   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=footer).
 Last update 
[1801bae...88784ce](https://codecov.io/gh/apache/incubator-airflow/pull/3814?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2934) Pools not respected for internal subdag tasks

2018-08-28 Thread JIRA


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2934?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594845#comment-16594845
 ] 

Andreas Költringer commented on AIRFLOW-2934:
-

We have the same problem. I created [a thread on the dev mailing 
list.|https://lists.apache.org/thread.html/a482ace7c609143910364b0305159f653c307e4f4fb680d4a7cf314b@%3Cdev.airflow.apache.org%3E]

Besides that, I [created a 
Gist|https://gist.github.com/akoeltringer/63fcf0340ae219c112b2a5377e6d2715] to 
reproduce the issue.

> Pools not respected for internal subdag tasks
> -
>
> Key: AIRFLOW-2934
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2934
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: pools, subdag
>Affects Versions: 1.9.0
> Environment: Linux victorvianna 4.14.65-1-MANJARO #1 SMP PREEMPT Sat 
> Aug 18 13:29:56 UTC 2018 x86_64 GNU/Linux
> Python 3.6.6
>Reporter: Victor Vianna
>Priority: Blocker
>  Labels: pool, subdag
> Attachments: Screenshot from 2018-08-22 12-32-32.png, dag_pool.py
>
>
> I'm trying to have some subdags execute one task at a time. The way I found 
> was to create a first pool for the SubdagOperators (pool1 in the attached 
> code file) and a second one for the internal tasks (pool2). However, it 
> appears that pools for subdag elements are not being respected. Running 
> airflow 1.9.0 with LocalExecutor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable 
to async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416543200
 
 
   You are right, not this one but a similiar one 
https://github.com/apache/incubator-airflow/pull/3578
   It fixes async await for Python 3.7 even if Python 3.7 is not supported for 
this release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable 
to async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416543200
 
 
   You are right, not this one but a similiar one 
([2716](https://github.com/apache/incubator-airflow/pull/3578)). 
   It fixes async await for Python 3.7 even if Python 3.7 is not supported for 
this release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #3814: [AIRFLOW-XXX] Remove residual line in Changelog

2018-08-28 Thread GitBox
kaxil opened a new pull request #3814: [AIRFLOW-XXX] Remove residual line in 
Changelog
URL: https://github.com/apache/incubator-airflow/pull/3814
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation, you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   The residual line might be because someone tried to resolve a conflict and 
missed to delete this line.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable 
to async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416543200
 
 
   Not this one. But a similiar one 
([2716](https://github.com/apache/incubator-airflow/pull/3578)) on Line 9. It 
fixes async await for Python 3.7 even if Python 3.7 is not supported for this 
release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
feluelle edited a comment on issue #3561: [AIRFLOW-2713] Rename async variable 
to async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416543200
 
 
   Not this one (2713). But a similiar one (2716) on Line 9. It fixes async 
await for Python 3.7 even if Python 3.7 is not supported for this release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feluelle commented on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
feluelle commented on issue #3561: [AIRFLOW-2713] Rename async variable to 
async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416543200
 
 
   Not this one (2713). But a similiar one (2716) on Line 9. It fixes async 
await for Python 3.7 even Python 3.7 is not supported for this release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Perados commented on issue #3561: [AIRFLOW-2713] Rename async variable to async_packages in setup.py for Python 3.7.0 compatibility

2018-08-28 Thread GitBox
Perados commented on issue #3561: [AIRFLOW-2713] Rename async variable to 
async_packages in setup.py for Python 3.7.0 compatibility
URL: 
https://github.com/apache/incubator-airflow/pull/3561#issuecomment-416542155
 
 
   I think there is a problem, I didn't see this commit in the changelog for 
1.10 nor in the 1.10 tag : 
https://github.com/apache/incubator-airflow/blob/master/CHANGELOG.txt and 
https://github.com/apache/incubator-airflow/commits/1.10.0
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented 
DatabricksRunNowOperator for jobs/run-now …
URL: 
https://github.com/apache/incubator-airflow/pull/3813#issuecomment-416533459
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=h1)
 Report
   > Merging 
[#3813](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3813/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3813   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=footer).
 Last update 
[1801bae...5dcef17](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
codecov-io commented on issue #3813: [AIRFLOW-1998] Implemented 
DatabricksRunNowOperator for jobs/run-now …
URL: 
https://github.com/apache/incubator-airflow/pull/3813#issuecomment-416533459
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=h1)
 Report
   > Merging 
[#3813](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3813/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3813   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=footer).
 Last update 
[1801bae...5dcef17](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented 
DatabricksRunNowOperator for jobs/run-now …
URL: 
https://github.com/apache/incubator-airflow/pull/3813#issuecomment-416533459
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=h1)
 Report
   > Merging 
[#3813](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3813/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3813   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=footer).
 Last update 
[1801bae...5dcef17](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now …

2018-08-28 Thread GitBox
codecov-io edited a comment on issue #3813: [AIRFLOW-1998] Implemented 
DatabricksRunNowOperator for jobs/run-now …
URL: 
https://github.com/apache/incubator-airflow/pull/3813#issuecomment-416533459
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=h1)
 Report
   > Merging 
[#3813](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/1801baefe44f361010c23e6ec4ee8b8569eab82d?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3813/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3813   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581015810   
   ===
 Hits1223912239   
 Misses   3571 3571
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=footer).
 Last update 
[1801bae...5dcef17](https://codecov.io/gh/apache/incubator-airflow/pull/3813?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1998) Implement Databricks Operator for jobs/run-now endpoint

2018-08-28 Thread Israel Knight (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1998?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594775#comment-16594775
 ] 

Israel Knight commented on AIRFLOW-1998:


Hi friend! Looks like you and I have similar needs. I didn't see your reply 
until just now. I was just delighted there was an existing issue for a problem 
we had and that perhaps I was helping someone!

We needed this functionality ASAP at my company last week--- so I had to 
implement this. I conferred with Databricks on their preference of 
implementation-- they said to make a new operator, and extract the shared 
functions to module level functions. I submitted a PR: 
[https://github.com/apache/incubator-airflow/pull/3813]

I await their feedback on my PR. (yours too!)

I'm also down for collaborating on remaining endpoint operators we might both 
need if you want! :)

 

> Implement Databricks Operator for jobs/run-now endpoint
> ---
>
> Key: AIRFLOW-1998
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1998
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks, operators
>Affects Versions: 1.9.0
>Reporter: Diego Rabatone Oliveira
>Assignee: Israel Knight
>Priority: Major
>
> Implement a Operator to deal with Databricks '2.0/jobs/run-now' API Endpoint.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >