[GitHub] [airflow] codecov-io commented on issue #4949: [AIRFLOW-1557][WIP] Backfill respect pool limit

2019-03-20 Thread GitBox
codecov-io commented on issue #4949: [AIRFLOW-1557][WIP] Backfill respect pool 
limit
URL: https://github.com/apache/airflow/pull/4949#issuecomment-475117194
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=h1) 
Report
   > Merging 
[#4949](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/bac8244e0a177f3e016b08d0ddbe3e8696424982?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `85.71%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4949/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4949  +/-   ##
   =
   + Coverage   75.58%   75.6%   +0.01% 
   =
 Files 454 454  
 Lines   29209   29241  +32 
   =
   + Hits22078   22108  +30 
   - Misses   71317133   +2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/4949/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.09% <85.71%> (+0.1%)` | :arrow_up: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4949/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.92% <0%> (+0.03%)` | :arrow_up: |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/4949/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `65.9% <0%> (+1.83%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=footer). 
Last update 
[bac8244...1a94cbe](https://codecov.io/gh/apache/airflow/pull/4949?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add automatic generation of API Reference

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add 
automatic generation of API Reference
URL: https://github.com/apache/airflow/pull/4788#discussion_r267626989
 
 

 ##
 File path: setup.py
 ##
 @@ -160,11 +160,12 @@ def write_version(filename=os.path.join(*['airflow',
 databricks = ['requests>=2.20.0, <3']
 datadog = ['datadog>=0.14.0']
 doc = [
-'sphinx>=1.2.3',
 'sphinx-argparse>=0.1.13',
+'sphinx-autoapi>=0.7.1',
+'Sphinx-PyPI-upload>=0.2.1',
 'sphinx-rtd-theme>=0.1.6',
+'sphinx>=1.2.3',
 'sphinxcontrib-httpdomain>=1.7.0',
-'Sphinx-PyPI-upload>=0.2.1'
 
 Review comment:
   I wanna introduce alphabetical order. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] milton0825 commented on a change in pull request #4889: [AIRFLOW-4057] statsd should escape invalid characters

2019-03-20 Thread GitBox
milton0825 commented on a change in pull request #4889: [AIRFLOW-4057] statsd 
should escape invalid characters
URL: https://github.com/apache/airflow/pull/4889#discussion_r267625586
 
 

 ##
 File path: airflow/stats.py
 ##
 @@ -0,0 +1,124 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+
+from functools import wraps
+import logging
+import socket
+import string
+import textwrap
+from typing import Any
+
+from airflow import configuration as conf
+from airflow.exceptions import InvalidStatsNameException
+
+log = logging.getLogger(__name__)
+
+
+class DummyStatsLogger(object):
+@classmethod
+def incr(cls, stat, count=1, rate=1):
+pass
+
+@classmethod
+def decr(cls, stat, count=1, rate=1):
+pass
+
+@classmethod
+def gauge(cls, stat, value, rate=1, delta=False):
+pass
+
+@classmethod
+def timing(cls, stat, dt):
+pass
+
+
+# Only characters in the character set are considered valid
+# for the stat_name if stat_name_default_handler is used.
+ALLOWED_CHARACTERS = set(string.ascii_letters + string.digits + '_.-')
+
+
+def stat_name_default_handler(stat_name, max_length=250):
+if not isinstance(stat_name, str):
+raise InvalidStatsNameException('The stat_name has to be a string')
+if len(stat_name) > max_length:
+raise InvalidStatsNameException(textwrap.dedent("""\
+The stat_name ({stat_name}) has to be less than {max_length} 
characters.
+""".format(stat_name=stat_name, max_length=max_length)))
+if not all((c in ALLOWED_CHARACTERS) for c in stat_name):
+raise InvalidStatsNameException(textwrap.dedent("""\
+The stat name ({stat_name}) has to be composed with characters in
+{allowed_characters}.
+""".format(stat_name=stat_name,
+   allowed_characters=ALLOWED_CHARACTERS)))
+return stat_name
+
+
+def validate_stat(f):
+@wraps(f)
+def wrapper(stat, *args, **kwargs):
+try:
+from airflow.plugins_manager import stat_name_handler
+if stat_name_handler:
+handle_stat_name_func = stat_name_handler
+else:
+handle_stat_name_func = stat_name_default_handler
+stat_name = handle_stat_name_func(stat)
+except Exception as err:
+log.warning('Invalid stat name: {stat}.'.format(stat=stat), err)
+return
 
 Review comment:
   @ashb any comments?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #4948: [AIRFLOW-4129] Escape HTML in generated tooltips (#4944)

2019-03-20 Thread GitBox
kaxil commented on issue #4948: [AIRFLOW-4129] Escape HTML in generated 
tooltips (#4944)
URL: https://github.com/apache/airflow/pull/4948#issuecomment-475090903
 
 
   Agree with @mik-laj 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
XD-DENG commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r267606803
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -74,7 +78,17 @@ def execute(self, context):
 self.result_configuration['OutputLocation'] = self.output_location
 self.query_execution_id = self.hook.run_query(self.query, 
self.query_execution_context,
   
self.result_configuration, self.client_request_token)
-self.hook.poll_query_status(self.query_execution_id)
+query_status = self.hook.poll_query_status(self.query_execution_id, 
self.max_tries)
+
+if query_status in AWSAthenaHook.FAILURE_STATES:
+raise Exception(
+'Final state of Athena job is {}, query_execution_id is {}.'
+.format(query_status, self.query_execution_id))
+elif not query_status or query_status in 
AWSAthenaHook.INTERMEDIATE_STATES:
+raise Exception(
+'Final state of Athena job is {}. \
+ Max tries of poll status exceeded, query_execution_id is {}.'
 
 Review comment:
   Hi @bryanyang0528 This line-breaking here will cause minor display issue. 
   
   You can try to run
   ```python
   raise Exception(
   'Final state of Athena job is {}. \
   Max tries of poll status exceeded, query_execution_id is {}.'
   .format("AAA", "BBB"))
   ```
   
   What you will see is 
   ```
   Exception: Final state of Athena job is AAA. Max tries of poll status 
exceeded, query_execution_id is **BBB.**
   ```
   in which we have unnecessary extra spaces between the two sentences.
   
   Please change how you break the line. Thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] cixuuz commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add automatic generation of API Reference

2019-03-20 Thread GitBox
cixuuz commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add 
automatic generation of API Reference
URL: https://github.com/apache/airflow/pull/4788#discussion_r267597950
 
 

 ##
 File path: setup.py
 ##
 @@ -160,11 +160,12 @@ def write_version(filename=os.path.join(*['airflow',
 databricks = ['requests>=2.20.0, <3']
 datadog = ['datadog>=0.14.0']
 doc = [
-'sphinx>=1.2.3',
 'sphinx-argparse>=0.1.13',
+'sphinx-autoapi>=0.7.1',
+'Sphinx-PyPI-upload>=0.2.1',
 'sphinx-rtd-theme>=0.1.6',
+'sphinx>=1.2.3',
 'sphinxcontrib-httpdomain>=1.7.0',
-'Sphinx-PyPI-upload>=0.2.1'
 
 Review comment:
   sphinx and Sphinx-PyPI-upload could keep same order. Looks like no changes 
happened. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-2806) test_mark_success_no_kill test breaks intermittently on CI

2019-03-20 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2806?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-2806.
-
Resolution: Cannot Reproduce

> test_mark_success_no_kill test breaks intermittently on CI
> --
>
> Key: AIRFLOW-2806
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2806
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Taylor Edmiston
>Assignee: Taylor Edmiston
>Priority: Minor
>
> The test_mark_success_no_kill test is breaking intermittently on the CI for 
> some versions of Python and some databases, particularly Python 3.5 for both 
> PostgreSQL and MySQL.
> A traceback of the error is 
> ([link|https://travis-ci.org/apache/incubator-airflow/jobs/407522994#L5668-L5701]):
> {code:java}
> 10) ERROR: test_mark_success_no_kill (tests.transplant_class..C)
> --
>  Traceback (most recent call last):
>  tests/jobs.py line 1116 in test_mark_success_no_kill
>  ti.refresh_from_db()
>  airflow/utils/db.py line 74 in wrapper
>  return func(*args, **kwargs)
>  /opt/python/3.5.5/lib/python3.5/contextlib.py line 66 in __exit__
>  next(self.gen)
>  airflow/utils/db.py line 45 in create_session
>  session.commit()
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/orm/session.py
>  line 927 in commit
>  self.transaction.commit()
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/orm/session.py
>  line 471 in commit
>  t[1].commit()
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/base.py
>  line 1632 in commit
>  self._do_commit()
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/base.py
>  line 1663 in _do_commit
>  self.connection._commit_impl()
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/base.py
>  line 723 in _commit_impl
>  self._handle_dbapi_exception(e, None, None, None, None)
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/base.py
>  line 1402 in _handle_dbapi_exception
>  exc_info
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/util/compat.py
>  line 203 in raise_from_cause
>  reraise(type(exception), exception, tb=exc_tb, cause=cause)
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/util/compat.py
>  line 186 in reraise
>  raise value.with_traceback(tb)
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/base.py
>  line 721 in _commit_impl
>  self.engine.dialect.do_commit(self.connection)
>  
> .tox/py35-backend_postgres/lib/python3.5/site-packages/sqlalchemy/engine/default.py
>  line 443 in do_commit
>  dbapi_connection.commit()
>  OperationalError: (psycopg2.OperationalError) server closed the connection 
> unexpectedly
>  This probably means the server terminated abnormally{code}
> It seems to be erroring out on trying to 
> [commit|http://initd.org/psycopg/docs/connection.html#connection.commit] the 
> pending transaction to the database, possibly because the connection has been 
> closed. What's weird is that this line is already in a try-except block 
> catching all exceptions, but I think it's somehow not entering the except 
> clause.
> [https://github.com/apache/incubator-airflow/blob/f3b6b60c4809afdde916e8982a300f942f26109b/airflow/utils/db.py#L36-L50]
> Note: This is a follow up to AIRFLOW-2801 ([PR 
> #3642|https://github.com/apache/incubator-airflow/pull/3642]) which provided 
> a short-term solution by skipping the flaky test.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3458) Refactor: Move Connection out of models.py

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797663#comment-16797663
 ] 

ASF subversion and git services commented on AIRFLOW-3458:
--

Commit 3899027251532beecbfd8d4afe8886ae43fd9d1a in airflow's branch 
refs/heads/v1-10-stable from Ash Berlin-Taylor
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=3899027 ]

[AIRFLOW-3458] Deprecation path for moving models.Connection


> Refactor: Move Connection out of models.py
> --
>
> Key: AIRFLOW-3458
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3458
> Project: Apache Airflow
>  Issue Type: Task
>  Components: models
>Affects Versions: 1.10.1
>Reporter: Fokko Driesprong
>Assignee: Bas Harenslak
>Priority: Major
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3584) Switch dags index view to orm dags

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3584?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797669#comment-16797669
 ] 

ASF subversion and git services commented on AIRFLOW-3584:
--

Commit 6bae99bd77a513ed240796ae121d931588dee282 in airflow's branch 
refs/heads/v1-10-stable from Joshua Carp
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=6bae99b ]

[AIRFLOW-3584] Use ORM DAGs for index view. (#4390)

* [AIRFLOW-3584] Use ORM DAGs for index view.

* Serialize schedule interval to json rather than pickle.


> Switch dags index view to orm dags
> --
>
> Key: AIRFLOW-3584
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3584
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Minor
> Fix For: 2.0.0
>
>
> Part of AIRFLOW-3562.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3606) Fix Flake8 test & Fix the Flake8 errors introduced since Flake8 test was broken

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797672#comment-16797672
 ] 

ASF subversion and git services commented on AIRFLOW-3606:
--

Commit 58d85a1a8f8f5cebf149c46405541ae5f0712bd5 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=58d85a1 ]

[AIRFLOW-3606] Fix Flake8 test & fix the Flake8 errors introduced since Flake8 
test was broken (#4415)

The flake8 test in the Travis CI was broken since 
https://github.com/apache/incubator-airflow/pull/4361
(https://github.com/apache/incubator-airflow/commit/7a6acbf5b343e4a6895d1cc8af75ecc02b4fd0e8
 )

And some Flake8 errors (code style/quality issues. found in 10 files) were 
introduce since flake8 test was broken.


>  Fix Flake8 test & Fix the Flake8 errors introduced since Flake8 test was 
> broken
> 
>
> Key: AIRFLOW-3606
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3606
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Critical
> Fix For: 2.0.0
>
>
> The flake8 test in the Travis CI was broken since 
> [https://github.com/apache/incubator-airflow/pull/4361] 
> ([https://github.com/apache/incubator-airflow/commit/7a6acbf5b343e4a6895d1cc8af75ecc02b4fd0e8]
>  )
>  
> And some Flake8 errors (code style/quality issues. found in 10 files) were 
> introduce since flake8 test was broken.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3561) Improve some views

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797666#comment-16797666
 ] 

ASF subversion and git services commented on AIRFLOW-3561:
--

Commit ad7a70297b639142e0a89cfb89a116371d90e605 in airflow's branch 
refs/heads/v1-10-stable from Peter van 't Hof
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ad7a702 ]

[AIRFLOW-3561] Improve queries (#4368)

* improve queries

* Adding field to the database

* Set length of field

* remove dagbag use in xcom call

* Fixing typo

* Adding test

* Remove default_view

* fixing test

* rename var

* Fixing rbac dag_stats

* Fixing rbac task_stats

* Fixing rbac code

* Fixing rbac xcom

* Fixing template

* Fixing default view call

* Added timezone to DagModel

* Fixing timezone


> Improve some views
> --
>
> Key: AIRFLOW-3561
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3561
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Minor
> Fix For: 2.0.0
>
>
> Some views does interaction with the dag bag while is not needed for the 
> query.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3458) Refactor: Move Connection out of models.py

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797662#comment-16797662
 ] 

ASF subversion and git services commented on AIRFLOW-3458:
--

Commit d4b9c069f8caf34b4f1dee40dafd963690edd954 in airflow's branch 
refs/heads/v1-10-stable from BasPH
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=d4b9c06 ]

[AIRFLOW-3458] Move models.Connection into separate file (#4335)


> Refactor: Move Connection out of models.py
> --
>
> Key: AIRFLOW-3458
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3458
> Project: Apache Airflow
>  Issue Type: Task
>  Components: models
>Affects Versions: 1.10.1
>Reporter: Fokko Driesprong
>Assignee: Bas Harenslak
>Priority: Major
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2821) Doc "Plugins" can be improved

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797667#comment-16797667
 ] 

ASF subversion and git services commented on AIRFLOW-2821:
--

Commit 628548bd8a6433d1b33d621574ce9f36bab94b85 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=628548b ]

[AIRFLOW-2821] Refine Doc "Plugins" (#3664)



> Doc "Plugins" can be improved
> -
>
> Key: AIRFLOW-2821
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2821
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
>
> The doc "Plugins" is not very clear in some sense.
> Something that users may need to take note is not mentioned.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2548) Output Plugin Import Errors to WebUI

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2548?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797668#comment-16797668
 ] 

ASF subversion and git services commented on AIRFLOW-2548:
--

Commit 95cfffdaef33b85531f7b856fe5c8da643138846 in airflow's branch 
refs/heads/v1-10-stable from Jimmy Cao
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=95cfffd ]

[AIRFLOW-2548] Output plugin import errors to web UI (#3930)



> Output Plugin Import Errors to WebUI
> 
>
> Key: AIRFLOW-2548
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2548
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Andy Cooper
>Priority: Major
> Fix For: 1.10.3
>
>
> All,
>  
> We currently output all DAG import errors to the webUI. I propose we do the 
> same with plugin errors as well. This will provide a better user experience 
> by bubbling up all errors to the webUI instead of hiding them in stdOut.
>  
> Proposal...
>  * Extend models.ImportError to have a "type" field to distinguish from error 
> types.
>  * Prevent class SchedulerJob methods from clearing out and pulling from 
> models.ImportError if type = 'plugin'
>  * Create new ImportError records in plugins_manager.py for each plugin that 
> fails to import
>  * Prompt user in views.py with plugin ImportErrors - specifying that they 
> need to fix and restart webserver to resolve.
>  
> Does this seem reasonable to everyone? I'd be interested in taking on this 
> work if needed



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3584) Switch dags index view to orm dags

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3584?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797670#comment-16797670
 ] 

ASF subversion and git services commented on AIRFLOW-3584:
--

Commit 6bae99bd77a513ed240796ae121d931588dee282 in airflow's branch 
refs/heads/v1-10-stable from Joshua Carp
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=6bae99b ]

[AIRFLOW-3584] Use ORM DAGs for index view. (#4390)

* [AIRFLOW-3584] Use ORM DAGs for index view.

* Serialize schedule interval to json rather than pickle.


> Switch dags index view to orm dags
> --
>
> Key: AIRFLOW-3584
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3584
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Minor
> Fix For: 2.0.0
>
>
> Part of AIRFLOW-3562.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3600) Remove dagBag from trigger call

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797665#comment-16797665
 ] 

ASF subversion and git services commented on AIRFLOW-3600:
--

Commit 8271250a7e2cde3c1d7f193ca41bd2595c717b4e in airflow's branch 
refs/heads/v1-10-stable from Peter van 't Hof
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8271250 ]

[AIRFLOW-3600] Remove dagbag from trigger (#4407)

* Remove dagbag from trigger call

* Adding fix to rbac

* empty commit

* Added create_dagrun to DagModel

* Adding testing to /trigger calls

* Make session a class var


> Remove dagBag from trigger call
> ---
>
> Key: AIRFLOW-3600
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3600
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.1
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Minor
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3543) rescheduled tasks block DAG deletion

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797671#comment-16797671
 ] 

ASF subversion and git services commented on AIRFLOW-3543:
--

Commit 1e8ab63ffd03612c282b793cc740591cd8aeefd7 in airflow's branch 
refs/heads/v1-10-stable from Stefan Seelmann
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=1e8ab63 ]

AIRFLOW-3543: Fix deletion of DAG with rescheduled tasks (#4646)


> rescheduled tasks block DAG deletion
> 
>
> Key: AIRFLOW-3543
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3543
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: cli, database
> Environment: postgres 10 database
>Reporter: Christopher
>Assignee: Stefan Seelmann
>Priority: Critical
> Fix For: 1.10.3
>
>
> This applies to current master branch after 
> [AIRFLOW-2747|https://github.com/apache/incubator-airflow/commit/dc59d7e2750aa90e099afad8689f2646f18f92a6]
>  was merged. 
> Once a sensor task is rescheduled, the task cannot be deleted from the DB due 
> to a foreign key constraint. This prevents deletion of tasks and DAGS. This 
> occurs regardless of whether the DAG is still running or whether the sensor 
> is actually rescheduled to run in the future or not (ie the task may complete 
> successfully but its entry still resides as a row in the task_reschedule 
> table.
>  
> I am running a postgres-backed airflow instance.
>  
> {{Traceback (most recent call last):}}
> {{ File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
> line 1193, in _execute_context}}
> {{context)
> {{File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", 
> line 509, in do_execute
> {{cursor.execute(statement, parameters)}}
> {{psycopg2.IntegrityError: update or delete on table "task_instance" violates 
> foreign key constraint "task_reschedule_dag_task_date_fkey" on table 
> "task_reschedule"}}
> {{DETAIL: Key (task_id, dag_id, execution_date)=(check_images_ready_11504, 
> flight5105_v0.0.1, 2018-12-13 00:00:00+00) is still referenced from table 
> "task_reschedule".}}
> {{sqlalchemy.exc.IntegrityError: (psycopg2.IntegrityError) update or delete 
> on table "task_instance" violates foreign key constraint 
> "task_reschedule_dag_task_date_fkey" on table "task_reschedule"}}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3713) Updating.md should be updated with information about Optional Project Id in GCP operators

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797664#comment-16797664
 ] 

ASF subversion and git services commented on AIRFLOW-3713:
--

Commit ad80e69535b1e1c96416cd2f376fb36de1915d46 in airflow's branch 
refs/heads/v1-10-stable from Jarek Potiuk
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ad80e69 ]

[AIRFLOW-3713] Updated documentation for GCP optional project_id (#4541)


> Updating.md should be updated with information about Optional Project Id in 
> GCP operators 
> --
>
> Key: AIRFLOW-3713
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3713
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Jarek Potiuk
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract 
Jinja directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r267585002
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -290,6 +294,7 @@ 
 {% endblock %}
 {% block tail %}
   {{ super() }}
+  
 
 Review comment:
   I guess you would like me to apply a solution similar to that presented in 
the PR #4950 but this is not in my opinion the right solution. This creates one 
big file that contains the major part of JS code. I would like to divide the 
code into smaller parts and mark the requirements for each JS snippet to make 
future refactorations easier.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mans2singh commented on issue #4887: [AIRFLOW-4055] Add AWS SQS Sensor

2019-03-20 Thread GitBox
mans2singh commented on issue #4887: [AIRFLOW-4055] Add AWS SQS Sensor
URL: https://github.com/apache/airflow/pull/4887#issuecomment-475063699
 
 
   @mik-laj  - I've updated the code to reflect your recommendations (added 
default conn id, and moved hook to the poke method.  Please let me know if you 
have any additional recommendations.  Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract 
Jinja directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r267573261
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -290,6 +294,7 @@ 
 {% endblock %}
 {% block tail %}
   {{ super() }}
+  
 
 Review comment:
   I would prefer to deal with this problem in a separate PR. When the JS code 
will be in separate files and it will be possible to introduce mechanisms that 
will be much more efficient in loading the JS code. Currently, it is not 
possible to implement a solution that will solve the problem completely and I 
will be 100% sure. I can make another change depending on this in seperate PR, 
but it creates another chain of PR.  Already in the case of CSS code, we are 
blocked from making changes, because the introduction of new changes involves 
the necessity to fix the old problems first.
   
   I could introduce your solution now, but it will complicate the code, which 
is not in good condition.
   
   It is worth noting that if a reference to the same JS file appears in the 
HTML code, the file is only loaded once. This is a simple and easy module 
loader.  This is not a bad thing if you see the dependency twice in the HTML 
code. This means that a given fragment is needed in two places. The browser 
will provide an automatic and 100% sure way that the code will not be executed 
twice. This is not a bad practice, but a simple way to ensure that the module 
is loaded.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] GrantSheehan commented on issue #4553: [AIRFLOW-3541] Add Avro logical type conversion to bigquery hook

2019-03-20 Thread GitBox
GrantSheehan commented on issue #4553: [AIRFLOW-3541] Add Avro logical type 
conversion to bigquery hook
URL: https://github.com/apache/airflow/pull/4553#issuecomment-475052592
 
 
   Any reason this is languishing? Anything I can do to help get it merged?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract 
Jinja directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r267568275
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -290,6 +294,7 @@ 
 {% endblock %}
 {% block tail %}
   {{ super() }}
+  
 
 Review comment:
   Currently, `tail` and` tail_js` blocks are not used correctly in our 
application. So the introduction of your suggestion can be problematic, because 
it is difficult to predict the order in which items are loaded.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-4134) "DB connection invalidated" warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Description: 
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Call stack:
{code}
Call stack:
  File "/usr/local/bin/airflow", line 32, in 
args.func(args)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, 
in wrapper
return f(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 992, 
in scheduler
job.run()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 205, in 
run
self._execute()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1532, in 
_execute
self._execute_helper()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1562, in 
_execute_helper
self.processor_agent.start()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
511, in start
self._async_mode)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
565, in _launch_process
p.start()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 105, in start
self._popen = self._Popen(self)
  File "/usr/local/lib/python3.6/multiprocessing/context.py", line 223, in 
_Popen
return _default_context.get_context().Process._Popen(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/context.py", line 277, in 
_Popen
return Popen(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", line 19, in 
__init__
self._launch(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", line 73, in 
_launch
code = process_obj._bootstrap()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in 
_bootstrap
self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
560, in helper
processor_manager.start()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
797, in start
self.start_in_async()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
820, in start_in_async
simple_dags = self.heartbeat()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1190, in heartbeat
zombies = self._find_zombies()
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 73, 
in wrapper
return func(*args, **kwargs)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1236, in _find_zombies
LJ.latest_heartbeat < limit_dttm,
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
2925, in all
return list(self)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3081, in __iter__
return self._execute_and_instances(context)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3103, in _execute_and_instances
querycontext, self._connection_from_session, close_with_result=True
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3111, in _get_bind_args
mapper=self._bind_mapper(), clause=querycontext.statement, **kw
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3096, in _connection_from_session
conn = self.session.connection(**kw)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
1120, in connection
execution_options=execution_options,
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
1126, in _connection_for_bind
engine, execution_options
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
424, in _connection_for_bind
conn = bind.contextual_connect()
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
2194, in contextual_connect
**kwargs
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
125, in __init__
self.dispatch.engine_connect(self, self.__branch)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/event/attr.py", line 
297, in __call__
fn(*args, **kw)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", 
line 79, in ping_connection
log.warning("DB connection invalidated. Reconnecting...", err)
Message: 'DB connection invalidated. Reconnecting...'
Arguments: (OperationalError('(psycopg2.OperationalError) server closed the 
connection 

[GitHub] [airflow] mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-20 Thread GitBox
mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity 
for airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-475046471
 
 
   @vsoch Yes. Test locally and when we finish work, we will make Travis happy.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dimberman commented on a change in pull request #4952: feat/AIRFLOW-4008/k8s-executor-env-from

2019-03-20 Thread GitBox
dimberman commented on a change in pull request #4952: 
feat/AIRFLOW-4008/k8s-executor-env-from
URL: https://github.com/apache/airflow/pull/4952#discussion_r267557050
 
 

 ##
 File path: 
tests/contrib/kubernetes/kubernetes_request_factory/test_pod_request_factory.py
 ##
 @@ -0,0 +1,154 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.contrib.kubernetes.kubernetes_request_factory.\
+pod_request_factory import SimplePodRequestFactory, \
+ExtractXcomPodRequestFactory
+from airflow.contrib.kubernetes.pod import Pod
+from mock import ANY
+import unittest
+
+
+class TestSimplePodRequestFactory(unittest.TestCase):
+
+def setUp(self):
+self.simple_pod_request_factory = SimplePodRequestFactory()
+self.xcom_pod_request_factory = ExtractXcomPodRequestFactory()
+self.pod = Pod(
+image='busybox',
+envs={
+'ENVIRONMENT': 'prod',
+'LOG_LEVEL': 'warning'
+},
+name='myapp-pod',
+cmds=['sh', '-c', 'echo Hello Kubernetes!'],
+labels={'app': 'myapp'},
+env_from_configmap_ref='env_from_configmap',
+env_from_secret_ref='env_from_secret_a,env_from_secret_b',
+image_pull_secrets='pull_secret_a,pull_secret_b'
+)
+self.maxDiff = None
+
+def test_simple_pod_request_factory_create(self):
+expected_result = {
+'apiVersion': 'v1',
+'kind': 'Pod',
 
 Review comment:
   there seems to be a lot of boiler plate code copied between these tests. 
Could you please create a base object and then copy + modify that object as 
needed for each function?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-4134) "DB connection invalidated" warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Description: 
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Call stack:
{code}
Call stack:
  File "/usr/local/bin/airflow", line 32, in 
args.func(args)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, 
in wrapper
return f(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 992, 
in scheduler
job.run()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 205, in 
run
self._execute()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1532, in 
_execute
self._execute_helper()
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1562, in 
_execute_helper
self.processor_agent.start()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
511, in start
self._async_mode)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
565, in _launch_process
p.start()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 105, in start
self._popen = self._Popen(self)
  File "/usr/local/lib/python3.6/multiprocessing/context.py", line 223, in 
_Popen
return _default_context.get_context().Process._Popen(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/context.py", line 277, in 
_Popen
return Popen(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", line 19, in 
__init__
self._launch(process_obj)
  File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", line 73, in 
_launch
code = process_obj._bootstrap()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in 
_bootstrap
self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
560, in helper
processor_manager.start()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
797, in start
self.start_in_async()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
820, in start_in_async
simple_dags = self.heartbeat()
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1190, in heartbeat
zombies = self._find_zombies()
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 73, 
in wrapper
return func(*args, **kwargs)
  File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1236, in _find_zombies
LJ.latest_heartbeat < limit_dttm,
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
2925, in all
return list(self)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3081, in __iter__
return self._execute_and_instances(context)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3103, in _execute_and_instances
querycontext, self._connection_from_session, close_with_result=True
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3111, in _get_bind_args
mapper=self._bind_mapper(), clause=querycontext.statement, **kw
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 
3096, in _connection_from_session
conn = self.session.connection(**kw)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
1120, in connection
execution_options=execution_options,
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
1126, in _connection_for_bind
engine, execution_options
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 
424, in _connection_for_bind
conn = bind.contextual_connect()
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
2194, in contextual_connect
**kwargs
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
125, in __init__
self.dispatch.engine_connect(self, self.__branch)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/event/attr.py", line 
297, in __call__
fn(*args, **kw)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", 
line 79, in ping_connection
log.warning("DB connection invalidated. Reconnecting...", err)
Message: 'DB connection invalidated. Reconnecting...'
Arguments: (OperationalError('(psycopg2.OperationalError) server closed the 
connection 

[jira] [Updated] (AIRFLOW-4134) "DB connection invalidated" warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Description: 
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Call stack:
{code}
webserver_1  | Call stack:
webserver_1  |   File "/usr/local/bin/airflow", line 32, in 
webserver_1  | args.func(args)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, in 
wrapper
webserver_1  | return f(*args, **kwargs)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 992, in 
scheduler
webserver_1  | job.run()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", 
line 205, in run
webserver_1  | self._execute()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", 
line 1532, in _execute
webserver_1  | self._execute_helper()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", 
line 1562, in _execute_helper
webserver_1  | self.processor_agent.start()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
511, in start
webserver_1  | self._async_mode)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
565, in _launch_process
webserver_1  | p.start()
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/process.py", 
line 105, in start
webserver_1  | self._popen = self._Popen(self)
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/context.py", 
line 223, in _Popen
webserver_1  | return 
_default_context.get_context().Process._Popen(process_obj)
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/context.py", 
line 277, in _Popen
webserver_1  | return Popen(process_obj)
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", 
line 19, in __init__
webserver_1  | self._launch(process_obj)
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/popen_fork.py", 
line 73, in _launch
webserver_1  | code = process_obj._bootstrap()
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/process.py", 
line 258, in _bootstrap
webserver_1  | self.run()
webserver_1  |   File "/usr/local/lib/python3.6/multiprocessing/process.py", 
line 93, in run
webserver_1  | self._target(*self._args, **self._kwargs)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
560, in helper
webserver_1  | processor_manager.start()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
797, in start
webserver_1  | self.start_in_async()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
820, in start_in_async
webserver_1  | simple_dags = self.heartbeat()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1190, in heartbeat
webserver_1  | zombies = self._find_zombies()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 73, in 
wrapper
webserver_1  | return func(*args, **kwargs)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 
1236, in _find_zombies
webserver_1  | LJ.latest_heartbeat < limit_dttm,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 2925, in 
all
webserver_1  | return list(self)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3081, in 
__iter__
webserver_1  | return self._execute_and_instances(context)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3103, in 
_execute_and_instances
webserver_1  | querycontext, self._connection_from_session, 
close_with_result=True
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3111, in 
_get_bind_args
webserver_1  | mapper=self._bind_mapper(), clause=querycontext.statement, 
**kw
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3096, in 
_connection_from_session
webserver_1  | conn = self.session.connection(**kw)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1120, 
in connection
webserver_1  | execution_options=execution_options,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1126, 
in _connection_for_bind
webserver_1  | engine, 

[GitHub] [airflow] dimberman commented on issue #4852: [AIRFLOW-3152] Implements init-containers in Kubernetes Pod Operator

2019-03-20 Thread GitBox
dimberman commented on issue #4852: [AIRFLOW-3152] Implements init-containers 
in Kubernetes Pod Operator
URL: https://github.com/apache/airflow/pull/4852#issuecomment-475037297
 
 
   @mgaggero why are you PRing this against the stable branch? Also could you 
please add a test?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dimberman commented on a change in pull request #4943: [AIRFLOW-XXX] add description of is_delete_operator_pod

2019-03-20 Thread GitBox
dimberman commented on a change in pull request #4943: [AIRFLOW-XXX] add 
description of is_delete_operator_pod
URL: https://github.com/apache/airflow/pull/4943#discussion_r267554892
 
 

 ##
 File path: airflow/contrib/operators/kubernetes_pod_operator.py
 ##
 @@ -80,6 +80,10 @@ class KubernetesPodOperator(BaseOperator):
 /airflow/xcom/return.json in the container will also be pushed to an
 XCom when the container completes.
 :type do_xcom_push: bool
+:param is_delete_operator_pod: If True, delete the pod after it's reached
+a final state (independent of pod success), or the task instance gets 
 
 Review comment:
   +1


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-4133) CLI `test` mode writes to DB on task failure

2019-03-20 Thread Rob (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob updated AIRFLOW-4133:
-
Summary: CLI `test` mode writes to DB on task failure  (was: CLI `test` 
mode writes to DB on failure)

> CLI `test` mode writes to DB on task failure
> 
>
> Key: AIRFLOW-4133
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4133
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: cli
>Affects Versions: 1.10.2
>Reporter: Rob
>Priority: Minor
>
> The documentation for running {{airflow test}} says that this mode does NOT 
> record any state in the DB.
> However, {{def handle_failure}} will *always* add {{TaskFail}} to the session
> [https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1536]
> and then commit at the end
> [https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1576]
> regardless of the value for {{test_mode.}}
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4134) "DB connection invalidated" warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Description: 
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Here's the traceback:

{code}
webserver_1  | Traceback (most recent call last):
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 68, 
in ping_connection
webserver_1  | connection.scalar(select([1]))
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
in scalar
webserver_1  | return self.execute(object_, *multiparams, **params).scalar()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
in execute
webserver_1  | return meth(self, multiparams, params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 273, 
in _execute_on_connection
webserver_1  | return connection._execute_clauseelement(self, multiparams, 
params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1099, 
in _execute_clauseelement
webserver_1  | distilled_params,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1240, 
in _execute_context
webserver_1  | e, statement, parameters, cursor, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1458, 
in _handle_dbapi_exception
webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 296, 
in raise_from_cause
webserver_1  | reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 276, 
in reraise
webserver_1  | raise value.with_traceback(tb)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1236, 
in _execute_context
webserver_1  | cursor, statement, parameters, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 
536, in do_execute
webserver_1  | cursor.execute(statement, parameters)
webserver_1  | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) 
server closed the connection unexpectedly
webserver_1  |  This probably means the server terminated abnormally
webserver_1  |  before or while processing the request.
webserver_1  |  [SQL: 'SELECT 1'] (Background on this error at: 
http://sqlalche.me/e/e3q8)
{code}

It has something to do with the configure_orm function in airflow/settings.py, 
because that is the only usage of setup_event_handlers (from 
airflow/utils/sqlalchemy.py).

And if I disable connection pooling, then the warning seems to go away.

Beyond that, I am not sure where to go from here.  But something must be wrong. 
 




  was:
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Here's the traceback:

{code}
webserver_1  | Traceback (most recent call last):
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 68, 
in ping_connection
webserver_1  | connection.scalar(select([1]))
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
in scalar
webserver_1  | return self.execute(object_, *multiparams, **params).scalar()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
in execute
webserver_1  | return meth(self, multiparams, params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 273, 
in _execute_on_connection
webserver_1  | return connection._execute_clauseelement(self, multiparams, 
params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1099, 
in _execute_clauseelement
webserver_1  | distilled_params,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1240, 
in _execute_context
webserver_1  | e, statement, parameters, cursor, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1458, 
in _handle_dbapi_exception
webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
webserver_1  |   File 

[jira] [Updated] (AIRFLOW-4134) DB connection invalidated warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Description: 
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Here's the traceback:

{code}
webserver_1  | Traceback (most recent call last):
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 68, 
in ping_connection
webserver_1  | connection.scalar(select([1]))
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
in scalar
webserver_1  | return self.execute(object_, *multiparams, **params).scalar()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
in execute
webserver_1  | return meth(self, multiparams, params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 273, 
in _execute_on_connection
webserver_1  | return connection._execute_clauseelement(self, multiparams, 
params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1099, 
in _execute_clauseelement
webserver_1  | distilled_params,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1240, 
in _execute_context
webserver_1  | e, statement, parameters, cursor, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1458, 
in _handle_dbapi_exception
webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 296, 
in raise_from_cause
webserver_1  | reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 276, 
in reraise
webserver_1  | raise value.with_traceback(tb)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1236, 
in _execute_context
webserver_1  | cursor, statement, parameters, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 
536, in do_execute
webserver_1  | cursor.execute(statement, parameters)
webserver_1  | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) 
server closed the connection unexpectedly
webserver_1  |  This probably means the server terminated abnormally
webserver_1  |  before or while processing the request.
webserver_1  |  [SQL: 'SELECT 1'] (Background on this error at: 
http://sqlalche.me/e/e3q8)
{code}

It has something to do with the configure_orm function in airflow/settings.py, 
because that is the only usage of setup_event_handlers (from 
airflow/utils/sqlalchemy.py).

I am not sure where to go from here.  But something must be wrong.  




  was:
I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Here's the traceback:

{code}
webserver_1  | Traceback (most recent call last):
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 68, 
in ping_connection
webserver_1  | connection.scalar(select([1]))
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
in scalar
webserver_1  | return self.execute(object_, *multiparams, **params).scalar()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
in execute
webserver_1  | return meth(self, multiparams, params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 273, 
in _execute_on_connection
webserver_1  | return connection._execute_clauseelement(self, multiparams, 
params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1099, 
in _execute_clauseelement
webserver_1  | distilled_params,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1240, 
in _execute_context
webserver_1  | e, statement, parameters, cursor, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1458, 
in _handle_dbapi_exception
webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 296, 
in raise_from_cause
webserver_1  | 

[jira] [Updated] (AIRFLOW-4134) "DB connection invalidated" warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish updated AIRFLOW-4134:
-
Summary: "DB connection invalidated" warning at every zombie check  (was: 
DB connection invalidated warning at every zombie check)

> "DB connection invalidated" warning at every zombie check
> -
>
> Key: AIRFLOW-4134
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4134
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.2
>Reporter: Daniel Standish
>Priority: Major
>
> I am finding with 1.10.2 that I seem to get a warning {{DB connection 
> invalidated. Reconnecting...}} very frequently.
> I to try to diagnose I added logging of the triggering error on line 79 in 
> airflow/utils/sqlalchemy.py, from which this warning is generated.
> Here's the traceback:
> {code}
> webserver_1  | Traceback (most recent call last):
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 
> 68, in ping_connection
> webserver_1  | connection.scalar(select([1]))
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
> in scalar
> webserver_1  | return self.execute(object_, *multiparams, 
> **params).scalar()
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
> in execute
> webserver_1  | return meth(self, multiparams, params)
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 
> 273, in _execute_on_connection
> webserver_1  | return connection._execute_clauseelement(self, 
> multiparams, params)
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
> 1099, in _execute_clauseelement
> webserver_1  | distilled_params,
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
> 1240, in _execute_context
> webserver_1  | e, statement, parameters, cursor, context
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
> 1458, in _handle_dbapi_exception
> webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 296, 
> in raise_from_cause
> webserver_1  | reraise(type(exception), exception, tb=exc_tb, cause=cause)
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 276, 
> in reraise
> webserver_1  | raise value.with_traceback(tb)
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 
> 1236, in _execute_context
> webserver_1  | cursor, statement, parameters, context
> webserver_1  |   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 
> 536, in do_execute
> webserver_1  | cursor.execute(statement, parameters)
> webserver_1  | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) 
> server closed the connection unexpectedly
> webserver_1  |This probably means the server terminated abnormally
> webserver_1  |before or while processing the request.
> webserver_1  |  [SQL: 'SELECT 1'] (Background on this error at: 
> http://sqlalche.me/e/e3q8)
> {code}
> It has something to do with the configure_orm function in 
> airflow/settings.py, because that is the only usage of setup_event_handlers 
> (from airflow/utils/sqlalchemy.py).
> I am not sure where to go from here.  But something must be wrong.  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4133) CLI `test` mode writes to DB on failure

2019-03-20 Thread Rob (JIRA)
Rob created AIRFLOW-4133:


 Summary: CLI `test` mode writes to DB on failure
 Key: AIRFLOW-4133
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4133
 Project: Apache Airflow
  Issue Type: Bug
  Components: cli
Affects Versions: 1.10.2
Reporter: Rob


The documentation for running {{airflow test}} says that this mode does NOT 
record any state in the DB.

However, {{def handle_failure}} will *always* add {{TaskFail}} to the session

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1536]

and then commit at the end

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1576]

regardless of the value for {{test_mode.}}

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4134) DB connection invalidated warning at every zombie check

2019-03-20 Thread Daniel Standish (JIRA)
Daniel Standish created AIRFLOW-4134:


 Summary: DB connection invalidated warning at every zombie check
 Key: AIRFLOW-4134
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4134
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: 1.10.2
Reporter: Daniel Standish


I am finding with 1.10.2 that I seem to get a warning {{DB connection 
invalidated. Reconnecting...}} very frequently.

I to try to diagnose I added logging of the triggering error on line 79 in 
airflow/utils/sqlalchemy.py, from which this warning is generated.

Here's the traceback:

{code}
webserver_1  | Traceback (most recent call last):
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/airflow/utils/sqlalchemy.py", line 68, 
in ping_connection
webserver_1  | connection.scalar(select([1]))
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 912, 
in scalar
webserver_1  | return self.execute(object_, *multiparams, **params).scalar()
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 980, 
in execute
webserver_1  | return meth(self, multiparams, params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 273, 
in _execute_on_connection
webserver_1  | return connection._execute_clauseelement(self, multiparams, 
params)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1099, 
in _execute_clauseelement
webserver_1  | distilled_params,
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1240, 
in _execute_context
webserver_1  | e, statement, parameters, cursor, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1458, 
in _handle_dbapi_exception
webserver_1  | util.raise_from_cause(sqlalchemy_exception, exc_info)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 296, 
in raise_from_cause
webserver_1  | reraise(type(exception), exception, tb=exc_tb, cause=cause)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 276, 
in reraise
webserver_1  | raise value.with_traceback(tb)
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1236, 
in _execute_context
webserver_1  | cursor, statement, parameters, context
webserver_1  |   File 
"/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 
536, in do_execute
webserver_1  | cursor.execute(statement, parameters)
webserver_1  | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) 
server closed the connection unexpectedly
webserver_1  |  This probably means the server terminated abnormally
webserver_1  |  before or while processing the request.
webserver_1  |  [SQL: 'SELECT 1'] (Background on this error at: 
http://sqlalche.me/e/e3q8)
{code}

I am not sure where to go from here.  But something must be wrong.  





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4133) CLI `test` mode writes to DB on task failure

2019-03-20 Thread Rob (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob updated AIRFLOW-4133:
-
Description: 
The documentation for running {{airflow test}} says that this mode does NOT 
record any state in the DB. However, failed tasks are currently recorded into 
table {{task_fail}} regardless of the {{test}} mode. Details:

{{def handle_failure}} will *always* add {{TaskFail}} to the session

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1536]

and then commit at the end

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1576]

regardless of the value for {{test_mode.}}

 

 

 

  was:
The documentation for running {{airflow test}} says that this mode does NOT 
record any state in the DB.

However, {{def handle_failure}} will *always* add {{TaskFail}} to the session

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1536]

and then commit at the end

[https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1576]

regardless of the value for {{test_mode.}}

 

 

 


> CLI `test` mode writes to DB on task failure
> 
>
> Key: AIRFLOW-4133
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4133
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: cli
>Affects Versions: 1.10.2
>Reporter: Rob
>Priority: Minor
>
> The documentation for running {{airflow test}} says that this mode does NOT 
> record any state in the DB. However, failed tasks are currently recorded into 
> table {{task_fail}} regardless of the {{test}} mode. Details:
> {{def handle_failure}} will *always* add {{TaskFail}} to the session
> [https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1536]
> and then commit at the end
> [https://github.com/apache/airflow/blob/ae295382a0f1ad7a7d8d0368483de22d03e88493/airflow/models/__init__.py#L1576]
> regardless of the value for {{test_mode.}}
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #4951: [AIRFLOW-4131] Make template undefined behavior configurable.

2019-03-20 Thread GitBox
codecov-io commented on issue #4951: [AIRFLOW-4131] Make template undefined 
behavior configurable.
URL: https://github.com/apache/airflow/pull/4951#issuecomment-475010786
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=h1) 
Report
   > Merging 
[#4951](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/ae295382a0f1ad7a7d8d0368483de22d03e88493?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4951/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4951  +/-   ##
   ==
   - Coverage75.6%   75.59%   -0.01% 
   ==
 Files 454  454  
 Lines   2922729228   +1 
   ==
 Hits2209622096  
   - Misses   7131 7132   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4951/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.88% <100%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=footer). 
Last update 
[ae29538...65af90c](https://codecov.io/gh/apache/airflow/pull/4951?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-4056) Dag file processing does not respect dag_dir_list_interval

2019-03-20 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish closed AIRFLOW-4056.

Resolution: Invalid

> Dag file processing does not respect dag_dir_list_interval
> --
>
> Key: AIRFLOW-4056
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4056
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.2
> Environment: I have confirmed this issue on mac and centos 
> environments, using mysql backend.
>Reporter: Daniel Standish
>Priority: Major
>
> The conf parameter {{dag_dir_list_interval}} seems to have no effect on dag 
> directory scanning.
> It seems to happen every 2 seconds, no matter what.  The default is supposed 
> to be 5 minutes.
> As  a result I see a scheduler output like this:
> {code:java}
> [2019-03-09 17:06:24,579] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:26,587] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:28,590] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:30,597] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:32,603] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:34,611] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:35,195] {sqlalchemy.py:79} WARNING - DB connection 
> invalidated. Reconnecting...
> [2019-03-09 17:06:36,615] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:38,623] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:40,631] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:42,637] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:44,644] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:46,205] {sqlalchemy.py:79} WARNING - DB connection 
> invalidated. Reconnecting...
> [2019-03-09 17:06:46,651] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:48,658] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:50,666] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:52,670] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:54,680] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:06:56,687] {jobs.py:1559} INFO - Harvesting DAG parsing 
> results{code}
> And no more is there the periodic printing of dag stats, like there was in 
> 1.10.1.
>  I can confirm that this is happening by adding this to something in dag 
> folder:
> {code:python}
> with open(Path('~/temp/test.log').expanduser(), 'at') as f:
> f.write(f"{datetime.now()}: i am imported\n")
> {code}
> Here is some scheduler output with debug log level:
> {code}
>    _
>  |__( )_  __/__  /  __
>   /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
> ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
>  _/_/  |_/_/  /_//_//_/  \//|__/
> [2019-03-09 17:20:59,042] {jobs.py:1477} INFO - Starting the scheduler
> [2019-03-09 17:20:59,042] {jobs.py:1485} INFO - Running execute loop for -1 
> seconds
> [2019-03-09 17:20:59,043] {jobs.py:1486} INFO - Processing each file at most 
> -1 times
> [2019-03-09 17:20:59,043] {jobs.py:1489} INFO - Searching for files in 
> /Users/dstandish/code/python_tfgetl/tfgetl/dags
> [2019-03-09 17:20:59,046] {jobs.py:1491} INFO - There are 11 files in 
> /Users/dstandish/code/python_tfgetl/tfgetl/dags
> [2019-03-09 17:20:59,105] {jobs.py:1534} INFO - Resetting orphaned tasks for 
> active dag runs
> [2019-03-09 17:20:59,121] {dag_processing.py:453} INFO - Launched 
> DagFileProcessorManager with pid: 57333
> [2019-03-09 17:20:59,122] {jobs.py:1548} DEBUG - Starting Loop...
> [2019-03-09 17:20:59,122] {jobs.py:1559} INFO - Harvesting DAG parsing results
> [2019-03-09 17:20:59,123] {jobs.py:1595} DEBUG - Heartbeating the executor
> [2019-03-09 17:20:59,123] {base_executor.py:118} DEBUG - 0 running task 
> instances
> [2019-03-09 17:20:59,123] {base_executor.py:119} DEBUG - 0 in queue
> [2019-03-09 17:20:59,123] {base_executor.py:120} DEBUG - 32 open slots
> [2019-03-09 17:20:59,124] {base_executor.py:149} DEBUG - Calling the  'airflow.executors.local_executor.LocalExecutor'> sync method
> [2019-03-09 17:20:59,128] {jobs.py:1613} DEBUG - Ran scheduling loop in 0.01 
> seconds
> [2019-03-09 17:20:59,129] {jobs.py:1614} DEBUG - Sleeping for 1.00 seconds
> [2019-03-09 17:20:59,130] {settings.py:51} INFO - Configured default timezone 
> 
> [2019-03-09 17:20:59,133] {logging_config.py:63} DEBUG - Unable to load 
> custom logging, using default config instead
> [2019-03-09 17:20:59,143] {settings.py:146} DEBUG - Setting up DB connection 

[GitHub] [airflow] ashb commented on issue #4569: [AIRFLOW-3745] Fix viewer not able to view dag details

2019-03-20 Thread GitBox
ashb commented on issue #4569: [AIRFLOW-3745] Fix viewer not able to view dag 
details
URL: https://github.com/apache/airflow/pull/4569#issuecomment-474986844
 
 
   @feng-tao I haven't managed to finish it but I got somewhere with migrations 
to deal with perms:
   
   https://gist.github.com/ashb/f43741740fb0eae59948d52634cda575
   
   The biggest issue is working out 1) what query to run, or 2) working out how 
to call the right classes/methods from within fab.security.* to update.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Reopened] (AIRFLOW-2548) Output Plugin Import Errors to WebUI

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor reopened AIRFLOW-2548:


> Output Plugin Import Errors to WebUI
> 
>
> Key: AIRFLOW-2548
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2548
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Andy Cooper
>Priority: Major
> Fix For: 2.0.0
>
>
> All,
>  
> We currently output all DAG import errors to the webUI. I propose we do the 
> same with plugin errors as well. This will provide a better user experience 
> by bubbling up all errors to the webUI instead of hiding them in stdOut.
>  
> Proposal...
>  * Extend models.ImportError to have a "type" field to distinguish from error 
> types.
>  * Prevent class SchedulerJob methods from clearing out and pulling from 
> models.ImportError if type = 'plugin'
>  * Create new ImportError records in plugins_manager.py for each plugin that 
> fails to import
>  * Prompt user in views.py with plugin ImportErrors - specifying that they 
> need to fix and restart webserver to resolve.
>  
> Does this seem reasonable to everyone? I'd be interested in taking on this 
> work if needed



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4118) Instrument dagrun duration

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4118?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797452#comment-16797452
 ] 

ASF GitHub Bot commented on AIRFLOW-4118:
-

feng-tao commented on pull request #4946: [AIRFLOW-4118] instrument DagRun 
duration
URL: https://github.com/apache/airflow/pull/4946
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Instrument dagrun duration
> --
>
> Key: AIRFLOW-4118
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4118
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4118) Instrument dagrun duration

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4118?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797453#comment-16797453
 ] 

ASF subversion and git services commented on AIRFLOW-4118:
--

Commit ae295382a0f1ad7a7d8d0368483de22d03e88493 in airflow's branch 
refs/heads/master from Chao-Han Tsai
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ae29538 ]

[AIRFLOW-4118] instrument DagRun duration (#4946)



> Instrument dagrun duration
> --
>
> Key: AIRFLOW-4118
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4118
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Chao-Han Tsai
>Assignee: Chao-Han Tsai
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] feng-tao merged pull request #4946: [AIRFLOW-4118] instrument DagRun duration

2019-03-20 Thread GitBox
feng-tao merged pull request #4946: [AIRFLOW-4118] instrument DagRun duration
URL: https://github.com/apache/airflow/pull/4946
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-2548) Output Plugin Import Errors to WebUI

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-2548.

   Resolution: Fixed
Fix Version/s: (was: 2.0.0)
   1.10.3

> Output Plugin Import Errors to WebUI
> 
>
> Key: AIRFLOW-2548
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2548
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Andy Cooper
>Priority: Major
> Fix For: 1.10.3
>
>
> All,
>  
> We currently output all DAG import errors to the webUI. I propose we do the 
> same with plugin errors as well. This will provide a better user experience 
> by bubbling up all errors to the webUI instead of hiding them in stdOut.
>  
> Proposal...
>  * Extend models.ImportError to have a "type" field to distinguish from error 
> types.
>  * Prevent class SchedulerJob methods from clearing out and pulling from 
> models.ImportError if type = 'plugin'
>  * Create new ImportError records in plugins_manager.py for each plugin that 
> fails to import
>  * Prompt user in views.py with plugin ImportErrors - specifying that they 
> need to fix and restart webserver to resolve.
>  
> Does this seem reasonable to everyone? I'd be interested in taking on this 
> work if needed



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3458) Refactor: Move Connection out of models.py

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3458.

   Resolution: Fixed
Fix Version/s: (was: 2.0.0)
   1.10.3

> Refactor: Move Connection out of models.py
> --
>
> Key: AIRFLOW-3458
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3458
> Project: Apache Airflow
>  Issue Type: Task
>  Components: models
>Affects Versions: 1.10.1
>Reporter: Fokko Driesprong
>Assignee: Bas Harenslak
>Priority: Major
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2548) Output Plugin Import Errors to WebUI

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2548?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797446#comment-16797446
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2548:


Reopenning to set fix version

> Output Plugin Import Errors to WebUI
> 
>
> Key: AIRFLOW-2548
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2548
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Andy Cooper
>Priority: Major
> Fix For: 2.0.0
>
>
> All,
>  
> We currently output all DAG import errors to the webUI. I propose we do the 
> same with plugin errors as well. This will provide a better user experience 
> by bubbling up all errors to the webUI instead of hiding them in stdOut.
>  
> Proposal...
>  * Extend models.ImportError to have a "type" field to distinguish from error 
> types.
>  * Prevent class SchedulerJob methods from clearing out and pulling from 
> models.ImportError if type = 'plugin'
>  * Create new ImportError records in plugins_manager.py for each plugin that 
> fails to import
>  * Prompt user in views.py with plugin ImportErrors - specifying that they 
> need to fix and restart webserver to resolve.
>  
> Does this seem reasonable to everyone? I'd be interested in taking on this 
> work if needed



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3458) Refactor: Move Connection out of models.py

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797447#comment-16797447
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3458:


Reopenning to set fix version

> Refactor: Move Connection out of models.py
> --
>
> Key: AIRFLOW-3458
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3458
> Project: Apache Airflow
>  Issue Type: Task
>  Components: models
>Affects Versions: 1.10.1
>Reporter: Fokko Driesprong
>Assignee: Bas Harenslak
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Reopened] (AIRFLOW-3458) Refactor: Move Connection out of models.py

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor reopened AIRFLOW-3458:


> Refactor: Move Connection out of models.py
> --
>
> Key: AIRFLOW-3458
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3458
> Project: Apache Airflow
>  Issue Type: Task
>  Components: models
>Affects Versions: 1.10.1
>Reporter: Fokko Driesprong
>Assignee: Bas Harenslak
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3623) Support download log file from UI

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3623:
---
Fix Version/s: (was: 2.0.0)
   1.10.3

> Support download log file from UI
> -
>
> Key: AIRFLOW-3623
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3623
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ui
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Major
>  Labels: newbie
> Fix For: 1.10.3
>
>
> for some large log files, it is not a good idea to fetch and render in the 
> UI. Adding the ability to let users to download by try_number in the dag 
> modal.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3009) Python 3.7 import collections.abc deprecation warning

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3009:
---
Fix Version/s: 1.10.3

> Python 3.7 import collections.abc deprecation warning 
> --
>
> Key: AIRFLOW-3009
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3009
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: core
>Affects Versions: 2.0.0
> Environment: Arch Linux, Python 3.7
>Reporter: Francis Lalonde
>Priority: Minor
> Fix For: 1.10.3
>
>
> After pip-installing Airflow from source, the following warning message 
> appears upon entering any airflow command from the prompt:
> {{/usr/lib/python3.7/site-packages/airflow/models.py:29: DeprecationWarning: 
> Using or importing the ABCs from 'collections' instead of from 
> 'collections.abc' is deprecated, and in 3.8 it will stop working}}{{from 
> collections import namedtuple, defaultdict, Hashable}}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3591) Fix start date, end date, duration for rescheduled tasks

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3591:
---
Fix Version/s: (was: 2.0.0)

> Fix start date, end date, duration for rescheduled tasks
> 
>
> Key: AIRFLOW-3591
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3591
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.1
>Reporter: Stefan Seelmann
>Assignee: Stefan Seelmann
>Priority: Major
> Fix For: 1.10.3
>
>
> For each reschedule the task instance start date is currently set the now() 
> which also implies that the duration is also too wrong (usually one or a few 
> seconds).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3218) Support for DAG level poking in external task sensor

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3218:
---
Fix Version/s: (was: 2.0.0)

> Support for DAG level poking in external task sensor
> 
>
> Key: AIRFLOW-3218
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3218
> Project: Apache Airflow
>  Issue Type: New Feature
>Affects Versions: 1.10.0
>Reporter: Marcin Szymanski
>Assignee: Marcin Szymanski
>Priority: Major
> Fix For: 1.10.3
>
>
> Currently external task sensor requires a specific task to finish. This 
> change adds support for DAG level checks, if task ID is not provided



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3590) In case of reschedule executor should not log success

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3590?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3590:
---
Fix Version/s: (was: 2.0.0)

> In case of reschedule executor should not log success
> -
>
> Key: AIRFLOW-3590
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3590
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executor
>Reporter: Stefan Seelmann
>Assignee: Stefan Seelmann
>Priority: Major
> Fix For: 1.10.3
>
>
> Based on comment from [~ashb] 
> https://github.com/apache/airflow/pull/3596#issuecomment-447590657
> The scheduler (when using SequentialExecutor, but that isn't relevant) logs 
> this task as Success!
> {code}
> [2018-12-15 18:59:13,635] {jobs.py:1100} INFO - 1 tasks up for execution:
>  
> [2018-12-15 18:59:13,649] {jobs.py:1135} INFO - Figuring out tasks to run in 
> Pool(name=None) with 128 open slots and 1 task instances in queue
> [2018-12-15 18:59:13,656] {jobs.py:1171} INFO - DAG hello_world has 0/16 
> running and queued tasks
> [2018-12-15 18:59:13,656] {jobs.py:1209} INFO - Setting the follow tasks to 
> queued state:
>  
> [2018-12-15 18:59:13,698] {jobs.py:1293} INFO - Setting the following 1 tasks 
> to queued state:
>  
> [2018-12-15 18:59:13,699] {jobs.py:1335} INFO - Sending ('hello_world', 
> 'wait', datetime.datetime(2018, 12, 15, 18, 50, tzinfo=), 1) 
> to executor with priority 2 and queue default
> [2018-12-15 18:59:13,701] {base_executor.py:56} INFO - Adding to queue: 
> airflow run hello_world wait 2018-12-15T18:50:00+00:00 --local -sd 
> /Users/ash/airflow/dags/foo.py
> [2018-12-15 18:59:13,742] {sequential_executor.py:45} INFO - Executing 
> command: airflow run hello_world wait 2018-12-15T18:50:00+00:00 --local -sd 
> /Users/ash/airflow/dags/foo.py
> [2018-12-15 18:59:15,558] {__init__.py:51} INFO - Using executor 
> SequentialExecutor
> [2018-12-15 18:59:15,755] {models.py:273} INFO - Filling up the DagBag from 
> /Users/ash/airflow/dags/foo.py
> [2018-12-15 18:59:15,833] {cli.py:530} INFO - Running  hello_world.wait 2018-12-15T18:50:00+00:00 [queued]> on host 
> themisto.localdomain
> [2018-12-15 18:59:21,427] {jobs.py:1439} INFO - Executor reports 
> hello_world.wait execution_date=2018-12-15 18:50:00+00:00 as success for 
> try_number 1
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3881) Correct hive_hook to_csv row number

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3881:
---
Fix Version/s: (was: 2.0.0)

> Correct hive_hook to_csv row number
> ---
>
> Key: AIRFLOW-3881
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3881
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.2
>Reporter: zhongjiajie
>Assignee: zhongjiajie
>Priority: Minor
>  Labels: easyfix
> Fix For: 1.10.3
>
>
> hive_hooks to_csv function had wrong row number each path so far, this Jira 
> ticket is correct row number



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] BasPH commented on issue #4335: [AIRFLOW-3458] Move models.Connection into separate file

2019-03-20 Thread GitBox
BasPH commented on issue #4335: [AIRFLOW-3458] Move models.Connection into 
separate file
URL: https://github.com/apache/airflow/pull/4335#issuecomment-474972002
 
 
   @ashb sounds good to me!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb edited a comment on issue #4335: [AIRFLOW-3458] Move models.Connection into separate file

2019-03-20 Thread GitBox
ashb edited a comment on issue #4335: [AIRFLOW-3458] Move models.Connection 
into separate file
URL: https://github.com/apache/airflow/pull/4335#issuecomment-474851078
 
 
   @BasPH @Fokko I'd like to cherry-pick this in to 1.10.3, but do do that we 
need a deprecation shim.
   
   I can do this with a few lines at the _end_ of models/__init__.py:
   
   ```python
   # To avoid circular import on Python2.7 we need to define this at the 
_bottom_
   from zope.deprecation import deprecated
   from airflow.models.connection import Connection  # noqa
   
   deprecated('Connection', 'has been moved to airflow.models.connection 
package. Please update your code before Airflow 2.0')
   
   ```
   
   Are you happy for me to "just" do that on the release branch?
   
   Trying that 
https://github.com/ashb/airflow/commit/f6d200876c9ab41aa68e8d51bb1e56aa38720393 
 - tests running https://travis-ci.org/ashb/airflow/builds/509059430


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-20 Thread GitBox
ashb commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja 
directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r267476864
 
 

 ##
 File path: airflow/www/templates/airflow/dag.html
 ##
 @@ -290,6 +294,7 @@ 
 {% endblock %}
 {% block tail %}
   {{ super() }}
+  
 
 Review comment:
   It seems most/all pages include this file. Should we either add it in to 
base.js, or add this file in baselayout instead?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3746) DockerOperator tasks in Airflow celery worker are stuck in "Running" state

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797395#comment-16797395
 ] 

ASF GitHub Bot commented on AIRFLOW-3746:
-

ashwiniadiga commented on pull request #4583: [AIRFLOW-3746] Fix to prevent 
missing container exit
URL: https://github.com/apache/airflow/pull/4583
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


>  DockerOperator tasks in Airflow celery worker are stuck in "Running" state
> ---
>
> Key: AIRFLOW-3746
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3746
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Reporter: Ashwini
>Priority: Major
>
> With the following DAG and task and using the celery executor, the task runs 
> but never completes.
> *from* *airflow* *import* DAG
> *from* *airflow.operators.bash_operator* *import* BashOperator
> *from* *airflow.operators.docker_operator* *import* DockerOperator
> *from* *datetime* *import* datetime, timedelta
>  
>  
> default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2018, 12, 31),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "retry_delay": timedelta(minutes=5),
> }
>  
> dag = DAG("celery_test", default_args=default_args, 
> schedule_interval=timedelta(1))
>   DockerOperator(task_id ="test_docker", image = 
> "gitlab-registry.nordstrom.com/merchprice/airflow:hello_python", retries=0, 
> xcom_all=True , dag = dag)
>  
> t2.set_upstream(t1)
> 
> This is verison of airfow and celery and using
> python 3.6.
> apache-airflow   1.10.1     
> celery           4.1.1      
> docker           3.7.0      
>  
> --
> Here is the logs:
> *** Log file does not exist: 
> /home/x9eu/airflow/logs/celery_test/test_docker/2019-01-16T00:00:00+00:00/1.log
> *** Fetching from: 
> http://test.com:8793/log/celery_test/test_docker/2019-01-16T00:00:00+00:00/1.log
>  
> [2019-01-21 20:49:26,260] \{models.py:1361} INFO - Dependencies all met for 
> 
> [2019-01-21 20:49:26,742] \{models.py:1361} INFO - Dependencies all met for 
> 
> [2019-01-21 20:49:26,742] \{models.py:1573} INFO - 
> 
> Starting attempt 1 of 1
> 
>  
> [2019-01-21 20:49:26,925] \{models.py:1595} INFO - Executing 
>  on 2019-01-16T00:00:00+00:00
> [2019-01-21 20:49:26,925] \{base_task_runner.py:118} INFO - Running: ['bash', 
> '-c', 'airflow run celery_test test_docker 2019-01-16T00:00:00+00:00 --pickle 
> 20 --job_id 59 --raw --cfg_path /tmp/tmps0u9a_e0']
> [2019-01-21 20:49:27,524] \{base_task_runner.py:101} INFO - Job 59: Subtask 
> test_docker [2019-01-21 20:49:27,523] \{settings.py:174} INFO - 
> setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
> [2019-01-21 20:49:28,187] \{base_task_runner.py:101} INFO - Job 59: Subtask 
> test_docker [2019-01-21 20:49:28,185] \{__init__.py:51} INFO - Using executor 
> CeleryExecutor
> [2019-01-21 20:49:29,544] \{base_task_runner.py:101} INFO - Job 59: Subtask 
> test_docker [2019-01-21 20:49:29,542] \{cli.py:470} INFO - Loading pickle id 
> 20
> [2019-01-21 20:49:31,140] \{base_task_runner.py:101} INFO - Job 59: Subtask 
> test_docker [2019-01-21 20:49:31,137] \{cli.py:484} INFO - Running 
>  
> on host test.com
> [2019-01-21 20:49:32,603] \{docker_operator.py:182} INFO - Starting docker 
> container from image registry/airflow:hello_python
> [2019-01-21 20:49:48,770] \{docker_operator.py:228} INFO - Hello, %d 0
> Hello, %d 1
> Hello, %d 2
> Hello, %d 3
> Hello, %d 4
> Hello, %d 5
> Hello, %d 6
> Hello, %d 7
> Hello, %d 8
> Hello, %d 9
> Hello, %d 10
> Hello, %d 11
> Hello, %d 12
> Hello, %d 13
> Hello, %d 14
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3746) DockerOperator tasks in Airflow celery worker are stuck in "Running" state

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797396#comment-16797396
 ] 

ASF GitHub Bot commented on AIRFLOW-3746:
-

ashwiniadiga commented on pull request #4583: [AIRFLOW-3746] Fix to prevent 
missing container exit
URL: https://github.com/apache/airflow/pull/4583
 
 
   My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3746)
   
   switch from cli.logs to cli.attach to prevent missing container exit
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[(https://issues.apache.org/jira/browse/AIRFLOW-3746)] 
 - https://issues.apache.org/jira/browse/AIRFLOW-3746
   
   
   ### Description
   
   - My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) 
   switch from cli.logs to cli.attach to prevent missing container exit
   With DockerOperator and using the celery executor, the task runs but never 
completes. It hungs as 
   container.logs / certain container hangs.
   
   since The contianer.logs() function is a wrapper around container .attach() 
method, which you can use instead if you want to fetch/stream container output 
without first retrieving the entire backlog.
   So changing the container.logs() to container.attach() gives the same output 
and completes the task.
   with the following log.
   The docker_operator Task exited with return code 0
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:All the existing tests are fixed. No new tests are 
not required.
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


>  DockerOperator tasks in Airflow celery worker are stuck in "Running" state
> ---
>
> Key: AIRFLOW-3746
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3746
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Reporter: Ashwini
>Priority: Major
>
> With the following DAG and task and using the celery executor, the task runs 
> but never completes.
> *from* *airflow* *import* DAG
> *from* *airflow.operators.bash_operator* *import* BashOperator
> *from* *airflow.operators.docker_operator* *import* DockerOperator
> *from* *datetime* *import* datetime, timedelta
>  
>  
> default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2018, 12, 31),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "retry_delay": timedelta(minutes=5),
> }
>  
> dag = DAG("celery_test", default_args=default_args, 
> schedule_interval=timedelta(1))
>   DockerOperator(task_id ="test_docker", image = 
> "gitlab-registry.nordstrom.com/merchprice/airflow:hello_python", retries=0, 
> xcom_all=True , dag = dag)
>  
> t2.set_upstream(t1)
> 
> This is verison of airfow and celery and using
> python 3.6.
> apache-airflow   1.10.1     
> celery           4.1.1      
> docker           3.7.0      
>  
> --
> Here is the logs:
> *** Log file does not exist: 
> /home/x9eu/airflow/logs/celery_test/test_docker/2019-01-16T00:00:00+00:00/1.log
> *** Fetching from: 
> http://test.com:8793/log/celery_test/test_docker/2019-01-16T00:00:00+00:00/1.log
>  
> [2019-01-21 20:49:26,260] \{models.py:1361} INFO - Dependencies all met for 
> 
> [2019-01-21 20:49:26,742] \{models.py:1361} INFO - 

[GitHub] [airflow] ashwiniadiga opened a new pull request #4583: [AIRFLOW-3746] Fix to prevent missing container exit

2019-03-20 Thread GitBox
ashwiniadiga opened a new pull request #4583: [AIRFLOW-3746] Fix to prevent 
missing container exit
URL: https://github.com/apache/airflow/pull/4583
 
 
   My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW-3746)
   
   switch from cli.logs to cli.attach to prevent missing container exit
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following 
[(https://issues.apache.org/jira/browse/AIRFLOW-3746)] 
 - https://issues.apache.org/jira/browse/AIRFLOW-3746
   
   
   ### Description
   
   - My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) 
   switch from cli.logs to cli.attach to prevent missing container exit
   With DockerOperator and using the celery executor, the task runs but never 
completes. It hungs as 
   container.logs / certain container hangs.
   
   since The contianer.logs() function is a wrapper around container .attach() 
method, which you can use instead if you want to fetch/stream container output 
without first retrieving the entire backlog.
   So changing the container.logs() to container.attach() gives the same output 
and completes the task.
   with the following log.
   The docker_operator Task exited with return code 0
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:All the existing tests are fixed. No new tests are 
not required.
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashwiniadiga commented on issue #4583: [AIRFLOW-3746] Fix to prevent missing container exit

2019-03-20 Thread GitBox
ashwiniadiga commented on issue #4583: [AIRFLOW-3746] Fix to prevent missing 
container exit
URL: https://github.com/apache/airflow/pull/4583#issuecomment-474960633
 
 
   Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashwiniadiga commented on issue #4583: [AIRFLOW-3746] Fix to prevent missing container exit

2019-03-20 Thread GitBox
ashwiniadiga commented on issue #4583: [AIRFLOW-3746] Fix to prevent missing 
container exit
URL: https://github.com/apache/airflow/pull/4583#issuecomment-474960618
 
 
   Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashwiniadiga closed pull request #4583: [AIRFLOW-3746] Fix to prevent missing container exit

2019-03-20 Thread GitBox
ashwiniadiga closed pull request #4583: [AIRFLOW-3746] Fix to prevent missing 
container exit
URL: https://github.com/apache/airflow/pull/4583
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4132) No default connection id for some operators

2019-03-20 Thread Kamil Bregula (JIRA)
Kamil Bregula created AIRFLOW-4132:
--

 Summary: No default connection id for some operators
 Key: AIRFLOW-4132
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4132
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Kamil Bregula


Hello,

Some operators do not have a default ID connection set, making their use more 
difficult.

Greets



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-20 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-474939864
 
 
   Random testing failure? 
   
![image](https://user-images.githubusercontent.com/814322/54705136-825ae400-4b12-11e9-8f39-52ab417dcd75.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r267442260
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -74,7 +76,16 @@ def execute(self, context):
 self.result_configuration['OutputLocation'] = self.output_location
 self.query_execution_id = self.hook.run_query(self.query, 
self.query_execution_context,
   
self.result_configuration, self.client_request_token)
-self.hook.poll_query_status(self.query_execution_id)
+query_status = self.hook.poll_query_status(self.query_execution_id, 
self.max_tries)
+
+if not query_status or query_status in AWSAthenaHook.FAILURE_STATES:
+raise Exception(
+'Athena job failed. Final state is {}, query_execution_id is 
{}.'
+.format(query_status, self.query_execution_id))
+elif query_status in AWSAthenaHook.INTERMEDIATE_STATES:
 
 Review comment:
   @XD-DENG Thank you for the detailed explanation. It's very clear and helpful.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4943: [AIRFLOW-XXX] add description of is_delete_operator_pod

2019-03-20 Thread GitBox
r39132 commented on a change in pull request #4943: [AIRFLOW-XXX] add 
description of is_delete_operator_pod
URL: https://github.com/apache/airflow/pull/4943#discussion_r267428719
 
 

 ##
 File path: airflow/contrib/operators/kubernetes_pod_operator.py
 ##
 @@ -80,6 +80,10 @@ class KubernetesPodOperator(BaseOperator):
 /airflow/xcom/return.json in the container will also be pushed to an
 XCom when the container completes.
 :type do_xcom_push: bool
+:param is_delete_operator_pod: If True, delete the pod after it's reached
+a final state (independent of pod success), or the task instance gets 
 
 Review comment:
   Nit: Since this is a boolean param, can you change the description to "If... 
else" instead of "If ... or". Reading this, it's a bit confusing what this 
operator does.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-2603) Set up Intersphinx for cross-project references in the docs

2019-03-20 Thread Kamil Bregula (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula reassigned AIRFLOW-2603:
--

Assignee: Kamil Bregula  (was: Tim Swast)

> Set up Intersphinx for cross-project references in the docs
> ---
>
> Key: AIRFLOW-2603
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2603
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Tim Swast
>Assignee: Kamil Bregula
>Priority: Minor
>
> Currently there are many modules mocked out in docs/conf.py These modules 
> could be unmocked and linked to from the API reference documentation by using 
> Intersphinx.
>  
> For example, see how this is configured in Pandas at 
> https://github.com/pandas-dev/pandas/blob/master/doc/source/conf.py#L366-L374



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4008) Add ability to set a ConfigMap to EnvFrom for pods brought up by the Kubernetes Executor

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797277#comment-16797277
 ] 

ASF GitHub Bot commented on AIRFLOW-4008:
-

davlum commented on pull request #4952: feat/AIRFLOW-4008/k8s-executor-env-from
URL: https://github.com/apache/airflow/pull/4952
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add ability to set a ConfigMap to EnvFrom for pods brought up by the 
> Kubernetes Executor
> 
>
> Key: AIRFLOW-4008
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4008
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: kubernetes
>Reporter: David Lum
>Assignee: David Lum
>Priority: Minor
>  Labels: Kubernetes, kubernetes
>
> This ticket is split off from AIRFLOW-3258. This ticket will allow users to 
> specify configMapRefs and secretRefs to envFrom for worker pods brought up by 
> the KubernetesExecutor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2603) Set up Intersphinx for cross-project references in the docs

2019-03-20 Thread Kamil Bregula (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797278#comment-16797278
 ] 

Kamil Bregula commented on AIRFLOW-2603:


Done via: [https://github.com/apache/airflow/pull/4655]

> Set up Intersphinx for cross-project references in the docs
> ---
>
> Key: AIRFLOW-2603
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2603
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Tim Swast
>Assignee: Tim Swast
>Priority: Minor
>
> Currently there are many modules mocked out in docs/conf.py These modules 
> could be unmocked and linked to from the API reference documentation by using 
> Intersphinx.
>  
> For example, see how this is configured in Pandas at 
> https://github.com/pandas-dev/pandas/blob/master/doc/source/conf.py#L366-L374



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] davlum opened a new pull request #4952: feat/AIRFLOW-4008/k8s-executor-env-from

2019-03-20 Thread GitBox
davlum opened a new pull request #4952: feat/AIRFLOW-4008/k8s-executor-env-from
URL: https://github.com/apache/airflow/pull/4952
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4012) Upgrade upper bound of tabulate to 0.8.3

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4012?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797236#comment-16797236
 ] 

ASF subversion and git services commented on AIRFLOW-4012:
--

Commit ed10eb4e01e40d44ac6bc71e28a5909f60943ce8 in airflow's branch 
refs/heads/v1-10-stable from OmerJog
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ed10eb4 ]

[AIRFLOW-4012]  - Upgrade tabulate to 0.8.3 (#4838)

Pin upper version bound to 0.9 too, so we don't have to update this again
until 0.9 is out (when we can check for compatability)


> Upgrade upper bound of tabulate to 0.8.3
> 
>
> Key: AIRFLOW-4012
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4012
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: omerjog
>Priority: Major
> Fix For: 1.10.3
>
>
> Version 0.8.3 was released in January 2019



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb edited a comment on issue #4335: [AIRFLOW-3458] Move models.Connection into separate file

2019-03-20 Thread GitBox
ashb edited a comment on issue #4335: [AIRFLOW-3458] Move models.Connection 
into separate file
URL: https://github.com/apache/airflow/pull/4335#issuecomment-474851078
 
 
   @BasPH @Fokko I'd like to cherry-pick this in to 1.10.3, but do do that we 
need a deprecation shim.
   
   I can do this with a few lines at the _end_ of models/__init__.py:
   
   ```python
   # To avoid circular import on Python2.7 we need to define this at the 
_bottom_
   from zope.deprecation import deprecated
   from airflow.models.connection import Connection  # noqa
   
   deprecated('Connection', 'has been moved to airflow.models.connection 
package. Please update your code before Airflow 2.0')
   
   ```
   
   Are you happy for me to "just" do that on the release branch?
   
   Trying that 
https://github.com/ashb/airflow/commit/ae414097f46b9c4eef5e41907112b328fb13203d 
 - tests running https://travis-ci.org/ashb/airflow/builds/508974769


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2767) Upgrade gunicorn to 19.5.0 or greater to avoid moderate-severity CVE

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797239#comment-16797239
 ] 

ASF subversion and git services commented on AIRFLOW-2767:
--

Commit f2d570385b0b7c6ca1a871bf9d92aca0231077e7 in airflow's branch 
refs/heads/v1-10-stable from RosterIn
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f2d5703 ]

[AIRFLOW-2767] - Upgrade gunicorn to 19.5.0 to avoid moderate-severity CVE 
(#4795)

Upgrade gunicorn to 19.5.0 to avoid moderate-severity CVE

> Upgrade gunicorn to 19.5.0 or greater to avoid moderate-severity CVE
> 
>
> Key: AIRFLOW-2767
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2767
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Siddharth Anand
>Priority: Major
> Fix For: 1.10.3
>
>
> Refer to the moderate-severity CVE in gunicorn 19.4.5 (apparently fixed in 
> 19.5.0)
> [https://nvd.nist.gov/vuln/detail/CVE-2018-1000164] 
> Currently, apache airflow's setup.py allows 19.4.0
> -s



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3701) Google Cloud Vision Product Search operators

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797234#comment-16797234
 ] 

ASF subversion and git services commented on AIRFLOW-3701:
--

Commit b14d070836165d046056f9c6a7d474fbb34aca28 in airflow's branch 
refs/heads/v1-10-stable from Szymon Przedwojski
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=b14d070 ]

[AIRFLOW-3701] Add Google Cloud Vision Product Search operators (#4665)


> Google Cloud Vision Product Search operators
> 
>
> Key: AIRFLOW-3701
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3701
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp
>Reporter: Szymon Przedwojski
>Assignee: Szymon Przedwojski
>Priority: Minor
> Fix For: 1.10.3
>
>
> Implement operators to access the Cloud Vision 
> [Product|https://cloud.google.com/vision/product-search/docs/reference/rest/v1/projects.locations.products]
>  and [Product 
> Set|https://cloud.google.com/vision/product-search/docs/reference/rest/v1/projects.locations.productSets]
>  APIs.
>  
> |CloudVisionProductSetCreateOperator|
> |CloudVisionProductSetUpdateOperator|
> |CloudVisionProductSetGetOperator|
> |CloudVisionProductSetDeleteOperator|
> |CloudVisionProductCreateOperator|
> |CloudVisionProductUpdateOperator|
> |CloudVisionProductGetOperator|
> |CloudVisionProductDeleteOperator|



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3795) provide_context is not a passable parameter for PythonVirtualenvOperator

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797238#comment-16797238
 ] 

ASF subversion and git services commented on AIRFLOW-3795:
--

Commit 94f355c7cd3bb4e64974c2f8f0f6e796da9423ea in airflow's branch 
refs/heads/v1-10-stable from Sergio Soto
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=94f355c ]

[AIRFLOW-3795] provide_context param is now used (#4735)

* provide_context param is now used

* Fixed new PythonVirtualenvOperator test


> provide_context is not a passable parameter for PythonVirtualenvOperator
> 
>
> Key: AIRFLOW-3795
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3795
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Reporter: Susannah Doss
>Assignee: Sergio Soto Núñez
>Priority: Trivial
> Fix For: 1.10.3
>
>
> `PythonVirtualenvOperator` does not allow me to specify 
> `provide_context=True`: 
> https://github.com/apache/airflow/blob/83cb9c3acdd3b4eeadf1cab3cb45d644c3e9ede0/airflow/operators/python_operator.py#L242
> However, I am able to do so when I use the plain `PythonOperator`. I can't 
> see a reason why I wouldn't be allowed to have it be set to `True` when using 
> a `PythonVirtualenvOperator`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3758) Airflow command fails when remote logging enabled to azure blob

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797235#comment-16797235
 ] 

ASF subversion and git services commented on AIRFLOW-3758:
--

Commit 753cc45730d4afe7a0f15e3fc1d59d3c0d06d1d2 in airflow's branch 
refs/heads/v1-10-stable from Tanay Tummalapalli
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=753cc45 ]

[AIRFLOW-3758] Fix circular import in WasbTaskHandler (#4601)

WasbHook was causing a circular import error when configure_logging() was 
called.

> Airflow command fails when remote logging enabled to azure blob
> ---
>
> Key: AIRFLOW-3758
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3758
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.2
>Reporter: Marko Kattelus
>Assignee: Tanay Tummalapalli
>Priority: Major
> Fix For: 1.10.3
>
>
> I tried to update to 1.10.2 but when running
> {code:java}
> airflow initdb
> {code}
> I get following:
> {code:java}
> webserver_1 | Updating database
> webserver_1 | Unable to load the config, contains a configuration error.
> webserver_1 | Traceback (most recent call last):
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 390, in 
> resolve
> webserver_1 | found = getattr(found, frag)
> webserver_1 | AttributeError: module 'airflow.utils.log' has no attribute 
> 'wasb_task_handler'
> webserver_1 |
> webserver_1 | During handling of the above exception, another exception 
> occurred:
> webserver_1 |
> webserver_1 | Traceback (most recent call last):
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 392, in 
> resolve
> webserver_1 | self.importer(used)
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py",
>  line 23, in 
> webserver_1 | from airflow.contrib.hooks.wasb_hook import WasbHook
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", 
> line 22, in 
> webserver_1 | from airflow.hooks.base_hook import BaseHook
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, 
> in 
> webserver_1 | from airflow.models import Connection
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 86, in 
> 
> webserver_1 | from airflow.utils.dag_processing import list_py_file_paths
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", 
> line 49, in 
> webserver_1 | from airflow.settings import logging_class_path
> webserver_1 | ImportError: cannot import name 'logging_class_path'
> webserver_1 |
> webserver_1 | The above exception was the direct cause of the following 
> exception:
> webserver_1 |
> webserver_1 | Traceback (most recent call last):
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 565, in 
> configure
> webserver_1 | handler = self.configure_handler(handlers[name])
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 715, in 
> configure_handler
> webserver_1 | klass = self.resolve(cname)
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 399, in 
> resolve
> webserver_1 | raise v
> webserver_1 | File "/usr/local/lib/python3.6/logging/config.py", line 392, in 
> resolve
> webserver_1 | self.importer(used)
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/log/wasb_task_handler.py",
>  line 23, in 
> webserver_1 | from airflow.contrib.hooks.wasb_hook import WasbHook
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/wasb_hook.py", 
> line 22, in 
> webserver_1 | from airflow.hooks.base_hook import BaseHook
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/hooks/base_hook.py", line 28, 
> in 
> webserver_1 | from airflow.models import Connection
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 86, in 
> 
> webserver_1 | from airflow.utils.dag_processing import list_py_file_paths
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", 
> line 49, in 
> webserver_1 | from airflow.settings import logging_class_path
> webserver_1 | ValueError: Cannot resolve 
> 'airflow.utils.log.wasb_task_handler.WasbTaskHandler': cannot import name 
> 'logging_class_path'
> webserver_1 |
> webserver_1 | During handling of the above exception, another exception 
> occurred:
> webserver_1 |
> webserver_1 | Traceback (most recent call last):
> webserver_1 | File "/usr/local/bin/airflow", line 21, in 
> webserver_1 | from airflow import configuration
> webserver_1 | File 
> "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", 

[jira] [Commented] (AIRFLOW-4019) AWS Athena Sensor's object has no attribute 'mode'

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4019?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797237#comment-16797237
 ] 

ASF subversion and git services commented on AIRFLOW-4019:
--

Commit 0fb32166825d630cc5e87b39588e280737567448 in airflow's branch 
refs/heads/v1-10-stable from Mariko Wakabayashi
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=0fb3216 ]

[AIRFLOW-4019] Fix AWS Athena Sensor object has no attribute 'mode' (#4844)



> AWS Athena Sensor's object has no attribute 'mode'
> --
>
> Key: AIRFLOW-4019
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4019
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.2
>Reporter: Mariko Wakabayashi
>Assignee: Mariko Wakabayashi
>Priority: Major
> Fix For: 1.10.3
>
>
> *Bug*
> {code:java}
> [2019-03-05 18:52:32,317] {models.py:1788} ERROR - 'AthenaSensor' object has 
> no attribute 'mode'
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 1657, 
> in _run_raw_task
> result = task_copy.execute(context=context)
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
>  line 92, in execute
> if self.reschedule:
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
>  line 123, in reschedule
> return self.mode == 'reschedule'{code}
> *Fix*
>  * AWS Athena Sensor's first argument to super() is incorrect
>  * 
> [https://github.com/apache/airflow/blob/master/airflow/contrib/sensors/aws_athena_sensor.py#L59]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3615) Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 -> 3.5 (but not in 3.6)

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797229#comment-16797229
 ] 

ASF subversion and git services commented on AIRFLOW-3615:
--

Commit dd8ce6a7123170ef4b0f719fb773527b17d9348c in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=dd8ce6a ]

[AIRFLOW-3615] Preserve case of UNIX socket paths in Connections (#4591)



> Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 
> -> 3.5 (but not in 3.6) 
> --
>
> Key: AIRFLOW-3615
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3615
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jarek Potiuk
>Assignee: Kamil Bregula
>Priority: Major
> Fix For: 1.10.3
>
>
> There is a problem with case sensitivity of parsing URI for database 
> connections which are using local UNIX sockets rather than TCP connection.
> In case of local UNIX sockets the hostname part of the URI contains 
> url-encoded local socket path rather than actual hostname and in case this 
> path contains uppercase characters, urlparse will deliberately lowercase them 
> when parsing. This is perfectly fine for hostnames (according to 
> [https://tools.ietf.org/html/rfc3986#section-6.2.3)] case normalisation 
> should be done for hostnames.
> However urlparse still uses hostname if the URI does not contain host but 
> only local path (i.e. when the location starts with %2F ("/")). What's more - 
> the host gets converted to lowercase for python 2.7 - 3.5. Surprisingly this 
> is somewhat "fixed" in 3.6 (i.e if the URL location starts with %2F, the 
> hostname is not normalized to lowercase any more ! - see below snippets 
> showing the behaviours for different python versions) .
> In Airflow's Connection this problem bubbles up. Airflow uses urlparse to get 
> the hostname/path in models.py:parse_from_uri and in case of UNIX sockets it 
> is done via hostname. There is no other, reliable way when using urlparse 
> because the path can also contain 'authority' (user/password) and this is 
> urlparse's job to separate them out. The Airflow's Connection similarly does 
> not make a distinction of TCP vs. local socket connection and it uses host 
> field to store the  socket path (it's case sensitive however). So you can use 
> UPPERCASE when you define connection in the database, but this is a problem 
> for parsing connections from environment variables, because we currently 
> cannot pass a URI where socket path contains UPPERCASE characters.
> Since urlparse is really there to parse URLs and it is not good for parsing 
> non-URL URIs - we should likely use different parser which handles more 
> generic URIs - including non-lowercasing path for all versions of python.
> I think we could also consider adding local path to Connection model and use 
> it instead of hostname to store the socket path. This approach would be the 
> "correct" one, but it might introduce some compatibility issues, so maybe 
> it's not worth, considering that host is case sensitive in Airflow.
> Snippet showing urlparse behaviour in different python versions:
> {quote}Python 2.7.10 (default, Aug 17 2018, 19:45:58)
>  [GCC 4.2.1 Compatible Apple LLVM 10.0.0 (clang-1000.0.42)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.5.4 (v3.5.4:3f56838976, Aug 7 2017, 12:56:33)
>  [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  Traceback (most recent call last):
>  File "", line 1, in 
>  ImportError: No module named 'urlparse'
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.6.7 (v3.6.7:6ec5cf24b7, Oct 20 2018, 03:02:14)
>  [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  {color:#ff}'%2FAAA'{color}
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3615) Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 -> 3.5 (but not in 3.6)

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797228#comment-16797228
 ] 

ASF GitHub Bot commented on AIRFLOW-3615:
-

ashb commented on pull request #4591: [AIRFLOW-3615] Parse hostname using netloc
URL: https://github.com/apache/airflow/pull/4591
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 
> -> 3.5 (but not in 3.6) 
> --
>
> Key: AIRFLOW-3615
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3615
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jarek Potiuk
>Assignee: Kamil Bregula
>Priority: Major
>
> There is a problem with case sensitivity of parsing URI for database 
> connections which are using local UNIX sockets rather than TCP connection.
> In case of local UNIX sockets the hostname part of the URI contains 
> url-encoded local socket path rather than actual hostname and in case this 
> path contains uppercase characters, urlparse will deliberately lowercase them 
> when parsing. This is perfectly fine for hostnames (according to 
> [https://tools.ietf.org/html/rfc3986#section-6.2.3)] case normalisation 
> should be done for hostnames.
> However urlparse still uses hostname if the URI does not contain host but 
> only local path (i.e. when the location starts with %2F ("/")). What's more - 
> the host gets converted to lowercase for python 2.7 - 3.5. Surprisingly this 
> is somewhat "fixed" in 3.6 (i.e if the URL location starts with %2F, the 
> hostname is not normalized to lowercase any more ! - see below snippets 
> showing the behaviours for different python versions) .
> In Airflow's Connection this problem bubbles up. Airflow uses urlparse to get 
> the hostname/path in models.py:parse_from_uri and in case of UNIX sockets it 
> is done via hostname. There is no other, reliable way when using urlparse 
> because the path can also contain 'authority' (user/password) and this is 
> urlparse's job to separate them out. The Airflow's Connection similarly does 
> not make a distinction of TCP vs. local socket connection and it uses host 
> field to store the  socket path (it's case sensitive however). So you can use 
> UPPERCASE when you define connection in the database, but this is a problem 
> for parsing connections from environment variables, because we currently 
> cannot pass a URI where socket path contains UPPERCASE characters.
> Since urlparse is really there to parse URLs and it is not good for parsing 
> non-URL URIs - we should likely use different parser which handles more 
> generic URIs - including non-lowercasing path for all versions of python.
> I think we could also consider adding local path to Connection model and use 
> it instead of hostname to store the socket path. This approach would be the 
> "correct" one, but it might introduce some compatibility issues, so maybe 
> it's not worth, considering that host is case sensitive in Airflow.
> Snippet showing urlparse behaviour in different python versions:
> {quote}Python 2.7.10 (default, Aug 17 2018, 19:45:58)
>  [GCC 4.2.1 Compatible Apple LLVM 10.0.0 (clang-1000.0.42)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.5.4 (v3.5.4:3f56838976, Aug 7 2017, 12:56:33)
>  [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  Traceback (most recent call last):
>  File "", line 1, in 
>  ImportError: No module named 'urlparse'
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.6.7 (v3.6.7:6ec5cf24b7, Oct 20 2018, 03:02:14)
>  [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  {color:#ff}'%2FAAA'{color}
> {quote}
>  



--
This message was sent by 

[jira] [Resolved] (AIRFLOW-3615) Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 -> 3.5 (but not in 3.6)

2019-03-20 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3615.

   Resolution: Fixed
Fix Version/s: 1.10.3

> Connection parsed from URI - case-insensitive UNIX socket paths in python 2.7 
> -> 3.5 (but not in 3.6) 
> --
>
> Key: AIRFLOW-3615
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3615
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jarek Potiuk
>Assignee: Kamil Bregula
>Priority: Major
> Fix For: 1.10.3
>
>
> There is a problem with case sensitivity of parsing URI for database 
> connections which are using local UNIX sockets rather than TCP connection.
> In case of local UNIX sockets the hostname part of the URI contains 
> url-encoded local socket path rather than actual hostname and in case this 
> path contains uppercase characters, urlparse will deliberately lowercase them 
> when parsing. This is perfectly fine for hostnames (according to 
> [https://tools.ietf.org/html/rfc3986#section-6.2.3)] case normalisation 
> should be done for hostnames.
> However urlparse still uses hostname if the URI does not contain host but 
> only local path (i.e. when the location starts with %2F ("/")). What's more - 
> the host gets converted to lowercase for python 2.7 - 3.5. Surprisingly this 
> is somewhat "fixed" in 3.6 (i.e if the URL location starts with %2F, the 
> hostname is not normalized to lowercase any more ! - see below snippets 
> showing the behaviours for different python versions) .
> In Airflow's Connection this problem bubbles up. Airflow uses urlparse to get 
> the hostname/path in models.py:parse_from_uri and in case of UNIX sockets it 
> is done via hostname. There is no other, reliable way when using urlparse 
> because the path can also contain 'authority' (user/password) and this is 
> urlparse's job to separate them out. The Airflow's Connection similarly does 
> not make a distinction of TCP vs. local socket connection and it uses host 
> field to store the  socket path (it's case sensitive however). So you can use 
> UPPERCASE when you define connection in the database, but this is a problem 
> for parsing connections from environment variables, because we currently 
> cannot pass a URI where socket path contains UPPERCASE characters.
> Since urlparse is really there to parse URLs and it is not good for parsing 
> non-URL URIs - we should likely use different parser which handles more 
> generic URIs - including non-lowercasing path for all versions of python.
> I think we could also consider adding local path to Connection model and use 
> it instead of hostname to store the socket path. This approach would be the 
> "correct" one, but it might introduce some compatibility issues, so maybe 
> it's not worth, considering that host is case sensitive in Airflow.
> Snippet showing urlparse behaviour in different python versions:
> {quote}Python 2.7.10 (default, Aug 17 2018, 19:45:58)
>  [GCC 4.2.1 Compatible Apple LLVM 10.0.0 (clang-1000.0.42)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.5.4 (v3.5.4:3f56838976, Aug 7 2017, 12:56:33)
>  [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urlparse import urlparse,unquote
>  Traceback (most recent call last):
>  File "", line 1, in 
>  ImportError: No module named 'urlparse'
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  '%2faaa'
> {quote}
>  
> {quote}Python 3.6.7 (v3.6.7:6ec5cf24b7, Oct 20 2018, 03:02:14)
>  [GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
>  Type "help", "copyright", "credits" or "license" for more information.
>  >>> from urllib.parse import urlparse,unquote
>  >>> conn = urlparse("http://AAA;)
>  >>> conn.hostname
>  'aaa'
>  >>> conn = urlparse("http://%2FAAA;)
>  >>> conn.hostname
>  {color:#ff}'%2FAAA'{color}
> {quote}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb merged pull request #4591: [AIRFLOW-3615] Parse hostname using netloc

2019-03-20 Thread GitBox
ashb merged pull request #4591: [AIRFLOW-3615] Parse hostname using netloc
URL: https://github.com/apache/airflow/pull/4591
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dimberman commented on issue #4941: [AIRFLOW-4123] Add Exception handling for _change_state method in K8 …

2019-03-20 Thread GitBox
dimberman commented on issue #4941: [AIRFLOW-4123] Add Exception handling for 
_change_state method in K8 …
URL: https://github.com/apache/airflow/pull/4941#issuecomment-474855831
 
 
   @ashb worked thank you!
   
   I set it to 1.10.3 since it really shouldn't affect much else.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-4123) Add Exception handling for _change_state method in K8 Executor

2019-03-20 Thread Daniel Imberman (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Imberman updated AIRFLOW-4123:
-
Fix Version/s: 1.10.3

> Add Exception handling for _change_state method in K8 Executor
> --
>
> Key: AIRFLOW-4123
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4123
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Anand
>Assignee: Anand
>Priority: Minor
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb commented on issue #4335: [AIRFLOW-3458] Move models.Connection into separate file

2019-03-20 Thread GitBox
ashb commented on issue #4335: [AIRFLOW-3458] Move models.Connection into 
separate file
URL: https://github.com/apache/airflow/pull/4335#issuecomment-474851078
 
 
   @BasPH @Fokko I'd like to cherry-pick this in to 1.10.3, but do do that we 
need a deprecation shim.
   
   I can do this with a few lines at the _end_ of models/__init__.py:
   
   ```python
   # To avoid circular import on Python2.7 we need to define this at the 
_bottom_
   from zope.deprecation import deprecated
   from airflow.models.connection import Connection  # noqa
   
   deprecated('Connection', 'has been moved to airflow.models.connection 
package. Please update your code before Airflow 2.0')
   
   ```
   
   Are you happy for me to "just" do that on the release branch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
XD-DENG commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r267356523
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -74,7 +76,16 @@ def execute(self, context):
 self.result_configuration['OutputLocation'] = self.output_location
 self.query_execution_id = self.hook.run_query(self.query, 
self.query_execution_context,
   
self.result_configuration, self.client_request_token)
-self.hook.poll_query_status(self.query_execution_id)
+query_status = self.hook.poll_query_status(self.query_execution_id, 
self.max_tries)
+
+if not query_status or query_status in AWSAthenaHook.FAILURE_STATES:
+raise Exception(
+'Athena job failed. Final state is {}, query_execution_id is 
{}.'
+.format(query_status, self.query_execution_id))
+elif query_status in AWSAthenaHook.INTERMEDIATE_STATES:
 
 Review comment:
   Hi @bryanyang0528 , if you check 
https://github.com/apache/airflow/blob/4655c3f2bbd6dbb442a9c8482559748bd9db0bd7/airflow/contrib/hooks/aws_athena_hook.py#L123-L140
   
   You will notice that `query_state is None` or `query_state in 
self.INTERMEDIATE_STATES` would not result in `break`. Instead, 
`poll_query_status()` will only end with `query_state is None` or `query_state 
in self.INTERMEDIATE_STATES` when `max_tries` is reached. It may be a too 
strong assumption to say "`query_status` is `None` means `failed`".
   
   On the other hand, `else:` (including `FAILURE_STATES`) cause an explicit 
`break`, which is for sure `failed`.
   
   Hope this clarifies.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4131) Allow users to control undefined behavior in templates

2019-03-20 Thread Josh Carp (JIRA)
Josh Carp created AIRFLOW-4131:
--

 Summary: Allow users to control undefined behavior in templates
 Key: AIRFLOW-4131
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4131
 Project: Apache Airflow
  Issue Type: Improvement
Reporter: Josh Carp


As a user, I want to configure templates to raise exceptions on undefined 
variables rather than silently replacing them with the empty string. I propose 
adding a `template_undefined` option to `DAG` to support this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-20 Thread GitBox
nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r267347821
 
 

 ##
 File path: airflow/contrib/operators/opsgenie_alert_operator.py
 ##
 @@ -0,0 +1,142 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+from airflow.contrib.hooks.opsgenie_alert_hook import OpsgenieAlertHook
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class OpsgenieAlertOperator(BaseOperator):
+"""
+This operator allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this operator.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visibleTo: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visibleTo: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+
+@apply_defaults
+def __init__(self,
+ http_conn_id=None,
 
 Review comment:
   Added.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4131) Allow users to control undefined behavior in templates

2019-03-20 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4131?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797186#comment-16797186
 ] 

ASF GitHub Bot commented on AIRFLOW-4131:
-

jmcarp commented on pull request #4951: [AIRFLOW-4131] Make template undefined 
behavior configurable.
URL: https://github.com/apache/airflow/pull/4951
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4131
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow users to control undefined behavior in templates
> --
>
> Key: AIRFLOW-4131
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4131
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Priority: Trivial
>
> As a user, I want to configure templates to raise exceptions on undefined 
> variables rather than silently replacing them with the empty string. I 
> propose adding a `template_undefined` option to `DAG` to support this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] jmcarp opened a new pull request #4951: [AIRFLOW-4131] Make template undefined behavior configurable.

2019-03-20 Thread GitBox
jmcarp opened a new pull request #4951: [AIRFLOW-4131] Make template undefined 
behavior configurable.
URL: https://github.com/apache/airflow/pull/4951
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4131
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r267341823
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -74,7 +76,16 @@ def execute(self, context):
 self.result_configuration['OutputLocation'] = self.output_location
 self.query_execution_id = self.hook.run_query(self.query, 
self.query_execution_context,
   
self.result_configuration, self.client_request_token)
-self.hook.poll_query_status(self.query_execution_id)
+query_status = self.hook.poll_query_status(self.query_execution_id, 
self.max_tries)
+
+if not query_status or query_status in AWSAthenaHook.FAILURE_STATES:
+raise Exception(
+'Athena job failed. Final state is {}, query_execution_id is 
{}.'
+.format(query_status, self.query_execution_id))
+elif query_status in AWSAthenaHook.INTERMEDIATE_STATES:
 
 Review comment:
   @XD-DENG  According to the source code of `query_status`:
   ```
   if query_state is None:
   self.log.info('Trial {try_number}: Invalid query state. 
Retrying again'.format(
   try_number=try_number))
   ```
   This query might something wrong so that this query is not sent to the queue 
of Athena, so I think it should show `failed` if `query_status` is None. Does 
it make any sense or there is something I misunderstand?   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r267341823
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -74,7 +76,16 @@ def execute(self, context):
 self.result_configuration['OutputLocation'] = self.output_location
 self.query_execution_id = self.hook.run_query(self.query, 
self.query_execution_context,
   
self.result_configuration, self.client_request_token)
-self.hook.poll_query_status(self.query_execution_id)
+query_status = self.hook.poll_query_status(self.query_execution_id, 
self.max_tries)
+
+if not query_status or query_status in AWSAthenaHook.FAILURE_STATES:
+raise Exception(
+'Athena job failed. Final state is {}, query_execution_id is 
{}.'
+.format(query_status, self.query_execution_id))
+elif query_status in AWSAthenaHook.INTERMEDIATE_STATES:
 
 Review comment:
   @XD-DENG  According to the source code of `query_status`:
   ```
   if query_state is None:
   self.log.info('Trial {try_number}: Invalid query state. 
Retrying again'.format(
   try_number=try_number))
   ```
   This query might something wrong so that this query is not sent to the queue 
of Athena, so I think it should show `failed` if `query_status` is None. Does 
it make any sense or there is somthing I missing?   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4100) Possible to create invalid JS on dag pages

2019-03-20 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16797146#comment-16797146
 ] 

ASF subversion and git services commented on AIRFLOW-4100:
--

Commit 13a40a9ad88d02aeb543643e0ceecfd5c6949159 in airflow's branch 
refs/heads/v1-10-stable from Ash Berlin-Taylor
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=13a40a9 ]

[AIRFLOW-4100] Correctly JSON escape data for tree/graph views (#4921)


> Possible to create invalid JS on dag pages
> --
>
> Key: AIRFLOW-4100
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4100
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Reporter: Ash Berlin-Taylor
>Assignee: Ash Berlin-Taylor
>Priority: Major
> Fix For: 1.10.3
>
>
> If you have odd dag/task/run ids it is possible to break the charts on the 
> various pages.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw exception if job failed or cancelled or retry too many times

2019-03-20 Thread GitBox
bryanyang0528 commented on a change in pull request #4919: [AIRFLOW-4093] Throw 
exception if job failed or cancelled or retry too many times
URL: https://github.com/apache/airflow/pull/4919#discussion_r266544955
 
 

 ##
 File path: airflow/contrib/operators/aws_athena_operator.py
 ##
 @@ -47,7 +47,8 @@ class AWSAthenaOperator(BaseOperator):
 
 @apply_defaults
 def __init__(self, query, database, output_location, 
aws_conn_id='aws_default', client_request_token=None,
- query_execution_context=None, result_configuration=None, 
sleep_time=30, *args, **kwargs):
+ query_execution_context=None, result_configuration=None, 
sleep_time=30, max_tries=None,
 
 Review comment:
   @XD-DENG  Thx. I've updated it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4788: [AIRFLOW-3811][3/3] Add automatic generation of API Reference

2019-03-20 Thread GitBox
mik-laj commented on issue #4788: [AIRFLOW-3811][3/3] Add automatic generation 
of API Reference
URL: https://github.com/apache/airflow/pull/4788#issuecomment-474805300
 
 
   The required changes appeared on the master. Documentations is built 
correctly. 
   
   Latest preview: http://puny-spark.surge.sh/_autoapi/index.html
   
   PTAL @ashb @Fokko @XD-DENG @feng-tao @cixuuz 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add automatic generation of API Reference

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add 
automatic generation of API Reference
URL: https://github.com/apache/airflow/pull/4788#discussion_r267306767
 
 

 ##
 File path: .gitignore
 ##
 @@ -83,6 +83,7 @@ instance/
 
 # Sphinx documentation
 docs/_build/
+docs/autoapi/
 
 Review comment:
   Done.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add automatic generation of API Reference

2019-03-20 Thread GitBox
mik-laj commented on a change in pull request #4788: [AIRFLOW-3811][3/3] Add 
automatic generation of API Reference
URL: https://github.com/apache/airflow/pull/4788#discussion_r267306693
 
 

 ##
 File path: .gitignore
 ##
 @@ -83,6 +83,7 @@ instance/
 
 
 Review comment:
   First, this file must be created in a separate PR. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4591: [AIRFLOW-3615] Parse hostname using netloc

2019-03-20 Thread GitBox
codecov-io edited a comment on issue #4591: [AIRFLOW-3615] Parse hostname using 
netloc
URL: https://github.com/apache/airflow/pull/4591#issuecomment-467251879
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=h1) 
Report
   > Merging 
[#4591](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/de7e0d8a9d78b0f5e03f3057a1fad6f965a0ecae?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4591/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4591  +/-   ##
   ==
   + Coverage   75.58%   75.59%   +<.01% 
   ==
 Files 454  454  
 Lines   2920929218   +9 
   ==
   + Hits2207922087   +8 
   - Misses   7130 7131   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/4591/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `65.9% <100%> (+1.83%)` | :arrow_up: |
   | 
[airflow/contrib/operators/ssh\_operator.py](https://codecov.io/gh/apache/airflow/pull/4591/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zc2hfb3BlcmF0b3IucHk=)
 | `82.27% <0%> (-1.27%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=footer). 
Last update 
[de7e0d8...dcdf882](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4591: [AIRFLOW-3615] Parse hostname using netloc

2019-03-20 Thread GitBox
codecov-io edited a comment on issue #4591: [AIRFLOW-3615] Parse hostname using 
netloc
URL: https://github.com/apache/airflow/pull/4591#issuecomment-467251879
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=h1) 
Report
   > Merging 
[#4591](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/de7e0d8a9d78b0f5e03f3057a1fad6f965a0ecae?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4591/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4591  +/-   ##
   ==
   + Coverage   75.58%   75.59%   +<.01% 
   ==
 Files 454  454  
 Lines   2920929218   +9 
   ==
   + Hits2207922087   +8 
   - Misses   7130 7131   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/4591/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `65.9% <100%> (+1.83%)` | :arrow_up: |
   | 
[airflow/contrib/operators/ssh\_operator.py](https://codecov.io/gh/apache/airflow/pull/4591/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zc2hfb3BlcmF0b3IucHk=)
 | `82.27% <0%> (-1.27%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=footer). 
Last update 
[de7e0d8...dcdf882](https://codecov.io/gh/apache/airflow/pull/4591?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >