[GitHub] [airflow] KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488556836
 
 
   I see I see :D I just set it up this afternoon and guess it is all good now 
:D This is actually the first PR I merged  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] apraovjr commented on issue #4530: [AIRFLOW-3282] Implement Azure Kubernetes Service Operator

2019-05-01 Thread GitBox
apraovjr commented on issue #4530: [AIRFLOW-3282] Implement Azure Kubernetes 
Service Operator
URL: https://github.com/apache/airflow/pull/4530#issuecomment-48815
 
 
   @mik-laj  I have updated changes


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep 
crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488553048
 
 
   @KevinYang21 that's correct. Not sure if you have merged any pr yet, but I 
just let you merge it and see if you have anything setup correctly. Of course I 
could merge it myself :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-4452.
---
   Resolution: Fixed
Fix Version/s: 1.10.4

> Webserver and Scheduler keep crashing because of slackclient update
> ---
>
> Key: AIRFLOW-4452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver
>Affects Versions: 1.10.1
>Reporter: Abhishek Ray
>Assignee: Tao Feng
>Priority: Blocker
> Fix For: 1.10.4
>
>
> Webserver and Scheduler get into a crash loop if Airflow is installed with 
> slack dependencies.
> Airflow relies on slackclient which released a new major version (2.0.0) 
> today ([https://pypi.org/project/slackclient/#history]). This new version 
> seems to be incompatible with Airflow causing the webserver to get into a 
> crash loop.
> The root cause of the issue is that Airflow doesn't pin requirements for 
> slackclient:
> [https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
> {code:java}
> slack = ['slackclient>=1.0.0']{code}
>  
> This is the exception in the logs due to this error:
>  
> {code:java}
> File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
>     from airflow.operators import SlackAPIPostOperator
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 372, in __getattr__
>     loaded_attribute = self._load_attribute(attribute)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 336, in _load_attribute
>     self._loaded_modules[module] = imp.load_module(module, f, filename, 
> description)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 235, in load_module
>     return load_source(name, filename, file)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 172, in load_source
>     module = _load(spec)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/operators/slack_operator.py",
>  line 24, in 
>     from airflow.hooks.slack_hook import SlackHook
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/hooks/slack_hook.py",
>  line 20, in 
>     from slackclient import SlackClient
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/__init__.py",
>  line 1, in 
>     from .client import SlackClient # noqa
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/client.py",
>  line 8, in 
>     from .server import Server
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/server.py",
>  line 14, in 
>     from websocket import create_connection
> ModuleNotFoundError: No module named 'websocket'
> {code}
>  
>  
> This is how to reproduce this issue:
> Install apache airflow with slack: 
> {code:java}
> pip install apache-airflow[slack]==1.10.1{code}
>  
> Create a DAG which uses *SlackAPIPostOperator*
> {code:java}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from airflow.operators import SlackAPIPostOperator
> dag_default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2019, 4, 22),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "catchup": True,
> }
> dag = DAG("test_dag", default_args=dag_default_args, 
> schedule_interval="@daily")
> BashOperator(task_id="print_date", bash_command="date", dag=dag){code}
>  
> I think the fix should be pretty straightforward to add a max version for 
> slackclient.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488552428
 
 
   Actually, we only need a +1 right? Author can merge as long as another 
committer gave a +1?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep 
crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488552351
 
 
   thanks @KevinYang21 . Welcome aboard :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488552257
 
 
   Aaah I didn't finish my dinner fast enough :D


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831411#comment-16831411
 ] 

ASF GitHub Bot commented on AIRFLOW-4452:
-

KevinYang21 commented on pull request #5225: [AIRFLOW-4452] Webserver and 
Scheduler keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Webserver and Scheduler keep crashing because of slackclient update
> ---
>
> Key: AIRFLOW-4452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver
>Affects Versions: 1.10.1
>Reporter: Abhishek Ray
>Assignee: Tao Feng
>Priority: Blocker
>
> Webserver and Scheduler get into a crash loop if Airflow is installed with 
> slack dependencies.
> Airflow relies on slackclient which released a new major version (2.0.0) 
> today ([https://pypi.org/project/slackclient/#history]). This new version 
> seems to be incompatible with Airflow causing the webserver to get into a 
> crash loop.
> The root cause of the issue is that Airflow doesn't pin requirements for 
> slackclient:
> [https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
> {code:java}
> slack = ['slackclient>=1.0.0']{code}
>  
> This is the exception in the logs due to this error:
>  
> {code:java}
> File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
>     from airflow.operators import SlackAPIPostOperator
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 372, in __getattr__
>     loaded_attribute = self._load_attribute(attribute)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 336, in _load_attribute
>     self._loaded_modules[module] = imp.load_module(module, f, filename, 
> description)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 235, in load_module
>     return load_source(name, filename, file)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 172, in load_source
>     module = _load(spec)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/operators/slack_operator.py",
>  line 24, in 
>     from airflow.hooks.slack_hook import SlackHook
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/hooks/slack_hook.py",
>  line 20, in 
>     from slackclient import SlackClient
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/__init__.py",
>  line 1, in 
>     from .client import SlackClient # noqa
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/client.py",
>  line 8, in 
>     from .server import Server
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/server.py",
>  line 14, in 
>     from websocket import create_connection
> ModuleNotFoundError: No module named 'websocket'
> {code}
>  
>  
> This is how to reproduce this issue:
> Install apache airflow with slack: 
> {code:java}
> pip install apache-airflow[slack]==1.10.1{code}
>  
> Create a DAG which uses *SlackAPIPostOperator*
> {code:java}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from airflow.operators import SlackAPIPostOperator
> dag_default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2019, 4, 22),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "catchup": True,
> }
> dag = DAG("test_dag", default_args=dag_default_args, 
> schedule_interval="@daily")
> BashOperator(task_id="print_date", bash_command="date", dag=dag){code}
>  
> I think the fix should be pretty straightforward to add a max version for 
> slackclient.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831412#comment-16831412
 ] 

ASF subversion and git services commented on AIRFLOW-4452:
--

Commit 406882f5e5b271d2f9aef98f56d197c9b8b6784f in airflow's branch 
refs/heads/master from Tao Feng
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=406882f ]

[AIRFLOW-4452] Webserver and Scheduler keep crashing because of slackclient 
update (#5225)



> Webserver and Scheduler keep crashing because of slackclient update
> ---
>
> Key: AIRFLOW-4452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver
>Affects Versions: 1.10.1
>Reporter: Abhishek Ray
>Assignee: Tao Feng
>Priority: Blocker
>
> Webserver and Scheduler get into a crash loop if Airflow is installed with 
> slack dependencies.
> Airflow relies on slackclient which released a new major version (2.0.0) 
> today ([https://pypi.org/project/slackclient/#history]). This new version 
> seems to be incompatible with Airflow causing the webserver to get into a 
> crash loop.
> The root cause of the issue is that Airflow doesn't pin requirements for 
> slackclient:
> [https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
> {code:java}
> slack = ['slackclient>=1.0.0']{code}
>  
> This is the exception in the logs due to this error:
>  
> {code:java}
> File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
>     from airflow.operators import SlackAPIPostOperator
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 372, in __getattr__
>     loaded_attribute = self._load_attribute(attribute)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 336, in _load_attribute
>     self._loaded_modules[module] = imp.load_module(module, f, filename, 
> description)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 235, in load_module
>     return load_source(name, filename, file)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 172, in load_source
>     module = _load(spec)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/operators/slack_operator.py",
>  line 24, in 
>     from airflow.hooks.slack_hook import SlackHook
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/hooks/slack_hook.py",
>  line 20, in 
>     from slackclient import SlackClient
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/__init__.py",
>  line 1, in 
>     from .client import SlackClient # noqa
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/client.py",
>  line 8, in 
>     from .server import Server
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/server.py",
>  line 14, in 
>     from websocket import create_connection
> ModuleNotFoundError: No module named 'websocket'
> {code}
>  
>  
> This is how to reproduce this issue:
> Install apache airflow with slack: 
> {code:java}
> pip install apache-airflow[slack]==1.10.1{code}
>  
> Create a DAG which uses *SlackAPIPostOperator*
> {code:java}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from airflow.operators import SlackAPIPostOperator
> dag_default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2019, 4, 22),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "catchup": True,
> }
> dag = DAG("test_dag", default_args=dag_default_args, 
> schedule_interval="@daily")
> BashOperator(task_id="print_date", bash_command="date", dag=dag){code}
>  
> I think the fix should be pretty straightforward to add a max version for 
> slackclient.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] KevinYang21 merged pull request #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
KevinYang21 merged pull request #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep 
crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488552043
 
 
   @KevinYang21 , CI green now :) Could you help to merge the pr?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5030: [AIRFLOW-4227] Use python3-style type annotations.

2019-05-01 Thread GitBox
feng-tao commented on issue #5030: [AIRFLOW-4227] Use python3-style type 
annotations.
URL: https://github.com/apache/airflow/pull/5030#issuecomment-488549097
 
 
   We could revisit this pr given we drop PY2?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
codecov-io commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488548932
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=h1) 
Report
   > Merging 
[#5225](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9b34bf541a4181b20586326508ba754c04f5cc68?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5225/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#5225   +/-   ##
   ===
 Coverage   78.54%   78.54%   
   ===
 Files 469  469   
 Lines   2998329983   
   ===
 Hits2355123551   
 Misses   6432 6432
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=footer). 
Last update 
[9b34bf5...a9e60d5](https://codecov.io/gh/apache/airflow/pull/5225?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
KevinYang21 commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler 
keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488545413
 
 
   Will merge after CI passed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4452?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831400#comment-16831400
 ] 

ASF GitHub Bot commented on AIRFLOW-4452:
-

feng-tao commented on pull request #5225: [AIRFLOW-4452] Webserver and 
Scheduler keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225
 
 
   …client update
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4452
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Webserver and Scheduler keep crashing because of slackclient update
> ---
>
> Key: AIRFLOW-4452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver
>Affects Versions: 1.10.1
>Reporter: Abhishek Ray
>Assignee: Tao Feng
>Priority: Blocker
>
> Webserver and Scheduler get into a crash loop if Airflow is installed with 
> slack dependencies.
> Airflow relies on slackclient which released a new major version (2.0.0) 
> today ([https://pypi.org/project/slackclient/#history]). This new version 
> seems to be incompatible with Airflow causing the webserver to get into a 
> crash loop.
> The root cause of the issue is that Airflow doesn't pin requirements for 
> slackclient:
> [https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
> {code:java}
> slack = ['slackclient>=1.0.0']{code}
>  
> This is the exception in the logs due to this error:
>  
> {code:java}
> File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
>     from airflow.operators import SlackAPIPostOperator
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 372, in __getattr__
>     loaded_attribute = self._load_attribute(attribute)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 336, in _load_attribute
>     self._loaded_modules[module] = imp.load_module(module, f, filename, 
> description)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 235, in load_module
>     return load_source(name, filename, file)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 172, in load_source
>     module = _load(spec)
>   File 
> 

[GitHub] [airflow] feng-tao edited a comment on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao edited a comment on issue #5225: [AIRFLOW-4452] Webserver and 
Scheduler keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488545219
 
 
   PTAL @XD-DENG @ashb @potiuk  @KevinYang21 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao opened a new pull request #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao opened a new pull request #5225: [AIRFLOW-4452] Webserver and 
Scheduler keep crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225
 
 
   …client update
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4452
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep crashing because of slack…

2019-05-01 Thread GitBox
feng-tao commented on issue #5225: [AIRFLOW-4452] Webserver and Scheduler keep 
crashing because of slack…
URL: https://github.com/apache/airflow/pull/5225#issuecomment-488545219
 
 
   PTAL @XD-DENG @ashb @potiuk 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4551: [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods

2019-05-01 Thread GitBox
codecov-io edited a comment on issue #4551: [AIRFLOW-2955] Fix kubernetes pod 
operator to set requests and limits on task pods
URL: https://github.com/apache/airflow/pull/4551#issuecomment-455531251
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=h1) 
Report
   > Merging 
[#4551](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/60b9023ed92b31a75dbdf8b33ce7e9c2bc3637d1?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4551/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4551  +/-   ##
   ==
   - Coverage   78.56%   78.55%   -0.01% 
   ==
 Files 466  469   +3 
 Lines   2980529989 +184 
   ==
   + Hits2341523558 +143 
   - Misses   6390 6431  +41
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `98.73% <100%> (+0.1%)` | :arrow_up: |
   | 
[airflow/contrib/executors/kubernetes\_executor.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4ZWN1dG9ycy9rdWJlcm5ldGVzX2V4ZWN1dG9yLnB5)
 | `62.24% <0%> (-1.04%)` | :arrow_down: |
   | 
[airflow/models/dag.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnLnB5)
 | `93.2% <0%> (-0.16%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `78.35% <0%> (-0.16%)` | :arrow_down: |
   | 
[airflow/config\_templates/airflow\_local\_settings.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWdfdGVtcGxhdGVzL2FpcmZsb3dfbG9jYWxfc2V0dGluZ3MucHk=)
 | `76.47% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `96.55% <0%> (ø)` | :arrow_up: |
   | 
[...rib/example\_dags/example\_gcp\_video\_intelligence.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF92aWRlb19pbnRlbGxpZ2VuY2UucHk=)
 | `0% <0%> (ø)` | |
   | 
[...ntrib/operators/gcp\_video\_intelligence\_operator.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3BfdmlkZW9faW50ZWxsaWdlbmNlX29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | |
   | 
[...rflow/contrib/hooks/gcp\_video\_intelligence\_hook.py](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF92aWRlb19pbnRlbGxpZ2VuY2VfaG9vay5weQ==)
 | `69.23% <0%> (ø)` | |
   | ... and [11 
more](https://codecov.io/gh/apache/airflow/pull/4551/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=footer). 
Last update 
[60b9023...374920d](https://codecov.io/gh/apache/airflow/pull/4551?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng reassigned AIRFLOW-4452:
-

Assignee: Tao Feng

> Webserver and Scheduler keep crashing because of slackclient update
> ---
>
> Key: AIRFLOW-4452
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver
>Affects Versions: 1.10.1
>Reporter: Abhishek Ray
>Assignee: Tao Feng
>Priority: Blocker
>
> Webserver and Scheduler get into a crash loop if Airflow is installed with 
> slack dependencies.
> Airflow relies on slackclient which released a new major version (2.0.0) 
> today ([https://pypi.org/project/slackclient/#history]). This new version 
> seems to be incompatible with Airflow causing the webserver to get into a 
> crash loop.
> The root cause of the issue is that Airflow doesn't pin requirements for 
> slackclient:
> [https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
> {code:java}
> slack = ['slackclient>=1.0.0']{code}
>  
> This is the exception in the logs due to this error:
>  
> {code:java}
> File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
>     from airflow.operators import SlackAPIPostOperator
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 372, in __getattr__
>     loaded_attribute = self._load_attribute(attribute)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
>  line 336, in _load_attribute
>     self._loaded_modules[module] = imp.load_module(module, f, filename, 
> description)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 235, in load_module
>     return load_source(name, filename, file)
>   File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
> line 172, in load_source
>     module = _load(spec)
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/operators/slack_operator.py",
>  line 24, in 
>     from airflow.hooks.slack_hook import SlackHook
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/hooks/slack_hook.py",
>  line 20, in 
>     from slackclient import SlackClient
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/__init__.py",
>  line 1, in 
>     from .client import SlackClient # noqa
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/client.py",
>  line 8, in 
>     from .server import Server
>   File 
> "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/server.py",
>  line 14, in 
>     from websocket import create_connection
> ModuleNotFoundError: No module named 'websocket'
> {code}
>  
>  
> This is how to reproduce this issue:
> Install apache airflow with slack: 
> {code:java}
> pip install apache-airflow[slack]==1.10.1{code}
>  
> Create a DAG which uses *SlackAPIPostOperator*
> {code:java}
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from airflow.operators import SlackAPIPostOperator
> dag_default_args = {
>     "owner": "airflow",
>     "depends_on_past": False,
>     "start_date": datetime(2019, 4, 22),
>     "email": ["airf...@airflow.com"],
>     "email_on_failure": False,
>     "email_on_retry": False,
>     "retries": 1,
>     "catchup": True,
> }
> dag = DAG("test_dag", default_args=dag_default_args, 
> schedule_interval="@daily")
> BashOperator(task_id="print_date", bash_command="date", dag=dag){code}
>  
> I think the fix should be pretty straightforward to add a max version for 
> slackclient.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4452) Webserver and Scheduler keep crashing because of slackclient update

2019-05-01 Thread Abhishek Ray (JIRA)
Abhishek Ray created AIRFLOW-4452:
-

 Summary: Webserver and Scheduler keep crashing because of 
slackclient update
 Key: AIRFLOW-4452
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4452
 Project: Apache Airflow
  Issue Type: Bug
  Components: scheduler, webserver
Affects Versions: 1.10.1
Reporter: Abhishek Ray


Webserver and Scheduler get into a crash loop if Airflow is installed with 
slack dependencies.


Airflow relies on slackclient which released a new major version (2.0.0) today 
([https://pypi.org/project/slackclient/#history]). This new version seems to be 
incompatible with Airflow causing the webserver to get into a crash loop.

The root cause of the issue is that Airflow doesn't pin requirements for 
slackclient:

[https://github.com/apache/airflow/blob/v1-10-stable/setup.py#L229]
{code:java}
slack = ['slackclient>=1.0.0']{code}
 

This is the exception in the logs due to this error:

 
{code:java}
File "/Users/abhishek.ray/airflow/dags/test_dag.py", line 3, in 
    from airflow.operators import SlackAPIPostOperator
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
 line 372, in __getattr__
    loaded_attribute = self._load_attribute(attribute)
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/utils/helpers.py",
 line 336, in _load_attribute
    self._loaded_modules[module] = imp.load_module(module, f, filename, 
description)
  File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
line 235, in load_module
    return load_source(name, filename, file)
  File "/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/imp.py", 
line 172, in load_source
    module = _load(spec)
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/operators/slack_operator.py",
 line 24, in 
    from airflow.hooks.slack_hook import SlackHook
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/airflow/hooks/slack_hook.py",
 line 20, in 
    from slackclient import SlackClient
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/__init__.py",
 line 1, in 
    from .client import SlackClient # noqa
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/client.py",
 line 8, in 
    from .server import Server
  File 
"/Users/abhishek.ray/.virtualenvs/airflow-test/lib/python3.6/site-packages/slackclient/server.py",
 line 14, in 
    from websocket import create_connection
ModuleNotFoundError: No module named 'websocket'
{code}
 

 

This is how to reproduce this issue:

Install apache airflow with slack: 
{code:java}
pip install apache-airflow[slack]==1.10.1{code}
 

Create a DAG which uses *SlackAPIPostOperator*
{code:java}
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators import SlackAPIPostOperator

dag_default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "start_date": datetime(2019, 4, 22),
    "email": ["airf...@airflow.com"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 1,
    "catchup": True,
}


dag = DAG("test_dag", default_args=dag_default_args, schedule_interval="@daily")

BashOperator(task_id="print_date", bash_command="date", dag=dag){code}
 

I think the fix should be pretty straightforward to add a max version for 
slackclient.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] suensummit commented on a change in pull request #4551: [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods

2019-05-01 Thread GitBox
suensummit commented on a change in pull request #4551: [AIRFLOW-2955] Fix 
kubernetes pod operator to set requests and limits on task pods
URL: https://github.com/apache/airflow/pull/4551#discussion_r280280581
 
 

 ##
 File path: airflow/contrib/operators/kubernetes_pod_operator.py
 ##
 @@ -140,6 +142,15 @@ def execute(self, context):
 except AirflowException as ex:
 raise AirflowException('Pod Launching failed: 
{error}'.format(error=ex))
 
+def _set_resources(self, resources):
+inputResource = Resources()
+if resources is not None:
 
 Review comment:
   OK cool, thanks.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task dependency context defination and usage

2019-05-01 Thread GitBox
codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task 
dependency context defination and usage
URL: https://github.com/apache/airflow/pull/5079#issuecomment-486600978
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=h1) 
Report
   > Merging 
[#5079](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9b34bf541a4181b20586326508ba754c04f5cc68?src=pr=desc)
 will **increase** coverage by `0.03%`.
   > The diff coverage is `96.22%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5079/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5079  +/-   ##
   ==
   + Coverage   78.54%   78.58%   +0.03% 
   ==
 Files 469  471   +2 
 Lines   2998330022  +39 
   ==
   + Hits2355123592  +41 
   + Misses   6432 6430   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `97.05% <100%> (+0.08%)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/dagrun\_id\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvZGFncnVuX2lkX2RlcC5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.27% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `92.88% <100%> (+0.45%)` | :arrow_up: |
   | 
[airflow/ti\_deps/dep\_context.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcF9jb250ZXh0LnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `66.83% <50%> (ø)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `78.35% <88.88%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `96% <96%> (ø)` | |
   | 
[airflow/models/dagbag.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnYmFnLnB5)
 | `91.38% <0%> (-0.48%)` | :arrow_down: |
   | ... and [2 
more](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=footer). 
Last update 
[9b34bf5...1b4e685](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task dependency context defination and usage

2019-05-01 Thread GitBox
codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task 
dependency context defination and usage
URL: https://github.com/apache/airflow/pull/5079#issuecomment-486600978
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=h1) 
Report
   > Merging 
[#5079](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9b34bf541a4181b20586326508ba754c04f5cc68?src=pr=desc)
 will **increase** coverage by `0.04%`.
   > The diff coverage is `96.22%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5079/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5079  +/-   ##
   ==
   + Coverage   78.54%   78.58%   +0.04% 
   ==
 Files 469  471   +2 
 Lines   2998330022  +39 
   ==
   + Hits2355123594  +43 
   + Misses   6432 6428   -4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `97.05% <100%> (+0.08%)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/dagrun\_id\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvZGFncnVuX2lkX2RlcC5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.27% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.05% <100%> (+0.62%)` | :arrow_up: |
   | 
[airflow/ti\_deps/dep\_context.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcF9jb250ZXh0LnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `66.83% <50%> (ø)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `78.35% <88.88%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `96% <96%> (ø)` | |
   | ... and [2 
more](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=footer). 
Last update 
[9b34bf5...1b4e685](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task dependency context defination and usage

2019-05-01 Thread GitBox
codecov-io edited a comment on issue #5079: [WIP][AIRFLOW-4285] Update task 
dependency context defination and usage
URL: https://github.com/apache/airflow/pull/5079#issuecomment-486600978
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=h1) 
Report
   > Merging 
[#5079](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9b34bf541a4181b20586326508ba754c04f5cc68?src=pr=desc)
 will **increase** coverage by `0.03%`.
   > The diff coverage is `96.22%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5079/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5079  +/-   ##
   ==
   + Coverage   78.54%   78.58%   +0.03% 
   ==
 Files 469  471   +2 
 Lines   2998330022  +39 
   ==
   + Hits2355123593  +42 
   + Misses   6432 6429   -3
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/pool.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvcG9vbC5weQ==)
 | `97.05% <100%> (+0.08%)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/dagrun\_id\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvZGFncnVuX2lkX2RlcC5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.27% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `92.88% <100%> (+0.45%)` | :arrow_up: |
   | 
[airflow/ti\_deps/dep\_context.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcF9jb250ZXh0LnB5)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `66.83% <50%> (ø)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `78.35% <88.88%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/pool\_slots\_available\_dep.py](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvcG9vbF9zbG90c19hdmFpbGFibGVfZGVwLnB5)
 | `96% <96%> (ø)` | |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/5079/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=footer). 
Last update 
[9b34bf5...1b4e685](https://codecov.io/gh/apache/airflow/pull/5079?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors

2019-05-01 Thread GitBox
feng-tao commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors
URL: https://github.com/apache/airflow/pull/5224#issuecomment-488537399
 
 
   @youngyjd do we have existing tests for cgroupTaskRunner? If not, could we 
add that which we could mock the actual cgroup routine?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-4447) Display task duration as human friendly format in Tree View

2019-05-01 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4447?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-4447.
---
   Resolution: Fixed
Fix Version/s: 1.10.4

> Display task duration as human friendly format in Tree View
> ---
>
> Key: AIRFLOW-4447
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4447
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Minor
> Fix For: 1.10.4
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] feng-tao merged pull request #5218: [AIRFLOW-4447] Display task duration as human friendly format in UI

2019-05-01 Thread GitBox
feng-tao merged pull request #5218: [AIRFLOW-4447] Display task duration as 
human friendly format in UI
URL: https://github.com/apache/airflow/pull/5218
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4447) Display task duration as human friendly format in Tree View

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4447?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831375#comment-16831375
 ] 

ASF subversion and git services commented on AIRFLOW-4447:
--

Commit 9b34bf541a4181b20586326508ba754c04f5cc68 in airflow's branch 
refs/heads/master from Ping Zhang
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=9b34bf5 ]

[AIRFLOW-4447] Display task duration as human friendly format in UI (#5218)



> Display task duration as human friendly format in Tree View
> ---
>
> Key: AIRFLOW-4447
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4447
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4447) Display task duration as human friendly format in Tree View

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4447?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831374#comment-16831374
 ] 

ASF GitHub Bot commented on AIRFLOW-4447:
-

feng-tao commented on pull request #5218: [AIRFLOW-4447] Display task duration 
as human friendly format in UI
URL: https://github.com/apache/airflow/pull/5218
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Display task duration as human friendly format in Tree View
> ---
>
> Key: AIRFLOW-4447
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4447
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Ping Zhang
>Assignee: Ping Zhang
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors

2019-05-01 Thread GitBox
codecov-io commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors
URL: https://github.com/apache/airflow/pull/5224#issuecomment-488498739
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=h1) 
Report
   > Merging 
[#5224](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/52b121797b328383cd39905ef762fc79e2e6a653?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `0%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5224/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5224  +/-   ##
   ==
   - Coverage   78.54%   78.54%   -0.01% 
   ==
 Files 469  469  
 Lines   2998329984   +1 
   ==
 Hits2355123551  
   - Misses   6432 6433   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/task\_runner/cgroup\_task\_runner.py](https://codecov.io/gh/apache/airflow/pull/5224/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3Rhc2tfcnVubmVyL2Nncm91cF90YXNrX3J1bm5lci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=footer). 
Last update 
[52b1217...7b86b9e](https://codecov.io/gh/apache/airflow/pull/5224?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] youngyjd commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors

2019-05-01 Thread GitBox
youngyjd commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors
URL: https://github.com/apache/airflow/pull/5224#issuecomment-488475516
 
 
   @ashb how do you want this to be tested? cgroup must be mounted on the 
system first to make this taskrunner work.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4450) has_dag_access does not handle form parameters

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831295#comment-16831295
 ] 

ASF subversion and git services commented on AIRFLOW-4450:
--

Commit 22c559dce7974c217bd687d4e468296b980a8e21 in airflow's branch 
refs/heads/v1-10-test from Chris McLennon
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=22c559d ]

[AIRFLOW-4450] Fix request arguments in has_dag_access (#5220)

has_dag_access needs to use request arguments from both
request.args and request.form.

This is related to the changes made in AIRFLOW-4240/#5039.

(cherry picked from commit 52b121797b328383cd39905ef762fc79e2e6a653)


> has_dag_access does not handle form parameters
> --
>
> Key: AIRFLOW-4450
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4450
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.3
>Reporter: Chris McLennon
>Assignee: Chris McLennon
>Priority: Major
> Fix For: 1.10.4
>
>
> With the update to Airflow 1.10.3, we noticed that users that have permission 
> to clear task instances are no longer able to do so from the webserver, and 
> are receiving an "Access denied" error. They are able to clear task instances 
> from /taskinstance, but not from /clear.
> I believe this is related to the change made in AIRFLOW-4240, which had the 
> unintended side effect of breaking the has_dag_access decorator for POST 
> requests.
> This may affect other endpoints.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors

2019-05-01 Thread GitBox
ashb commented on issue #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors
URL: https://github.com/apache/airflow/pull/5224#issuecomment-488466036
 
 
   Any chance you could add some unit tests covering some of this too? The 
CGroup runner isn't much used and without unit tests it will be hard for the 
Airflow community to continue supporting it (meaning we might remove it in a 
future version)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #4610: [AIRFLOW-3783] Switch to HEADER option being provided already by Redshift

2019-05-01 Thread GitBox
ashb commented on issue #4610: [AIRFLOW-3783] Switch to HEADER option being 
provided already by Redshift
URL: https://github.com/apache/airflow/pull/4610#issuecomment-488465522
 
 
   In which case this probably needs to be an option (that can default to on) 
to continue working with older clusters, right? Or do clusters auto/forcibly 
get upgraded?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4146) CgroupTaskRunner is not functioning

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831297#comment-16831297
 ] 

ASF GitHub Bot commented on AIRFLOW-4146:
-

youngyjd commented on pull request #5224: [AIRFLOW-4146] Fix CgroupTaskRunner 
errors
URL: https://github.com/apache/airflow/pull/5224
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [X] My PR addresses the following 
[AIRFLOW-4146](https://issues.apache.org/jira/browse/AIRFLOW-4146) issues and 
references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4146
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [X] Here are some details about my PR, including screenshots of any UI 
changes:
   1. CgroupTaskRunner import errors is fixed in this PR.
   2. Node.name() is encoded but it works fine in python2. However, python3 
returns False for `'abc' = b'abc'`. We have to decode it since python2 support 
has been dropped in this project.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   1. Not a fundamental change. 
   2. There is no test written for this task runner.
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [X] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [X] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> CgroupTaskRunner is not functioning
> ---
>
> Key: AIRFLOW-4146
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4146
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Junda Yang
>Assignee: Junda Yang
>Priority: Major
>
> Tried to switch from StandardTaskRunner to CgroupTaskRunner but tasks were 
> stuck and unable to finish. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] youngyjd opened a new pull request #5224: [AIRFLOW-4146] Fix CgroupTaskRunner errors

2019-05-01 Thread GitBox
youngyjd opened a new pull request #5224: [AIRFLOW-4146] Fix CgroupTaskRunner 
errors
URL: https://github.com/apache/airflow/pull/5224
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [X] My PR addresses the following 
[AIRFLOW-4146](https://issues.apache.org/jira/browse/AIRFLOW-4146) issues and 
references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4146
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [X] Here are some details about my PR, including screenshots of any UI 
changes:
   1. CgroupTaskRunner import errors is fixed in this PR.
   2. Node.name() is encoded but it works fine in python2. However, python3 
returns False for `'abc' = b'abc'`. We have to decode it since python2 support 
has been dropped in this project.
   
   ### Tests
   
   - [X] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   1. Not a fundamental change. 
   2. There is no test written for this task runner.
   
   ### Commits
   
   - [X] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [X] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [X] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4240) State changing actions shouldn't be GET requests

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831296#comment-16831296
 ] 

ASF subversion and git services commented on AIRFLOW-4240:
--

Commit 22c559dce7974c217bd687d4e468296b980a8e21 in airflow's branch 
refs/heads/v1-10-test from Chris McLennon
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=22c559d ]

[AIRFLOW-4450] Fix request arguments in has_dag_access (#5220)

has_dag_access needs to use request arguments from both
request.args and request.form.

This is related to the changes made in AIRFLOW-4240/#5039.

(cherry picked from commit 52b121797b328383cd39905ef762fc79e2e6a653)


> State changing actions shouldn't be GET requests
> 
>
> Key: AIRFLOW-4240
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4240
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
>Assignee: Ash Berlin-Taylor
>Priority: Major
> Fix For: 1.10.3
>
>
> We have a number of actions which perform actions (trigger, clear, etc) that 
> are performed over GET requests.
> That should be avoided as browsers/corporate proxies might prefetch the URLs 
> causing things to behave oddly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-4450) has_dag_access does not handle form parameters

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4450?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-4450.

Resolution: Fixed

> has_dag_access does not handle form parameters
> --
>
> Key: AIRFLOW-4450
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4450
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.3
>Reporter: Chris McLennon
>Assignee: Chris McLennon
>Priority: Major
> Fix For: 1.10.4
>
>
> With the update to Airflow 1.10.3, we noticed that users that have permission 
> to clear task instances are no longer able to do so from the webserver, and 
> are receiving an "Access denied" error. They are able to clear task instances 
> from /taskinstance, but not from /clear.
> I believe this is related to the change made in AIRFLOW-4240, which had the 
> unintended side effect of breaking the has_dag_access decorator for POST 
> requests.
> This may affect other endpoints.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4450) has_dag_access does not handle form parameters

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831293#comment-16831293
 ] 

ASF subversion and git services commented on AIRFLOW-4450:
--

Commit 52b121797b328383cd39905ef762fc79e2e6a653 in airflow's branch 
refs/heads/master from Chris McLennon
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=52b1217 ]

[AIRFLOW-4450] Fix request arguments in has_dag_access (#5220)

has_dag_access needs to use request arguments from both
request.args and request.form.

This is related to the changes made in AIRFLOW-4240/#5039.

> has_dag_access does not handle form parameters
> --
>
> Key: AIRFLOW-4450
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4450
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.3
>Reporter: Chris McLennon
>Assignee: Chris McLennon
>Priority: Major
> Fix For: 1.10.4
>
>
> With the update to Airflow 1.10.3, we noticed that users that have permission 
> to clear task instances are no longer able to do so from the webserver, and 
> are receiving an "Access denied" error. They are able to clear task instances 
> from /taskinstance, but not from /clear.
> I believe this is related to the change made in AIRFLOW-4240, which had the 
> unintended side effect of breaking the has_dag_access decorator for POST 
> requests.
> This may affect other endpoints.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4450) has_dag_access does not handle form parameters

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831292#comment-16831292
 ] 

ASF GitHub Bot commented on AIRFLOW-4450:
-

ashb commented on pull request #5220: [AIRFLOW-4450] has_dag_access does not 
handle form parameters
URL: https://github.com/apache/airflow/pull/5220
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> has_dag_access does not handle form parameters
> --
>
> Key: AIRFLOW-4450
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4450
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.3
>Reporter: Chris McLennon
>Assignee: Chris McLennon
>Priority: Major
> Fix For: 1.10.4
>
>
> With the update to Airflow 1.10.3, we noticed that users that have permission 
> to clear task instances are no longer able to do so from the webserver, and 
> are receiving an "Access denied" error. They are able to clear task instances 
> from /taskinstance, but not from /clear.
> I believe this is related to the change made in AIRFLOW-4240, which had the 
> unintended side effect of breaking the has_dag_access decorator for POST 
> requests.
> This may affect other endpoints.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4240) State changing actions shouldn't be GET requests

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831294#comment-16831294
 ] 

ASF subversion and git services commented on AIRFLOW-4240:
--

Commit 52b121797b328383cd39905ef762fc79e2e6a653 in airflow's branch 
refs/heads/master from Chris McLennon
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=52b1217 ]

[AIRFLOW-4450] Fix request arguments in has_dag_access (#5220)

has_dag_access needs to use request arguments from both
request.args and request.form.

This is related to the changes made in AIRFLOW-4240/#5039.

> State changing actions shouldn't be GET requests
> 
>
> Key: AIRFLOW-4240
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4240
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Ash Berlin-Taylor
>Assignee: Ash Berlin-Taylor
>Priority: Major
> Fix For: 1.10.3
>
>
> We have a number of actions which perform actions (trigger, clear, etc) that 
> are performed over GET requests.
> That should be avoided as browsers/corporate proxies might prefetch the URLs 
> causing things to behave oddly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb merged pull request #5220: [AIRFLOW-4450] has_dag_access does not handle form parameters

2019-05-01 Thread GitBox
ashb merged pull request #5220: [AIRFLOW-4450] has_dag_access does not handle 
form parameters
URL: https://github.com/apache/airflow/pull/5220
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #5206: [AIRFLOW-4434] fix support for impala in hive hook

2019-05-01 Thread GitBox
ashb commented on issue #5206: [AIRFLOW-4434] fix support for impala in hive 
hook
URL: https://github.com/apache/airflow/pull/5206#issuecomment-488462937
 
 
   May or June time-frame seems likely, yes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KevinYang21 commented on issue #5218: [AIRFLOW-4447] Display task duration as human friendly format in UI

2019-05-01 Thread GitBox
KevinYang21 commented on issue #5218: [AIRFLOW-4447] Display task duration as 
human friendly format in UI
URL: https://github.com/apache/airflow/pull/5218#issuecomment-488461595
 
 
   Ty @pingzh 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4368) Document Apache Airflow architecture with diagrams

2019-05-01 Thread Pooja Gadige (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831280#comment-16831280
 ] 

Pooja Gadige commented on AIRFLOW-4368:
---

[~aizhamal], may I please start working on this one if it's available? 

I've already quickly skimmed through the *building and deploying docs* and 
*contributors' guide* topics.

How would you like the initial drafts of the content to be submitted for review?

Thank you!

> Document Apache Airflow architecture with diagrams
> --
>
> Key: AIRFLOW-4368
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4368
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: docs
>Reporter: Aizhamal Nurmamat kyzy
>Priority: Major
>  Labels: gsod2019
>
> h4. Expected deliverables
>  * A page describing the architecture
>  * The page should have detailed descriptions of each component:
>  * Scheduler
>  * Web Server
>  * Worker
>  * Metadata DB
>  * The page should also contain a diagram on [Apache Airflow 
> architecture|https://imgur.com/a/YGpg5Wa]
>  * Description of how Apache Airflow [schedules 
> tasks|https://blog.sicara.com/using-airflow-with-celery-workers-54cb5212d405]
>  * Detailed examples with diagrams and text [using 
> PlantUML|https://github.com/plantuml/plantuml]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] serkef commented on a change in pull request #5083: [AIRFLOW-4292] Cleanup and improve SLA code

2019-05-01 Thread GitBox
serkef commented on a change in pull request #5083: [AIRFLOW-4292] Cleanup and 
improve SLA code
URL: https://github.com/apache/airflow/pull/5083#discussion_r280211931
 
 

 ##
 File path: airflow/jobs.py
 ##
 @@ -660,86 +660,88 @@ def manage_slas(self, dag, session=None):
 dttm = dag.following_schedule(dttm)
 session.commit()
 
+# Identify tasks should send notification
 slas = (
 session
 .query(SlaMiss)
 .filter(SlaMiss.notification_sent == False, SlaMiss.dag_id == 
dag.dag_id)  # noqa: E712
 .all()
 )
 
-if slas:
-sla_dates = [sla.execution_date for sla in slas]
-qry = (
-session
-.query(TI)
-.filter(
-TI.state != State.SUCCESS,
-TI.execution_date.in_(sla_dates),
-TI.dag_id == dag.dag_id
-).all()
-)
-blocking_tis = []
-for ti in qry:
-if ti.task_id in dag.task_ids:
-ti.task = dag.get_task(ti.task_id)
-blocking_tis.append(ti)
-else:
-session.delete(ti)
-session.commit()
+if not slas:
 
 Review comment:
   Since there is no else, we can reverse the check to reduce the indentation 
and make it more readable.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3877) Scheduler sending the absolute path to celery

2019-05-01 Thread James Coder (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831229#comment-16831229
 ] 

James Coder commented on AIRFLOW-3877:
--

I'm seeing the same thing using redis and postgres backend. tasks fail if they 
try to run on a different host where the dag location is different that the 
scheduler.

> Scheduler sending the absolute path to celery
> -
>
> Key: AIRFLOW-3877
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3877
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.10.1
>Reporter: Vipul Pandey
>Priority: Major
>
> Hi,
> Upgraded the airflow version from 1.7.3 to 1.10.1. After up-gradation of the 
> scheduler, webserver and workers, the dags have stopped working showing below 
> error on scheduler- 
> {{Either the dag did not exist or it failed to parse.}}
> I have not made any changes to the config. While investigating the issue the 
> scheduler logs shows the issue. Earlier the scheduler run the task as - 
> Adding to queue: airflow run    --local -sd 
> DAGS_FOLDER/
> While now it is running with absolute path - 
> Adding to queue: airflow run    --local -sd 
> //     
> PATH_TO_DAGS_FOLDER is like /home//Airflow/dags...
> which is same as what it is pushing it to workers by since worker is running 
> on some other user it is not able to find the dag location specified. 
> I am using mysql as backend and rabbitmq for message passing.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] andrewhharmon opened a new pull request #5223: [AIRFLOW-4417] Add AWS IAM authenication for PostgresHook

2019-05-01 Thread GitBox
andrewhharmon opened a new pull request #5223: [AIRFLOW-4417] Add AWS IAM 
authenication for PostgresHook
URL: https://github.com/apache/airflow/pull/5223
 
 
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title.
 - https://issues.apache.org/jira/browse/AIRFLOW-4417
   
   ### Description
   
   - [x] Enhance the existing PostgresHook to allow for IAM authentication for 
RDS Postgres and Redshift.
   
   ### Tests
   
   - [x] My PR adds the following unit tests
 - test_get_conn()
 - test_get_conn_rds_iam_postgres()
 - test_get_conn_rds_iam_redshift
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280169672
 
 

 ##
 File path: airflow/models/taskinstance.py
 ##
 @@ -215,6 +215,49 @@ def try_number(self, value):
 def next_try_number(self):
 return self._try_number + 1
 
+@staticmethod
+@provide_session
+def find(session=None, dag_id=None, task_id=None, execution_date=None,
+ execution_date_before=None, execution_date_after=None,
+ state=None, state_not_equal=None):
+"""
+Returns a set of dag runs for the given search criteria.
+
+:param dag_id: the dag_id to find task instance for
+:type dag_id: int
+:param task_id: the task_id to find task instance for
+:type task_id: int
+:param execution_date_before: filter on execution date before the 
provided one
+:type execution_date_before: datetime.datetime
+:param execution_date_after: filter on execution date after the 
provided one
+:type execution_date_after: datetime.datetime
+:param state_not_equal: the state of the task instance not to be in 
the results
+:type state: airflow.utils.state.State
+:param state: the state of the task instance
+:type state: airflow.utils.state.State
+:param session: database session
+:type session: sqlalchemy.orm.session.Session
+"""
+TI = TaskInstance
+
+query = session.query(TI)
+if dag_id:
 
 Review comment:
   Shouldn't at least dag_id be required?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280168804
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -41,6 +47,124 @@
 api_experimental = Blueprint('api_experimental', __name__)
 
 
+@api_experimental.route(
+
'/dags//dag_runs//tasks//logs',
+methods=['GET'])
+@requires_authentication
+def logs(dag_id, execution_date, task_id):
+"""
+Return logs for the specified task identified by dag_id, execution_date 
and task_id
+"""
+try:
+log = get_task_logs(dag_id, task_id, execution_date)
+except AirflowException as err:
+_log.info(err)
+response = jsonify(error="{}".format(err))
+response.status_code = err.status_code
+return response
+except ValueError:
+error_message = (
+'Given execution date, {}, could not be identified '
+'as a date. Example date format: 2015-11-16T14:34:15+00:00'
+.format(execution_date))
+response = jsonify({'error': error_message})
+response.status_code = 400
+return response
+except AttributeError as e:
+error_message = ["Unable to read logs.\n{}\n".format(str(e))]
+metadata = {}
+metadata['end_of_log'] = True
+return jsonify(message=error_message, error=True, metadata=metadata)
+
+return log
+
+
+@api_experimental.route('/dag_runs', methods=['GET'])
+@requires_authentication
+def dag_runs_filter():
+"""
+Return the list of all dag_runs
+:query param state: a query string parameter 
'?state=queued|running|success...'
+:query param state_not_equal: a query string parameter 
'?state_not_equal=queued|running|success...'
+:query param execution_date_before: a query string parameter to find all 
runs before provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15"
+:query param execution_date_after: a query string parameter to find all 
runs after provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15"
+:query param dag_id: String identifier of a DAG
+:return: List of DAG runs of a DAG with requested state,
+"""
+state = request.args.get('state')
+state_not_equal = request.args.get('state_not_equal')
+execution_date_before = request.args.get('execution_date_before')
+execution_date_after = request.args.get('execution_date_after')
+dag_id = request.args.get('dag_id')
+
+dagruns = get_all_dag_runs(dag_id=dag_id, state=state, 
state_not_equal=state_not_equal,
+   execution_date_before=execution_date_before,
+   execution_date_after=execution_date_after)
+
+return jsonify(dagruns)
+
+
+@api_experimental.route('/task_instances', methods=['GET'])
+@requires_authentication
+def task_instances_filter():
+"""
+Return the list of all dag_runs
+:query param state: a query string parameter 
'?state=queued|running|success...'
+:query param state_not_equal: a query string parameter 
'?state_not_equal=queued|running|success...'
+:query param execution_date_before: a query string parameter to find all 
runs before provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15".'
+:query param execution_date_after: a query string parameter to find all 
runs after provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15".'
+:query param dag_id: String identifier of a DAG
+:query param task_id: String identifier of a task
+:return: List of task instances
+"""
+state = request.args.get('state')
+state_not_equal = request.args.get('state_not_equal')
+execution_date_before = request.args.get('execution_date_before')
+execution_date_after = request.args.get('execution_date_after')
+dag_id = request.args.get('dag_id')
+task_id = request.args.get('task_id')
+
+task_instances = get_all_task_instances(dag_id=dag_id, state=state, 
state_not_equal=state_not_equal,
+
execution_date_before=execution_date_before,
+
execution_date_after=execution_date_after, task_id=task_id)
+
+return jsonify(task_instances)
+
+
+@api_experimental.route('/dags', methods=['GET'])
+@requires_authentication
+def get_all_dags():
+"""
+Returns a list of Dags
+:query param is_paused: a query string parameter '?is_paused=true|false'
+:return: List of all DAGs
+"""
+is_paused = request.args.get('is_paused')
 
 Review comment:
   Perhaps add a robustness check here to ensure that is_paused is either true 
or false (not sure if str or boolean). What if the user enters an erroneous 
value?



[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280165948
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -255,18 +413,14 @@ def dag_run_status(dag_id, execution_date):
 _log.info(error_message)
 response = jsonify({'error': error_message})
 response.status_code = 400
-
 return response
 
-try:
-info = get_dag_run_state(dag_id, execution_date)
-except AirflowException as err:
-_log.info(err)
-response = jsonify(error="{}".format(err))
-response.status_code = err.status_code
+if not dagruns:
+error_message = "No Dag run found with provided execution date"
+response = jsonify(error="{}".format(error_message))
 
 Review comment:
   Can simply this with just ```jsonify(error=error_message)```
   This happens in multiple places.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280168152
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -115,13 +239,23 @@ def dag_runs(dag_id):
 """
 Returns a list of Dag Runs for a specific DAG ID.
 :query param state: a query string parameter 
'?state=queued|running|success...'
+:query param state_not_equal: a query string parameter 
'?state_not_equal=queued|running|success...'
+:query param execution_date_before: a query string parameter to find all 
runs before provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15".'
 
 Review comment:
   Nitpick: Comments have a superfluous .' at the end


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280166583
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -150,23 +284,39 @@ def get_dag_code(dag_id):
 return response
 
 
+@api_experimental.route('/dags//tasks', methods=['GET'])
+@requires_authentication
+def tasks(dag_id):
+"""Returns a JSON with all tasks associated with the dag_id. """
+try:
+task_list = list()
+task_ids = get_tasks(dag_id)
+for task_id in task_ids:
+task_list.append(get_task_as_dict(dag_id, task_id))
+
+except AirflowException as err:
+_log.info(err)
+response = jsonify(error="{}".format(err))
+response.status_code = err.status_code
+return response
+
+return jsonify(task_list)
 
 Review comment:
   Good practice to initialize ```info = None``` just before the try-except 
since the return statement depends on it, or to return the info object inside 
of the try-block while the final return statement can just return None.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280167143
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -41,6 +47,124 @@
 api_experimental = Blueprint('api_experimental', __name__)
 
 
+@api_experimental.route(
+
'/dags//dag_runs//tasks//logs',
+methods=['GET'])
+@requires_authentication
+def logs(dag_id, execution_date, task_id):
+"""
+Return logs for the specified task identified by dag_id, execution_date 
and task_id
+"""
+try:
+log = get_task_logs(dag_id, task_id, execution_date)
+except AirflowException as err:
+_log.info(err)
+response = jsonify(error="{}".format(err))
+response.status_code = err.status_code
+return response
+except ValueError:
+error_message = (
+'Given execution date, {}, could not be identified '
+'as a date. Example date format: 2015-11-16T14:34:15+00:00'
+.format(execution_date))
+response = jsonify({'error': error_message})
+response.status_code = 400
+return response
+except AttributeError as e:
+error_message = ["Unable to read logs.\n{}\n".format(str(e))]
+metadata = {}
+metadata['end_of_log'] = True
+return jsonify(message=error_message, error=True, metadata=metadata)
+
+return log
+
+
+@api_experimental.route('/dag_runs', methods=['GET'])
+@requires_authentication
+def dag_runs_filter():
+"""
+Return the list of all dag_runs
+:query param state: a query string parameter 
'?state=queued|running|success...'
+:query param state_not_equal: a query string parameter 
'?state_not_equal=queued|running|success...'
+:query param execution_date_before: a query string parameter to find all 
runs before provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15"
+:query param execution_date_after: a query string parameter to find all 
runs after provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15"
+:query param dag_id: String identifier of a DAG
+:return: List of DAG runs of a DAG with requested state,
+"""
+state = request.args.get('state')
+state_not_equal = request.args.get('state_not_equal')
+execution_date_before = request.args.get('execution_date_before')
+execution_date_after = request.args.get('execution_date_after')
+dag_id = request.args.get('dag_id')
+
+dagruns = get_all_dag_runs(dag_id=dag_id, state=state, 
state_not_equal=state_not_equal,
+   execution_date_before=execution_date_before,
+   execution_date_after=execution_date_after)
+
+return jsonify(dagruns)
+
+
+@api_experimental.route('/task_instances', methods=['GET'])
+@requires_authentication
+def task_instances_filter():
+"""
+Return the list of all dag_runs
+:query param state: a query string parameter 
'?state=queued|running|success...'
+:query param state_not_equal: a query string parameter 
'?state_not_equal=queued|running|success...'
+:query param execution_date_before: a query string parameter to find all 
runs before provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15".'
+:query param execution_date_after: a query string parameter to find all 
runs after provided date,
+should be in format "-mm-DDTHH:MM:SS", for example: 
"2016-11-16T11:34:15".'
+:query param dag_id: String identifier of a DAG
+:query param task_id: String identifier of a task
+:return: List of task instances
+"""
+state = request.args.get('state')
+state_not_equal = request.args.get('state_not_equal')
+execution_date_before = request.args.get('execution_date_before')
+execution_date_after = request.args.get('execution_date_after')
+dag_id = request.args.get('dag_id')
+task_id = request.args.get('task_id')
+
+task_instances = get_all_task_instances(dag_id=dag_id, state=state, 
state_not_equal=state_not_equal,
+
execution_date_before=execution_date_before,
+
execution_date_after=execution_date_after, task_id=task_id)
+
+return jsonify(task_instances)
+
+
+@api_experimental.route('/dags', methods=['GET'])
+@requires_authentication
+def get_all_dags():
+"""
+Returns a list of Dags
+:query param is_paused: a query string parameter '?is_paused=true|false'
+:return: List of all DAGs
+"""
+is_paused = request.args.get('is_paused')
+dag_list = get_dags(is_paused)
+
+return jsonify(dag_list)
+
+
+@api_experimental.route('/dags/', methods=['GET'])
+@requires_authentication
+def get_dag_info(dag_id):
+"""
+Returns information for a single dag
+"""
+try:
+   

[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280166583
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -150,23 +284,39 @@ def get_dag_code(dag_id):
 return response
 
 
+@api_experimental.route('/dags//tasks', methods=['GET'])
+@requires_authentication
+def tasks(dag_id):
+"""Returns a JSON with all tasks associated with the dag_id. """
+try:
+task_list = list()
+task_ids = get_tasks(dag_id)
+for task_id in task_ids:
+task_list.append(get_task_as_dict(dag_id, task_id))
+
+except AirflowException as err:
+_log.info(err)
+response = jsonify(error="{}".format(err))
+response.status_code = err.status_code
+return response
+
+return jsonify(task_list)
 
 Review comment:
   Good practice to initialize task_list = list() just before the try-except 
since the return statement depends on it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280166358
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -150,23 +284,39 @@ def get_dag_code(dag_id):
 return response
 
 
+@api_experimental.route('/dags//tasks', methods=['GET'])
+@requires_authentication
+def tasks(dag_id):
+"""Returns a JSON with all tasks associated with the dag_id. """
+try:
+task_list = list()
+task_ids = get_tasks(dag_id)
+for task_id in task_ids:
+task_list.append(get_task_as_dict(dag_id, task_id))
+
+except AirflowException as err:
+_log.info(err)
+response = jsonify(error="{}".format(err))
 
 Review comment:
   Can simply this with just ```jsonify(error=error_message)```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
alejandrounravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280165948
 
 

 ##
 File path: airflow/www/api/experimental/endpoints.py
 ##
 @@ -255,18 +413,14 @@ def dag_run_status(dag_id, execution_date):
 _log.info(error_message)
 response = jsonify({'error': error_message})
 response.status_code = 400
-
 return response
 
-try:
-info = get_dag_run_state(dag_id, execution_date)
-except AirflowException as err:
-_log.info(err)
-response = jsonify(error="{}".format(err))
-response.status_code = err.status_code
+if not dagruns:
+error_message = "No Dag run found with provided execution date"
+response = jsonify(error="{}".format(error_message))
 
 Review comment:
   Can simply this with just ```jsonify(error=error_message)```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pkrish commented on issue #5213: AIRFLOW-4174 Fix run with backoff

2019-05-01 Thread GitBox
pkrish commented on issue #5213: AIRFLOW-4174 Fix run with backoff
URL: https://github.com/apache/airflow/pull/5213#issuecomment-488367798
 
 
   Ran into this issue as well. When using run_with_advanced_retry, response is 
always None. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] cong-zhu commented on issue #5177: [AIRFLOW-4084][WIP] Fix bug downloading incomplete logs from ElasticSearch

2019-05-01 Thread GitBox
cong-zhu commented on issue #5177: [AIRFLOW-4084][WIP] Fix bug downloading 
incomplete logs from ElasticSearch
URL: https://github.com/apache/airflow/pull/5177#issuecomment-488361791
 
 
   @KevinYang21 Yes, we have unit test for this log download feature, though 
it's not testing against ElasticSearch. CI failed because of python version. I 
marked this PR as `WIP`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kurtqq commented on issue #4610: [AIRFLOW-3783] Switch to HEADER option being provided already by Redshift

2019-05-01 Thread GitBox
kurtqq commented on issue #4610: [AIRFLOW-3783] Switch to HEADER option being 
provided already by Redshift
URL: https://github.com/apache/airflow/pull/4610#issuecomment-488361296
 
 
   @ashb  to answer your question - yes.
   "Versions 1.0.3945, 1.0.4081, 1.0.4222
   Time period: September 19–October 10, 2018
   Features and Improvements
   Amazon Redshift: You can specify the HEADER option in the UNLOAD command 
to add a specific header row to each created output file. The header row 
contains the column names created by the unload query.
   "
   
https://docs.aws.amazon.com/redshift/latest/mgmt/rs-mgmt-cluster-version-notes.html
   
   also info on stackoverflow (see fez answer):
   
https://stackoverflow.com/questions/24681214/unloading-from-redshift-to-s3-with-headers


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] pingzh commented on issue #5218: [AIRFLOW-4447] Display task duration as human friendly format in UI

2019-05-01 Thread GitBox
pingzh commented on issue #5218: [AIRFLOW-4447] Display task duration as human 
friendly format in UI
URL: https://github.com/apache/airflow/pull/5218#issuecomment-488354904
 
 
   @BasPH thanks. could you please merge the pr?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #5220: [AIRFLOW-4450] has_dag_access does not handle form parameters

2019-05-01 Thread GitBox
codecov-io edited a comment on issue #5220: [AIRFLOW-4450] has_dag_access does 
not handle form parameters
URL: https://github.com/apache/airflow/pull/5220#issuecomment-488171889
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=h1) 
Report
   > Merging 
[#5220](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/6a70ed65878abfce953488b3cd747456cc22af30?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5220/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#5220   +/-   ##
   ===
 Coverage   78.55%   78.55%   
   ===
 Files 469  469   
 Lines   2998329983   
   ===
 Hits2355223552   
 Misses   6431 6431
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/decorators.py](https://codecov.io/gh/apache/airflow/pull/5220/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZGVjb3JhdG9ycy5weQ==)
 | `74.5% <100%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=footer). 
Last update 
[6a70ed6...a708148](https://codecov.io/gh/apache/airflow/pull/5220?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] chrismclennon commented on issue #5220: [AIRFLOW-4450] has_dag_access does not handle form parameters

2019-05-01 Thread GitBox
chrismclennon commented on issue #5220: [AIRFLOW-4450] has_dag_access does not 
handle form parameters
URL: https://github.com/apache/airflow/pull/5220#issuecomment-488331873
 
 
   > Use flask built-in for this
   
   Great catch. I made the change and squashed commits.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] andriisoldatenko commented on a change in pull request #4751: [AIRFLOW-3607] collected trigger rule dep check per dag run

2019-05-01 Thread GitBox
andriisoldatenko commented on a change in pull request #4751: [AIRFLOW-3607] 
collected trigger rule dep check per dag run
URL: https://github.com/apache/airflow/pull/4751#discussion_r280125265
 
 

 ##
 File path: airflow/ti_deps/deps/trigger_rule_dep.py
 ##
 @@ -49,33 +50,46 @@ def _get_dep_statuses(self, ti, session, dep_context):
 yield self._passing_status(reason="The task had a dummy trigger 
rule set.")
 return
 
-# TODO(unknown): this query becomes quite expensive with dags that 
have many
-# tasks. It should be refactored to let the task report to the dag run 
and get the
-# aggregates from there.
-qry = (
-session
-.query(
-func.coalesce(func.sum(
-case([(TI.state == State.SUCCESS, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.SKIPPED, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.FAILED, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.UPSTREAM_FAILED, 1)], else_=0)), 
0),
-func.count(TI.task_id),
+successes, skipped, failed, upstream_failed, done = 0, 0, 0, 0, 0
+if dep_context.finished_tasks is None:
+qry = (
+session
+.query(
+func.coalesce(func.sum(
+case([(TI.state == State.SUCCESS, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.SKIPPED, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.FAILED, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.UPSTREAM_FAILED, 1)], 
else_=0)), 0),
+func.count(TI.task_id),
+)
+.filter(
+TI.dag_id == ti.dag_id,
+TI.task_id.in_(ti.task.upstream_task_ids),
+TI.execution_date == ti.execution_date,
+TI.state.in_(State.finished()),
+)
 )
-.filter(
-TI.dag_id == ti.dag_id,
-TI.task_id.in_(ti.task.upstream_task_ids),
-TI.execution_date == ti.execution_date,
-TI.state.in_([
-State.SUCCESS, State.FAILED,
-State.UPSTREAM_FAILED, State.SKIPPED]),
-)
-)
+successes, skipped, failed, upstream_failed, done = qry.first()
+else:
+# see if the task name is in the task upstream for our task
+upstream_tasks = [finished_task for finished_task in 
dep_context.finished_tasks
+  if finished_task.task_id in 
ti.task.upstream_task_ids]
+if upstream_tasks:
+upstream_tasks_sorted = sorted(upstream_tasks, key=lambda x: 
x.state)
 
 Review comment:
   I think we need more tests here too ^


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] andriisoldatenko commented on a change in pull request #4751: [AIRFLOW-3607] collected trigger rule dep check per dag run

2019-05-01 Thread GitBox
andriisoldatenko commented on a change in pull request #4751: [AIRFLOW-3607] 
collected trigger rule dep check per dag run
URL: https://github.com/apache/airflow/pull/4751#discussion_r280125168
 
 

 ##
 File path: airflow/ti_deps/deps/trigger_rule_dep.py
 ##
 @@ -49,33 +50,46 @@ def _get_dep_statuses(self, ti, session, dep_context):
 yield self._passing_status(reason="The task had a dummy trigger 
rule set.")
 return
 
-# TODO(unknown): this query becomes quite expensive with dags that 
have many
-# tasks. It should be refactored to let the task report to the dag run 
and get the
-# aggregates from there.
-qry = (
-session
-.query(
-func.coalesce(func.sum(
-case([(TI.state == State.SUCCESS, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.SKIPPED, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.FAILED, 1)], else_=0)), 0),
-func.coalesce(func.sum(
-case([(TI.state == State.UPSTREAM_FAILED, 1)], else_=0)), 
0),
-func.count(TI.task_id),
+successes, skipped, failed, upstream_failed, done = 0, 0, 0, 0, 0
+if dep_context.finished_tasks is None:
+qry = (
+session
+.query(
+func.coalesce(func.sum(
+case([(TI.state == State.SUCCESS, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.SKIPPED, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.FAILED, 1)], else_=0)), 0),
+func.coalesce(func.sum(
+case([(TI.state == State.UPSTREAM_FAILED, 1)], 
else_=0)), 0),
+func.count(TI.task_id),
+)
+.filter(
+TI.dag_id == ti.dag_id,
+TI.task_id.in_(ti.task.upstream_task_ids),
+TI.execution_date == ti.execution_date,
+TI.state.in_(State.finished()),
+)
 )
-.filter(
-TI.dag_id == ti.dag_id,
-TI.task_id.in_(ti.task.upstream_task_ids),
-TI.execution_date == ti.execution_date,
-TI.state.in_([
-State.SUCCESS, State.FAILED,
-State.UPSTREAM_FAILED, State.SKIPPED]),
-)
-)
+successes, skipped, failed, upstream_failed, done = qry.first()
+else:
+# see if the task name is in the task upstream for our task
+upstream_tasks = [finished_task for finished_task in 
dep_context.finished_tasks
+  if finished_task.task_id in 
ti.task.upstream_task_ids]
+if upstream_tasks:
 
 Review comment:
   I was wondering before this change we had only SQL query, but now it's both 
python + sql.
   樂 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r-richmond commented on issue #5206: [AIRFLOW-4434] fix support for impala in hive hook

2019-05-01 Thread GitBox
r-richmond commented on issue #5206: [AIRFLOW-4434] fix support for impala in 
hive hook
URL: https://github.com/apache/airflow/pull/5206#issuecomment-488318329
 
 
   Thanks @ashb, Also looking at the past releases it looks like 1.10.4 will 
land around June; is that roughly correct?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Comment Edited] (AIRFLOW-1013) airflow/jobs.py:manage_slas() exception for @once dag

2019-05-01 Thread Xiaodong DENG (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831028#comment-16831028
 ] 

Xiaodong DENG edited comment on AIRFLOW-1013 at 5/1/19 2:36 PM:


Seems yet, and didn't get enough attention.

Shall I close this ticket as duplicate of 
https://issues.apache.org/jira/browse/AIRFLOW-4390 (I will track everything 
there)? [~ash]


was (Author: xd-deng):
Seems yet, and didn't get enough attention.

Shall I close this ticket as duplicate of 
https://issues.apache.org/jira/browse/AIRFLOW-4390 ? [~ash]

> airflow/jobs.py:manage_slas() exception for @once dag
> -
>
> Key: AIRFLOW-1013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1013
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.0, 1.8.1, 1.8.2
>Reporter: Ruslan Dautkhanov
>Assignee: Muhammad Ahmmad
>Priority: Critical
>  Labels: dagrun, once, scheduler, sla
> Fix For: 1.10.0
>
>
> Getting following exception 
> {noformat}
> [2017-03-19 20:16:25,786] {jobs.py:354} DagFileProcessor2638 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 346, in helper
> pickle_dags)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1581, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1175, in _process_dags
> self.manage_slas(dag)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 595, in manage_slas
> while dttm < datetime.now():
> TypeError: can't compare datetime.datetime to NoneType
> {noformat}
> Exception is in airflow/jobs.py:manage_slas() :
> https://github.com/apache/incubator-airflow/blob/v1-8-stable/airflow/jobs.py#L595
> {code}
> ts = datetime.now()
> SlaMiss = models.SlaMiss
> for ti in max_tis:
> task = dag.get_task(ti.task_id)
> dttm = ti.execution_date
> if task.sla:
> dttm = dag.following_schedule(dttm)
>   >>>   while dttm < datetime.now():  <<< here
> following_schedule = dag.following_schedule(dttm)
> if following_schedule + task.sla < datetime.now():
> session.merge(models.SlaMiss(
> task_id=ti.task_id,
> {code}
> It seems that dag.following_schedule() returns None for @once dag?
> Here's how dag is defined:
> {code}
> main_dag = DAG(
> dag_id = 'DISCOVER-Oracle-Load',
> default_args   = default_args,   
> user_defined_macros= dag_macros,   
> start_date = datetime.now(), 
> catchup= False,  
> schedule_interval  = '@once',
> concurrency= 2,  
> max_active_runs= 1,  
> dagrun_timeout = timedelta(days=4),  
> )
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1013) airflow/jobs.py:manage_slas() exception for @once dag

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831032#comment-16831032
 ] 

Ash Berlin-Taylor commented on AIRFLOW-1013:


Yeah probably.

> airflow/jobs.py:manage_slas() exception for @once dag
> -
>
> Key: AIRFLOW-1013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1013
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.0, 1.8.1, 1.8.2
>Reporter: Ruslan Dautkhanov
>Assignee: Muhammad Ahmmad
>Priority: Critical
>  Labels: dagrun, once, scheduler, sla
> Fix For: 1.10.0
>
>
> Getting following exception 
> {noformat}
> [2017-03-19 20:16:25,786] {jobs.py:354} DagFileProcessor2638 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 346, in helper
> pickle_dags)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1581, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1175, in _process_dags
> self.manage_slas(dag)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 595, in manage_slas
> while dttm < datetime.now():
> TypeError: can't compare datetime.datetime to NoneType
> {noformat}
> Exception is in airflow/jobs.py:manage_slas() :
> https://github.com/apache/incubator-airflow/blob/v1-8-stable/airflow/jobs.py#L595
> {code}
> ts = datetime.now()
> SlaMiss = models.SlaMiss
> for ti in max_tis:
> task = dag.get_task(ti.task_id)
> dttm = ti.execution_date
> if task.sla:
> dttm = dag.following_schedule(dttm)
>   >>>   while dttm < datetime.now():  <<< here
> following_schedule = dag.following_schedule(dttm)
> if following_schedule + task.sla < datetime.now():
> session.merge(models.SlaMiss(
> task_id=ti.task_id,
> {code}
> It seems that dag.following_schedule() returns None for @once dag?
> Here's how dag is defined:
> {code}
> main_dag = DAG(
> dag_id = 'DISCOVER-Oracle-Load',
> default_args   = default_args,   
> user_defined_macros= dag_macros,   
> start_date = datetime.now(), 
> catchup= False,  
> schedule_interval  = '@once',
> concurrency= 2,  
> max_active_runs= 1,  
> dagrun_timeout = timedelta(days=4),  
> )
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-1013) airflow/jobs.py:manage_slas() exception for @once dag

2019-05-01 Thread Xiaodong DENG (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1013?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiaodong DENG closed AIRFLOW-1013.
--
Resolution: Duplicate

 This issue is noticed, and will be further tracked with 
https://issues.apache.org/jira/browse/AIRFLOW-4390 from now on.

> airflow/jobs.py:manage_slas() exception for @once dag
> -
>
> Key: AIRFLOW-1013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1013
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.0, 1.8.1, 1.8.2
>Reporter: Ruslan Dautkhanov
>Assignee: Muhammad Ahmmad
>Priority: Critical
>  Labels: dagrun, once, scheduler, sla
> Fix For: 1.10.0
>
>
> Getting following exception 
> {noformat}
> [2017-03-19 20:16:25,786] {jobs.py:354} DagFileProcessor2638 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 346, in helper
> pickle_dags)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1581, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1175, in _process_dags
> self.manage_slas(dag)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 595, in manage_slas
> while dttm < datetime.now():
> TypeError: can't compare datetime.datetime to NoneType
> {noformat}
> Exception is in airflow/jobs.py:manage_slas() :
> https://github.com/apache/incubator-airflow/blob/v1-8-stable/airflow/jobs.py#L595
> {code}
> ts = datetime.now()
> SlaMiss = models.SlaMiss
> for ti in max_tis:
> task = dag.get_task(ti.task_id)
> dttm = ti.execution_date
> if task.sla:
> dttm = dag.following_schedule(dttm)
>   >>>   while dttm < datetime.now():  <<< here
> following_schedule = dag.following_schedule(dttm)
> if following_schedule + task.sla < datetime.now():
> session.merge(models.SlaMiss(
> task_id=ti.task_id,
> {code}
> It seems that dag.following_schedule() returns None for @once dag?
> Here's how dag is defined:
> {code}
> main_dag = DAG(
> dag_id = 'DISCOVER-Oracle-Load',
> default_args   = default_args,   
> user_defined_macros= dag_macros,   
> start_date = datetime.now(), 
> catchup= False,  
> schedule_interval  = '@once',
> concurrency= 2,  
> max_active_runs= 1,  
> dagrun_timeout = timedelta(days=4),  
> )
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2340) SQLalchemy pessimistic connection handling not working

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-2340.

Resolution: Duplicate

Think this issue is fixed by AIRFLOW-2703 which is in 1.10.2

> SQLalchemy pessimistic connection handling not working
> --
>
> Key: AIRFLOW-2340
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2340
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 1.9.0
>Reporter: John Arnold
>Priority: Critical
> Attachments: airflow_traceback.txt, webserver.txt
>
>
> Our scheduler keeps crashing, about once a day.  It seems to be triggered by 
> a failure to connect to the postgresql database, but then it doesn't recover 
> and crashes the scheduler over and over.
> The scheduler runs in a container in our environment, so after several 
> container restarts, docker gives up and the container stays down.
> Airflow should be able to recover from a connection failure without blowing 
> up the container altogether.  Perhaps some exponential backoff is needed?
>  
> See attached log from the scheduler.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-508) Can't delete task instance in sqlite

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-508?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor closed AIRFLOW-508.
-
Resolution: Cannot Reproduce

> Can't delete task instance in sqlite
> 
>
> Key: AIRFLOW-508
> URL: https://issues.apache.org/jira/browse/AIRFLOW-508
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.7.1.3
>Reporter: Li Xuanji
>Assignee: Li Xuanji
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-1013) airflow/jobs.py:manage_slas() exception for @once dag

2019-05-01 Thread Xiaodong DENG (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831028#comment-16831028
 ] 

Xiaodong DENG edited comment on AIRFLOW-1013 at 5/1/19 2:37 PM:


Seems yes, and didn't get enough attention.

Shall I close this ticket as duplicate of 
https://issues.apache.org/jira/browse/AIRFLOW-4390 (I will track everything 
there)? [~ash]


was (Author: xd-deng):
Seems yet, and didn't get enough attention.

Shall I close this ticket as duplicate of 
https://issues.apache.org/jira/browse/AIRFLOW-4390 (I will track everything 
there)? [~ash]

> airflow/jobs.py:manage_slas() exception for @once dag
> -
>
> Key: AIRFLOW-1013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1013
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.0, 1.8.1, 1.8.2
>Reporter: Ruslan Dautkhanov
>Assignee: Muhammad Ahmmad
>Priority: Critical
>  Labels: dagrun, once, scheduler, sla
> Fix For: 1.10.0
>
>
> Getting following exception 
> {noformat}
> [2017-03-19 20:16:25,786] {jobs.py:354} DagFileProcessor2638 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 346, in helper
> pickle_dags)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1581, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1175, in _process_dags
> self.manage_slas(dag)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 595, in manage_slas
> while dttm < datetime.now():
> TypeError: can't compare datetime.datetime to NoneType
> {noformat}
> Exception is in airflow/jobs.py:manage_slas() :
> https://github.com/apache/incubator-airflow/blob/v1-8-stable/airflow/jobs.py#L595
> {code}
> ts = datetime.now()
> SlaMiss = models.SlaMiss
> for ti in max_tis:
> task = dag.get_task(ti.task_id)
> dttm = ti.execution_date
> if task.sla:
> dttm = dag.following_schedule(dttm)
>   >>>   while dttm < datetime.now():  <<< here
> following_schedule = dag.following_schedule(dttm)
> if following_schedule + task.sla < datetime.now():
> session.merge(models.SlaMiss(
> task_id=ti.task_id,
> {code}
> It seems that dag.following_schedule() returns None for @once dag?
> Here's how dag is defined:
> {code}
> main_dag = DAG(
> dag_id = 'DISCOVER-Oracle-Load',
> default_args   = default_args,   
> user_defined_macros= dag_macros,   
> start_date = datetime.now(), 
> catchup= False,  
> schedule_interval  = '@once',
> concurrency= 2,  
> max_active_runs= 1,  
> dagrun_timeout = timedelta(days=4),  
> )
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-245) Access to task instance from custom Executor

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor closed AIRFLOW-245.
-
Resolution: Won't Fix

No longer relevant. If you want a pattern to copy that doesn't need Airflow 
installed on the remote end look at the KubeExecutor.

> Access to task instance from custom Executor
> 
>
> Key: AIRFLOW-245
> URL: https://issues.apache.org/jira/browse/AIRFLOW-245
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executor
>Reporter: Alexandr Nikitin
>Priority: Major
>
> I'm writing a custom executor that executes tasks on mesos and I want to have 
> access to task instances from it. So that I can reuse all existing operators 
> e.g. DockerOperator and access its fields like image, command, volumes and 
> transform them to mesos.
> This can be done by changing `def execute_async(self, key, command, 
> queue=None):` in `BaseExecutor`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-4390) manage_slas() needs refactoring

2019-05-01 Thread Xiaodong DENG (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4390?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiaodong DENG updated AIRFLOW-4390:
---
Affects Version/s: 1.8.0
   1.8.1
   1.8.2

> manage_slas() needs refactoring
> ---
>
> Key: AIRFLOW-4390
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4390
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.8.0, 1.8.1, 1.8.2, 1.10.3, 1.10.4
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Blocker
> Fix For: 2.0.0
>
>
> As analyzed in 
> https://issues.apache.org/jira/browse/AIRFLOW-4297?focusedCommentId=16822945=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16822945
>  , current implementation of manage_slas() has a few flaws, and is not 
> functioning in some cases.
> We need to refactor it somehow.
> As discussed in [https://github.com/apache/airflow/pull/5150] , we put it as 
> a blocker for 2.0.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4390) manage_slas() needs refactoring

2019-05-01 Thread Xiaodong DENG (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831029#comment-16831029
 ] 

Xiaodong DENG commented on AIRFLOW-4390:


Related issue was found reported long time ago in 
https://issues.apache.org/jira/browse/AIRFLOW-1013

> manage_slas() needs refactoring
> ---
>
> Key: AIRFLOW-4390
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4390
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.3, 1.10.4
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Blocker
> Fix For: 2.0.0
>
>
> As analyzed in 
> https://issues.apache.org/jira/browse/AIRFLOW-4297?focusedCommentId=16822945=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16822945
>  , current implementation of manage_slas() has a few flaws, and is not 
> functioning in some cases.
> We need to refactor it somehow.
> As discussed in [https://github.com/apache/airflow/pull/5150] , we put it as 
> a blocker for 2.0.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1013) airflow/jobs.py:manage_slas() exception for @once dag

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831023#comment-16831023
 ] 

Ash Berlin-Taylor commented on AIRFLOW-1013:


Paging [~xd.den...@gmail.com] - looks like the SLA issue for {{@once}} dags has 
been around for a while.

> airflow/jobs.py:manage_slas() exception for @once dag
> -
>
> Key: AIRFLOW-1013
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1013
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.8.0, 1.8.1, 1.8.2
>Reporter: Ruslan Dautkhanov
>Assignee: Muhammad Ahmmad
>Priority: Critical
>  Labels: dagrun, once, scheduler, sla
> Fix For: 1.10.0
>
>
> Getting following exception 
> {noformat}
> [2017-03-19 20:16:25,786] {jobs.py:354} DagFileProcessor2638 ERROR - Got an 
> exception! Propagating...
> Traceback (most recent call last):
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 346, in helper
> pickle_dags)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1581, in process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 1175, in _process_dags
> self.manage_slas(dag)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/utils/db.py",
>  line 53, in wrapper
> result = func(*args, **kwargs)
>   File 
> "/opt/cloudera/parcels/Anaconda/lib/python2.7/site-packages/airflow/jobs.py", 
> line 595, in manage_slas
> while dttm < datetime.now():
> TypeError: can't compare datetime.datetime to NoneType
> {noformat}
> Exception is in airflow/jobs.py:manage_slas() :
> https://github.com/apache/incubator-airflow/blob/v1-8-stable/airflow/jobs.py#L595
> {code}
> ts = datetime.now()
> SlaMiss = models.SlaMiss
> for ti in max_tis:
> task = dag.get_task(ti.task_id)
> dttm = ti.execution_date
> if task.sla:
> dttm = dag.following_schedule(dttm)
>   >>>   while dttm < datetime.now():  <<< here
> following_schedule = dag.following_schedule(dttm)
> if following_schedule + task.sla < datetime.now():
> session.merge(models.SlaMiss(
> task_id=ti.task_id,
> {code}
> It seems that dag.following_schedule() returns None for @once dag?
> Here's how dag is defined:
> {code}
> main_dag = DAG(
> dag_id = 'DISCOVER-Oracle-Load',
> default_args   = default_args,   
> user_defined_macros= dag_macros,   
> start_date = datetime.now(), 
> catchup= False,  
> schedule_interval  = '@once',
> concurrency= 2,  
> max_active_runs= 1,  
> dagrun_timeout = timedelta(days=4),  
> )
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-05-01 Thread GitBox
galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template 
fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-488292399
 
 
   > Minor changes to the code.
   > 
   > Sorry it sat un-revewied for so long @galak75
   > 
   > This needs adding to the docs/ tree somewhere - probably in 
https://airflow.apache.org/concepts.html#id1
   > 
   > I missed your first ping, but please `@ashb` me once you've made these 
changes. (Oh, though I'm not around for most of May.)
   
   @ashb : thank you for the review. I'll reworked it pretty soon.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #5222: Adding new contributor to G Adventures

2019-05-01 Thread GitBox
codecov-io commented on issue #5222: Adding new contributor to G Adventures
URL: https://github.com/apache/airflow/pull/5222#issuecomment-488291475
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=h1) 
Report
   > Merging 
[#5222](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/0ae15d84b7d2aee4119a7432578b4431b13a08ee?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5222/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5222  +/-   ##
   ==
   - Coverage   78.55%   78.54%   -0.01% 
   ==
 Files 469  469  
 Lines   2998329983  
   ==
   - Hits2355223551   -1 
   - Misses   6431 6432   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/5222/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `92.42% <0%> (-0.18%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=footer). 
Last update 
[0ae15d8...a27c933](https://codecov.io/gh/apache/airflow/pull/5222?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb merged pull request #5222: Adding new contributor to G Adventures

2019-05-01 Thread GitBox
ashb merged pull request #5222: Adding new contributor to G Adventures
URL: https://github.com/apache/airflow/pull/5222
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-4434) AIRFLOW-2463 broke support for impala

2019-05-01 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-4434.

   Resolution: Fixed
Fix Version/s: (was: 2.0.0)

> AIRFLOW-2463 broke support for impala 
> --
>
> Key: AIRFLOW-4434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.0, 1.10.1, 1.10.2, 1.10.3
>Reporter: r-richmond
>Assignee: r-richmond
>Priority: Critical
> Fix For: 1.10.4
>
>
> [PR#3405|https://github.com/apache/airflow/pull/3405] added 
> [functionality|https://github.com/apache/airflow/blame/master/airflow/hooks/hive_hooks.py#L800]
>  that made task instance context available for hive queries. Unfortunately 
> this usage of the set command is [not 
> supported|https://impala.apache.org/docs/build/html/topics/impala_set.html] 
> on impala (currently only impala-shell supports set and only for query 
> options not variables like hive).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4434) AIRFLOW-2463 broke support for impala

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831003#comment-16831003
 ] 

ASF subversion and git services commented on AIRFLOW-4434:
--

Commit c4daf707e3256667791635f7f38c43a27ec3d32a in airflow's branch 
refs/heads/v1-10-test from r-richmond
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c4daf70 ]

[AIRFLOW-4434] Support Impala with the HiveServer2Hook (#5206)

Fix support for impala in hive hook through a config option
that turns off set variable commands. Breaking changes were
introduced in [AIRFLOW-2463].

(cherry picked from commit 0ae15d84b7d2aee4119a7432578b4431b13a08ee)


> AIRFLOW-2463 broke support for impala 
> --
>
> Key: AIRFLOW-4434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.0, 1.10.1, 1.10.2, 1.10.3
>Reporter: r-richmond
>Assignee: r-richmond
>Priority: Critical
> Fix For: 1.10.4
>
>
> [PR#3405|https://github.com/apache/airflow/pull/3405] added 
> [functionality|https://github.com/apache/airflow/blame/master/airflow/hooks/hive_hooks.py#L800]
>  that made task instance context available for hive queries. Unfortunately 
> this usage of the set command is [not 
> supported|https://impala.apache.org/docs/build/html/topics/impala_set.html] 
> on impala (currently only impala-shell supports set and only for query 
> options not variables like hive).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Issue Comment Deleted] (AIRFLOW-152) Add --task_params option to 'airflow run'

2019-05-01 Thread jack (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jack updated AIRFLOW-152:
-
Comment: was deleted

(was: Is this still needed?)

> Add --task_params option to 'airflow run'
> -
>
> Key: AIRFLOW-152
> URL: https://issues.apache.org/jira/browse/AIRFLOW-152
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: cli
>Reporter: Jeffrey Picard
>Priority: Minor
>
> Currently there is a 'task_params' option which can add to or override
> values in the params dictionary for a task, but it is only available
> when running a task with 'airflow test'.
> By accepting this parameter in 'airflow run' and then passing it to the
> subprocess through the command method in the TaskInstance class this
> option can be supported.
> This has use cases in running tasks in an ad-hoc manner where a
> parameter may define an environment (i.e. testing vs. production) or
> input / output locations and a developer may want to tweak them on the
> fly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4434) AIRFLOW-2463 broke support for impala

2019-05-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16830998#comment-16830998
 ] 

ASF GitHub Bot commented on AIRFLOW-4434:
-

ashb commented on pull request #5206: [AIRFLOW-4434] fix support for impala in 
hive hook
URL: https://github.com/apache/airflow/pull/5206
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> AIRFLOW-2463 broke support for impala 
> --
>
> Key: AIRFLOW-4434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.0, 1.10.1, 1.10.2, 1.10.3
>Reporter: r-richmond
>Assignee: r-richmond
>Priority: Critical
> Fix For: 1.10.4, 2.0.0
>
>
> [PR#3405|https://github.com/apache/airflow/pull/3405] added 
> [functionality|https://github.com/apache/airflow/blame/master/airflow/hooks/hive_hooks.py#L800]
>  that made task instance context available for hive queries. Unfortunately 
> this usage of the set command is [not 
> supported|https://impala.apache.org/docs/build/html/topics/impala_set.html] 
> on impala (currently only impala-shell supports set and only for query 
> options not variables like hive).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2898) Task not entering queued state for pool

2019-05-01 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831006#comment-16831006
 ] 

jack commented on AIRFLOW-2898:
---

Using your dag example i'm unable to reproduce on 1.10.3:

I see 2 running as expected: !airflowcheck.PNG!

> Task not entering queued state for pool
> ---
>
> Key: AIRFLOW-2898
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2898
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: pools, scheduler
>Affects Versions: 1.8.0, 1.9.0
>Reporter: rana
>Assignee: rana
>Priority: Blocker
> Attachments: airflowcheck.PNG, dag_test.py
>
>
> I have a pool of 3 and have several jobs (over 10) which use the pool.
> Tasks timeout (after 10 mins) from being stuck in scheduled state when the 
> tasks should be in queued state for the pool.
> to reproduce:
> use attached dag, and create a pool called "backfill" with 2 slots
> start the dag, and go to this url - >
> /admin/taskinstance/?flt0_pool_equals=backfill
> you'll see two light green "running" state, loads of white "scheduled" state 
> and none in grey "queued" state.
> what I expect would be two in running, the rest in queued state.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2898) Task not entering queued state for pool

2019-05-01 Thread jack (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jack updated AIRFLOW-2898:
--
Attachment: airflowcheck.PNG

> Task not entering queued state for pool
> ---
>
> Key: AIRFLOW-2898
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2898
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: pools, scheduler
>Affects Versions: 1.8.0, 1.9.0
>Reporter: rana
>Assignee: rana
>Priority: Blocker
> Attachments: airflowcheck.PNG, dag_test.py
>
>
> I have a pool of 3 and have several jobs (over 10) which use the pool.
> Tasks timeout (after 10 mins) from being stuck in scheduled state when the 
> tasks should be in queued state for the pool.
> to reproduce:
> use attached dag, and create a pool called "backfill" with 2 slots
> start the dag, and go to this url - >
> /admin/taskinstance/?flt0_pool_equals=backfill
> you'll see two light green "running" state, loads of white "scheduled" state 
> and none in grey "queued" state.
> what I expect would be two in running, the rest in queued state.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3449) Airflow DAG parsing logs aren't written when using S3 logging

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3449?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831005#comment-16831005
 ] 

ASF subversion and git services commented on AIRFLOW-3449:
--

Commit cec54947d7cf0489284995a6a899b9b94620e301 in airflow's branch 
refs/heads/v1-10-stable from Ash Berlin-Taylor
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=cec5494 ]

[AIRFLOW-3449] Write local dag parsing logs when remote logging enabled. (#5175)

The default "processor" handler is a FileProcessorHandler (compare to
FileTaskHandler as the default "task" handler) so setting this as a
subclass of FileTaskHandler ended up with an invalid path for the
processor logs, meaning they never got written to a file when remote
logging is enabled.

This was likely a mis-configuration introduced in #2793

(cherry picked from commit d474f70ed846f64420cce8b54e4d42f3955c4ed0)


> Airflow DAG parsing logs aren't written when using S3 logging
> -
>
> Key: AIRFLOW-3449
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3449
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging, scheduler
>Affects Versions: 1.10.0, 1.10.1
>Reporter: James Meickle
>Assignee: Ash Berlin-Taylor
>Priority: Critical
> Fix For: 1.10.4
>
>
> The default Airflow logging class outputs provides some logs to stdout, some 
> to "task" folders, and some to "processor" folders (generated during DAG 
> parsing). The 1.10.0 logging update broke this, but only for users who are 
> also using S3 logging. This is because of this feature in the default logging 
> config file:
> {code:python}
> if REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('s3://'):
> DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['s3'])
> {code}
> That replaces this functioning handlers block:
> {code:python}
> 'task': {
> 'class': 'airflow.utils.log.file_task_handler.FileTaskHandler',
> 'formatter': 'airflow',
> 'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
> 'filename_template': FILENAME_TEMPLATE,
> },
> 'processor': {
> 'class': 
> 'airflow.utils.log.file_processor_handler.FileProcessorHandler',
> 'formatter': 'airflow',
> 'base_log_folder': os.path.expanduser(PROCESSOR_LOG_FOLDER),
> 'filename_template': PROCESSOR_FILENAME_TEMPLATE,
> },
> {code}
> With this non-functioning block:
> {code:python}
> 'task': {
> 'class': 'airflow.utils.log.s3_task_handler.S3TaskHandler',
> 'formatter': 'airflow',
> 'base_log_folder': os.path.expanduser(BASE_LOG_FOLDER),
> 's3_log_folder': REMOTE_BASE_LOG_FOLDER,
> 'filename_template': FILENAME_TEMPLATE,
> },
> 'processor': {
> 'class': 'airflow.utils.log.s3_task_handler.S3TaskHandler',
> 'formatter': 'airflow',
> 'base_log_folder': os.path.expanduser(PROCESSOR_LOG_FOLDER),
> 's3_log_folder': REMOTE_BASE_LOG_FOLDER,
> 'filename_template': PROCESSOR_FILENAME_TEMPLATE,
> },
> {code}
> The key issue here is that both "task" and "processor" are being given a 
> "S3TaskHandler" class to use for logging. But that is not a generic S3 class; 
> it's actually a subclass of FileTaskHandler! 
> https://github.com/apache/incubator-airflow/blob/1.10.1/airflow/utils/log/s3_task_handler.py#L26
> Since the template vars don't match the template string, the path to log to 
> evaluates to garbage. The handler then silently fails to log anything at all. 
> It is likely that anyone using a default-like logging config, plus the remote 
> S3 logging feature, stopped getting DAG parsing logs (either locally *or* in 
> S3) as of 1.10.0
> Commenting out the DAG parsing section of the S3 block fixed this on my 
> instance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2463) Make task instance context available for hive queries

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831004#comment-16831004
 ] 

ASF subversion and git services commented on AIRFLOW-2463:
--

Commit c4daf707e3256667791635f7f38c43a27ec3d32a in airflow's branch 
refs/heads/v1-10-test from r-richmond
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c4daf70 ]

[AIRFLOW-4434] Support Impala with the HiveServer2Hook (#5206)

Fix support for impala in hive hook through a config option
that turns off set variable commands. Breaking changes were
introduced in [AIRFLOW-2463].

(cherry picked from commit 0ae15d84b7d2aee4119a7432578b4431b13a08ee)


> Make task instance context available for hive queries
> -
>
> Key: AIRFLOW-2463
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2463
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Kevin Yang
>Assignee: Kevin Yang
>Priority: Major
> Fix For: 1.10.2
>
>
> Currently hive queries run through HiveOperator() would receive task_instance 
> context as hive_conf. But the context is not available when 
> HiveCliHook()/HiveServer2Hook() was called through PythonOperator(), nor 
> available when hive cli was called in BashOperator() nor available when 
> HiveServer2Hook() was called in any operator.
> Having the context available would provide users the capability to audit hive 
> queries.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] tomwross opened a new pull request #5222: Adding new contributor to G Adventures

2019-05-01 Thread GitBox
tomwross opened a new pull request #5222: Adding new contributor to G Adventures
URL: https://github.com/apache/airflow/pull/5222
 
 
   G Adventures' Airflow-based project has a new contributor
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   _No ticket, updating readme per general request_
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   _G Adventures' Airflow-based project has a new contributor, @chchtv11_
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   _No changes to code or functionality_
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   _No JIRA ticket, see above_
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   _No new functionality or changes to functionality that would require 
updating documentation_
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4434) AIRFLOW-2463 broke support for impala

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16830999#comment-16830999
 ] 

ASF subversion and git services commented on AIRFLOW-4434:
--

Commit 0ae15d84b7d2aee4119a7432578b4431b13a08ee in airflow's branch 
refs/heads/master from r-richmond
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=0ae15d8 ]

[AIRFLOW-4434] Support Impala with the HiveServer2Hook (#5206)

Fix support for impala in hive hook through a config option
that turns off set variable commands. Breaking changes were
introduced in [AIRFLOW-2463].

> AIRFLOW-2463 broke support for impala 
> --
>
> Key: AIRFLOW-4434
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4434
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hive_hooks
>Affects Versions: 1.10.0, 1.10.1, 1.10.2, 1.10.3
>Reporter: r-richmond
>Assignee: r-richmond
>Priority: Critical
> Fix For: 1.10.4, 2.0.0
>
>
> [PR#3405|https://github.com/apache/airflow/pull/3405] added 
> [functionality|https://github.com/apache/airflow/blame/master/airflow/hooks/hive_hooks.py#L800]
>  that made task instance context available for hive queries. Unfortunately 
> this usage of the set command is [not 
> supported|https://impala.apache.org/docs/build/html/topics/impala_set.html] 
> on impala (currently only impala-shell supports set and only for query 
> options not variables like hive).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2463) Make task instance context available for hive queries

2019-05-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2463?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16831000#comment-16831000
 ] 

ASF subversion and git services commented on AIRFLOW-2463:
--

Commit 0ae15d84b7d2aee4119a7432578b4431b13a08ee in airflow's branch 
refs/heads/master from r-richmond
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=0ae15d8 ]

[AIRFLOW-4434] Support Impala with the HiveServer2Hook (#5206)

Fix support for impala in hive hook through a config option
that turns off set variable commands. Breaking changes were
introduced in [AIRFLOW-2463].

> Make task instance context available for hive queries
> -
>
> Key: AIRFLOW-2463
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2463
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Kevin Yang
>Assignee: Kevin Yang
>Priority: Major
> Fix For: 1.10.2
>
>
> Currently hive queries run through HiveOperator() would receive task_instance 
> context as hive_conf. But the context is not available when 
> HiveCliHook()/HiveServer2Hook() was called through PythonOperator(), nor 
> available when hive cli was called in BashOperator() nor available when 
> HiveServer2Hook() was called in any operator.
> Having the context available would provide users the capability to audit hive 
> queries.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb merged pull request #5206: [AIRFLOW-4434] fix support for impala in hive hook

2019-05-01 Thread GitBox
ashb merged pull request #5206: [AIRFLOW-4434] fix support for impala in hive 
hook
URL: https://github.com/apache/airflow/pull/5206
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-4441) Recent tasks and dag runs stuck loading

2019-05-01 Thread Rafael (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rafael closed AIRFLOW-4441.
---
Resolution: Invalid

Seems I've spoke too soon. I cannot create a minimal setup to reproduce it, 
seems related to my setup after all. 

Closing for now.

> Recent tasks and dag runs stuck loading
> ---
>
> Key: AIRFLOW-4441
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4441
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.3
>Reporter: Rafael
>Priority: Major
> Attachments: Screenshot 2019-04-30 at 15.00.37.png
>
>
> After starting a fresh installation of Airflow 1.10.3, I see the UI is 
> "stuck" loading Recent Tasks and DAG Runs in the /admin/ page.
> I see no errors in the logs (logging_level and fab_loging_level set to INFO) 
> or in the browser console, so I don't know how to provide more details.
> Attached is a screenshot of what I see.
> !Screenshot 2019-04-30 at 15.00.37.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mik-laj commented on issue #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
mik-laj commented on issue #5118: [AIRFLOW-4315] Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#issuecomment-488280603
 
 
   http://www.sphinx-doc.org/en/master/usage/restructuredtext/domains.html


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-503) ExternalTaskSensor causes runtime exception

2019-05-01 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16830989#comment-16830989
 ] 

jack commented on AIRFLOW-503:
--

>From the docs:

 
{code:java}
param allowed_states: list of allowed states, default is ``['success']``
{code}
So in your case it should be:
{code:java}
allowed_states=['all']{code}
 

> ExternalTaskSensor causes runtime exception
> ---
>
> Key: AIRFLOW-503
> URL: https://issues.apache.org/jira/browse/AIRFLOW-503
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: db, operators
>Affects Versions: 1.7.1
> Environment: airflow 1.7.1.3.
> postgress 9.2.13 (backend DB)
> OS   Red Hat Enterprise Linux Server 7.2 (Maipo)
> python 2.7.5
>Reporter: Hila Visan
>Priority: Critical
>
> I just created a new task using ExternalTaskSensor between weekly dag and 
> daily dag (named 'services_daily_sensor') .
> When I tried to test it, i ran the command:
> 'airflow test weekly_agg services_daily_sensor 2016-09-11T06:00:00'.
> The command failed with the following error:
>  
> ervices_daily_sensor> on 2016-09-11 06:00:00
> [2016-09-11 08:59:09,602] {sensors.py:195} INFO - Poking for 
> daily_agg.services_daily_task on 2016-09-11 02:00:00 ...
> [2016-09-11 08:59:09,614] {models.py:1286} ERROR - 
> (psycopg2.ProgrammingError) can't adapt type 'builtin_function_or_method' 
> [SQL: 'SELECT count(*) AS count_1 \nFROM (SELECT task_instance.task_id AS 
> task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, 
> task_instance.execution_date AS task_instance_execution_date, 
> task_instance.start_date AS task_instance_start_date, task_instance.end_date 
> AS task_instance_end_date, task_instance.duration AS task_instance_duration, 
> task_instance.state AS task_instance_state, task_instance.try_number AS 
> task_instance_try_number, task_instance.hostname AS task_instance_hostname, 
> task_instance.unixname AS task_instance_unixname, task_instance.job_id AS 
> task_instance_job_id, task_instance.pool AS task_instance_pool, 
> task_instance.queue AS task_instance_queue, task_instance.priority_weight AS 
> task_instance_priority_weight, task_instance.operator AS 
> task_instance_operator, task_instance.queued_dttm AS 
> task_instance_queued_dttm \nFROM task_instance \nWHERE task_instance.dag_id = 
> %(dag_id_1)s AND task_instance.task_id = %(task_id_1)s AND 
> task_instance.state IN (%(state_1)s) AND task_instance.execution_date = 
> %(execution_date_1)s) AS anon_1'] [parameters: {'state_1':  all>, 'execution_date_1': datetime.datetime(2016, 9, 11, 2, 0), 'dag_id_1': 
> 'daily_agg', 'task_id_1': 'services_daily_task'}]
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1242, in run
> result = task_copy.execute(context=context)
>   File "/usr/lib/python2.7/site-packages/airflow/operators/sensors.py", line 
> 56, in execute
> while not self.poke(context):
>   File "/usr/lib/python2.7/site-packages/airflow/operators/sensors.py", line 
> 203, in poke
> TI.execution_date == dttm,
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 
> 2980, in count
> return self.from_self(col).scalar()
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 
> 2749, in scalar
> ret = self.one()
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 
> 2718, in one
> ret = list(self)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 
> 2761, in __iter__
> return self._execute_and_instances(context)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/orm/query.py", line 
> 2776, in _execute_and_instances
> result = conn.execute(querycontext.statement, self._params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 914, in execute
> return meth(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/sql/elements.py", line 
> 323, in _execute_on_connection
> return connection._execute_clauseelement(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1010, in _execute_clauseelement
> compiled_sql, distilled_params
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1146, in _execute_context
> context)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1341, in _handle_dbapi_exception
> exc_info
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/compat.py", line 
> 202, in raise_from_cause
> reraise(type(exception), exception, tb=exc_tb, cause=cause)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1139, in _execute_context
> context)
>   File 

[GitHub] [airflow] raviagarwalunravel commented on issue #5118: [AIRFLOW-4315] Add monitoring API's to airflow

2019-05-01 Thread GitBox
raviagarwalunravel commented on issue #5118: [AIRFLOW-4315] Add monitoring 
API's to airflow
URL: https://github.com/apache/airflow/pull/5118#issuecomment-488280167
 
 
   
   > Could you add an `:rtype:` do the docs to say what type of object(s) the 
api.common.* methods return? (`dict`, `List[TaskInstance]`, `list[dict]` etc.)
   
   What is ':rtype:'? where can I find more information about it and or example 
?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] andriisoldatenko commented on issue #5083: [AIRFLOW-4292] Cleanup and improve SLA code

2019-05-01 Thread GitBox
andriisoldatenko commented on issue #5083: [AIRFLOW-4292] Cleanup and improve 
SLA code
URL: https://github.com/apache/airflow/pull/5083#issuecomment-488279398
 
 
   @serkef can you please make sure you have pass all checks?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] andriisoldatenko commented on a change in pull request #5083: [AIRFLOW-4292] Cleanup and improve SLA code

2019-05-01 Thread GitBox
andriisoldatenko commented on a change in pull request #5083: [AIRFLOW-4292] 
Cleanup and improve SLA code
URL: https://github.com/apache/airflow/pull/5083#discussion_r280068209
 
 

 ##
 File path: airflow/jobs.py
 ##
 @@ -660,86 +660,88 @@ def manage_slas(self, dag, session=None):
 dttm = dag.following_schedule(dttm)
 session.commit()
 
+# Identify tasks should send notification
 slas = (
 session
 .query(SlaMiss)
 .filter(SlaMiss.notification_sent == False, SlaMiss.dag_id == 
dag.dag_id)  # noqa: E712
 .all()
 )
 
-if slas:
-sla_dates = [sla.execution_date for sla in slas]
-qry = (
-session
-.query(TI)
-.filter(
-TI.state != State.SUCCESS,
-TI.execution_date.in_(sla_dates),
-TI.dag_id == dag.dag_id
-).all()
-)
-blocking_tis = []
-for ti in qry:
-if ti.task_id in dag.task_ids:
-ti.task = dag.get_task(ti.task_id)
-blocking_tis.append(ti)
-else:
-session.delete(ti)
-session.commit()
+if not slas:
 
 Review comment:
   what are the purpose of this change?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1868) Packaged Dags not added to dag table, unable to execute tasks

2019-05-01 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1868?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16830988#comment-16830988
 ] 

jack commented on AIRFLOW-1868:
---

Can you please check if this is still an issue in newer airflow version?

> Packaged Dags not added to dag table, unable to execute tasks
> -
>
> Key: AIRFLOW-1868
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1868
> Project: Apache Airflow
>  Issue Type: Bug
> Environment: airflow 1.8.2, celery, rabbitMQ, mySQL, aws
>Reporter: nathan warshauer
>Priority: Major
> Attachments: Screen Shot 2017-11-29 at 2.31.02 PM.png, Screen Shot 
> 2017-11-29 at 4.40.39 PM.png, Screen Shot 2017-11-29 at 4.42.39 PM.png
>
>
> .zip files in the dag directory do not appear to be getting added to the dag 
> table on the airflow database.  When a .zip file is placed within the dags 
> folder and it contains executable .py files, the dag_id should be added to 
> the dag table and airflow should allow the dag to be unpaused and run through 
> the web server.
> SELECT distinct dag.dag_id AS dag_dag_id FROM dag confirms the dag does not 
> exist in the dags table but shows up on the UI with the warning message "This 
> Dag seems to be existing only locally" however the dag exists in all 3 dag 
> directories (master and two workers) and the airflow.cfg has donot_pickle = 
> True
> When the dag is triggered manually via airflow trigger_dag  the 
> process goes to the web server and does not execute any tasks.  When I go to 
> the task and click start through the UI the task will execute successfully 
> and shows the attached state upon completion.  When I do not do this process 
> the tasks will not enter the queue and the run sits idle as the 3rd attached 
> image shows.
> Basically, the dag CAN run manually from the zip BUT the scheduler and 
> underlying database tables appear to not be functioning correctly for 
> packaged dags.
> Please let me know if I can provide any additional information regarding this 
> issue, or if you all have any leads that I can check out for resolving this.
> dag = DAG('MY-DAG-NAME', 
>   default_args=default_args, 
>   schedule_interval='*/5 * * * *',
>   max_active_runs=1,
>   dagrun_timeout=timedelta(minutes=4, seconds=30))
> default_args = {
>   'depends_on_past': False,
>   'email': ['airf...@airflow.com'],
>   'email_on_failure': True,
>   'email_on_retry': False,
>   'owner': 'airflow',
>   'provide_context': True,
>   'retries': 0,
>   'retry_delay': timedelta(minutes=5),
>   'start_date': datetime(2017,11,28)
> }



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #5152: [AIRFLOW-4385] Add namespace into pod's yaml

2019-05-01 Thread GitBox
codecov-io commented on issue #5152: [AIRFLOW-4385] Add namespace into pod's 
yaml
URL: https://github.com/apache/airflow/pull/5152#issuecomment-488277566
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=h1) 
Report
   > Merging 
[#5152](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/60b9023ed92b31a75dbdf8b33ce7e9c2bc3637d1?src=pr=desc)
 will **increase** coverage by `0.12%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5152/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5152  +/-   ##
   ==
   + Coverage   78.56%   78.68%   +0.12% 
   ==
 Files 466  470   +4 
 Lines   2980531901+2096 
   ==
   + Hits2341525101+1686 
   - Misses   6390 6800 +410
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...etes\_request\_factory/kubernetes\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3Rvcnkva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkucHk=)
 | `99.05% <100%> (+0.01%)` | :arrow_up: |
   | 
[.../kubernetes\_request\_factory/pod\_request\_factory.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMva3ViZXJuZXRlc19yZXF1ZXN0X2ZhY3RvcnkvcG9kX3JlcXVlc3RfZmFjdG9yeS5weQ==)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/sensors/gcs\_sensor.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvZ2NzX3NlbnNvci5weQ==)
 | `53.65% <0%> (-15.82%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `87.35% <0%> (-11.28%)` | :arrow_down: |
   | 
[airflow/config\_templates/airflow\_local\_settings.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWdfdGVtcGxhdGVzL2FpcmZsb3dfbG9jYWxfc2V0dGluZ3MucHk=)
 | `66.66% <0%> (-9.81%)` | :arrow_down: |
   | 
[airflow/executors/base\_executor.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvYmFzZV9leGVjdXRvci5weQ==)
 | `92.2% <0%> (-3.45%)` | :arrow_down: |
   | 
[airflow/contrib/operators/gcs\_to\_s3.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nY3NfdG9fczMucHk=)
 | `98% <0%> (-2%)` | :arrow_down: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `57.93% <0%> (-1.2%)` | :arrow_down: |
   | 
[airflow/utils/synchronized\_queue.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zeW5jaHJvbml6ZWRfcXVldWUucHk=)
 | `92.3% <0%> (ø)` | |
   | 
[...rib/example\_dags/example\_gcp\_video\_intelligence.py](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djcF92aWRlb19pbnRlbGxpZ2VuY2UucHk=)
 | `0% <0%> (ø)` | |
   | ... and [17 
more](https://codecov.io/gh/apache/airflow/pull/5152/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=footer). 
Last update 
[60b9023...dbc9dcb](https://codecov.io/gh/apache/airflow/pull/5152?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >