[GitHub] [airflow] kani5hk commented on a change in pull request #4877: [AIRFLOW-4046] Added validations for poke_interval & timeout for Airflow Sensor

2019-03-07 Thread GitBox
kani5hk commented on a change in pull request #4877: [AIRFLOW-4046] Added 
validations for poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4877#discussion_r263690806
 
 

 ##
 File path: airflow/sensors/base_sensor_operator.py
 ##
 @@ -70,6 +70,14 @@ def __init__(self,
  *args,
  **kwargs):
 super(BaseSensorOperator, self).__init__(*args, **kwargs)
+if isinstance(poke_interval, str) or poke_interval < 0:
+ raise AirflowException(
+"The poke_interval must be a non-negative number"
+)
+if isinstance(timeout, str) or timeout < 0:
+ raise AirflowException(
+"The timeout must be a non-negative number"
+)
 
 Review comment:
   Thanks @XD-DENG for a quick review, actually sleep(True) is considered as 
sleep(1) and sleep(0) for sleep(False), so it's also kind of handled already. 
Thoughts?
   I'll remove string type comparison though.
   
   Also, some goof up has been done due to which this PR got closed, I've 
opened another one https://github.com/apache/airflow/pull/4878


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4046) Validate poke_interval and timeout value in Airflow Sensor

2019-03-07 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4046?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787634#comment-16787634
 ] 

ASF GitHub Bot commented on AIRFLOW-4046:
-

kani5hk commented on pull request #4878: [AIRFLOW-4046] Added validations for 
poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4878
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4046
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Validate poke_interval and timeout value in Airflow Sensor
> --
>
> Key: AIRFLOW-4046
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4046
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Kanishk Lohumi
>Assignee: Kanishk Lohumi
>Priority: Major
>
> There is no validation being done on poke_interval & timeout values provided 
> to BaseSensor, this results in ValueError with negative value of 
> poke_interval when mode is *poke* because  time.sleep() doesn't support 
> negative values and give below error. {{}}
> {noformat}
> [2019-03-07 11:48:08,182] {models.py:1790} ERROR - sleep length must be 
> non-negative:cG9rZS1rYW5pc2hrcy1tYnAuY29ycC5hZG9iZS5jb20= Traceback (most 
> recent call last): File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/models.py",
>  line 1659, in _run_raw_task result = task_copy.execute(context=context) File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
>  line 112, in execute sleep(self.poke_interval) ValueError: sleep length must 
> be non-negative{noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] kani5hk opened a new pull request #4878: [AIRFLOW-4046] Added validations for poke_interval & timeout for Airflow Sensor

2019-03-07 Thread GitBox
kani5hk opened a new pull request #4878: [AIRFLOW-4046] Added validations for 
poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4878
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4046
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4046) Validate poke_interval and timeout value in Airflow Sensor

2019-03-07 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4046?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787622#comment-16787622
 ] 

ASF GitHub Bot commented on AIRFLOW-4046:
-

kani5hk commented on pull request #4877: [AIRFLOW-4046] Added validations for 
poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4877
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Validate poke_interval and timeout value in Airflow Sensor
> --
>
> Key: AIRFLOW-4046
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4046
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Kanishk Lohumi
>Assignee: Kanishk Lohumi
>Priority: Major
>
> There is no validation being done on poke_interval & timeout values provided 
> to BaseSensor, this results in ValueError with negative value of 
> poke_interval when mode is *poke* because  time.sleep() doesn't support 
> negative values and give below error. {{}}
> {noformat}
> [2019-03-07 11:48:08,182] {models.py:1790} ERROR - sleep length must be 
> non-negative:cG9rZS1rYW5pc2hrcy1tYnAuY29ycC5hZG9iZS5jb20= Traceback (most 
> recent call last): File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/models.py",
>  line 1659, in _run_raw_task result = task_copy.execute(context=context) File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
>  line 112, in execute sleep(self.poke_interval) ValueError: sleep length must 
> be non-negative{noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] XD-DENG commented on a change in pull request #4877: [AIRFLOW-4046] Added validations for poke_interval & timeout for Airflow Sensor

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4877: [AIRFLOW-4046] Added 
validations for poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4877#discussion_r263688154
 
 

 ##
 File path: airflow/sensors/base_sensor_operator.py
 ##
 @@ -70,6 +70,14 @@ def __init__(self,
  *args,
  **kwargs):
 super(BaseSensorOperator, self).__init__(*args, **kwargs)
+if isinstance(poke_interval, str) or poke_interval < 0:
+ raise AirflowException(
+"The poke_interval must be a non-negative number"
+)
+if isinstance(timeout, str) or timeout < 0:
+ raise AirflowException(
+"The timeout must be a non-negative number"
+)
 
 Review comment:
   I'm not sure if it's necessary to check types here. Because if you provide 
an invalid type for `timeout` or `poke_interval`, the `execute` method will 
fail and give exception later anyway.
   
   On the other hand, if I provide a Boolean value here for `timeout` or 
`poke_interval`, it should be captured as invalid type as well while it's not 
covered here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] milton0825 closed pull request #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
milton0825 closed pull request #4860: [AIRFLOW-] create user in quick start
URL: https://github.com/apache/airflow/pull/4860
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] milton0825 opened a new pull request #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
milton0825 opened a new pull request #4860: [AIRFLOW-] create user in quick 
start
URL: https://github.com/apache/airflow/pull/4860
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4046) Validate poke_interval and timeout value in Airflow Sensor

2019-03-07 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4046?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787619#comment-16787619
 ] 

ASF GitHub Bot commented on AIRFLOW-4046:
-

kani5hk commented on pull request #4877: [AIRFLOW-4046] Added validations for 
poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4877
 
 
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Validate poke_interval and timeout value in Airflow Sensor
> --
>
> Key: AIRFLOW-4046
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4046
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Kanishk Lohumi
>Assignee: Kanishk Lohumi
>Priority: Major
>
> There is no validation being done on poke_interval & timeout values provided 
> to BaseSensor, this results in ValueError with negative value of 
> poke_interval when mode is *poke* because  time.sleep() doesn't support 
> negative values and give below error. {{}}
> {noformat}
> [2019-03-07 11:48:08,182] {models.py:1790} ERROR - sleep length must be 
> non-negative:cG9rZS1rYW5pc2hrcy1tYnAuY29ycC5hZG9iZS5jb20= Traceback (most 
> recent call last): File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/models.py",
>  line 1659, in _run_raw_task result = task_copy.execute(context=context) File 
> "/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
>  line 112, in execute sleep(self.poke_interval) ValueError: sleep length must 
> be non-negative{noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] kani5hk opened a new pull request #4877: [AIRFLOW-4046] Added validations for poke_interval & timeout for Airflow Sensor

2019-03-07 Thread GitBox
kani5hk opened a new pull request #4877: [AIRFLOW-4046] Added validations for 
poke_interval & timeout for Airflow Sensor
URL: https://github.com/apache/airflow/pull/4877
 
 
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4046) Validate poke_interval and timeout value in Airflow Sensor

2019-03-07 Thread Kanishk Lohumi (JIRA)
Kanishk Lohumi created AIRFLOW-4046:
---

 Summary: Validate poke_interval and timeout value in Airflow Sensor
 Key: AIRFLOW-4046
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4046
 Project: Apache Airflow
  Issue Type: Improvement
  Components: operators
Affects Versions: 1.10.2
Reporter: Kanishk Lohumi
Assignee: Kanishk Lohumi


There is no validation being done on poke_interval & timeout values provided to 
BaseSensor, this results in ValueError with negative value of poke_interval 
when mode is *poke* because  time.sleep() doesn't support negative values and 
give below error. {{}}
{noformat}
[2019-03-07 11:48:08,182] {models.py:1790} ERROR - sleep length must be 
non-negative:cG9rZS1rYW5pc2hrcy1tYnAuY29ycC5hZG9iZS5jb20= Traceback (most 
recent call last): File 
"/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/models.py",
 line 1659, in _run_raw_task result = task_copy.execute(context=context) File 
"/Users/lohumi/Documents/airflow_1.10.2/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
 line 112, in execute sleep(self.poke_interval) ValueError: sleep length must 
be non-negative{noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4045) BASE_URL not working when RBAC enabled

2019-03-07 Thread Janne Keskitalo (JIRA)
Janne Keskitalo created AIRFLOW-4045:


 Summary: BASE_URL not working when RBAC enabled
 Key: AIRFLOW-4045
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4045
 Project: Apache Airflow
  Issue Type: Bug
  Components: webserver
Affects Versions: 1.10.2
 Environment: Ubuntu 18.04
Reporter: Janne Keskitalo


Web UI breaks when using RBAC with a subpath. Settings used:
{quote}
rbac = true
base_url = http://localhost:8080/airflow
{quote}

Going to address "http://localhost:8080/airflow; gets redirected to 
"http://localhost:8080/home; which doesn't exist.






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ffinfo edited a comment on issue #4872: [AIRFLOW-4038] Restructure database queries on /home

2019-03-07 Thread GitBox
ffinfo edited a comment on issue #4872: [AIRFLOW-4038] Restructure database 
queries on /home
URL: https://github.com/apache/airflow/pull/4872#issuecomment-470598941
 
 
   @XD-DENG: Almost, indeed the main thing is to switch from `provide_session` 
to `create_session`. Also the order of statements is changes a bit because 
dags_query was build up on multiple statements but in-between the error query 
was there. this is now grouped.
   
   About the switch from provide_session to create_session. Me and @Fokko did a 
little research and using create_session keeps less time a session locked to a 
scope. `@provide_session` on nested sessions can keep the session open for a 
long time while not really needed. Of course this is bigger then just this call 
but try to convert things where I can. Has targeting this method to remove the 
DagBag of the Webserver but I did notice that this call has no usage of DagBag. 
By this time I already changed the provide_session ;)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4044) The documentation of `query_params` in `BigQueryOperator` is wrong

2019-03-07 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787552#comment-16787552
 ] 

ASF GitHub Bot commented on AIRFLOW-4044:
-

hengfengli commented on pull request #4876: [AIRFLOW-4044] The documentation of 
`query_params` in `BigQueryOperator` is wrong. 
URL: https://github.com/apache/airflow/pull/4876
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following 
[AIRFLOW-4044](https://issues.apache.org/jira/browse/AIRFLOW-4044) 
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Currently, the doc 
(https://airflow.apache.org/code.html?highlight=query_params) says: 
   
   ```
   query_params (dict) - a dictionary containing query parameter types and 
values, passed to BigQuery. 
   ```
   
   However, in BigQueryBaseCursor 
(https://github.com/apache/airflow/blob/0c797a830e3370bd6e39f5fcfc128a8fd776912e/airflow/contrib/hooks/bigquery_hook.py#L694-L696),
 the doc indicates that this parameter should be a list of `dict`. Also, it is 
unclear how this `query_params` look like and no examples are available. 
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   Updating the documentation and there are no code changes. 
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> The documentation of `query_params` in `BigQueryOperator` is wrong
> --
>
> Key: AIRFLOW-4044
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4044
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 1.10.2
>Reporter: Hengfeng Li
>Priority: Minor
>
> Currently, the doc 
> (https://airflow.apache.org/code.html?highlight=query_params) says: 
>  * *query_params* (_dict_) - a dictionary containing query parameter types 
> and values, passed to BigQuery. 
> However, in BigQueryBaseCursor 
> ([https://github.com/apache/airflow/blob/0c797a830e3370bd6e39f5fcfc128a8fd776912e/airflow/contrib/hooks/bigquery_hook.py#L694-L696),]
>  the doc indicates that this parameter should be a list of `dict`. Also, it 
> is unclear how this `query_params` look like and no examples are available. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] hengfengli opened a new pull request #4876: [AIRFLOW-4044] The documentation of `query_params` in `BigQueryOperator` is wrong.

2019-03-07 Thread GitBox
hengfengli opened a new pull request #4876: [AIRFLOW-4044] The documentation of 
`query_params` in `BigQueryOperator` is wrong. 
URL: https://github.com/apache/airflow/pull/4876
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following 
[AIRFLOW-4044](https://issues.apache.org/jira/browse/AIRFLOW-4044) 
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Currently, the doc 
(https://airflow.apache.org/code.html?highlight=query_params) says: 
   
   ```
   query_params (dict) - a dictionary containing query parameter types and 
values, passed to BigQuery. 
   ```
   
   However, in BigQueryBaseCursor 
(https://github.com/apache/airflow/blob/0c797a830e3370bd6e39f5fcfc128a8fd776912e/airflow/contrib/hooks/bigquery_hook.py#L694-L696),
 the doc indicates that this parameter should be a list of `dict`. Also, it is 
unclear how this `query_params` look like and no examples are available. 
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   Updating the documentation and there are no code changes. 
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-4044) The documentation of `query_params` in `BigQueryOperator` is wrong

2019-03-07 Thread Hengfeng Li (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hengfeng Li reassigned AIRFLOW-4044:


Assignee: (was: Hengfeng Li)

> The documentation of `query_params` in `BigQueryOperator` is wrong
> --
>
> Key: AIRFLOW-4044
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4044
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 1.10.2
>Reporter: Hengfeng Li
>Priority: Minor
>
> Currently, the doc 
> (https://airflow.apache.org/code.html?highlight=query_params) says: 
>  * *query_params* (_dict_) - a dictionary containing query parameter types 
> and values, passed to BigQuery. 
> However, in BigQueryBaseCursor 
> ([https://github.com/apache/airflow/blob/0c797a830e3370bd6e39f5fcfc128a8fd776912e/airflow/contrib/hooks/bigquery_hook.py#L694-L696),]
>  the doc indicates that this parameter should be a list of `dict`. Also, it 
> is unclear how this `query_params` look like and no examples are available. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4044) The documentation of `query_params` in `BigQueryOperator` is wrong

2019-03-07 Thread Hengfeng Li (JIRA)
Hengfeng Li created AIRFLOW-4044:


 Summary: The documentation of `query_params` in `BigQueryOperator` 
is wrong
 Key: AIRFLOW-4044
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4044
 Project: Apache Airflow
  Issue Type: Bug
  Components: Documentation
Affects Versions: 1.10.2
Reporter: Hengfeng Li
Assignee: Hengfeng Li


Currently, the doc 
(https://airflow.apache.org/code.html?highlight=query_params) says: 
 * *query_params* (_dict_) - a dictionary containing query parameter types and 
values, passed to BigQuery. 

However, in BigQueryBaseCursor 
([https://github.com/apache/airflow/blob/0c797a830e3370bd6e39f5fcfc128a8fd776912e/airflow/contrib/hooks/bigquery_hook.py#L694-L696),]
 the doc indicates that this parameter should be a list of `dict`. Also, it is 
unclear how this `query_params` look like and no examples are available. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263654163
 
 

 ##
 File path: tests/contrib/hooks/test_snowflake_hook.py
 ##
 @@ -71,3 +108,30 @@ def test_get_conn_params(self):
 
 def test_get_conn(self):
 self.assertEqual(self.db_hook.get_conn(), self.conn)
+
+def test_key_pair_auth_encrypted(self):
+self.conn.extra_dejson = {'database': 'db',
+  'account': 'airflow',
+  'warehouse': 'af_wh',
+  'region': 'af_region',
+  'role': 'af_role',
+  'private_key_file': '/tmp/test_key.p8'}
+
+params = self.db_hook._get_conn_params()
+self.assertTrue('private_key' in params)
+
+def test_key_pair_auth_not_encrypted(self):
+self.conn.extra_dejson = {'database': 'db',
+  'account': 'airflow',
+  'warehouse': 'af_wh',
+  'region': 'af_region',
+  'role': 'af_role',
+  'private_key_file': '/tmp/test_key.pem'}
 
 Review comment:
   Similarly, may be better to refer `self.nonEncryptedPrivateKey` you prepared 
earlier rather than hardcoding here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263654064
 
 

 ##
 File path: tests/contrib/hooks/test_snowflake_hook.py
 ##
 @@ -71,3 +108,30 @@ def test_get_conn_params(self):
 
 def test_get_conn(self):
 self.assertEqual(self.db_hook.get_conn(), self.conn)
+
+def test_key_pair_auth_encrypted(self):
+self.conn.extra_dejson = {'database': 'db',
+  'account': 'airflow',
+  'warehouse': 'af_wh',
+  'region': 'af_region',
+  'role': 'af_role',
+  'private_key_file': '/tmp/test_key.p8'}
 
 Review comment:
   You have already prepared `self.encryptedPrivateKey = "/tmp/test_key.p8"` 
above, so I believe it's better to refer `self.encryptedPrivateKey` rather than 
hardcoding here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263653773
 
 

 ##
 File path: airflow/contrib/hooks/snowflake_hook.py
 ##
 @@ -64,6 +66,33 @@ def _get_conn_params(self):
 "region": self.region or region or '',
 "role": self.role or role or '',
 }
+
+"""
+If private_key_file is specified in the extra json, load the contents 
of the file as a private
+key and specify that in the connection configuration. The connection 
password then becomes the
+passphrase for the private key. If your private key file is not 
encrypted (not recommended), then
+leave the password empty.
+"""
+private_key_file = conn.extra_dejson.get('private_key_file', None)
+if private_key_file is not None:
+with open(private_key_file, "rb") as key:
+passphrase = None
+if conn.password is not None and conn.password.strip() != '':
+passphrase = conn.password.strip().encode()
+
+p_key = serialization.load_pem_private_key(
+key.read(),
+password=passphrase,
+backend=default_backend()
+)
+
+pkb = p_key.private_bytes(encoding=serialization.Encoding.DER,
+  format=serialization.PrivateFormat.PKCS8,
+  
encryption_algorithm=serialization.NoEncryption())
+
+conn_config['private_key'] = pkb
+conn_config.pop('password', None)
 
 Review comment:
   Curious why we pop out `password` here?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263653355
 
 

 ##
 File path: airflow/contrib/hooks/snowflake_hook.py
 ##
 @@ -64,6 +66,33 @@ def _get_conn_params(self):
 "region": self.region or region or '',
 "role": self.role or role or '',
 }
+
+"""
+If private_key_file is specified in the extra json, load the contents 
of the file as a private
+key and specify that in the connection configuration. The connection 
password then becomes the
+passphrase for the private key. If your private key file is not 
encrypted (not recommended), then
+leave the password empty.
+"""
+private_key_file = conn.extra_dejson.get('private_key_file', None)
+if private_key_file is not None:
 
 Review comment:
   This line can be `if private_key_file:`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263653467
 
 

 ##
 File path: airflow/contrib/hooks/snowflake_hook.py
 ##
 @@ -64,6 +66,33 @@ def _get_conn_params(self):
 "region": self.region or region or '',
 "role": self.role or role or '',
 }
+
+"""
+If private_key_file is specified in the extra json, load the contents 
of the file as a private
+key and specify that in the connection configuration. The connection 
password then becomes the
+passphrase for the private key. If your private key file is not 
encrypted (not recommended), then
+leave the password empty.
+"""
+private_key_file = conn.extra_dejson.get('private_key_file', None)
+if private_key_file is not None:
+with open(private_key_file, "rb") as key:
+passphrase = None
+if conn.password is not None and conn.password.strip() != '':
 
 Review comment:
   `if conn.password is not None` - similar to the point above.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for key pair auth in snowflake hook

2019-03-07 Thread GitBox
XD-DENG commented on a change in pull request #4875: [AIRFLOW-4031] Allow for 
key pair auth in snowflake hook
URL: https://github.com/apache/airflow/pull/4875#discussion_r263654163
 
 

 ##
 File path: tests/contrib/hooks/test_snowflake_hook.py
 ##
 @@ -71,3 +108,30 @@ def test_get_conn_params(self):
 
 def test_get_conn(self):
 self.assertEqual(self.db_hook.get_conn(), self.conn)
+
+def test_key_pair_auth_encrypted(self):
+self.conn.extra_dejson = {'database': 'db',
+  'account': 'airflow',
+  'warehouse': 'af_wh',
+  'region': 'af_region',
+  'role': 'af_role',
+  'private_key_file': '/tmp/test_key.p8'}
+
+params = self.db_hook._get_conn_params()
+self.assertTrue('private_key' in params)
+
+def test_key_pair_auth_not_encrypted(self):
+self.conn.extra_dejson = {'database': 'db',
+  'account': 'airflow',
+  'warehouse': 'af_wh',
+  'region': 'af_region',
+  'role': 'af_role',
+  'private_key_file': '/tmp/test_key.pem'}
 
 Review comment:
   Similarly, may be better to refer `self.nonEncryptedPrivateKey` rather than 
hardcoding here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4804: [AIRFLOW-3980] Unify logger

2019-03-07 Thread GitBox
mik-laj commented on issue #4804: [AIRFLOW-3980] Unify logger
URL: https://github.com/apache/airflow/pull/4804#issuecomment-470759253
 
 
   @Fokko Now Travis is green.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4804: [AIRFLOW-3980] Unify logger

2019-03-07 Thread GitBox
codecov-io commented on issue #4804: [AIRFLOW-3980] Unify logger
URL: https://github.com/apache/airflow/pull/4804#issuecomment-470758658
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=h1) 
Report
   > Merging 
[#4804](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f8dacae03340cb8423e37d7b053e7625a157f89e?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `71.53%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4804/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4804  +/-   ##
   ==
   + Coverage   75.28%   75.29%   +<.01% 
   ==
 Files 450  450  
 Lines   2902629019   -7 
   ==
   - Hits2185221849   -3 
   + Misses   7174 7170   -4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/auth/backends/ldap\_auth.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2F1dGgvYmFja2VuZHMvbGRhcF9hdXRoLnB5)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[airflow/sensors/external\_task\_sensor.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL2V4dGVybmFsX3Rhc2tfc2Vuc29yLnB5)
 | `94.64% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/s3\_list\_operator.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zM19saXN0X29wZXJhdG9yLnB5)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/hooks/webhdfs\_hook.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy93ZWJoZGZzX2hvb2sucHk=)
 | `36.36% <ø> (ø)` | :arrow_up: |
   | 
[airflow/sensors/hive\_partition\_sensor.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9zZW5zb3JzL2hpdmVfcGFydGl0aW9uX3NlbnNvci5weQ==)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[...ntrib/sensors/aws\_glue\_catalog\_partition\_sensor.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXdzX2dsdWVfY2F0YWxvZ19wYXJ0aXRpb25fc2Vuc29yLnB5)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/gcs\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZ2NzX3Rhc2tfaGFuZGxlci5weQ==)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `67.22% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/check\_operator.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvY2hlY2tfb3BlcmF0b3IucHk=)
 | `60.16% <0%> (+1.89%)` | :arrow_up: |
   | 
[airflow/hooks/http\_hook.py](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9odHRwX2hvb2sucHk=)
 | `95.45% <0%> (ø)` | :arrow_up: |
   | ... and [45 
more](https://codecov.io/gh/apache/airflow/pull/4804/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=footer). 
Last update 
[f8dacae...c415804](https://codecov.io/gh/apache/airflow/pull/4804?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-07 Thread GitBox
mik-laj commented on a change in pull request #4846: [AIRFLOW-4030] adding 
start to singularity for airflow
URL: https://github.com/apache/airflow/pull/4846#discussion_r263609499
 
 

 ##
 File path: airflow/operators/singularity_operator.py
 ##
 @@ -0,0 +1,179 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.utils.file import TemporaryDirectory
+from spython.main import Client
+import shutil
+import ast
+import os
+
+
+class SingularityOperator(BaseOperator):
+"""
+Execute a command inside a Singularity container
+
+Singularity has more seamless connection to the host than Docker, so
+no special binds are needed to ensure binding content in the user $HOME
+and temporary directories. If the user needs custom binds, this can
+be done with --volumes
+
+:param image: Singularity image or URI from which to create the container.
+:type image: str
+:param auto_remove: Delete the container when the process exits
+The default is False.
+:type auto_remove: bool
+:param command: Command to be run in the container. (templated)
+:type command: str or list
+:param start_command: start command to pass to the container instance
+:type start_command: string or list
+:param environment: Environment variables to set in the container. 
(templated)
+:type environment: dict
+:param force_pull: Pull the image on every run. Default is False.
+:type force_pull: bool
+:param volumes: List of volumes to mount into the container, e.g.
+``['/host/path:/container/path', '/host/path2:/container/path2']``.
+:param options: other flags (list) to provide to the instance start
+:type options: list
+:param working_dir: Working directory to
+set on the container (equivalent to the -w switch the docker client)
+:type working_dir: str
+"""
+template_fields = ('command', 'environment',)
+template_ext = ('.sh', '.bash',)
+
+@apply_defaults
+def __init__(
+self,
+image,
+api_version=None,
+command=None,
+start_command=None,
+environment=None,
+pull_folder=None,
+force_pull=False,
+volumes=None,
+options = None,
+working_dir=None,
+auto_remove=False,
+*args,
+**kwargs):
+
+super(SingularityOperator, self).__init__(*args, **kwargs)
+self.api_version = api_version
+self.auto_remove = auto_remove
+self.command = command
+self.start_command = start_command
+self.environment = environment or {}
+self.force_pull = force_pull
+self.image = image
+self.instance = None
+self.options = options or []
+self.volumes = volumes or []
+self.working_dir = working_dir
+self.cli = None
+self.container = None
+
+def execute(self, context):
+self.log.info('Preparing Singularity container %s', self.image)
+self.cli = Client
 
 Review comment:
   The operator should not call the external library directly. It is 
recommended to introduce an intermediate layer - hook.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-07 Thread GitBox
mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity 
for airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-470738564
 
 
   @vsoch Yes. You understand well. 
   Example: 
   ```python
   
@mock.patch('airflow.contrib.hooks.gcp_vision_hook.CloudVisionHook.get_conn')
   def test_add_product_to_product_set(self, get_conn):
   # Given
   add_product_to_product_set_method = 
get_conn.return_value.add_product_to_product_set
   
   # When
   self.hook.add_product_to_product_set(
   product_set_id=PRODUCTSET_ID_TEST,
   product_id=PRODUCT_ID_TEST,
   location=LOC_ID_TEST,
   project_id=PROJECT_ID_TEST,. 
   )
   # Then
   # Product ID was provided explicitly in the method call above, 
should be returned from the method
   add_product_to_product_set_method.assert_called_once_with(
   name=PRODUCTSET_NAME_TEST, product=PRODUCT_NAME_TEST, 
retry=None, timeout=None, metadata=None
   )
   ```
   where `get_conn` is a method that return a authorised API client library.  
   Source: https://github.com/apache/airflow/pull/4791/files
   
   As I look now, you have not created the hook. 
   This is not required, but it is good practice to introduce an intermediate 
layer between the external library and the operator. This allows you to use the 
same code again.
   For example: My team create a integration with Google Translate, Google 
Speech, so we created a two operators: CloudTranslateOperator, 
CloudTextToSpeech and two hooks: CloudTranslateHook and CloudSpeechHook. In 
next step, we are working on improving the most common use cases, so we've 
created the new CloudTranslateAndSpeechOperator and no new hook. 
   
   I do not demand Slack, but I just wanted to help you. Comments are okay for 
me.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4043) 405 when using REST api to get status of a DAG

2019-03-07 Thread Paymahn Moghadasian (JIRA)
Paymahn Moghadasian created AIRFLOW-4043:


 Summary: 405 when using REST api to get status of a DAG
 Key: AIRFLOW-4043
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4043
 Project: Apache Airflow
  Issue Type: Bug
  Components: api
Affects Versions: 1.10.2
 Environment: I'm running airflow in kubernetes on ubuntu 16.04
Reporter: Paymahn Moghadasian


I'm trying to use the airflow REST api (v1.10.2) but I'm having problems any 
time I try to query information about a specific dag.

Here's an example of the latest_runs endpoint working:


{noformat}
 ❯❯❯ curl -X GET http://192.168.99.100:30080/api/experimental/latest_runs
{
  "items": [
{
  "dag_id": "test_dag",
  "dag_run_url": 
"/admin/airflow/graph?dag_id=test_dag_date=2019-03-07+21%3A18%3A23.387031%2B00%3A00",
  "execution_date": "2019-03-07T21:18:23.387031+00:00",
  "start_date": "2019-03-07T21:18:23.683240+00:00"
}
  ]
}
{noformat}

However, when I try to query test_dag I get an error:


{noformat}
 ❯❯❯ curl -X GET 
"http://192.168.99.100:30080/api/experimental/dags/test_dag/dag_runs;

405 Method Not Allowed
Method Not Allowed
The method is not allowed for the requested URL.
{noformat}

I've also tried looking in the source code and I found that there's a [state 
param|https://github.com/apache/airflow/blob/f4277cb32a3b75591ed6decb9f8d6c33f60986be/airflow/www/api/experimental/endpoints.py#L117]
 that can be used:


{noformat}
 ❯❯❯ curl -X GET 
"http://192.168.99.100:30080/api/experimental/dags/test_dag/dag_runs?state=success;

405 Method Not Allowed
Method Not Allowed
The method is not allowed for the requested URL.
{noformat}

but adding that in doesn't seem to help.

I find that triggering a DAG works:


{noformat}
 ❮❮❮ curl -X POST \
  http://192.168.99.100:30080/api/experimental/dags/test_dag/dag_runs \
  -H 'Content-Type: application/json' \
  -d '{}'
{
  "message": "Created "
}
{noformat}

Anyone have an idea why I can't query the status of a DAG and get a 405?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787271#comment-16787271
 ] 

Ash Berlin-Taylor edited comment on AIRFLOW-3885 at 3/7/19 10:28 PM:
-

https://travis-ci.org/ashb/airflow/jobs/503289086#L3445
{code}
airflow.exceptions.AirflowException: dag_id could not be found: 
SchedulerJobTest.test_execute_task_instances_limit. Either the dag did not 
exist or it failed to parse.
{code}

Which is a dag defined inline in the test/jobs.py (as it is on the branch 
still) on line 2056. But I don't see how the changes could affect this.

And it's somehow broken the WebUITests too:

https://travis-ci.org/ashb/airflow/jobs/503289086#L5731

Something doesn't add up. But it is definable this commit that is failing as 
the previous commit passes.


was (Author: ashb):
https://travis-ci.org/ashb/airflow/jobs/503289086#L3445
{code}
airflow.exceptions.AirflowException: dag_id could not be found: 
SchedulerJobTest.test_execute_task_instances_limit. Either the dag did not 
exist or it failed to parse.
{code}

Which is a dag defined inline in the test/jobs.py (as it is on the branch 
still) on line 2056. But I don't see how the changes could affect this.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787277#comment-16787277
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3885:


Oh I wonder if it is because [AIRFLOW-3239] isn't on the release branch (that 
renamed tests/jobs.py to tests/test_jobs.py) so something is being scoped 
differently.

I'll try pulling that one in tomorrow.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2509) Separate Configuration page into separate how-to guides

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2509?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2509:
---
Fix Version/s: (was: 2.0.0)
   1.10.3

> Separate Configuration page into separate how-to guides
> ---
>
> Key: AIRFLOW-2509
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2509
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: Tim Swast
>Assignee: Tim Swast
>Priority: Critical
> Fix For: 1.10.3
>
>
> The existing "Configuration" page is attempting to be both a tutorial 
> (teaching the basics of Airflow configuration & a minimal production 
> deployment) as well as how-tos for specific tasks.
>  
> I propose we separate Configuration into separate how-to guides, keeping the 
> current sequence so that it can still maintain the tutorial properties (at 
> least until a "Deploying a Production Airflow Environment" tutorial is 
> written).
>  
> There's a principle that the [distinct kinds of 
> documentation|http://www.writethedocs.org/videos/eu/2017/the-four-kinds-of-documentation-and-why-you-need-to-understand-what-they-are-daniele-procida/]
>  should be organized separately. The Django project does this 
> [https://docs.djangoproject.com/en/2.0/] by splitting into
>  
>  * Tutorials
>  * Topic guides (what Airflow calls Concepts)
>  * Reference guides
>  * How-to guides
> I think the same could apply well here. (This issue covers only How-to for 
> Configuration. More work would be required to separate other docs into proper 
> document types.)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-3362) Template to support jinja2 native python types

2019-03-07 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3362?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan reassigned AIRFLOW-3362:
--

Assignee: Ryan Yuan

> Template to support jinja2 native python types
> --
>
> Key: AIRFLOW-3362
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3362
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core, DAG
>Reporter: Duan Shiqiang
>Assignee: Ryan Yuan
>Priority: Major
>
> Airflow latest (1.10.x)'s template can only render into string which is fine 
> most of the times, but it would be better to support render into python types.
> It can be very useful if the template system can support render into native 
> python types like list, dictionary, etc. Especially when using xcom to pass 
> some values between operators.
> Jinja2 supports this feature from 2.10, more info can found here: 
> http://jinja.pocoo.org/docs/2.10/nativetypes/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2961) Speed up test_backfill_examples test

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2961?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2961:
---
Fix Version/s: (was: 2.0.0)
   1.10.3

> Speed up test_backfill_examples test
> 
>
> Key: AIRFLOW-2961
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2961
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Fokko Driesprong
>Priority: Major
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2843) ExternalTaskSensor: Add option to cease waiting immediately if the external DAG/task doesn't exist

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2843:
---
Fix Version/s: 1.10.3

> ExternalTaskSensor: Add option to cease waiting immediately if the external 
> DAG/task doesn't exist
> --
>
> Key: AIRFLOW-2843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2843
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> h2. Background
> *ExternalTaskSensor* will keep waiting (given restrictions of retries, 
> poke_interval, etc), even if the external DAG/task specified doesn't exist at 
> all. In some cases, this waiting may still make sense as new DAG may backfill.
> But it may be good to provide an option to cease waiting immediately if the 
> external DAG/task specified doesn't exist.
> h2. Proposal
> Provide an argument "check_existence". Set to *True* to check if the external 
> DAG/task exists, and immediately cease waiting if the external DAG/task does 
> not exist.
> The default value is set to *False* (no check or ceasing will happen) so it 
> will not affect any existing DAGs or user expectation.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3600) Remove dagBag from trigger call

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3600.

   Resolution: Fixed
Fix Version/s: (was: 2.0.0)
   1.10.3

> Remove dagBag from trigger call
> ---
>
> Key: AIRFLOW-3600
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3600
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.1
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Minor
> Fix For: 1.10.3
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Reopened] (AIRFLOW-3600) Remove dagBag from trigger call

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor reopened AIRFLOW-3600:


> Remove dagBag from trigger call
> ---
>
> Key: AIRFLOW-3600
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3600
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Affects Versions: 1.10.1
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Minor
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-07 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-470710098
 
 
   I'm not sure I understand - is the idea of "mock" that it doesn't actually 
call Singularity (on the cost) to create the instance? It just checks that the 
right commands go into the (not actually done) calls?
   
   I can join (another) slack if it's absolutely necessary, but I already have 
at least 15 and it would be more reproducible / better record to have our 
discussion with the PR (and version control!) here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-07 Thread GitBox
mik-laj commented on issue #4846: [AIRFLOW-4030] adding start to singularity 
for airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-470708776
 
 
   @vsoch You can made two types of tests:
   
   * integration tests,
   * unit test,
   
   In integration tests, you check behaviour of our application with all 
components.  Your application should work in a closed box. 
   In unit tests, you check the behaviour of only one class ex. you check if 
this class starts a different class with the appropriate parameters. You should 
mock all other classes via `unittest.mock`. Your class should work in a closed 
box.
   
   The selection of the appropriate type of tests depends on the required level 
of stability and complexity of the problem. In simple cases, **it is sufficient 
to use only unit tests**. You do not have to check everything if you trust the 
documentation of another project/class.
   
   I develop operators for GCP, so I am writing another type of tests - system 
tests. In this case, I write my code tests, but the code also communicates with 
other applications that I do not manage. It communicates with the real Google 
Cloud Platform platform.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4685: [AIRFLOW-3862] Check types with mypy.

2019-03-07 Thread GitBox
codecov-io edited a comment on issue #4685: [AIRFLOW-3862] Check types with 
mypy.
URL: https://github.com/apache/airflow/pull/4685#issuecomment-462176915
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=h1) 
Report
   > Merging 
[#4685](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/f8dacae03340cb8423e37d7b053e7625a157f89e?src=pr=desc)
 will **increase** coverage by `0.02%`.
   > The diff coverage is `93.54%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4685/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4685  +/-   ##
   ==
   + Coverage   75.28%   75.31%   +0.02% 
   ==
 Files 450  450  
 Lines   2902629056  +30 
   ==
   + Hits2185221883  +31 
   + Misses   7174 7173   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/sensors/python\_sensor.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvcHl0aG9uX3NlbnNvci5weQ==)
 | `85% <ø> (-0.72%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `98.61% <ø> (-0.06%)` | :arrow_down: |
   | 
[airflow/operators/dummy\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZHVtbXlfb3BlcmF0b3IucHk=)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/operators/subdag\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc3ViZGFnX29wZXJhdG9yLnB5)
 | `90% <ø> (-0.33%)` | :arrow_down: |
   | 
[airflow/operators/python\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcHl0aG9uX29wZXJhdG9yLnB5)
 | `95.8% <ø> (-0.03%)` | :arrow_down: |
   | 
[airflow/operators/dagrun\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZGFncnVuX29wZXJhdG9yLnB5)
 | `94.59% <ø> (-0.15%)` | :arrow_down: |
   | 
[...low/contrib/example\_dags/example\_winrm\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3dpbnJtX29wZXJhdG9yLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...contrib/example\_dags/example\_gcs\_to\_bq\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2djc190b19icV9vcGVyYXRvci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.47% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/check\_operator.py](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvY2hlY2tfb3BlcmF0b3IucHk=)
 | `58.59% <100%> (+0.32%)` | :arrow_up: |
   | ... and [32 
more](https://codecov.io/gh/apache/airflow/pull/4685/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=footer). 
Last update 
[f8dacae...a14f72f](https://codecov.io/gh/apache/airflow/pull/4685?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jmcarp commented on issue #4828: [AIRFLOW-4002] Optionally open debugger on errors in airflow test.

2019-03-07 Thread GitBox
jmcarp commented on issue #4828: [AIRFLOW-4002] Optionally open debugger on 
errors in airflow test.
URL: https://github.com/apache/airflow/pull/4828#issuecomment-470693508
 
 
   Anything else to revise here? I think I addressed everyone's feedback 爛 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-07 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-470691659
 
 
   How are we supposed to test the operator without having Singularity 
installed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787174#comment-16787174
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3885:


Thanks, trying with that one https://travis-ci.org/ashb/airflow/builds/503289084

However the error I'm seeing doesn't line up with that.

{code}
7) ERROR: test_delete_dag_dag_still_in_dagbag 
(tests.api.common.experimental.test_delete_dag.TestDeleteDAGCatchError)
--
   Traceback (most recent call last):
tests/api/common/experimental/test_delete_dag.py line 62 in 
test_delete_dag_dag_still_in_dagbag
  delete_dag(self.dag_id)
airflow/utils/db.py line 73 in wrapper
  return func(*args, **kwargs)
airflow/api/common/experimental/delete_dag.py line 40 in delete_dag
  raise DagNotFound("Dag id {} not found".format(dag_id))
   DagNotFound: Dag id example_bash_operator not found
{code}

I think I'm missing something else.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] jmcarp commented on a change in pull request #4685: [AIRFLOW-3862] Check types with mypy.

2019-03-07 Thread GitBox
jmcarp commented on a change in pull request #4685: [AIRFLOW-3862] Check types 
with mypy.
URL: https://github.com/apache/airflow/pull/4685#discussion_r263557675
 
 

 ##
 File path: airflow/models/__init__.py
 ##
 @@ -1021,9 +1027,8 @@ def are_dependents_done(self, session=None):
 count = ti[0][0]
 return count == len(task.downstream_task_ids)
 
-@property
 @provide_session
-def previous_ti(self, session=None):
+def get_previous_ti(self, session=None):
 
 Review comment:
   I agree, revised to make the new methods private and put docstrings in the 
public properties.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
feng-tao commented on issue #4860: [AIRFLOW-] create user in quick start
URL: https://github.com/apache/airflow/pull/4860#issuecomment-470684195
 
 
   @r39132  good point, may be change the note to  something:
   ```
   if you build it with 1.10.2 or 1.10.1 version, use `airflow create_user ...`
   If you build it with master, use `airflow user -c` something
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
feng-tao commented on issue #4860: [AIRFLOW-] create user in quick start
URL: https://github.com/apache/airflow/pull/4860#issuecomment-470684354
 
 
   depends on whether @ashb cherry-pick that pr.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Tao Feng (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787168#comment-16787168
 ] 

Tao Feng commented on AIRFLOW-3885:
---

[~ashb], you need this pr([https://github.com/apache/airflow/pull/4737)] as 
well.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4042) resources parameter is not documented in KubernetesPodOperator

2019-03-07 Thread laurent (JIRA)
laurent created AIRFLOW-4042:


 Summary: resources parameter is not documented in 
KubernetesPodOperator
 Key: AIRFLOW-4042
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4042
 Project: Apache Airflow
  Issue Type: Bug
  Components: contrib
Affects Versions: 1.10.2
Reporter: laurent


This operator takes a resources parameter to specify resources for a Pod, but 
this is not mentioned in the pydoc documentation and hence is missing from the 
online documentation.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787127#comment-16787127
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3885:


Just tried cherry picking it again: 
https://travis-ci.org/ashb/airflow/builds/503259016

I'll know if it passes in about an hour or not.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2993) Addition of S3_to_SFTP and SFTP_to_S3 Operators

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787071#comment-16787071
 ] 

ASF subversion and git services commented on AIRFLOW-2993:
--

Commit 6f485248ad8cb9182694ebaf1cc1e75b6c98cff7 in airflow's branch 
refs/heads/v1-10-stable from wmorris75
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=6f48524 ]

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)

Add operators for transferring files between s3 and sftp.

> Addition of S3_to_SFTP  and SFTP_to_S3 Operators
> 
>
> Key: AIRFLOW-2993
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2993
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.9.0
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
> Fix For: 1.10.3
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3933) Fix various typos

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3933?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787075#comment-16787075
 ] 

ASF subversion and git services commented on AIRFLOW-3933:
--

Commit 8b339481cec14e5ee240d4b31f8543ea3210c35a in airflow's branch 
refs/heads/v1-10-stable from Ryan Yuan
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8b33948 ]

[AIRFLOW-3933] Fix various typos (#4747)

Fix typos


> Fix various typos
> -
>
> Key: AIRFLOW-3933
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3933
> Project: Apache Airflow
>  Issue Type: Improvement
>Affects Versions: 1.10.2
>Reporter: Ryan Yuan
>Assignee: Ryan Yuan
>Priority: Trivial
> Fix For: 1.10.3
>
>
> Fix various typos



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2993) Addition of S3_to_SFTP and SFTP_to_S3 Operators

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787076#comment-16787076
 ] 

ASF subversion and git services commented on AIRFLOW-2993:
--

Commit c67a3960f25cf31501dd80546ed56f686b5b4637 in airflow's branch 
refs/heads/v1-10-stable from wmorris75
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=c67a396 ]

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)

Add operators for transferring files between s3 and sftp.

> Addition of S3_to_SFTP  and SFTP_to_S3 Operators
> 
>
> Key: AIRFLOW-2993
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2993
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.9.0
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
> Fix For: 1.10.3
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3905) Allow using parameters for sql statement in SqlSensor

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787073#comment-16787073
 ] 

ASF subversion and git services commented on AIRFLOW-3905:
--

Commit 4f277394eaba0f37733671faa21a8ba9a2e0df97 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=4f27739 ]

[AIRFLOW-3905] Allow using "parameters" in SqlSensor (#4723)

* [AIRFLOW-3905] Allow 'parameters' in SqlSensor

* Add check on conn_type & add test

Not all SQL-related connections are supported by SqlSensor,
due to limitation in Connection model and hook implementation.


> Allow using parameters for sql statement in SqlSensor
> -
>
> Key: AIRFLOW-3905
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3905
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> In most SQL-related operators/sensors, argument `parameters` is available to 
> help render SQL command conveniently. But this is not available in SqlSensor 
> yet.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2985) Operators for S3 object copying/deleting [boto3.client.copy_object()/delete_object()]

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787077#comment-16787077
 ] 

ASF subversion and git services commented on AIRFLOW-2985:
--

Commit 1744c0afeaf48244c5d4ee7fcd9b5698b908cd48 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=1744c0a ]

[AIRFLOW-2985] Operators for S3 object copying/deleting (#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.


> Operators for S3 object copying/deleting 
> [boto3.client.copy_object()/delete_object()]
> -
>
> Key: AIRFLOW-2985
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2985
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> Currently we don't have an operator in Airflow to help copy/delete objects 
> within S3, while they may be quite common use case when we deal with the data 
> in S3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3905) Allow using parameters for sql statement in SqlSensor

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787074#comment-16787074
 ] 

ASF subversion and git services commented on AIRFLOW-3905:
--

Commit 4f277394eaba0f37733671faa21a8ba9a2e0df97 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=4f27739 ]

[AIRFLOW-3905] Allow using "parameters" in SqlSensor (#4723)

* [AIRFLOW-3905] Allow 'parameters' in SqlSensor

* Add check on conn_type & add test

Not all SQL-related connections are supported by SqlSensor,
due to limitation in Connection model and hook implementation.


> Allow using parameters for sql statement in SqlSensor
> -
>
> Key: AIRFLOW-3905
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3905
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> In most SQL-related operators/sensors, argument `parameters` is available to 
> help render SQL command conveniently. But this is not available in SqlSensor 
> yet.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3905) Allow using parameters for sql statement in SqlSensor

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787069#comment-16787069
 ] 

ASF subversion and git services commented on AIRFLOW-3905:
--

Commit 7069279aee3a614dbd7bc748dfee4e8bc6323398 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=7069279 ]

[AIRFLOW-3905] Allow using "parameters" in SqlSensor (#4723)

* [AIRFLOW-3905] Allow 'parameters' in SqlSensor

* Add check on conn_type & add test

Not all SQL-related connections are supported by SqlSensor,
due to limitation in Connection model and hook implementation.


> Allow using parameters for sql statement in SqlSensor
> -
>
> Key: AIRFLOW-3905
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3905
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> In most SQL-related operators/sensors, argument `parameters` is available to 
> help render SQL command conveniently. But this is not available in SqlSensor 
> yet.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2985) Operators for S3 object copying/deleting [boto3.client.copy_object()/delete_object()]

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787072#comment-16787072
 ] 

ASF subversion and git services commented on AIRFLOW-2985:
--

Commit ef931056f9e9caa9c8d05dcf28677762fb14e553 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=ef93105 ]

[AIRFLOW-2985] Operators for S3 object copying/deleting (#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.


> Operators for S3 object copying/deleting 
> [boto3.client.copy_object()/delete_object()]
> -
>
> Key: AIRFLOW-2985
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2985
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> Currently we don't have an operator in Airflow to help copy/delete objects 
> within S3, while they may be quite common use case when we deal with the data 
> in S3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3905) Allow using parameters for sql statement in SqlSensor

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787068#comment-16787068
 ] 

ASF subversion and git services commented on AIRFLOW-3905:
--

Commit 7069279aee3a614dbd7bc748dfee4e8bc6323398 in airflow's branch 
refs/heads/v1-10-stable from Xiaodong
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=7069279 ]

[AIRFLOW-3905] Allow using "parameters" in SqlSensor (#4723)

* [AIRFLOW-3905] Allow 'parameters' in SqlSensor

* Add check on conn_type & add test

Not all SQL-related connections are supported by SqlSensor,
due to limitation in Connection model and hook implementation.


> Allow using parameters for sql statement in SqlSensor
> -
>
> Key: AIRFLOW-3905
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3905
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> In most SQL-related operators/sensors, argument `parameters` is available to 
> help render SQL command conveniently. But this is not available in SqlSensor 
> yet.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3834) Remove DagBag from /log

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3834.

   Resolution: Done
Fix Version/s: 2.0.0

> Remove DagBag from /log
> ---
>
> Key: AIRFLOW-3834
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3834
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3981) Make Airflow UI timezone aware

2019-03-07 Thread Andrew Stahlman (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3981?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16787047#comment-16787047
 ] 

Andrew Stahlman commented on AIRFLOW-3981:
--

Yeah, using moment to convert the time should be doable - I'll take a stab it 
if I get some time this weekend. Unfortunately, I don't see a way to customize 
the formatting of the selected date to include the time zone, which could be a 
little confusing since all of the other timestamps in the UI would include the 
UTC offset.

> Make Airflow UI timezone aware
> --
>
> Key: AIRFLOW-3981
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3981
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Tao Feng
>Assignee: Andrew Stahlman
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] feng-tao commented on issue #4849: [AIRFLOW-XXX] Avoid spamming the log from Airflow security manager

2019-03-07 Thread GitBox
feng-tao commented on issue #4849: [AIRFLOW-XXX] Avoid spamming the log from 
Airflow security manager
URL: https://github.com/apache/airflow/pull/4849#issuecomment-470625869
 
 
   ping @Fokko @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] coufon commented on issue #4791: [AIRFLOW-3908] Add more Google Cloud Vision operators

2019-03-07 Thread GitBox
coufon commented on issue #4791:  [AIRFLOW-3908] Add more Google Cloud Vision 
operators
URL: https://github.com/apache/airflow/pull/4791#issuecomment-470623177
 
 
   LGTM


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-07 Thread GitBox
mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract 
Jinja directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r263337590
 
 

 ##
 File path: airflow/www/templates/airflow/dags.html
 ##
 @@ -256,11 +270,7 @@ DAGs
   });
 
   var $input = $(".typeahead");
-  unique_options_search = new Set([
-  {% for token in auto_complete_data %}
-"{{token}}",
-  {% endfor %}
-]);
+  unique_options_search = new Set(getMetaValue('tokens').split(','));
 
 Review comment:
   I would like to rewrite it on Ajax to speed up page load in seperate PR. Not 
everyone uses the search engine, but everyone has all the data loaded even when 
there are a lot of them.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ffinfo commented on issue #4872: [AIRFLOW-4038] Restructure database queries on /home

2019-03-07 Thread GitBox
ffinfo commented on issue #4872: [AIRFLOW-4038] Restructure database queries on 
/home
URL: https://github.com/apache/airflow/pull/4872#issuecomment-470598941
 
 
   @XD-DENG: Almost, indeed the main thing is to switch from `provide_session` 
to `create_session`. Also the order of statements is changes a bit because 
dags_query was build up on multiple statements but in-between the error query 
was there. this is now grouped.
   
   About the switch from provide_session to create_session. Me and @Fokko did a 
little research and using create_session keeps less time a session locked to a 
scope. `@provide_session` on nested sessions can keep the session open for a 
long time while not really needed. Of course this is bigger then just this call 
but try to convert things where I can. Has targeting this method to remove the 
DagBag of the Webserver but I did notice that this call has no usage of DagBag. 
By this time I already changed to provide_session ;)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
mik-laj commented on issue #4860: [AIRFLOW-] create user in quick start
URL: https://github.com/apache/airflow/pull/4860#issuecomment-470596757
 
 
   I think, you should also update `docs/security.rst` file. The statement is 
there: 
   > By default, all gates are opened. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract Jinja directive from Javascript

2019-03-07 Thread GitBox
mik-laj commented on a change in pull request #4787: [AIRFLOW-3967] Extract 
Jinja directive from Javascript
URL: https://github.com/apache/airflow/pull/4787#discussion_r263338149
 
 

 ##
 File path: airflow/www/templates/airflow/dag_code.html
 ##
 @@ -18,18 +18,25 @@
 {% extends "airflow/dag.html" %}
 {% block title %}Airflow - DAGs{% endblock %}
 
+{% block head_css %}
+  {{ super() }}
+  
+{% endblock %}
+
 {% block content %}
 {{ super() }}
 {{ title }}
 {{ html_code|safe }}
 {% endblock %}
 
 {% block tail %}
+
 {{ super() }}
 

[jira] [Commented] (AIRFLOW-3885) Improve Travis buildtime

2019-03-07 Thread Andrew Stahlman (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786936#comment-16786936
 ] 

Andrew Stahlman commented on AIRFLOW-3885:
--

[~ashb] if you can point me to a failed Travis run or tell me how to recreate 
your merges/cherry-picks then I can help take a look.

> Improve Travis buildtime
> 
>
> Key: AIRFLOW-3885
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3885
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: travis
>Affects Versions: 1.10.2
>Reporter: Drew Sonne
>Assignee: Drew Sonne
>Priority: Major
> Fix For: 1.10.3
>
>
> * Remove the "install" action on the "pre-test" stage to avoid performing 
> lengthy Docker pulls to perform pre-checks
>  * Set nosetests to return on any test failures
> ** Given the lengthy runtime of the airflow CI test suites, if any tests 
> fail, we should fail immediately and return the failed test locally. Users 
> can run the tests locally to get full lists of failed tests.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io edited a comment on issue #4863: [AIRFLOW-3841] Remove DagBag from /tree

2019-03-07 Thread GitBox
codecov-io edited a comment on issue #4863: [AIRFLOW-3841] Remove DagBag from 
/tree
URL: https://github.com/apache/airflow/pull/4863#issuecomment-470475904
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=h1) 
Report
   > Merging 
[#4863](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/68c66a1f8ae2398bbfc8b9556fbf72b7becb?src=pr=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `91.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4863/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4863  +/-   ##
   ==
   - Coverage75.3%   75.28%   -0.02% 
   ==
 Files 450  450  
 Lines   2902029027   +7 
   ==
 Hits2185321853  
   - Misses   7167 7174   +7
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.81% <90.9%> (-0.05%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.45% <92.3%> (-0.02%)` | :arrow_down: |
   | 
[airflow/contrib/auth/backends/ldap\_auth.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2F1dGgvYmFja2VuZHMvbGRhcF9hdXRoLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=footer). 
Last update 
[68c66a1...1aa50be](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4863: [AIRFLOW-3841] Remove DagBag from /tree

2019-03-07 Thread GitBox
codecov-io edited a comment on issue #4863: [AIRFLOW-3841] Remove DagBag from 
/tree
URL: https://github.com/apache/airflow/pull/4863#issuecomment-470475904
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=h1) 
Report
   > Merging 
[#4863](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/68c66a1f8ae2398bbfc8b9556fbf72b7becb?src=pr=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `91.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4863/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4863  +/-   ##
   ==
   - Coverage75.3%   75.28%   -0.02% 
   ==
 Files 450  450  
 Lines   2902029027   +7 
   ==
 Hits2185321853  
   - Misses   7167 7174   +7
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.81% <90.9%> (-0.05%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.45% <92.3%> (-0.02%)` | :arrow_down: |
   | 
[airflow/contrib/auth/backends/ldap\_auth.py](https://codecov.io/gh/apache/airflow/pull/4863/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2F1dGgvYmFja2VuZHMvbGRhcF9hdXRoLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=footer). 
Last update 
[68c66a1...1aa50be](https://codecov.io/gh/apache/airflow/pull/4863?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3009) Python 3.7 import collections.abc deprecation warning

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786901#comment-16786901
 ] 

ASF subversion and git services commented on AIRFLOW-3009:
--

Commit 231e4c44af0adfcf6aca8b9157b3c950a12897b1 in airflow's branch 
refs/heads/v1-10-stable from Francis Lalonde
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=231e4c4 ]

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 
deprecation warning (#3849)


> Python 3.7 import collections.abc deprecation warning 
> --
>
> Key: AIRFLOW-3009
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3009
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: core
>Affects Versions: 2.0.0
> Environment: Arch Linux, Python 3.7
>Reporter: Francis Lalonde
>Priority: Minor
>
> After pip-installing Airflow from source, the following warning message 
> appears upon entering any airflow command from the prompt:
> {{/usr/lib/python3.7/site-packages/airflow/models.py:29: DeprecationWarning: 
> Using or importing the ABCs from 'collections' instead of from 
> 'collections.abc' is deprecated, and in 3.8 it will stop working}}{{from 
> collections import namedtuple, defaultdict, Hashable}}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] XD-DENG edited a comment on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator response should be logged even if the response check fails

2019-03-07 Thread GitBox
XD-DENG edited a comment on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator 
response should be logged even if the response check fails
URL: https://github.com/apache/airflow/pull/4873#issuecomment-470579593
 
 
   LGTM (other than that the commit subject could have been a bit shorter    ).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2231) DAG with a relativedelta schedule_interval fails

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2231?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786900#comment-16786900
 ] 

ASF subversion and git services commented on AIRFLOW-2231:
--

Commit f68938fd629ada00f14f10ea27e094e28e7247a8 in airflow's branch 
refs/heads/v1-10-stable from brookskd
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f68938f ]

[AIRFLOW-2231] Fix relativedelta DAG schedule_interval (#3174)

Fixes issues when specifying a DAG with a schedule_interval of type 
relativedelta.


> DAG with a relativedelta schedule_interval fails
> 
>
> Key: AIRFLOW-2231
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2231
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Reporter: Kyle Brooks
>Priority: Major
> Fix For: 1.10.3
>
> Attachments: test_reldel.py
>
>
> The documentation for the DAG class says using 
> dateutil.relativedelta.relativedelta as a schedule_interval is supported but 
> it fails:
>  
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 285, 
> in process_file
>     m = imp.load_source(mod_name, filepath)
>   File 
> "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/imp.py",
>  line 172, in load_source
>     module = _load(spec)
>   File "", line 675, in _load
>   File "", line 655, in _load_unlocked
>   File "", line 678, in exec_module
>   File "", line 205, in _call_with_frames_removed
>   File "/Users/k398995/airflow/dags/test_reldel.py", line 33, in 
>     dagrun_timeout=timedelta(minutes=60))
>   File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 2914, 
> in __init__
>     if schedule_interval in cron_presets:
> TypeError: unhashable type: 'relativedelta'
>  
> It looks like the __init__ function for class DAG assumes the 
> schedule_interval is hashable.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] XD-DENG commented on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator response should be logged even if the response check fails

2019-03-07 Thread GitBox
XD-DENG commented on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator response 
should be logged even if the response check fails
URL: https://github.com/apache/airflow/pull/4873#issuecomment-470579593
 
 
   LGTM.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #4857: [AIRFLOW-XXX] Improve airflow-jira script to make RelManager's life easier

2019-03-07 Thread GitBox
kaxil commented on issue #4857: [AIRFLOW-XXX] Improve airflow-jira script to 
make RelManager's life easier
URL: https://github.com/apache/airflow/pull/4857#issuecomment-470578862
 
 
   Had a quick look. LGTM. Last day of my annual leave :-D 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator response should be logged even if the response check fails

2019-03-07 Thread GitBox
codecov-io commented on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator 
response should be logged even if the response check fails
URL: https://github.com/apache/airflow/pull/4873#issuecomment-470573662
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=h1) 
Report
   > Merging 
[#4873](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4873/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4873  +/-   ##
   ==
   + Coverage   75.28%   75.29%   +<.01% 
   ==
 Files 450  450  
 Lines   2902629026  
   ==
   + Hits2185321854   +1 
   + Misses   7173 7172   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/http\_operator.py](https://codecov.io/gh/apache/airflow/pull/4873/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvaHR0cF9vcGVyYXRvci5weQ==)
 | `96.77% <100%> (+3.22%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=footer). 
Last update 
[3dd7955...adc47d7](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #4873: [AIRFLOW-4037] In SimpleHttpOperator response should be logged even if the response check fails

2019-03-07 Thread GitBox
codecov-io edited a comment on issue #4873: [AIRFLOW-4037] In 
SimpleHttpOperator response should be logged even if the response check fails
URL: https://github.com/apache/airflow/pull/4873#issuecomment-470573662
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=h1) 
Report
   > Merging 
[#4873](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4873/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4873  +/-   ##
   ==
   + Coverage   75.28%   75.29%   +<.01% 
   ==
 Files 450  450  
 Lines   2902629026  
   ==
   + Hits2185321854   +1 
   + Misses   7173 7172   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/http\_operator.py](https://codecov.io/gh/apache/airflow/pull/4873/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvaHR0cF9vcGVyYXRvci5weQ==)
 | `96.77% <100%> (+3.22%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=footer). 
Last update 
[3dd7955...adc47d7](https://codecov.io/gh/apache/airflow/pull/4873?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-3833) Remove DagBag from /get_logs_with_metadata

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3833:
-

Assignee: Julian de Ruiter  (was: Peter van 't Hof)

> Remove DagBag from /get_logs_with_metadata
> --
>
> Key: AIRFLOW-3833
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3833
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Julian de Ruiter
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3833) Remove DagBag from /get_logs_with_metadata

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3833 started by Peter van 't Hof.
-
> Remove DagBag from /get_logs_with_metadata
> --
>
> Key: AIRFLOW-3833
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3833
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3834) Remove DagBag from /log

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786868#comment-16786868
 ] 

ASF subversion and git services commented on AIRFLOW-3834:
--

Commit f8dacae03340cb8423e37d7b053e7625a157f89e in airflow's branch 
refs/heads/master from Peter van 't Hof
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f8dacae ]

[AIRFLOW-3834] Remove dagbag from /log (#4841)



> Remove DagBag from /log
> ---
>
> Key: AIRFLOW-3834
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3834
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #4872: [AIRFLOW-4038] Restructure database queries on /home

2019-03-07 Thread GitBox
codecov-io commented on issue #4872: [AIRFLOW-4038] Restructure database 
queries on /home
URL: https://github.com/apache/airflow/pull/4872#issuecomment-470564577
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=h1) 
Report
   > Merging 
[#4872](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `87.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4872/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4872  +/-   ##
   ==
   - Coverage   75.28%   75.28%   -0.01% 
   ==
 Files 450  450  
 Lines   2902629023   -3 
   ==
   - Hits2185321849   -4 
   - Misses   7173 7174   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4872/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.41% <87.5%> (-0.06%)` | :arrow_down: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4872/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.81% <0%> (-0.06%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=footer). 
Last update 
[3dd7955...90b2b1b](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4872: [AIRFLOW-4038] Restructure database queries on /home

2019-03-07 Thread GitBox
codecov-io commented on issue #4872: [AIRFLOW-4038] Restructure database 
queries on /home
URL: https://github.com/apache/airflow/pull/4872#issuecomment-470564582
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=h1) 
Report
   > Merging 
[#4872](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `87.5%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4872/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4872  +/-   ##
   ==
   - Coverage   75.28%   75.28%   -0.01% 
   ==
 Files 450  450  
 Lines   2902629023   -3 
   ==
   - Hits2185321849   -4 
   - Misses   7173 7174   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4872/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.41% <87.5%> (-0.06%)` | :arrow_down: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4872/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.81% <0%> (-0.06%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=footer). 
Last update 
[3dd7955...90b2b1b](https://codecov.io/gh/apache/airflow/pull/4872?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-3843) Remove DagBag from /tries

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3843 started by Peter van 't Hof.
-
> Remove DagBag from /tries
> -
>
> Key: AIRFLOW-3843
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3843
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-3832) Remove DagBag from /rendered

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3832?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3832:
-

Assignee: Julian de Ruiter  (was: Peter van 't Hof)

> Remove DagBag from /rendered
> 
>
> Key: AIRFLOW-3832
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3832
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Julian de Ruiter
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3834) Remove DagBag from /log

2019-03-07 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786867#comment-16786867
 ] 

ASF GitHub Bot commented on AIRFLOW-3834:
-

ashb commented on pull request #4841: [AIRFLOW-3834] Remove dagbag from /log
URL: https://github.com/apache/airflow/pull/4841
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Remove DagBag from /log
> ---
>
> Key: AIRFLOW-3834
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3834
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3846) Remove DagBag from /gantt

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3846 started by Peter van 't Hof.
-
> Remove DagBag from /gantt
> -
>
> Key: AIRFLOW-3846
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3846
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] ashb merged pull request #4841: [AIRFLOW-3834] Remove dagbag from /log

2019-03-07 Thread GitBox
ashb merged pull request #4841: [AIRFLOW-3834] Remove dagbag from /log
URL: https://github.com/apache/airflow/pull/4841
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-3846) Remove DagBag from /gantt

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3846:
-

Assignee: Julian de Ruiter  (was: Peter van 't Hof)

> Remove DagBag from /gantt
> -
>
> Key: AIRFLOW-3846
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3846
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Julian de Ruiter
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #4870: [AIRFLOW-3843] Remove DagBag from /tries.

2019-03-07 Thread GitBox
codecov-io commented on issue #4870: [AIRFLOW-3843] Remove DagBag from /tries.
URL: https://github.com/apache/airflow/pull/4870#issuecomment-470558956
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=h1) 
Report
   > Merging 
[#4870](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `66.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4870/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4870  +/-   ##
   ==
   - Coverage   75.28%   75.28%   -0.01% 
   ==
 Files 450  450  
 Lines   2902629030   +4 
   ==
   + Hits2185321855   +2 
   - Misses   7173 7175   +2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4870/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.39% <66.66%> (-0.08%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=footer). 
Last update 
[3dd7955...9782b0c](https://codecov.io/gh/apache/airflow/pull/4870?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-4039) Fix various issues on the /blocked endpoint

2019-03-07 Thread Bas Harenslak (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4039?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bas Harenslak updated AIRFLOW-4039:
---
Description: 
I think the /blocked endpoint could be improved in various ways:
 * Give more meaningful name. It returns a list of DAG ids and corresponding 
active dag runs & max dag runs. Could be named e.g. "max_dagruns".
 * The tooltip doesn't work for (running on Chrome 72). tooltip() is called in 
dags.html, but I only see the attribute when hovering over the schedule of a 
DAG.
 * The info is fetched and returned for all DAG runs, while there is pagination 
in the Airflow UI. Would make sense to fetch only the dagrun counts for the DAG 
ids on the shown page.
 * We should persist max_active_runs in the database to avoid re-parsing DAG 
files.

  was:
I think the /blocked endpoint could be improved in various ways:
 * Give more meaningful name. It returns a list of DAG ids and corresponding 
active dag runs & max dag runs. Could be named e.g. "max_dagruns".
 * The tooltip doesn't work for (running on Chrome 72). tooltip() is called in 
dags.html, but I only see the attribute when hovering over the schedule of a 
DAG.
 * The info is fetched and returned for all DAG runs, while there is pagination 
in the Airflow UI. Would make sense to fetch only the dagrun counts for the DAG 
ids on the shown page.


> Fix various issues on the /blocked endpoint
> ---
>
> Key: AIRFLOW-4039
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4039
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Bas Harenslak
>Priority: Major
>
> I think the /blocked endpoint could be improved in various ways:
>  * Give more meaningful name. It returns a list of DAG ids and corresponding 
> active dag runs & max dag runs. Could be named e.g. "max_dagruns".
>  * The tooltip doesn't work for (running on Chrome 72). tooltip() is called 
> in dags.html, but I only see the attribute when hovering over the schedule of 
> a DAG.
>  * The info is fetched and returned for all DAG runs, while there is 
> pagination in the Airflow UI. Would make sense to fetch only the dagrun 
> counts for the DAG ids on the shown page.
>  * We should persist max_active_runs in the database to avoid re-parsing DAG 
> files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-3844) Remove DagBag from /landing_times

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3844?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3844:
-

Assignee: Julian de Ruiter  (was: Peter van 't Hof)

> Remove DagBag from /landing_times
> -
>
> Key: AIRFLOW-3844
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3844
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Julian de Ruiter
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3844) Remove DagBag from /landing_times

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3844?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3844 started by Peter van 't Hof.
-
> Remove DagBag from /landing_times
> -
>
> Key: AIRFLOW-3844
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3844
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3832) Remove DagBag from /rendered

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3832?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3832 started by Peter van 't Hof.
-
> Remove DagBag from /rendered
> 
>
> Key: AIRFLOW-3832
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3832
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3837) Remove DagBag from /blocked

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3837 started by Peter van 't Hof.
-
> Remove DagBag from /blocked
> ---
>
> Key: AIRFLOW-3837
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3837
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Peter van 't Hof
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-3837) Remove DagBag from /blocked

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3837:
-

Assignee: Bas Harenslak  (was: Peter van 't Hof)

> Remove DagBag from /blocked
> ---
>
> Key: AIRFLOW-3837
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3837
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Bas Harenslak
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-3842) Remove DagBag from /duration

2019-03-07 Thread Peter van 't Hof (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3842?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter van 't Hof reassigned AIRFLOW-3842:
-

Assignee: Julian de Ruiter  (was: Peter van 't Hof)

> Remove DagBag from /duration
> 
>
> Key: AIRFLOW-3842
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3842
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Peter van 't Hof
>Assignee: Julian de Ruiter
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2985) Operators for S3 object copying/deleting [boto3.client.copy_object()/delete_object()]

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2985:
---
Fix Version/s: (was: 2.0.0)
   1.10.3

> Operators for S3 object copying/deleting 
> [boto3.client.copy_object()/delete_object()]
> -
>
> Key: AIRFLOW-2985
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2985
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> Currently we don't have an operator in Airflow to help copy/delete objects 
> within S3, while they may be quite common use case when we deal with the data 
> in S3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2993) Addition of S3_to_SFTP and SFTP_to_S3 Operators

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2993:
---
Fix Version/s: (was: 2.0.0)
   1.10.3

> Addition of S3_to_SFTP  and SFTP_to_S3 Operators
> 
>
> Key: AIRFLOW-2993
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2993
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Affects Versions: 1.9.0
>Reporter: Wayne Morris
>Assignee: Wayne Morris
>Priority: Major
> Fix For: 1.10.3
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2985) Operators for S3 object copying/deleting [boto3.client.copy_object()/delete_object()]

2019-03-07 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2985:
---
Issue Type: New Feature  (was: Improvement)

> Operators for S3 object copying/deleting 
> [boto3.client.copy_object()/delete_object()]
> -
>
> Key: AIRFLOW-2985
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2985
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: operators
>Reporter: Xiaodong DENG
>Assignee: Xiaodong DENG
>Priority: Minor
> Fix For: 1.10.3
>
>
> Currently we don't have an operator in Airflow to help copy/delete objects 
> within S3, while they may be quite common use case when we deal with the data 
> in S3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3751) LDAP - Malformed Schema

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786835#comment-16786835
 ] 

ASF subversion and git services commented on AIRFLOW-3751:
--

Commit b8ad457bf88d97fbb78880eda8e5c36a0cf7542a in airflow's branch 
refs/heads/v1-10-stable from Colin
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=b8ad457 ]

[AIRFLOW-3751] Option to allow malformed schemas for LDAP authentication (#4574)



> LDAP - Malformed Schema
> ---
>
> Key: AIRFLOW-3751
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3751
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.10.1
>Reporter: Colin Streicher
>Assignee: Colin Streicher
>Priority: Minor
> Fix For: 1.10.3
>
>
> This issue only appears to happen when using an LDAP server from which schema 
> is not available. This came up specifically when using Foxpass, but my 
> assumption is that this sort of thing is likely to happen for any LDAP as a 
> Service offering.
> Essentially, the issue is that the default setting for the ldap3 library is 
> to try to pull the schema from the server. From a normal ldap server, this is 
> just a call with a baseDN of '', however because of security 
> concerns(presumably), services like foxpass do not return anything when the 
> basedn is set to nothing.
> When the basedn is set to the normal search dn, there are no schema objects 
> returned. Since the get_info parameter in the Server() call validates the 
> schema by default, the call fails.
> In terms of fixing, this is pretty simple, adding a parameter that reflects 
> the setting in ldap3 that ignores this fixes the issue handily.
> In my dev environment, I made the following changes to ldap_auth.py
> {code:java}
> import ldap3
> ...
> def get_ldap_connection(dn=None, password=None):
> ...
> try:
> ignore_malformed_schema = configuration.conf.get("ldap", 
> "ignore_malformed_schema")
> except AirflowConfigException:
> pass
> if ignore_malformed_schema:
> 
> ldap3.set_config_parameter('IGNORE_MALFORMED_SCHEMA',ignore_malformed_schema)
> ...
> {code}
> Now, with AIRFLOW__LDAP__IGNORE_MALFORMED_SCHEMA=True, things work as 
> expected.
> I will open a PR for this, but before I do, I would welcome any feedback on 
> if this should be done, or if it should be done differently.
> Thank you in advance for any feedback.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2641) Fix MySqlToHiveTransfer to handle MySQL DECIMAL correctly

2019-03-07 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16786836#comment-16786836
 ] 

ASF subversion and git services commented on AIRFLOW-2641:
--

Commit e7343a3db108f5aea8467c73ca9856819e130d73 in airflow's branch 
refs/heads/v1-10-stable from OmerJog
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=e7343a3 ]

[AIRFLOW-2641] Fix MySqlToHiveTransfer to handle MySQL DECIMAL correctly



> Fix MySqlToHiveTransfer to handle MySQL DECIMAL correctly
> -
>
> Key: AIRFLOW-2641
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2641
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Kengo Seki
>Priority: Major
> Fix For: 1.10.3
>
>
> This line
> {code:title=airflow/operators/mysql_to_hive.py}
>  98 @classmethod
>  99 def type_map(cls, mysql_type):
> 100 t = MySQLdb.constants.FIELD_TYPE
> 101 d = {
> 102 t.BIT: 'INT',
> 103 t.DECIMAL: 'DOUBLE',
> {code}
> perhaps intends to convert MySQL DECIMAL to Hive DOUBLE, but it doesn't work 
> as expected.
> {code}
> mysql> DESC t;
> +---+---+--+-+-+---+
> | Field | Type  | Null | Key | Default | Extra |
> +---+---+--+-+-+---+
> | c | decimal(10,0) | YES  | | NULL|   |
> +---+---+--+-+-+---+
> 1 row in set (0.00 sec)
> {code}
> {code}
> In [1]: from airflow.operators.mysql_to_hive import MySqlToHiveTransfer
> In [2]: t = MySqlToHiveTransfer(mysql_conn_id="airflow_db", sql="SELECT * 
> FROM t", hive_table="t", recreate=True, task_id="t", ignore_ti_state=True)
> In [3]: t.execute(None)
> [2018-06-18 23:37:53,193] {base_hook.py:83} INFO - Using connection to: 
> localhost
> [2018-06-18 23:37:53,199] {hive_hooks.py:429} INFO - DROP TABLE IF EXISTS t;
> CREATE TABLE IF NOT EXISTS t (
> c STRING)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ''
> STORED AS textfile
> ;
> (snip)
> [2018-06-18 23:38:25,048] {hive_hooks.py:235} INFO - Loading data to table 
> default.t
> [2018-06-18 23:38:25,866] {hive_hooks.py:235} INFO - Table default.t stats: 
> [numFiles=1$ numRows=0, totalSize=0, rawDataSize=0]
> [2018-06-18 23:38:25,868] {hive_hooks.py:235} INFO - OK
> [2018-06-18 23:38:25,871] {hive_hooks.py:235} INFO - Time taken: 1.498 seconds
> {code}
> {code}
> $ hive -e 'DESC t'
> Logging initialized using configuration in 
> file:/etc/hive/conf.dist/hive-log4j.properties
> OK
> c   string
> Time taken: 2.342 seconds, Fetched: 1 row(s)
> {code}
> This is because {{MySQLdb.constants.FIELD_TYPE.DECIMAL}} does not stand for 
> DECIMAL type on MySQL 5.0+. {{MySQLdb.constants.FIELD_TYPE.NEWDECIMAL}} 
> should be used here.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io commented on issue #4868: [AIRFLOW-3836] Remove DagBag from /run

2019-03-07 Thread GitBox
codecov-io commented on issue #4868: [AIRFLOW-3836] Remove DagBag from /run
URL: https://github.com/apache/airflow/pull/4868#issuecomment-470548668
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=h1) 
Report
   > Merging 
[#4868](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/3dd79558b65fd0f5ae98ae1ab330afdf4f4c1840?src=pr=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `57.14%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4868/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4868  +/-   ##
   ==
   - Coverage   75.28%   75.27%   -0.02% 
   ==
 Files 450  450  
 Lines   2902629030   +4 
   ==
 Hits2185321853  
   - Misses   7173 7177   +4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4868/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.32% <57.14%> (-0.15%)` | :arrow_down: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4868/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.81% <0%> (-0.06%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=footer). 
Last update 
[3dd7955...187bbd6](https://codecov.io/gh/apache/airflow/pull/4868?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on issue #4860: [AIRFLOW-XXXX] create user in quick start

2019-03-07 Thread GitBox
r39132 commented on issue #4860: [AIRFLOW-] create user in quick start
URL: https://github.com/apache/airflow/pull/4860#issuecomment-470547927
 
 
   @milton0825 so, this is for 2.0+. Till 2.0+, it seems the `airflow 
create_user` is the relevant API.
   
   @ashb @Fokko @feng-tao Are we planning to release `airflow users` in any 
release before 2.0? If so, should he change his API call?
   
   I'm assuming this is not specific to the new RBAC webserver and applies to 
our current webserver as well?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   3   >