[GitHub] [airflow] codecov-io commented on issue #6138: [AIRFLOW-XXX] Fix typos in CONTRIBUTING.md

2019-09-18 Thread GitBox
codecov-io commented on issue #6138: [AIRFLOW-XXX] Fix typos in CONTRIBUTING.md
URL: https://github.com/apache/airflow/pull/6138#issuecomment-532533453
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=h1) 
Report
   > Merging 
[#6138](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/fa8e18a2a833a2774f48c413a72df052e8c8f4f0?src=pr=desc)
 will **decrease** coverage by `0.07%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6138/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6138  +/-   ##
   ==
   - Coverage   80.04%   79.97%   -0.08% 
   ==
 Files 607  607  
 Lines   3501935019  
   ==
   - Hits2803228006  -26 
   - Misses   6987 7013  +26
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6138/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/airflow/pull/6138/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `94.73% <0%> (-1.76%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6138/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `91.52% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6138/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `86.44% <0%> (-1.7%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=footer). 
Last update 
[fa8e18a...c61c0b2](https://codecov.io/gh/apache/airflow/pull/6138?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin opened a new pull request #6139: [AIRFLOW-5513] Move example_pubsub_flow.py to GCP package

2019-09-18 Thread GitBox
nuclearpinguin opened a new pull request #6139: [AIRFLOW-5513] Move 
example_pubsub_flow.py to GCP package
URL: https://github.com/apache/airflow/pull/6139
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5513
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5513) Move example_pubsub_flow.py to GCP package

2019-09-18 Thread Tomasz Urbaszek (Jira)
Tomasz Urbaszek created AIRFLOW-5513:


 Summary: Move example_pubsub_flow.py to GCP package
 Key: AIRFLOW-5513
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5513
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix 
action_logging so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#discussion_r325564835
 
 

 ##
 File path: airflow/www/decorators.py
 ##
 @@ -39,17 +39,19 @@ def wrapper(*args, **kwargs):
 else:
 user = g.user.username
 
+params = request.form if request.method == 'POST' else request.args
+
 log = Log(
 event=f.__name__,
 task_instance=None,
 owner=user,
-extra=str(list(request.args.items())),
-task_id=request.args.get('task_id'),
-dag_id=request.args.get('dag_id'))
+extra=str(list(params.items())),
 
 Review comment:
   Though I do worry that this will capture passwords and other sensitive info 
from the Variables form etc.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix 
action_logging so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#discussion_r325564835
 
 

 ##
 File path: airflow/www/decorators.py
 ##
 @@ -39,17 +39,19 @@ def wrapper(*args, **kwargs):
 else:
 user = g.user.username
 
+params = request.form if request.method == 'POST' else request.args
+
 log = Log(
 event=f.__name__,
 task_instance=None,
 owner=user,
-extra=str(list(request.args.items())),
-task_id=request.args.get('task_id'),
-dag_id=request.args.get('dag_id'))
+extra=str(list(params.items())),
 
 Review comment:
   Though I do worry that this will capture passwords and other sensitive info 
from the Connections form etc.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
ashb commented on a change in pull request #6064: [AIRFLOW-5444] Fix 
action_logging so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#discussion_r325564568
 
 

 ##
 File path: airflow/www/decorators.py
 ##
 @@ -39,17 +39,19 @@ def wrapper(*args, **kwargs):
 else:
 user = g.user.username
 
+params = request.form if request.method == 'POST' else request.args
+
 log = Log(
 event=f.__name__,
 task_instance=None,
 owner=user,
-extra=str(list(request.args.items())),
-task_id=request.args.get('task_id'),
-dag_id=request.args.get('dag_id'))
+extra=str(list(params.items())),
 
 Review comment:
   Just use 
[request.values](https://werkzeug.readthedocs.io/en/0.15.x/wrappers/#werkzeug.wrappers.BaseRequest.values)
 instead.
   
   ```suggestion
   extra=str(list(request.values.items())),
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] eladkal commented on issue #5998: [AIRFLOW-5398] Update contrib example DAGs to context manager

2019-09-18 Thread GitBox
eladkal commented on issue #5998: [AIRFLOW-5398] Update contrib example DAGs to 
context manager
URL: https://github.com/apache/airflow/pull/5998#issuecomment-532550769
 
 
   @mik-laj Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3871) Allow Jinja templating recursively on object attributes

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932167#comment-16932167
 ] 

ASF GitHub Bot commented on AIRFLOW-3871:
-

ashb commented on pull request #4743: [AIRFLOW-3871] render Operators template 
fields recursively
URL: https://github.com/apache/airflow/pull/4743
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Allow Jinja templating recursively on object attributes
> ---
>
> Key: AIRFLOW-3871
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3871
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: operators
>Affects Versions: 1.10.0
>Reporter: Galak
>Assignee: Björn Pollex
>Priority: Minor
>
> Some {{Operator}} fields can be templated (using Jinja). Template rendering 
> only works for string values (either direct values or values stored in 
> collections).
> But a templated string inside a custom class instance won't be rendered
> Here is my scenario: 
> I have a python method {{transform_data_file}} which is designed to call a 
> command object. This command object constructor 
> ({{MyAwesomeDataFileTransformer}}) has parameters that could be templated. 
> These templated parameters are not rendered so far (see 
> {{BaseOperator.render_template_from_field}} method). 
> {code}
> simple_task = PythonOperator(
> task_id='simple_task',
> provide_context=True,
> python_callable=transform_data_file,
> templates_dict={
>   'transformer': MyAwesomeDataFileTransformer(
> "/data/{{ dag.dag_id }}/{{ ts }}/input_file",
> "/data/{{ dag.dag_id }}/{{ ts }}/output_file",
> )
> },
> dag=dag
> )
> {code}
> I have 3 alternatives in mind to allow rendering inner attributes:
> # Either define an Abstract Base Class declaring an abstract method 
> {{render_template}}; then my command object would have to extend this 
> Abstract Base Class, and then implement {{render_template}} method.
> # Or use duck typing in {{BaseOperator.render_template_from_field}} to call 
> {{render_template}} method when it exists on templated custom objects; then 
> my command object would just have to implement {{render_template}} method.
> # Or traverse object attributes when rendering templates and call 
> {{BaseOperator.render_template}} recursively; then my command object would 
> not need any change
> My preferred solution is the 3rd one, but I would like to hear about your 
> opinion on this before. Maybe is there a 4th and better solution?
> I would be glad to submit a PR if this functionality is accepted.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-3871) Allow Jinja templating recursively on object attributes

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932168#comment-16932168
 ] 

ASF subversion and git services commented on AIRFLOW-3871:
--

Commit d567f9ab8d5fdd6d18509fd8d623b26795eca25c in airflow's branch 
refs/heads/master from Géraud
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=d567f9a ]

[AIRFLOW-3871] Operators template fields can now render fields inside objects 
(#4743)




> Allow Jinja templating recursively on object attributes
> ---
>
> Key: AIRFLOW-3871
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3871
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: operators
>Affects Versions: 1.10.0
>Reporter: Galak
>Assignee: Björn Pollex
>Priority: Minor
>
> Some {{Operator}} fields can be templated (using Jinja). Template rendering 
> only works for string values (either direct values or values stored in 
> collections).
> But a templated string inside a custom class instance won't be rendered
> Here is my scenario: 
> I have a python method {{transform_data_file}} which is designed to call a 
> command object. This command object constructor 
> ({{MyAwesomeDataFileTransformer}}) has parameters that could be templated. 
> These templated parameters are not rendered so far (see 
> {{BaseOperator.render_template_from_field}} method). 
> {code}
> simple_task = PythonOperator(
> task_id='simple_task',
> provide_context=True,
> python_callable=transform_data_file,
> templates_dict={
>   'transformer': MyAwesomeDataFileTransformer(
> "/data/{{ dag.dag_id }}/{{ ts }}/input_file",
> "/data/{{ dag.dag_id }}/{{ ts }}/output_file",
> )
> },
> dag=dag
> )
> {code}
> I have 3 alternatives in mind to allow rendering inner attributes:
> # Either define an Abstract Base Class declaring an abstract method 
> {{render_template}}; then my command object would have to extend this 
> Abstract Base Class, and then implement {{render_template}} method.
> # Or use duck typing in {{BaseOperator.render_template_from_field}} to call 
> {{render_template}} method when it exists on templated custom objects; then 
> my command object would just have to implement {{render_template}} method.
> # Or traverse object attributes when rendering templates and call 
> {{BaseOperator.render_template}} recursively; then my command object would 
> not need any change
> My preferred solution is the 3rd one, but I would like to hear about your 
> opinion on this before. Maybe is there a 4th and better solution?
> I would be glad to submit a PR if this functionality is accepted.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb merged pull request #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-09-18 Thread GitBox
ashb merged pull request #4743: [AIRFLOW-3871] render Operators template fields 
recursively
URL: https://github.com/apache/airflow/pull/4743
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-09-18 Thread GitBox
ashb commented on issue #4743: [AIRFLOW-3871] render Operators template fields 
recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-532563068
 
 
   Sorry it took so many months to get this in, and thanks for sticking with us!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle merged pull request #6138: [AIRFLOW-XXX] Fix typos in CONTRIBUTING.md

2019-09-18 Thread GitBox
feluelle merged pull request #6138: [AIRFLOW-XXX] Fix typos in CONTRIBUTING.md
URL: https://github.com/apache/airflow/pull/6138
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] lepture opened a new pull request #6140: [AIRFLOW-3753] Replace Flask-OAuthlib with Authlib

2019-09-18 Thread GitBox
lepture opened a new pull request #6140: [AIRFLOW-3753] Replace Flask-OAuthlib 
with Authlib
URL: https://github.com/apache/airflow/pull/6140
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3753
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Flask-OAuthlib is deprecated, use Authlib instead. Updated Google and GitHub 
Enterprise OAuth backends.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3753) Upgrade Google Oauth Backend from Flask Oauthlib to Authlib

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3753?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932243#comment-16932243
 ] 

ASF GitHub Bot commented on AIRFLOW-3753:
-

lepture commented on pull request #6140: [AIRFLOW-3753] Replace Flask-OAuthlib 
with Authlib
URL: https://github.com/apache/airflow/pull/6140
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-3753
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Flask-OAuthlib is deprecated, use Authlib instead. Updated Google and GitHub 
Enterprise OAuth backends.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Upgrade Google Oauth Backend from Flask Oauthlib to Authlib
> ---
>
> Key: AIRFLOW-3753
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3753
> Project: Apache Airflow
>  Issue Type: Wish
>Reporter: Daniel Rubenstein
>Priority: Minor
>
> [Airflow's contributed backed for Google 
> Authentication|https://github.com/apache/airflow/blob/master/airflow/contrib/auth/backends/google_auth.py]
>  relies on the python library [Flask 
> Oauthlib|https://github.com/lepture/flask-oauthlib], and is tied to having a 
> version greater than 0.9.1 ([see 
> setup.py|https://github.com/apache/airflow/blob/master/setup.py#L186]).
> As of March 2018 (see [this 
> update|https://github.com/lepture/flask-oauthlib/commit/7a58b6d48a99a2d3aca4e887a4bb56458fcc6e46#diff-88b99bb28683bd5b7e3a204826ead112]),
>  [Authlib|https://github.com/lepture/authlib] (maintained by the same owner) 
> has been recommended as the replacement.
> Since that time, external changes to Oauth have caused various breaks in 
> builds that are pinned to this version, as Flask-Oauthlib is deprecated at 
> this point and does not receive standard PyPI updates and other maintenance.
> I'm wondering if the maintainers of the project would accept a PR that 
> updates the backend to use the newer library. I'd be happy to take a crack at 
> it, just want to make sure it's something that the community sees as 
> worthwhile before getting started.
> Thanks!



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #5998: [AIRFLOW-5398] Update contrib example DAGs to context manager

2019-09-18 Thread GitBox
codecov-io edited a comment on issue #5998: [AIRFLOW-5398] Update contrib 
example DAGs to context manager
URL: https://github.com/apache/airflow/pull/5998#issuecomment-531900151
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=h1) 
Report
   > Merging 
[#5998](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/25c53b0a10f078417a82be7acbc6485e7a38c9a6?src=pr=desc)
 will **decrease** coverage by `0.35%`.
   > The diff coverage is `0%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/5998/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#5998  +/-   ##
   ==
   - Coverage   80.09%   79.73%   -0.36% 
   ==
 Files 596  608  +12 
 Lines   3487235014 +142 
   ==
   - Hits2793227920  -12 
   - Misses   6940 7094 +154
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[...ontrib/example\_dags/example\_databricks\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2RhdGFicmlja3Nfb3BlcmF0b3IucHk=)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...dags/example\_azure\_container\_instances\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2F6dXJlX2NvbnRhaW5lcl9pbnN0YW5jZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...example\_dags/example\_kubernetes\_executor\_config.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2t1YmVybmV0ZXNfZXhlY3V0b3JfY29uZmlnLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...ow/contrib/example\_dags/example\_qubole\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3F1Ym9sZV9vcGVyYXRvci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...ample\_dags/example\_emr\_job\_flow\_automatic\_steps.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2Vtcl9qb2JfZmxvd19hdXRvbWF0aWNfc3RlcHMucHk=)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...trib/example\_dags/example\_azure\_cosmosdb\_sensor.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2F6dXJlX2Nvc21vc2RiX3NlbnNvci5weQ==)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...flow/contrib/example\_dags/example\_qubole\_sensor.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3F1Ym9sZV9zZW5zb3IucHk=)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[.../contrib/example\_dags/example\_dingding\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2RpbmdkaW5nX29wZXJhdG9yLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...contrib/example\_dags/example\_papermill\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3BhcGVybWlsbF9vcGVyYXRvci5weQ==)
 | `0% <0%> (ø)` | |
   | 
[...low/contrib/example\_dags/example\_winrm\_operator.py](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3dpbnJtX29wZXJhdG9yLnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | ... and [90 
more](https://codecov.io/gh/apache/airflow/pull/5998/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=footer). 
Last update 
[25c53b0...ad7b5d3](https://codecov.io/gh/apache/airflow/pull/5998?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5513) Move example_pubsub_flow.py to GCP package

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5513?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932189#comment-16932189
 ] 

ASF GitHub Bot commented on AIRFLOW-5513:
-

nuclearpinguin commented on pull request #6139: [AIRFLOW-5513] Move 
example_pubsub_flow.py to GCP package
URL: https://github.com/apache/airflow/pull/6139
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5513
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Move example_pubsub_flow.py to GCP package
> --
>
> Key: AIRFLOW-5513
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5513
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5512) Development dependencies need gcp modules

2019-09-18 Thread Hao Liang (Jira)
Hao Liang created AIRFLOW-5512:
--

 Summary: Development dependencies need gcp modules
 Key: AIRFLOW-5512
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5512
 Project: Apache Airflow
  Issue Type: Improvement
  Components: dependencies
Affects Versions: 1.10.5
Reporter: Hao Liang
Assignee: Hao Liang


Since airflow/example_dags/example_gcs_to_bq.py requires GCP modules, running 
{code:bash}
airflow db init
{code}
after
{code:bash}
pip install -e ".[devel]"
{code}
raises error: ModuleNotFoundError: No module named 'google.cloud'.

Maybe we should add GCP dependencies to devel.






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on issue #6066: [AIRFLOW-XXX] Update to new logo

2019-09-18 Thread GitBox
ashb commented on issue #6066: [AIRFLOW-XXX] Update to new logo
URL: https://github.com/apache/airflow/pull/6066#issuecomment-532562131
 
 
   Curious


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3871) Allow Jinja templating recursively on object attributes

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3871?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3871.

Fix Version/s: 1.10.6
   Resolution: Fixed

> Allow Jinja templating recursively on object attributes
> ---
>
> Key: AIRFLOW-3871
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3871
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: operators
>Affects Versions: 1.10.0
>Reporter: Galak
>Assignee: Björn Pollex
>Priority: Minor
> Fix For: 1.10.6
>
>
> Some {{Operator}} fields can be templated (using Jinja). Template rendering 
> only works for string values (either direct values or values stored in 
> collections).
> But a templated string inside a custom class instance won't be rendered
> Here is my scenario: 
> I have a python method {{transform_data_file}} which is designed to call a 
> command object. This command object constructor 
> ({{MyAwesomeDataFileTransformer}}) has parameters that could be templated. 
> These templated parameters are not rendered so far (see 
> {{BaseOperator.render_template_from_field}} method). 
> {code}
> simple_task = PythonOperator(
> task_id='simple_task',
> provide_context=True,
> python_callable=transform_data_file,
> templates_dict={
>   'transformer': MyAwesomeDataFileTransformer(
> "/data/{{ dag.dag_id }}/{{ ts }}/input_file",
> "/data/{{ dag.dag_id }}/{{ ts }}/output_file",
> )
> },
> dag=dag
> )
> {code}
> I have 3 alternatives in mind to allow rendering inner attributes:
> # Either define an Abstract Base Class declaring an abstract method 
> {{render_template}}; then my command object would have to extend this 
> Abstract Base Class, and then implement {{render_template}} method.
> # Or use duck typing in {{BaseOperator.render_template_from_field}} to call 
> {{render_template}} method when it exists on templated custom objects; then 
> my command object would just have to implement {{render_template}} method.
> # Or traverse object attributes when rendering templates and call 
> {{BaseOperator.render_template}} recursively; then my command object would 
> not need any change
> My preferred solution is the 3rd one, but I would like to hear about your 
> opinion on this before. Maybe is there a 4th and better solution?
> I would be glad to submit a PR if this functionality is accepted.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] TobKed opened a new pull request #6141: [AIRFLOW-XXX] No implicit optional flag for mypy

2019-09-18 Thread GitBox
TobKed opened a new pull request #6141: [AIRFLOW-XXX] No implicit optional flag 
for mypy
URL: https://github.com/apache/airflow/pull/6141
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] TobKed commented on issue #6141: [AIRFLOW-5514] No implicit optional flag for mypy

2019-09-18 Thread GitBox
TobKed commented on issue #6141: [AIRFLOW-5514] No implicit optional flag for 
mypy
URL: https://github.com/apache/airflow/pull/6141#issuecomment-532616248
 
 
   cc @mik-laj @potiuk @kaxil 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5256) Related pylint changes for common licences in python files

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932324#comment-16932324
 ] 

ASF subversion and git services commented on AIRFLOW-5256:
--

Commit f8be014d9fa299e461acec85eac5276bd7a977d1 in airflow's branch 
refs/heads/v1-10-test from Jarek Potiuk
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=f8be014 ]

[AIRFLOW-5256] Related pylint changes for common licences in python files 
(#5786)

  (cherry picked from commit 47801057989046dfcf7b424ce54afee103803815)


> Related pylint changes for common licences in python files
> --
>
> Key: AIRFLOW-5256
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5256
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: ci, core
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5480) Fix flaky impersonation test

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5480?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932325#comment-16932325
 ] 

ASF subversion and git services commented on AIRFLOW-5480:
--

Commit 0679c5064dc0caee1d8bf3800e15772a177b4af2 in airflow's branch 
refs/heads/v1-10-test from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=0679c50 ]

[AIRFLOW-5480] Fix flaky impersonation (#6098)


(cherry picked from commit 4459592e523bbeeda0747dd9f888c3e215a241d1)


> Fix flaky impersonation test
> 
>
> Key: AIRFLOW-5480
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5480
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Priority: Critical
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5447) KubernetesExecutor hangs on task queueing

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5447?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932326#comment-16932326
 ] 

ASF subversion and git services commented on AIRFLOW-5447:
--

Commit b88d4c5a7721c061a6488d519be1646ae0fdddea in airflow's branch 
refs/heads/v1-10-test from Daniel Imberman
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=b88d4c5 ]

[AIRFLOW-5447] Scheduler stalls because second watcher thread in default args


> KubernetesExecutor hangs on task queueing
> -
>
> Key: AIRFLOW-5447
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5447
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: executor-kubernetes
>Affects Versions: 1.10.4, 1.10.5
> Environment: Kubernetes version v1.14.3, Airflow version 1.10.4-1.10.5
>Reporter: Henry Cohen
>Assignee: Daniel Imberman
>Priority: Blocker
>
> Starting in 1.10.4, and continuing in 1.10.5, when using the 
> KubernetesExecutor, with the webserver and scheduler running in the 
> kubernetes cluster, tasks are scheduled, but when added to the task queue, 
> the executor process hangs indefinitely. Based on log messages, it appears to 
> be stuck at this line 
> https://github.com/apache/airflow/blob/v1-10-stable/airflow/contrib/executors/kubernetes_executor.py#L761



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-4835) Refactor template rendering functions

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4835?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-4835:
---
Fix Version/s: (was: 2.0.0)
   1.10.6

> Refactor template rendering functions
> -
>
> Key: AIRFLOW-4835
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4835
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 2.0.0
>Reporter: Bas Harenslak
>Priority: Major
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5256) Related pylint changes for common licences in python files

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-5256:
---
Fix Version/s: (was: 2.0.0)
   1.10.6

> Related pylint changes for common licences in python files
> --
>
> Key: AIRFLOW-5256
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5256
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: ci, core
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-4864) Don't call load_test_configuration multiple times

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4864?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-4864:
---
Fix Version/s: (was: 2.0.0)
   1.10.6

> Don't call load_test_configuration multiple times
> -
>
> Key: AIRFLOW-4864
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4864
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: tests
>Affects Versions: 2.0.0
>Reporter: Ash Berlin-Taylor
>Assignee: Ash Berlin-Taylor
>Priority: Minor
> Fix For: 1.10.6
>
>
> We have a pattern that many test files have cargo-culted from each other to 
> call {{load_test_configuration}}, either at the module level, or inside the 
> {{setUp}} of the test classes.
> This isn't needed - we already load the test config from the environment 
> variable we set in travis, so this just causes a (tiny) delay.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk merged pull request #6143: [AIRFLOW-XXX] Add example of running pre-commit hooks on single file

2019-09-18 Thread GitBox
potiuk merged pull request #6143: [AIRFLOW-XXX] Add example of running 
pre-commit hooks on single file
URL: https://github.com/apache/airflow/pull/6143
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil opened a new pull request #6143: [AIRFLOW-XXX] Add example of running pre-commit hooks on single file

2019-09-18 Thread GitBox
kaxil opened a new pull request #6143: [AIRFLOW-XXX] Add example of running 
pre-commit hooks on single file
URL: https://github.com/apache/airflow/pull/6143
 
 
   Or a list of files
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
   
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   Add example of running pre-commit hooks on single file or list of files
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-5369) Add interactivity to pre-commit image building

2019-09-18 Thread Jarek Potiuk (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5369?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Potiuk resolved AIRFLOW-5369.
---
Fix Version/s: 1.10.6
   Resolution: Fixed

> Add interactivity to pre-commit image building
> --
>
> Key: AIRFLOW-5369
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5369
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Major
> Fix For: 1.10.6
>
>
> Currently when images are out-dated for pre-commit it just fails with message 
> how to re-run it with rebuild next time. Also when you already run 
> pre-commit, ^C does not work as expected - the main script is killed but the 
> docker images running the checks continue running in the background until 
> they finish. This is pretty annoying as killing such running docker 
> containers is not trivial and for pylint/mypy/flake we can run multiple 
> containers if we run it on many modified files. This is because wer are not 
> using dumb-init to run the checks.
>  
> This is discouraging a bit, so instead a bit of interactivity can be added:
> 1) If image gets out-dated a question is asked whether to rebuild it while 
> pre-commit is executed
> 2) If you run pre-commit directly you do not get asked for rebuild because 
> you can run multiple pre-commit scripts in parallel  (pre-commit does it) so 
> you should fail fast (with helpful instructions)
> 3) If you run pre-commit via breeze, it is optimised because only the image 
> that is actually needed is rebuilt (and question is asked then)
> 4) Additionally - you should be able to press ^C and kill all containers 
> running in the background as well as the main script. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5369) Add interactivity to pre-commit image building

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932366#comment-16932366
 ] 

ASF subversion and git services commented on AIRFLOW-5369:
--

Commit 37c5ca07dd4f07b00d2b8feff74b956b433f032f in airflow's branch 
refs/heads/v1-10-test from Jarek Potiuk
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=37c5ca0 ]

[AIRFLOW-5369] Adds interactivity to pre-commits (#5976)

This commit adds full interactivity to pre-commits. Whenever you run pre-commit
and it detects that the image should be rebuild, an interactive question will
pop up instead of failing the build and asking to rebuild with REBUILD=yes

This is much nicer from the user perspective. You can choose whether to:
1) Rebuild the image (which will take some time)
2) Not rebuild the image (this will use the old image with hope it's OK)
3) Quit.

Answer to that question is carried across all images needed to rebuild.
There is the special "build" pre-commit hook that takes care about that.

Note that this interactive question cannot be asked if you run only
single pre-commit hook with Dockerfile because it can run multiple processes
and you can start building in parallel. This is not desired so instead we fail
such builds.

(cherry picked from commit 857788e305bbefe4566a1988e2072a21c7aab319)


> Add interactivity to pre-commit image building
> --
>
> Key: AIRFLOW-5369
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5369
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Major
>
> Currently when images are out-dated for pre-commit it just fails with message 
> how to re-run it with rebuild next time. Also when you already run 
> pre-commit, ^C does not work as expected - the main script is killed but the 
> docker images running the checks continue running in the background until 
> they finish. This is pretty annoying as killing such running docker 
> containers is not trivial and for pylint/mypy/flake we can run multiple 
> containers if we run it on many modified files. This is because wer are not 
> using dumb-init to run the checks.
>  
> This is discouraging a bit, so instead a bit of interactivity can be added:
> 1) If image gets out-dated a question is asked whether to rebuild it while 
> pre-commit is executed
> 2) If you run pre-commit directly you do not get asked for rebuild because 
> you can run multiple pre-commit scripts in parallel  (pre-commit does it) so 
> you should fail fast (with helpful instructions)
> 3) If you run pre-commit via breeze, it is optimised because only the image 
> that is actually needed is rebuilt (and question is asked then)
> 4) Additionally - you should be able to press ^C and kill all containers 
> running in the background as well as the main script. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #6095: [AIRFLOW-5474] Add Basic auth to Druid hook

2019-09-18 Thread GitBox
codecov-io edited a comment on issue #6095: [AIRFLOW-5474] Add Basic auth to 
Druid hook
URL: https://github.com/apache/airflow/pull/6095#issuecomment-532669538
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=h1) 
Report
   > Merging 
[#6095](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/6a66ece8cae8c8d209efefcde2639dca8919e975?src=pr=desc)
 will **increase** coverage by `70.49%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6095/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#6095   +/-   ##
   ===
   + Coverage9.55%   80.05%   +70.49% 
   ===
 Files 606  607+1 
 Lines   3489335039  +146 
   ===
   + Hits 333428050+24716 
   + Misses  31559 6989-24570
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `88.88% <100%> (+88.88%)` | :arrow_up: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/gcp/utils/mlengine\_prediction\_summary.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvbWxlbmdpbmVfcHJlZGljdGlvbl9zdW1tYXJ5LnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...low/contrib/operators/google\_api\_to\_s3\_transfer.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nb29nbGVfYXBpX3RvX3MzX3RyYW5zZmVyLnB5)
 | | |
   | 
[airflow/contrib/hooks/google\_discovery\_api\_hook.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dvb2dsZV9kaXNjb3ZlcnlfYXBpX2hvb2sucHk=)
 | | |
   | 
[airflow/gcp/example\_dags/example\_dataflow.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZGF0YWZsb3cucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/operators/google\_api\_to\_s3\_transfer.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ29vZ2xlX2FwaV90b19zM190cmFuc2Zlci5weQ==)
 | `100% <0%> (ø)` | |
   | 
[airflow/gcp/hooks/discovery\_api.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvaG9va3MvZGlzY292ZXJ5X2FwaS5weQ==)
 | `100% <0%> (ø)` | |
   | ... and [507 
more](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=footer). 
Last update 
[6a66ece...fd4cf8e](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #6095: [AIRFLOW-5474] Add Basic auth to Druid hook

2019-09-18 Thread GitBox
codecov-io commented on issue #6095: [AIRFLOW-5474] Add Basic auth to Druid hook
URL: https://github.com/apache/airflow/pull/6095#issuecomment-532669538
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=h1) 
Report
   > Merging 
[#6095](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/6a66ece8cae8c8d209efefcde2639dca8919e975?src=pr=desc)
 will **increase** coverage by `70.49%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6095/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#6095   +/-   ##
   ===
   + Coverage9.55%   80.05%   +70.49% 
   ===
 Files 606  607+1 
 Lines   3489335039  +146 
   ===
   + Hits 333428050+24716 
   + Misses  31559 6989-24570
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/druid\_hook.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kcnVpZF9ob29rLnB5)
 | `88.88% <100%> (+88.88%)` | :arrow_up: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/gcp/utils/mlengine\_prediction\_summary.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvbWxlbmdpbmVfcHJlZGljdGlvbl9zdW1tYXJ5LnB5)
 | `0% <0%> (ø)` | :arrow_up: |
   | 
[...low/contrib/operators/google\_api\_to\_s3\_transfer.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9nb29nbGVfYXBpX3RvX3MzX3RyYW5zZmVyLnB5)
 | | |
   | 
[airflow/contrib/hooks/google\_discovery\_api\_hook.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dvb2dsZV9kaXNjb3ZlcnlfYXBpX2hvb2sucHk=)
 | | |
   | 
[airflow/gcp/example\_dags/example\_dataflow.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvZXhhbXBsZV9kYWdzL2V4YW1wbGVfZGF0YWZsb3cucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/operators/google\_api\_to\_s3\_transfer.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ29vZ2xlX2FwaV90b19zM190cmFuc2Zlci5weQ==)
 | `100% <0%> (ø)` | |
   | 
[airflow/gcp/hooks/discovery\_api.py](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvaG9va3MvZGlzY292ZXJ5X2FwaS5weQ==)
 | `100% <0%> (ø)` | |
   | ... and [507 
more](https://codecov.io/gh/apache/airflow/pull/6095/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=footer). 
Last update 
[6a66ece...fd4cf8e](https://codecov.io/gh/apache/airflow/pull/6095?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] TobKed commented on issue #6142: [AIRFLOW-5515] Add stacklevel to GCP deprecation warnings

2019-09-18 Thread GitBox
TobKed commented on issue #6142: [AIRFLOW-5515] Add stacklevel to GCP 
deprecation warnings
URL: https://github.com/apache/airflow/pull/6142#issuecomment-532676438
 
 
   @kaxil thank you noticing mistakes. I've fixed it.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5515) Add stacklevel to GCP deprecation warnings

2019-09-18 Thread Tobiasz Kedzierski (Jira)
Tobiasz Kedzierski created AIRFLOW-5515:
---

 Summary: Add stacklevel to GCP deprecation warnings
 Key: AIRFLOW-5515
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5515
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 1.10.5
Reporter: Tobiasz Kedzierski
Assignee: Tobiasz Kedzierski






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5516) Webserver crashes when there are too many dags or tasks to handle

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5516.

Resolution: Duplicate

We are working on this already via 
https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-24+DAG+Persistence+in+DB+using+JSON+for+Airflow+Webserver+and+%28optional%29+Scheduler
 and hope to have it released in 1.10.6 (or .7 if we decide .6 needs to be a 
bug fix release first.)

> Webserver crashes when there are too many dags or tasks to handle
> -
>
> Key: AIRFLOW-5516
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5516
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, ui, webserver
>Affects Versions: 1.8.1, 1.8.2, 1.9.0, 1.10.0, 1.10.1, 1.10.2
> Environment: Google Cloud Composer
>Reporter: Vivien Morlet
>Priority: Major
>  Labels: DAG, Error, UI, tasks, webserver
> Attachments: ServerError502.png, badGateway.png
>
>
> Hello everyone,
>  
> I am using Apache Airflow with the orchestration service Composer of Google 
> Cloud.
> I have an issue with the UI Webserver. I noticed that the webserver crashes 
> when there are too many DAGs or tasks to handle; by "crashes" I mean that the 
> workers are still handling the tasks and DAGs are triggered but the Webserver 
> throws 502 errors (see related screenshots).
>  
> I have approximately 50 DAGs that Airflow parses and there are for some DAGs 
> sometimes thousands of tasks to handle and to schedule.
>  
> As I said, the workers are still doing their jobs but this is totally 
> unconvenient (understand impossible) to handle DAGs (e.g trigger dags, show 
> graphs etc.) without the UI Webserver especially since the service Composer 
> of Google Cloud doesn't give a total access to every tools of Airflow.
>  
> Theoretically Airflow can handle an infinity of tasks but visibly the 
> webserver suffers.
> What can I do with this issue?
>  
> Thanks for your help, I hope that we will be able to fix this issue soon.
> Regards,
> Bibimorlet



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-401) scheduler gets stuck without a trace

2019-09-18 Thread Ash Berlin-Taylor (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932357#comment-16932357
 ] 

Ash Berlin-Taylor commented on AIRFLOW-401:
---

[~ms-nmcalabroso] The problem you are seeing there may be caused by 
AIRFLOW-5447 (which affects Kube and Local executors), the fix for which and 
will be released soon as 1.10.6

> scheduler gets stuck without a trace
> 
>
> Key: AIRFLOW-401
> URL: https://issues.apache.org/jira/browse/AIRFLOW-401
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: executors, scheduler
>Affects Versions: 1.7.1.3
>Reporter: Nadeem Ahmed Nazeer
>Assignee: Bolke de Bruin
>Priority: Minor
>  Labels: celery, kombu
> Attachments: Dag_code.txt, schduler_cpu100%.png, scheduler_stuck.png, 
> scheduler_stuck_7hours.png
>
>
> The scheduler gets stuck without a trace or error. When this happens, the CPU 
> usage of scheduler service is at 100%. No jobs get submitted and everything 
> comes to a halt. Looks it goes into some kind of infinite loop. 
> The only way I could make it run again is by manually restarting the 
> scheduler service. But again, after running some tasks it gets stuck. I've 
> tried with both Celery and Local executors but same issue occurs. I am using 
> the -n 3 parameter while starting scheduler. 
> Scheduler configs,
> job_heartbeat_sec = 5
> scheduler_heartbeat_sec = 5
> executor = LocalExecutor
> parallelism = 32
> Please help. I would be happy to provide any other information needed



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil edited a comment on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
kaxil edited a comment on issue #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144#issuecomment-532662112
 
 
   Need to remove the following too:
   
   - 
https://github.com/apache/airflow/blob/74ee98d2b619dc1a4d203834afa45796b4733e05/tests/operators/test_bash_operator.py#L26
   - 
https://github.com/apache/airflow/blob/74ee98d2b619dc1a4d203834afa45796b4733e05/tests/operators/test_operators.py#L20


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
kaxil commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144#issuecomment-532662112
 
 
   Need to remove the following too:
   
   - 
https://github.com/apache/airflow/blob/master/tests/operators/test_bash_operator.py#L26
   - 
https://github.com/apache/airflow/blob/master/tests/operators/test_operators.py#L20


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5514) No implicit optional flag for mypy

2019-09-18 Thread Tobiasz Kedzierski (Jira)
Tobiasz Kedzierski created AIRFLOW-5514:
---

 Summary: No implicit optional flag for mypy 
 Key: AIRFLOW-5514
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5514
 Project: Apache Airflow
  Issue Type: Improvement
  Components: core
Affects Versions: 1.10.5
Reporter: Tobiasz Kedzierski
Assignee: Tobiasz Kedzierski






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] andrei-l commented on a change in pull request #5819: [AIRFLOW-5147] extended character set for for k8s worker pods annotations

2019-09-18 Thread GitBox
andrei-l commented on a change in pull request #5819: [AIRFLOW-5147] extended 
character set for for k8s worker pods annotations
URL: https://github.com/apache/airflow/pull/5819#discussion_r325607947
 
 

 ##
 File path: airflow/config_templates/default_airflow.cfg
 ##
 @@ -771,15 +771,17 @@ run_as_user =
 # that allows for the key to be read, e.g. 65533
 fs_group =
 
+# Annotations configuration as a single line formatted JSON object.
+# See the naming convention in:
+#   
https://kubernetes.io/docs/concepts/overview/working-with-objects/annotations/
+worker_annotations =
+
+
 [kubernetes_node_selectors]
 # The Key-value pairs to be given to worker pods.
 # The worker pods will be scheduled to the nodes of the specified key-value 
pairs.
 # Should be supplied in the format: key = value
 
-[kubernetes_annotations]
 
 Review comment:
   updated!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
ashb commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience 
functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144#issuecomment-532672860
 
 
   I can update those, but they are using `configuration.conf.get(...)` so 
won't hit the deprecated functions. Should I update them anyway?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] TobKed commented on issue #6141: [AIRFLOW-5514] No implicit optional flag for mypy

2019-09-18 Thread GitBox
TobKed commented on issue #6141: [AIRFLOW-5514] No implicit optional flag for 
mypy
URL: https://github.com/apache/airflow/pull/6141#issuecomment-532615740
 
 
   [PEP-484](https://www.python.org/dev/peps/pep-0484/#id29) says:
   
   > A past version of this PEP allowed type checkers to assume an optional 
type when the default value is None.
   ...
   This is no longer the recommended behavior. Type checkers should move 
towards requiring the optional type to be made explicit.
   
   Other references:
   https://github.com/python/peps/pull/689/files
   https://github.com/apache/airflow/pull/5965#discussion_r319720604


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5516) Webserver crashes when there are too many dags or tasks to handle

2019-09-18 Thread Vivien Morlet (Jira)
Vivien Morlet created AIRFLOW-5516:
--

 Summary: Webserver crashes when there are too many dags or tasks 
to handle
 Key: AIRFLOW-5516
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5516
 Project: Apache Airflow
  Issue Type: Bug
  Components: DAG, ui, webserver
Affects Versions: 1.10.2, 1.10.1, 1.10.0, 1.9.0, 1.8.2, 1.8.1
 Environment: Google Cloud Composer
Reporter: Vivien Morlet
 Attachments: ServerError502.png, badGateway.png

Hello everyone,

 

I am using Apache Airflow with the orchestration service Composer of Google 
Cloud.

I have an issue with the UI Webserver. I noticed that the webserver crashes 
when there are too many DAGs or tasks to handle; by "crashes" I mean that the 
workers are still handling the tasks and DAGs are triggered but the Webserver 
throws 502 errors (see related screenshots).

 

I have approximately 50 DAGs that Airflow parses and there are for some DAGs 
sometimes thousands of tasks to handle and to schedule.

 

As I said, the workers are still doing their jobs but this is totally 
unconvenient (understand impossible) to handle DAGs (e.g trigger dags, show 
graphs etc.) without the UI Webserver especially since the service Composer of 
Google Cloud doesn't give a total access to every tools of Airflow.

 

Theoretically Airflow can handle an infinity of tasks but visibly the webserver 
suffers.
What can I do with this issue?

 

Thanks for your help, I hope that we will be able to fix this issue soon.

Regards,
Bibimorlet



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-401) scheduler gets stuck without a trace

2019-09-18 Thread Neil Calabroso (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932286#comment-16932286
 ] 

Neil Calabroso commented on AIRFLOW-401:


Currently experiencing this issue in `Ubuntu 14.04` using `python 3.6.8`. This 
started when we upgraded our staging environment from `1.10.1` to `1.10.4`. 
We're using `LocalExecutor` and the process is handled by upstart.

I'm also getting the issue in the Web UI:  The scheduler does not appear to be 
running. Last heartbeat was received 9 minutes ago.

For this sample, I got 3 stuck processes:

 
{code:java}
root@airflow-staging/home/ubuntu# ps aux | grep scheduler
airflow  21595  0.2  1.3 469868 109976 ?   S09:52   0:04 
/usr/bin/python3.6 /usr/local/bin/airflow scheduler -n 5
airflow  21602  0.0  1.1 1500268 95992 ?   Tl   09:52   0:00 
/usr/bin/python3.6 /usr/local/bin/airflow scheduler -n 5
airflow  21648  0.0  1.1 467796 94628 ?S09:52   0:00 
/usr/bin/python3.6 /usr/local/bin/airflow scheduler -n 5
root 25735  0.0  0.0  10472   920 pts/3S+   10:24   0:00 grep 
--color=auto scheduler
{code}
 

Running py-spy to each process gives

 
{code:java}
Collecting samples from 'pid: 21595' (python v3.6.8)
Total Samples 500
GIL: 0.00%, Active: 100.00%, Threads: 1  %Own   %Total  OwnTime  TotalTime  
Function (filename:line)
100.00% 100.00%5.00s 5.00s   _recv (multiprocessing/connection.py:379)
  0.00% 100.00%   0.000s 5.00s   wrapper (airflow/utils/cli.py:74)
  0.00% 100.00%   0.000s 5.00s   scheduler (airflow/bin/cli.py:1013)
  0.00% 100.00%   0.000s 5.00s   end 
(airflow/executors/local_executor.py:233)
  0.00% 100.00%   0.000s 5.00s(airflow:32)
  0.00% 100.00%   0.000s 5.00s   recv (multiprocessing/connection.py:250)
  0.00% 100.00%   0.000s 5.00s   _execute 
(airflow/jobs/scheduler_job.py:1323)
  0.00% 100.00%   0.000s 5.00s   end 
(airflow/executors/local_executor.py:212)
  0.00% 100.00%   0.000s 5.00s   _callmethod 
(multiprocessing/managers.py:757)
  0.00% 100.00%   0.000s 5.00s   join (:2)
  0.00% 100.00%   0.000s 5.00s   _recv_bytes 
(multiprocessing/connection.py:407)
  0.00% 100.00%   0.000s 5.00s   _execute_helper 
(airflow/jobs/scheduler_job.py:1463)
  0.00% 100.00%   0.000s 5.00s   run (airflow/jobs/base_job.py:213){code}
 
{code:java}
root@airflow-staging:/home/ubuntu# py-spy --pid 21602
Error: Failed to suspend process
Reason: EPERM: Operation not permitted{code}
 
{code:java}
Collecting samples from 'pid: 21648' (python v3.6.8)
Total Samples 28381
GIL: 0.00%, Active: 100.00%, Threads: 1  %Own   %Total  OwnTime  TotalTime  
Function (filename:line)
100.00% 100.00%   283.8s283.8s   _try_wait (subprocess.py:1424)
  0.00% 100.00%   0.000s283.8s   call (subprocess.py:289)
  0.00% 100.00%   0.000s283.8s   start 
(airflow/executors/local_executor.py:184)
  0.00% 100.00%   0.000s283.8s   wrapper (airflow/utils/cli.py:74)
  0.00% 100.00%   0.000s283.8s   _bootstrap (multiprocessing/process.py:258)
  0.00% 100.00%   0.000s283.8s   _execute_helper 
(airflow/jobs/scheduler_job.py:1347)
  0.00% 100.00%   0.000s283.8s   execute_work 
(airflow/executors/local_executor.py:86)
  0.00% 100.00%   0.000s283.8s(airflow:32)
  0.00% 100.00%   0.000s283.8s   _launch (multiprocessing/popen_fork.py:73)
  0.00% 100.00%   0.000s283.8s   run (airflow/jobs/base_job.py:213)
  0.00% 100.00%   0.000s283.8s   check_call (subprocess.py:306)
  0.00% 100.00%   0.000s283.8s   start (multiprocessing/process.py:105)
  0.00% 100.00%   0.000s283.8s   run 
(airflow/executors/local_executor.py:116)
  0.00% 100.00%   0.000s283.8s   wait (subprocess.py:1477)
  0.00% 100.00%   0.000s283.8s   scheduler (airflow/bin/cli.py:1013)
  0.00% 100.00%   0.000s283.8s   _Popen (multiprocessing/context.py:277)
  0.00% 100.00%   0.000s283.8s   _Popen (multiprocessing/context.py:223)
  0.00% 100.00%   0.000s283.8s   start 
(airflow/executors/local_executor.py:224)
  0.00% 100.00%   0.000s283.8s   _execute 
(airflow/jobs/scheduler_job.py:1323)
  0.00% 100.00%   0.000s283.8s   __init__ (multiprocessing/popen_fork.py:19)
{code}
 

We will try to downgrade to `1.10.3` first and see if this problem persists.

 

> scheduler gets stuck without a trace
> 
>
> Key: AIRFLOW-401
> URL: https://issues.apache.org/jira/browse/AIRFLOW-401
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: executors, scheduler
>Affects Versions: 1.7.1.3
>Reporter: Nadeem Ahmed Nazeer
>Assignee: Bolke de Bruin
>Priority: Minor
>  Labels: celery, kombu
> Attachments: Dag_code.txt, schduler_cpu100%.png, scheduler_stuck.png, 
> scheduler_stuck_7hours.png
>
>
> The scheduler gets stuck without a trace or error. 

[GitHub] [airflow] alrolorojas commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination 
bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532624568
 
 
   @feluelle @ashb I've addressed the feedback. Please take another look


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused 
pagination bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532644657
 
 
   > You could even use @conf_vars to annotate the test method to use the env 
var only in the whole test method. But either way it is better than before 
(try-finally...).
   
   @feluelle I did not find a way to pass arguments dynamically from 
`@parameterized.expand` to @conf_vars. Is there a way I can do that annotating 
the whole `conf_vars`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alrolorojas commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination 
bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532644657
 
 
   > You could even use @conf_vars to annotate the test method to use the env 
var only in the whole test method. But either way it is better than before 
(try-finally...).
   
   @feluelle I did not find a way to pass arguments dynamically from 
`@parameterized.expand`. Is there a way I can do that annotating the whole 
`conf_vars`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused 
pagination bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532644657
 
 
   > You could even use @conf_vars to annotate the test method to use the env 
var only in the whole test method. But either way it is better than before 
(try-finally...).
   
   @feluelle I did not find a way to pass arguments dynamically from 
`@parameterized.expand` to @conf_vars when used as a decorator. Is there a way 
I can do that annotating the whole method with `conf_vars` and use an argument 
from @parameterized.expand within it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused 
pagination bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532644657
 
 
   > You could even use @conf_vars to annotate the test method to use the env 
var only in the whole test method. But either way it is better than before 
(try-finally...).
   
   @feluelle I did not find a way to pass arguments dynamically from 
`@parameterized.expand` to @conf_vars. Is there a way I can do that annotating 
the whole method with `conf_vars` and use an argument from 
@parameterized.expand within it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb opened a new pull request #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
ashb opened a new pull request #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] https://issues.apache.org/jira/browse/AIRFLOW-4858
   
   ### Description
   
   - [x] These places were missed in the original PR (#5495)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4858) Conf historical convenience functions are not deprecated properly

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932374#comment-16932374
 ] 

ASF GitHub Bot commented on AIRFLOW-4858:
-

ashb commented on pull request #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] https://issues.apache.org/jira/browse/AIRFLOW-4858
   
   ### Description
   
   - [x] These places were missed in the original PR (#5495)
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Conf historical convenience functions are not deprecated properly
> -
>
> Key: AIRFLOW-4858
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4858
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.3
>Reporter: Hao Liang
>Assignee: Hao Liang
>Priority: Minor
> Fix For: 1.10.6
>
>
> In conf we want to deprecate "historical convenience functions" get, getint, 
> getboolean, etc. However the current code doesn't issue a deprecation warning 
> if any of these function is called.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil commented on issue #6142: [AIRFLOW-5515] Add stacklevel to GCP deprecation warnings

2019-09-18 Thread GitBox
kaxil commented on issue #6142: [AIRFLOW-5515] Add stacklevel to GCP 
deprecation warnings
URL: https://github.com/apache/airflow/pull/6142#issuecomment-532663381
 
 
   Please follow contribution guidelines and fill in details above like 
Description, add link to your Jira issue
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4858) Conf historical convenience functions are not deprecated properly

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932439#comment-16932439
 ] 

ASF subversion and git services commented on AIRFLOW-4858:
--

Commit b2e06d0edcdbcfbe65e5135554bd45b3e2d76cd0 in airflow's branch 
refs/heads/master from Ash Berlin-Taylor
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=b2e06d0 ]

[AIRFLOW-4858] Deprecate "Historical convenience functions" in 
airflow.configuration (#6144)

These places were missed in the original PR (#5495)

> Conf historical convenience functions are not deprecated properly
> -
>
> Key: AIRFLOW-4858
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4858
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.3
>Reporter: Hao Liang
>Assignee: Hao Liang
>Priority: Minor
> Fix For: 1.10.6
>
>
> In conf we want to deprecate "historical convenience functions" get, getint, 
> getboolean, etc. However the current code doesn't issue a deprecation warning 
> if any of these function is called.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-4858) Conf historical convenience functions are not deprecated properly

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932438#comment-16932438
 ] 

ASF GitHub Bot commented on AIRFLOW-4858:
-

ashb commented on pull request #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Conf historical convenience functions are not deprecated properly
> -
>
> Key: AIRFLOW-4858
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4858
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.3
>Reporter: Hao Liang
>Assignee: Hao Liang
>Priority: Minor
> Fix For: 1.10.6
>
>
> In conf we want to deprecate "historical convenience functions" get, getint, 
> getboolean, etc. However the current code doesn't issue a deprecation warning 
> if any of these function is called.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5515) Add stacklevel to GCP deprecation warnings

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932279#comment-16932279
 ] 

ASF GitHub Bot commented on AIRFLOW-5515:
-

TobKed commented on pull request #6142: [AIRFLOW-5515] Add stacklevel to GCP 
deprecation warnings
URL: https://github.com/apache/airflow/pull/6142
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add stacklevel to GCP deprecation warnings
> --
>
> Key: AIRFLOW-5515
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5515
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Tobiasz Kedzierski
>Assignee: Tobiasz Kedzierski
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] TobKed opened a new pull request #6142: [AIRFLOW-5515] Add stacklevel to GCP deprecation warnings

2019-09-18 Thread GitBox
TobKed opened a new pull request #6142: [AIRFLOW-5515] Add stacklevel to GCP 
deprecation warnings
URL: https://github.com/apache/airflow/pull/6142
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-5437) When you run Breeze, no matter what image is build python 3.5 from checks overrides run version

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5437?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5437.

Fix Version/s: 1.10.6
   Resolution: Resolved

> When you run Breeze, no matter what image is build python 3.5 from checks 
> overrides run version
> ---
>
> Key: AIRFLOW-5437
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5437
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 1.10.6
>
>
> When you run breeze, it rebuilds several images and overrides python to 3.5 
> version.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5376) Codecov does not work for new CI :(

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5376.

Fix Version/s: 1.10.6
   Resolution: Resolved

> Codecov does not work for new CI  :(
> 
>
> Key: AIRFLOW-5376
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5376
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Major
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5364) Port number variables are not set for local CI scripts

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5364?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5364.

Fix Version/s: 1.10.6
   Resolution: Resolved

> Port number variables are not set for local CI scripts
> --
>
> Key: AIRFLOW-5364
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5364
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Priority: Major
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5449) Can't commit blank values on variable edits

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5449.

Fix Version/s: 1.10.6
   Resolution: Resolved

> Can't commit blank values on variable edits
> ---
>
> Key: AIRFLOW-5449
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5449
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.5
>Reporter: Andrew Desousa
>Assignee: Andrew Desousa
>Priority: Minor
> Fix For: 1.10.6
>
>
> On Airflow, when going to "admin >> variables" and attempting to change the 
> value of an existing variable to nothing, the changes aren't being set 
> properly and saving the edit does nothing. This bug fix should allow users to 
> blank out the values of existing variables.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5447) KubernetesExecutor hangs on task queueing

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5447?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5447.

Fix Version/s: 1.10.6
   Resolution: Resolved

> KubernetesExecutor hangs on task queueing
> -
>
> Key: AIRFLOW-5447
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5447
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: executor-kubernetes
>Affects Versions: 1.10.4, 1.10.5
> Environment: Kubernetes version v1.14.3, Airflow version 1.10.4-1.10.5
>Reporter: Henry Cohen
>Assignee: Daniel Imberman
>Priority: Blocker
> Fix For: 1.10.6
>
>
> Starting in 1.10.4, and continuing in 1.10.5, when using the 
> KubernetesExecutor, with the webserver and scheduler running in the 
> kubernetes cluster, tasks are scheduled, but when added to the task queue, 
> the executor process hangs indefinitely. Based on log messages, it appears to 
> be stuck at this line 
> https://github.com/apache/airflow/blob/v1-10-stable/airflow/contrib/executors/kubernetes_executor.py#L761



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5425) Use logging not printing in LoggingCommandExecutor

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5425?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-5425.

Fix Version/s: 1.10.6
   Resolution: Resolved

> Use logging not printing in LoggingCommandExecutor
> --
>
> Key: AIRFLOW-5425
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5425
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: tests
>Affects Versions: 1.10.5
>Reporter: Tomasz Urbaszek
>Priority: Trivial
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
kaxil commented on issue #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144#issuecomment-532677316
 
 
   > I can update those, but they are using `configuration.conf.get(...)` so 
won't hit the deprecated functions. Should I update them anyway?
   Oh yes, you are right. In that case, I don't mind either way.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] TobKed commented on issue #6093: [AIRFLOW-5475] Normalize gcp_conn_id in operators and hooks

2019-09-18 Thread GitBox
TobKed commented on issue #6093: [AIRFLOW-5475] Normalize gcp_conn_id in 
operators and hooks
URL: https://github.com/apache/airflow/pull/6093#issuecomment-532677520
 
 
   cc @mik-laj 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5369) Add interactivity to pre-commit image building

2019-09-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932331#comment-16932331
 ] 

ASF GitHub Bot commented on AIRFLOW-5369:
-

potiuk commented on pull request #5976: [AIRFLOW-5369] Add interactivity to 
pre-commits
URL: https://github.com/apache/airflow/pull/5976
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add interactivity to pre-commit image building
> --
>
> Key: AIRFLOW-5369
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5369
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Major
>
> Currently when images are out-dated for pre-commit it just fails with message 
> how to re-run it with rebuild next time. Also when you already run 
> pre-commit, ^C does not work as expected - the main script is killed but the 
> docker images running the checks continue running in the background until 
> they finish. This is pretty annoying as killing such running docker 
> containers is not trivial and for pylint/mypy/flake we can run multiple 
> containers if we run it on many modified files. This is because wer are not 
> using dumb-init to run the checks.
>  
> This is discouraging a bit, so instead a bit of interactivity can be added:
> 1) If image gets out-dated a question is asked whether to rebuild it while 
> pre-commit is executed
> 2) If you run pre-commit directly you do not get asked for rebuild because 
> you can run multiple pre-commit scripts in parallel  (pre-commit does it) so 
> you should fail fast (with helpful instructions)
> 3) If you run pre-commit via breeze, it is optimised because only the image 
> that is actually needed is rebuilt (and question is asked then)
> 4) Additionally - you should be able to press ^C and kill all containers 
> running in the background as well as the main script. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5369) Add interactivity to pre-commit image building

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932332#comment-16932332
 ] 

ASF subversion and git services commented on AIRFLOW-5369:
--

Commit 857788e305bbefe4566a1988e2072a21c7aab319 in airflow's branch 
refs/heads/master from Jarek Potiuk
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=857788e ]

[AIRFLOW-5369] Adds interactivity to pre-commits (#5976)

This commit adds full interactivity to pre-commits. Whenever you run pre-commit
and it detects that the image should be rebuild, an interactive question will
pop up instead of failing the build and asking to rebuild with REBUILD=yes

This is much nicer from the user perspective. You can choose whether to:
1) Rebuild the image (which will take some time)
2) Not rebuild the image (this will use the old image with hope it's OK)
3) Quit.

Answer to that question is carried across all images needed to rebuild.
There is the special "build" pre-commit hook that takes care about that.

Note that this interactive question cannot be asked if you run only
single pre-commit hook with Dockerfile because it can run multiple processes
and you can start building in parallel. This is not desired so instead we fail
such builds.

> Add interactivity to pre-commit image building
> --
>
> Key: AIRFLOW-5369
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5369
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0, 1.10.5
>Reporter: Jarek Potiuk
>Assignee: Jarek Potiuk
>Priority: Major
>
> Currently when images are out-dated for pre-commit it just fails with message 
> how to re-run it with rebuild next time. Also when you already run 
> pre-commit, ^C does not work as expected - the main script is killed but the 
> docker images running the checks continue running in the background until 
> they finish. This is pretty annoying as killing such running docker 
> containers is not trivial and for pylint/mypy/flake we can run multiple 
> containers if we run it on many modified files. This is because wer are not 
> using dumb-init to run the checks.
>  
> This is discouraging a bit, so instead a bit of interactivity can be added:
> 1) If image gets out-dated a question is asked whether to rebuild it while 
> pre-commit is executed
> 2) If you run pre-commit directly you do not get asked for rebuild because 
> you can run multiple pre-commit scripts in parallel  (pre-commit does it) so 
> you should fail fast (with helpful instructions)
> 3) If you run pre-commit via breeze, it is optimised because only the image 
> that is actually needed is rebuilt (and question is asked then)
> 4) Additionally - you should be able to press ^C and kill all containers 
> running in the background as well as the main script. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk merged pull request #5976: [AIRFLOW-5369] Add interactivity to pre-commits

2019-09-18 Thread GitBox
potiuk merged pull request #5976: [AIRFLOW-5369] Add interactivity to 
pre-commits
URL: https://github.com/apache/airflow/pull/5976
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #5908: Revert "[AIRFLOW-4797] Improve performance and behaviour of zombie de…

2019-09-18 Thread GitBox
ashb commented on issue #5908: Revert "[AIRFLOW-4797] Improve performance and 
behaviour of zombie de…
URL: https://github.com/apache/airflow/pull/5908#issuecomment-532654231
 
 
   What have we decided to do about zombie detection? Is it okay as it is?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #6021: [AIRFLOW-5416] pre-load requirements for airflow image

2019-09-18 Thread GitBox
ashb commented on issue #6021: [AIRFLOW-5416] pre-load requirements for airflow 
image
URL: https://github.com/apache/airflow/pull/6021#issuecomment-532656472
 
 
   Follow commit message convention please .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil edited a comment on issue #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
kaxil edited a comment on issue #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144#issuecomment-532662112
 
 
   Need to update the following too:
   
   - 
https://github.com/apache/airflow/blob/74ee98d2b619dc1a4d203834afa45796b4733e05/tests/operators/test_bash_operator.py#L26
   - 
https://github.com/apache/airflow/blob/74ee98d2b619dc1a4d203834afa45796b4733e05/tests/operators/test_operators.py#L20


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5511) MSSQL resetdb /initdb for 1.10.5 fails to create tables

2019-09-18 Thread ROHIT K RAMWAL (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ROHIT K RAMWAL updated AIRFLOW-5511:

Priority: Minor  (was: Major)

> MSSQL resetdb /initdb for 1.10.5 fails to create tables
> ---
>
> Key: AIRFLOW-5511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5511
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, database
>Affects Versions: 1.10.5
>Reporter: ROHIT K RAMWAL
>Priority: Minor
> Attachments: airflowbug
>
>
> Airflow 1.10.5 installation with MSSQL backend and celery fails with below 
> error .
>  
> Traceback (most recent call last):
>   File "/bin/airflow", line 32, in 
>     args.func(args)
>   File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 1112, in 
> resetdb
>     db.resetdb(settings.RBAC)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 406, in 
> resetdb
>     initdb(rbac)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 106, in 
> initdb
>     upgradedb()
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 377, in 
> upgradedb
>     command.upgrade(config, 'heads')
>   File "/usr/lib/python2.7/site-packages/alembic/command.py", line 279, in 
> upgrade
>     script.run_env()
>   File "/usr/lib/python2.7/site-packages/alembic/script/base.py", line 475, 
> in run_env
>     util.load_python_file(self.dir, "env.py")
>   File "/usr/lib/python2.7/site-packages/alembic/util/pyfiles.py", line 98, 
> in load_python_file
>     module = load_module_py(module_id, path)
>   File "/usr/lib/python2.7/site-packages/alembic/util/compat.py", line 240, 
> in load_module_py
>     mod = imp.load_source(module_id, path, fp)
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 92, 
> in 
>     run_migrations_online()
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 86, 
> in run_migrations_online
>     context.run_migrations()
>   File "", line 8, in run_migrations
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/environment.py", 
> line 846, in run_migrations
>     self.get_context().run_migrations(**kw)
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/migration.py", line 
> 365, in run_migrations
>     step.migration_fn(**kw)
>   File 
> "/usr/lib/python2.7/site-packages/airflow/migrations/versions/6e96a59344a4_make_taskinstance_pool_not_nullable.py",
>  line 101, in upgrade
>     nullable=False,
>   File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
>     self.gen.next()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/base.py", line 
> 325, in batch_alter_table
>     impl.flush()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/batch.py", line 
> 79, in flush
>     fn(*arg, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 85, in 
> alter_column
>     **kw
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 172, in 
> alter_column
>     existing_comment=existing_comment,
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 36, in 
> _exec
>     result = super(MSSQLImpl, self)._exec(construct, *args, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 134, in 
> _exec
>     return conn.execute(construct, *multiparams, **params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 988, in execute
>     return meth(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 72, 
> in _execute_on_connection
>     return connection._execute_ddl(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1050, in _execute_ddl
>     compiled,
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1253, in _execute_context
>     e, statement, parameters, cursor, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1473, in _handle_dbapi_exception
>     util.raise_from_cause(sqlalchemy_exception, exc_info)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/compat.py", line 
> 398, in raise_from_cause
>     reraise(type(exception), exception, tb=exc_tb, cause=cause)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1249, in _execute_context
>     cursor, statement, parameters, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py", 
> line 552, in do_execute
>     cursor.execute(statement, parameters)
> sqlalchemy.exc.ProgrammingError: (pyodbc.ProgrammingError) ('42000', "[42000] 
> [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]The index 'ti_pool' is 
> dependent on column 'pool'. (5074) (SQLExecDirectW)")
> [SQL: 

[GitHub] [airflow] codecov-io commented on issue #6066: [AIRFLOW-XXX] Update to new logo

2019-09-18 Thread GitBox
codecov-io commented on issue #6066: [AIRFLOW-XXX] Update to new logo
URL: https://github.com/apache/airflow/pull/6066#issuecomment-532678459
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@e326633`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6066/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ## master#6066   +/-   ##
   =
 Coverage  ?   80.03%   
   =
 Files ?  607   
 Lines ?35032   
 Branches  ?0   
   =
 Hits  ?28039   
 Misses? 6993   
 Partials  ?0
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/airflow/pull/6066/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `100% <ø> (ø)` | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=footer). 
Last update 
[e326633...c0f95a3](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb merged pull request #6144: [AIRFLOW-4858] Deprecate "Historical convenience functions" in airflow.configuration

2019-09-18 Thread GitBox
ashb merged pull request #6144: [AIRFLOW-4858] Deprecate "Historical 
convenience functions" in airflow.configuration
URL: https://github.com/apache/airflow/pull/6144
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6066: [AIRFLOW-XXX] Update to new logo

2019-09-18 Thread GitBox
codecov-io edited a comment on issue #6066: [AIRFLOW-XXX] Update to new logo
URL: https://github.com/apache/airflow/pull/6066#issuecomment-532678459
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@e326633`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6066/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ## master#6066   +/-   ##
   =
 Coverage  ?   80.03%   
   =
 Files ?  607   
 Lines ?35032   
 Branches  ?0   
   =
 Hits  ?28039   
 Misses? 6993   
 Partials  ?0
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/airflow/pull/6066/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `100% <ø> (ø)` | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=footer). 
Last update 
[e326633...c0f95a3](https://codecov.io/gh/apache/airflow/pull/6066?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5462) Google Authentication redirection page crashes with KeyError: 'login'

2019-09-18 Thread Qian Yu (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Qian Yu updated AIRFLOW-5462:
-
Priority: Minor  (was: Major)

> Google Authentication redirection page crashes with KeyError: 'login'
> -
>
> Key: AIRFLOW-5462
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5462
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: authentication
>Affects Versions: 1.10.3, 1.10.4, 1.10.5
>Reporter: Qian Yu
>Priority: Minor
>
> When user is logged out and try to access the URL of a DAG's TreeView, he's 
> redirected to login page. However, after he logs in with google 
> authentication, the page redirects to the oauth-authorized/login page and 
> crashes with a mushroom cloud.
>  
> Expected:
> Redirects back to the original URL the user wanted to access
> Actual:
> Crash with mushroom cloud
>  
> Configuration:
> {code:python}
> [webserver]
> authenticate = True
> auth_backend = airflow.contrib.auth.backends.google_auth
> [google]
> client_id = google_client_id
> client_secret = google_client_secret
> oauth_callback_route = /oauth2callback
> domain = "example1.com,example2.com"
> {code}
>  
> The issue looks similar to the one described here: 
> [https://github.com/apache/incubator-superset/issues/7739]
> {code:python}
>   / (  ()   )  \___
>  /( (  (  )   _))  )   )\
>(( (   )()  )   (   )  )
>  ((/  ( _(   )   (   _) ) (  () )  )
> ( (  ( (_)   (((   )  .((_ ) .  )_
>( (  )(  (  ))   ) . ) (   )
>   (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
>   ( (  (   ) (  )   (  )) ) _)(   )  )  )
>  ( (  ( \ ) ((_  ( ) ( )  )   ) )  )) ( )
>   (  (   (  (   (_ ( ) ( _)  ) (  )  )   )
>  ( (  ( (  (  ) (_  )  ) )  _)   ) _( ( )
>   ((  (   )(( _)   _) _(_ (  (_ )
>(_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
>((__)\\||lll|l||///  \_))
> (   /(/ (  )  ) )\   )
>   (( ( ( | | ) ) )\   )
>(   /(| / ( )) ) ) )) )
>  ( ( _(|)_) )
>   (  ||\(|(|)|/|| )
> (|(||(||))
>   ( //|/l|||)|\\ \ )
> (/ / //  /|//\\  \ \  \ _)
> ---
> Node: ...
> ---
> Traceback (most recent call last):
>   File "/lib/python3.6/site-packages/flask/app.py", line 2446, in wsgi_app
> response = self.full_dispatch_request()
>   File "/lib/python3.6/site-packages/flask/app.py", line 1951, in 
> full_dispatch_request
> rv = self.handle_user_exception(e)
>   File "/lib/python3.6/site-packages/flask/app.py", line 1820, in 
> handle_user_exception
> reraise(exc_type, exc_value, tb)
>   File "/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
> raise value
>   File "/lib/python3.6/site-packages/flask/app.py", line 1949, in 
> full_dispatch_request
> rv = self.dispatch_request()
>   File "/lib/python3.6/site-packages/flask/app.py", line 1935, in 
> dispatch_request
> return self.view_functions[rule.endpoint](**req.view_args)
>   File "/lib/python3.6/site-packages/flask_appbuilder/security/views.py", 
> line 633, in oauth_authorized
> resp = self.appbuilder.sm.oauth_remotes[provider].authorized_response()
> KeyError: 'login'
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6130: [AIRFLOW-5508] Add a whitelist mechanism for the StatsD metrics

2019-09-18 Thread GitBox
ashb commented on a change in pull request #6130: [AIRFLOW-5508] Add a 
whitelist mechanism for the StatsD metrics
URL: https://github.com/apache/airflow/pull/6130#discussion_r325678548
 
 

 ##
 File path: scripts/ci/kubernetes/kube/templates/configmaps.template.yaml
 ##
 @@ -54,6 +54,9 @@ data:
 statsd_port = 8125
 statsd_prefix = airflow
 
+# optional comma-separated prefix whitelist (e.g: 
scheduler,executor,dagrun)
+statsd_whitelist =
 
 Review comment:
   You don't need to add this here.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-09-18 Thread GitBox
galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template 
fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-532693090
 
 
   > Sorry it took so many months to get this in, and thanks for sticking with 
us!
   
   Sometimes, things don't go as smoothly as we want...
   I'm happy this PR has been merged!
   Thank you
   :tada: 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] yuqian90 commented on a change in pull request #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
yuqian90 commented on a change in pull request #6064: [AIRFLOW-5444] Fix 
action_logging so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#discussion_r325690143
 
 

 ##
 File path: airflow/www/decorators.py
 ##
 @@ -39,17 +39,19 @@ def wrapper(*args, **kwargs):
 else:
 user = g.user.username
 
+params = request.form if request.method == 'POST' else request.args
+
 log = Log(
 event=f.__name__,
 task_instance=None,
 owner=user,
-extra=str(list(request.args.items())),
-task_id=request.args.get('task_id'),
-dag_id=request.args.get('dag_id'))
+extra=str(list(params.items())),
 
 Review comment:
   Thanks for the suggestion. I updated the code.
   
   Regarding your other concern: `this will capture passwords and other 
sensitive info`
   I don't have a great way of addressing that. So I did this unscientific 
search in the code base. @action_logging decorator is not used on the 
ConnectionModelView. So sensitive things put in the form is not logged.
   
   `$ grep -A5 action_logging airflow/www/views.py`
   
   I also started a test server and entered some info in ConnectionModelView 
and checked nothing is logged. 
   That said, pls do let me know if you have better ways to address or verify 
this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] bjoernpollex-sc commented on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-09-18 Thread GitBox
bjoernpollex-sc commented on issue #4743: [AIRFLOW-3871] render Operators 
template fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-532723570
 
 
   Same here, this is really awesome, can't wait for the next release!
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
codecov-io edited a comment on issue #6064: [AIRFLOW-5444] Fix action_logging 
so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#issuecomment-529516855
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=h1) 
Report
   > Merging 
[#6064](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/b2e06d0edcdbcfbe65e5135554bd45b3e2d76cd0?src=pr=desc)
 will **decrease** coverage by `0.39%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6064/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master#6064 +/-   ##
   =
   - Coverage   80.04%   79.65%   -0.4% 
   =
 Files 607  607 
 Lines   3503335033 
   =
   - Hits2804327904-139 
   - Misses   6990 7129+139
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/decorators.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZGVjb3JhdG9ycy5weQ==)
 | `74.5% <100%> (ø)` | :arrow_up: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/kube\_client.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL2t1YmVfY2xpZW50LnB5)
 | `33.33% <0%> (-41.67%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `70.14% <0%> (-28.36%)` | :arrow_down: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `94.73% <0%> (-1.76%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `91.52% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `86.44% <0%> (-1.7%)` | :arrow_down: |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=footer). 
Last update 
[b2e06d0...c8c4a12](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6064: [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged

2019-09-18 Thread GitBox
codecov-io edited a comment on issue #6064: [AIRFLOW-5444] Fix action_logging 
so that request.form for POST is logged
URL: https://github.com/apache/airflow/pull/6064#issuecomment-529516855
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=h1) 
Report
   > Merging 
[#6064](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/b2e06d0edcdbcfbe65e5135554bd45b3e2d76cd0?src=pr=desc)
 will **decrease** coverage by `0.39%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6064/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master#6064 +/-   ##
   =
   - Coverage   80.04%   79.65%   -0.4% 
   =
 Files 607  607 
 Lines   3503335033 
   =
   - Hits2804327904-139 
   - Misses   6990 7129+139
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/decorators.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZGVjb3JhdG9ycy5weQ==)
 | `74.5% <100%> (ø)` | :arrow_up: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume\_mount.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZV9tb3VudC5weQ==)
 | `44.44% <0%> (-55.56%)` | :arrow_down: |
   | 
[airflow/kubernetes/volume.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3ZvbHVtZS5weQ==)
 | `52.94% <0%> (-47.06%)` | :arrow_down: |
   | 
[airflow/kubernetes/pod\_launcher.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL3BvZF9sYXVuY2hlci5weQ==)
 | `45.25% <0%> (-46.72%)` | :arrow_down: |
   | 
[airflow/kubernetes/kube\_client.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9rdWJlcm5ldGVzL2t1YmVfY2xpZW50LnB5)
 | `33.33% <0%> (-41.67%)` | :arrow_down: |
   | 
[...rflow/contrib/operators/kubernetes\_pod\_operator.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9rdWJlcm5ldGVzX3BvZF9vcGVyYXRvci5weQ==)
 | `70.14% <0%> (-28.36%)` | :arrow_down: |
   | 
[airflow/hooks/postgres\_hook.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9wb3N0Z3Jlc19ob29rLnB5)
 | `94.73% <0%> (-1.76%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `91.52% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `86.44% <0%> (-1.7%)` | :arrow_down: |
   | ... and [1 
more](https://codecov.io/gh/apache/airflow/pull/6064/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=footer). 
Last update 
[b2e06d0...c8c4a12](https://codecov.io/gh/apache/airflow/pull/6064?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4835) Refactor template rendering functions

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4835?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932577#comment-16932577
 ] 

ASF subversion and git services commented on AIRFLOW-4835:
--

Commit 6107622671c8f2633ea34957fd7719b30beb3bb7 in airflow's branch 
refs/heads/v1-10-test from Bas Harenslak
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=6107622 ]

[AIRFLOW-4835] Refactor operator render_template (#5461)

- Refactors `BaseOperator.render_template()` and removes 
`render_template_from_field()`. The functionality could be greatly simplified 
into a single `render_template()` function.
- Improves performance by removing two `hasattr` calls and avoiding recreating 
Jinja environments.
- Removes the argument `attr` to `render_template()` which wasn't used.
- Squashes multiple similar tests into two parameterized tests.
- Adheres to 110 line length.
- Adds support for templating sets.
- Adds Pydoc.
- Adds typing.

(cherry picked from commit 47dd4c99a7324cb52121a25001599d4e7597959b)


> Refactor template rendering functions
> -
>
> Key: AIRFLOW-4835
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4835
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 2.0.0
>Reporter: Bas Harenslak
>Priority: Major
> Fix For: 1.10.6
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5210) Resolving Template Files for large DAGs hurts performance

2019-09-18 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932578#comment-16932578
 ] 

ASF subversion and git services commented on AIRFLOW-5210:
--

Commit 85ac7f748f13cb71ba989acb6800dbcea383541e in airflow's branch 
refs/heads/v1-10-test from Daniel Frank
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=85ac7f7 ]

[AIRFLOW-5210] Make finding template files more efficient (#5815)

For large DAGs, iterating over template fields to find template files can be 
time intensive.
Save this time for tasks that do not specify a template file extension.

(cherry picked from commit eeac82318a6440b2d65f9a35b3437b91813945f4)


> Resolving Template Files for large DAGs hurts performance 
> --
>
> Key: AIRFLOW-5210
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5210
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG
>Affects Versions: 1.10.4
>Reporter: Daniel Frank
>Priority: Major
> Fix For: 1.10.6
>
>
> During task execution,  "resolve_template_files" runs for all tasks in a 
> given DAG. For large DAGs this takes a long time and is not necessary for 
> tasks that do not use the template_ext field 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] feluelle commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
feluelle commented on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532685557
 
 
   Good point - haven't tried `@parameterized.expand` along with some other 
annotation which needs to retrieve `@parameterized.expand` args.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5410) AirflowConfigParser no longer find conf entries with capitalized letters

2019-09-18 Thread Qian Yu (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5410?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Qian Yu updated AIRFLOW-5410:
-
Priority: Minor  (was: Major)

> AirflowConfigParser no longer find conf entries with capitalized letters
> 
>
> Key: AIRFLOW-5410
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5410
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: configuration
>Affects Versions: 1.10.5
>Reporter: Qian Yu
>Priority: Minor
>
> The following commit changed the behavior of configuration parsing globally 
> (although the commit message only says "kubernetes"):
>  AIRFLOW-4316 Support setting kubernetes_environment_variables config section 
> from env var (#5668)
>  
> This used to work:
> {code:python}
> [mysection]
> Guess = "test"{code}
> {code:python}
> assert conf.get("mysection", "Guess")  == "test"{code}
> It stopped working since 1.10.5. It'll break because the key "Guess" is not 
> found. To get around the issue, "Guess" needs to be changed to all lower case
>  "guess". This change can cause surprises because it is not obvious from the 
> changelog this commit changed conf parsing everywhere.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Alexander Kazarin (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Kazarin updated AIRFLOW-5517:
---
Summary: SparkSubmitOperator: spark-binary parameter no longer taken from 
connection extra  (was: SparkSubmitHook: spark-binary parameter no longer taken 
from extra)

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Alexander Kazarin (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Kazarin updated AIRFLOW-5517:
---
Flags:   (was: Patch)

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5518) Mix between scheme and schema for HTTP connections

2019-09-18 Thread Ash Berlin-Taylor (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5518?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932582#comment-16932582
 ] 

Ash Berlin-Taylor commented on AIRFLOW-5518:


The {{schema}} field is probably confusingly named and _only_ makes sense for 
database connection which aren't the only thing that is created.

I think for Airflow 2.0 we should rename that field in the model to {{path}} 
and then show it as schema in the UI, but make the code refer to that all as 
path.

In this specific case the HttpConnection should not look at schema for it's 
scheme. That's a bug.

> Mix between scheme and schema for HTTP connections
> --
>
> Key: AIRFLOW-5518
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5518
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: hooks
>Affects Versions: 1.10.5
>Reporter: Mickael V
>Priority: Minor
> Fix For: 1.10.6
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> There is an inconsistence in the usage of *scheme* and *schema* when using an 
> HTTP connection.
> If the connection is made through the UI or imported through config files (or 
> to sum up, if it's kept in the Airflow DB), the *schema* represents the 
> *scheme* ({{http}}, {{https}}).
> But if the connection is parsed from a URI (for example if it's overloaded 
> through an environment variable), the {{schema}} is the {{path}} or the URI.
> This is wrong because then the {{HttpHook}} uses the {{schema}} to prefix the 
> {{base_url}}.
>  
> There are two possibilities that I see to fix this :
>  * At the {{Connection}} level, in {{parse_from_uri()}}, implement a special 
> treatment for http connection to have {{conn_type='http'}} and 
> {{schema=scheme}}
>  * At the {{HttpHook}} level, in {{get_conn}}, look up {{conn_type}} to get 
> the real {{scheme}}.
>  
> I propose using the first solution, as it is more consistent with what 
> happens when a connection is added through other ways (UI and config files).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-2219) Race condition to DagRun.verify_integrity between Scheduler and Webserver

2019-09-18 Thread Dmitry (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2219?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932479#comment-16932479
 ] 

Dmitry commented on AIRFLOW-2219:
-

I had same error a couple of times per day from TriggerDagRunOperator, but 
after i rewrote it, error is almost gone - new operator is working for a month 
and a half, and issue happened only once during this time. My operator code is 
attached, two key features is that i checked if dag already started, and i use 
same session in  get_dagrun and create_dagrun functions.
{code:python}
from airflow import settings
from airflow.utils.state import State
from airflow.models import DagBag
from airflow.operators.dagrun_operator import TriggerDagRunOperator, DagRunOrder

class DagRunOperator(TriggerDagRunOperator):
template_fields = ('execution_date',)
ui_color = '#e6ccff'

def __init__(
self,
trigger_dag_id,
python_callable,
execution_date=None,
*args, **kwargs
):
self.execution_date = execution_date
super(DagRunOperator, self).__init__(
trigger_dag_id=trigger_dag_id,
python_callable=python_callable,
*args, **kwargs
)

def execute(self, context):
run_id_dt = datetime.strptime(self.execution_date, '%Y-%m-%d %H:%M:%S') 
if self.execution_date is not None \
else datetime.now()
dro = DagRunOrder(run_id='trig__' + run_id_dt.isoformat())
dro = self.python_callable(context, dro)
if dro:
session = settings.Session()
dbag = DagBag(settings.DAGS_FOLDER)
trigger_dag = dbag.get_dag(self.trigger_dag_id)

if not trigger_dag.get_dagrun(self.execution_date, session=session):
logging.info("Creating DagRun...")
dr = trigger_dag.create_dagrun(
run_id=dro.run_id,
state=State.RUNNING,
execution_date=self.execution_date,
conf=dro.payload,
session=session,
external_trigger=True
)
logging.info("DagRun Created: {}".format(dr))
session.add(dr)
session.commit()
else:
logging.info("DagRun already exists {}".format(trigger_dag))

session.close()
else:
logging.info("Criteria not met, moving on")

{code}

> Race condition to DagRun.verify_integrity between Scheduler and Webserver
> -
>
> Key: AIRFLOW-2219
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2219
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: database
>Affects Versions: 1.8.1, 1.9.0
>Reporter: Will Wong
>Priority: Trivial
>
> Symptoms:
>  * Triggering dag causes the 404 nuke page with an error message along the 
> lines of: {{psycopg2.IntegrityError: duplicate key value violates unique 
> constraint "task_instance_pkey"}} when calling {{DagRun.verify_integrity}}
> Or
>  * Similar error in scheduler log for dag file when scheduling a DAG. 
> (Example exception at the end of description)
> This occurs because {{Dag.create_dagrun}} commits a the dag_run entry to the 
> database and then runs {{verify_integrity}} to add the task_instances 
> immediately. However, the scheduler already picks up a dag run before all 
> task_instances are created and also calls {{verify_integrity}} to create 
> task_instances at the same time.
> I don't _think_ this actually breaks anything in particular. The exception 
> happens either on the webpage or in the scheduler logs:
>  * If it occurs in the UI, it just scares people thinking something broke but 
> the task_instances will be created by the scheduler.
>  * If the error shows up in the scheduler, the task_instances are created by 
> the webserver and it continues processing the DAG during the next loop.
>  
>  I'm not sure if {{DagRun.verify_integrity}} is necessary for both 
> {{SchedulerJob._process_task_instances}} as well {{Dag.create_dagrun}} but 
> perhaps we can just stick to one?
>  
> {noformat}
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
> line 1170, in _execute_context
>     context)
>   File 
> "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py",
>  line 683, in do_executemany
>     cursor.executemany(statement, parameters)
> psycopg2.IntegrityError: duplicate key value violates unique constraint 
> "task_instance_pkey"
> DETAIL:  Key (task_id, dag_id, execution_date)=(docker_task_10240_7680_0, 
> chunkedgraph_edgetask_scheduler, 2018-03-15 23:46:57.116673) already exists.
> The above exception was the direct cause of the following 

[jira] [Updated] (AIRFLOW-5511) MSSQL resetdb /initdb for 1.10.5 fails to set

2019-09-18 Thread ROHIT K RAMWAL (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ROHIT K RAMWAL updated AIRFLOW-5511:

Summary: MSSQL resetdb /initdb for 1.10.5 fails to set   (was: MSSQL 
resetdb /initdb for 1.10.5 fails to create tables)

> MSSQL resetdb /initdb for 1.10.5 fails to set 
> --
>
> Key: AIRFLOW-5511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5511
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, database
>Affects Versions: 1.10.5
>Reporter: ROHIT K RAMWAL
>Priority: Minor
> Attachments: airflowbug
>
>
> Airflow 1.10.5 installation with MSSQL backend and celery fails with below 
> error .
>  
> Traceback (most recent call last):
>   File "/bin/airflow", line 32, in 
>     args.func(args)
>   File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 1112, in 
> resetdb
>     db.resetdb(settings.RBAC)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 406, in 
> resetdb
>     initdb(rbac)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 106, in 
> initdb
>     upgradedb()
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 377, in 
> upgradedb
>     command.upgrade(config, 'heads')
>   File "/usr/lib/python2.7/site-packages/alembic/command.py", line 279, in 
> upgrade
>     script.run_env()
>   File "/usr/lib/python2.7/site-packages/alembic/script/base.py", line 475, 
> in run_env
>     util.load_python_file(self.dir, "env.py")
>   File "/usr/lib/python2.7/site-packages/alembic/util/pyfiles.py", line 98, 
> in load_python_file
>     module = load_module_py(module_id, path)
>   File "/usr/lib/python2.7/site-packages/alembic/util/compat.py", line 240, 
> in load_module_py
>     mod = imp.load_source(module_id, path, fp)
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 92, 
> in 
>     run_migrations_online()
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 86, 
> in run_migrations_online
>     context.run_migrations()
>   File "", line 8, in run_migrations
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/environment.py", 
> line 846, in run_migrations
>     self.get_context().run_migrations(**kw)
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/migration.py", line 
> 365, in run_migrations
>     step.migration_fn(**kw)
>   File 
> "/usr/lib/python2.7/site-packages/airflow/migrations/versions/6e96a59344a4_make_taskinstance_pool_not_nullable.py",
>  line 101, in upgrade
>     nullable=False,
>   File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
>     self.gen.next()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/base.py", line 
> 325, in batch_alter_table
>     impl.flush()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/batch.py", line 
> 79, in flush
>     fn(*arg, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 85, in 
> alter_column
>     **kw
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 172, in 
> alter_column
>     existing_comment=existing_comment,
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 36, in 
> _exec
>     result = super(MSSQLImpl, self)._exec(construct, *args, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 134, in 
> _exec
>     return conn.execute(construct, *multiparams, **params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 988, in execute
>     return meth(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 72, 
> in _execute_on_connection
>     return connection._execute_ddl(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1050, in _execute_ddl
>     compiled,
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1253, in _execute_context
>     e, statement, parameters, cursor, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1473, in _handle_dbapi_exception
>     util.raise_from_cause(sqlalchemy_exception, exc_info)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/compat.py", line 
> 398, in raise_from_cause
>     reraise(type(exception), exception, tb=exc_tb, cause=cause)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1249, in _execute_context
>     cursor, statement, parameters, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py", 
> line 552, in do_execute
>     cursor.execute(statement, parameters)
> sqlalchemy.exc.ProgrammingError: (pyodbc.ProgrammingError) ('42000', "[42000] 
> [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]The index 

[jira] [Updated] (AIRFLOW-5511) MSSQL resetdb /initdb for 1.10.5 fails to set ti_pool non nullable

2019-09-18 Thread ROHIT K RAMWAL (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5511?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ROHIT K RAMWAL updated AIRFLOW-5511:

Summary: MSSQL resetdb /initdb for 1.10.5 fails to set ti_pool non nullable 
 (was: MSSQL resetdb /initdb for 1.10.5 fails to set )

> MSSQL resetdb /initdb for 1.10.5 fails to set ti_pool non nullable
> --
>
> Key: AIRFLOW-5511
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5511
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, database
>Affects Versions: 1.10.5
>Reporter: ROHIT K RAMWAL
>Priority: Minor
> Attachments: airflowbug
>
>
> Airflow 1.10.5 installation with MSSQL backend and celery fails with below 
> error .
>  
> Traceback (most recent call last):
>   File "/bin/airflow", line 32, in 
>     args.func(args)
>   File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 1112, in 
> resetdb
>     db.resetdb(settings.RBAC)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 406, in 
> resetdb
>     initdb(rbac)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 106, in 
> initdb
>     upgradedb()
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 377, in 
> upgradedb
>     command.upgrade(config, 'heads')
>   File "/usr/lib/python2.7/site-packages/alembic/command.py", line 279, in 
> upgrade
>     script.run_env()
>   File "/usr/lib/python2.7/site-packages/alembic/script/base.py", line 475, 
> in run_env
>     util.load_python_file(self.dir, "env.py")
>   File "/usr/lib/python2.7/site-packages/alembic/util/pyfiles.py", line 98, 
> in load_python_file
>     module = load_module_py(module_id, path)
>   File "/usr/lib/python2.7/site-packages/alembic/util/compat.py", line 240, 
> in load_module_py
>     mod = imp.load_source(module_id, path, fp)
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 92, 
> in 
>     run_migrations_online()
>   File "/usr/lib/python2.7/site-packages/airflow/migrations/env.py", line 86, 
> in run_migrations_online
>     context.run_migrations()
>   File "", line 8, in run_migrations
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/environment.py", 
> line 846, in run_migrations
>     self.get_context().run_migrations(**kw)
>   File "/usr/lib/python2.7/site-packages/alembic/runtime/migration.py", line 
> 365, in run_migrations
>     step.migration_fn(**kw)
>   File 
> "/usr/lib/python2.7/site-packages/airflow/migrations/versions/6e96a59344a4_make_taskinstance_pool_not_nullable.py",
>  line 101, in upgrade
>     nullable=False,
>   File "/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
>     self.gen.next()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/base.py", line 
> 325, in batch_alter_table
>     impl.flush()
>   File "/usr/lib/python2.7/site-packages/alembic/operations/batch.py", line 
> 79, in flush
>     fn(*arg, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 85, in 
> alter_column
>     **kw
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 172, in 
> alter_column
>     existing_comment=existing_comment,
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/mssql.py", line 36, in 
> _exec
>     result = super(MSSQLImpl, self)._exec(construct, *args, **kw)
>   File "/usr/lib/python2.7/site-packages/alembic/ddl/impl.py", line 134, in 
> _exec
>     return conn.execute(construct, *multiparams, **params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 988, in execute
>     return meth(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/sql/ddl.py", line 72, 
> in _execute_on_connection
>     return connection._execute_ddl(self, multiparams, params)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1050, in _execute_ddl
>     compiled,
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1253, in _execute_context
>     e, statement, parameters, cursor, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1473, in _handle_dbapi_exception
>     util.raise_from_cause(sqlalchemy_exception, exc_info)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/util/compat.py", line 
> 398, in raise_from_cause
>     reraise(type(exception), exception, tb=exc_tb, cause=cause)
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/base.py", line 
> 1249, in _execute_context
>     cursor, statement, parameters, context
>   File "/usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py", 
> line 552, in do_execute
>     cursor.execute(statement, parameters)
> sqlalchemy.exc.ProgrammingError: (pyodbc.ProgrammingError) ('42000', "[42000] 
> [Microsoft][ODBC 

[jira] [Updated] (AIRFLOW-5517) SparkSubmitHook: spark-binary parameter no longer taken from extra

2019-09-18 Thread Alexander Kazarin (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Kazarin updated AIRFLOW-5517:
---
Description: 
We have an extra parameters in spark connection:
{code:java}
{"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
{code}
After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
longer take effect.
 Broken after 
[this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
 commit, I think

Workaround: call SparkSubmitOperator with spark_binary=None argument

  was:
We have an extra parameters in spark connection:
{code:java}
{"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
{code}
After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
longer take effect.
 Broken after 
[this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
 commit, I think

Workaround: call SparkSubmitHook with spark_binary=None argument


> SparkSubmitHook: spark-binary parameter no longer taken from extra
> --
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5517) SparkSubmitHook: spark-binary parameter no longer taken from extra

2019-09-18 Thread Alexander Kazarin (Jira)
Alexander Kazarin created AIRFLOW-5517:
--

 Summary: SparkSubmitHook: spark-binary parameter no longer taken 
from extra
 Key: AIRFLOW-5517
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
 Project: Apache Airflow
  Issue Type: Bug
  Components: contrib
Affects Versions: 1.10.5, 1.10.4
Reporter: Alexander Kazarin


We have an extra parameters in spark connection:
{code:java}
{"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
{code}
After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
longer take effect.
 Broken after 
[this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
 commit, I think

Workaround: call SparkSubmitHook with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Alexander Kazarin (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932548#comment-16932548
 ] 

Alexander Kazarin commented on AIRFLOW-5517:


Fix:
{code}
diff --git a/airflow/contrib/operators/spark_submit_operator.py 
b/airflow/contrib/operators/spark_submit_operator.py
index 8325e1f..4c57e34 100644
--- a/airflow/contrib/operators/spark_submit_operator.py
+++ b/airflow/contrib/operators/spark_submit_operator.py
@@ -117,7 +117,7 @@ class SparkSubmitOperator(BaseOperator):
  application_args=None,
  env_vars=None,
  verbose=False,
- spark_binary="spark-submit",
+ spark_binary=None,
  *args,
  **kwargs):
 super().__init__(*args, **kwargs)
{code}

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Alexander Kazarin (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932548#comment-16932548
 ] 

Alexander Kazarin edited comment on AIRFLOW-5517 at 9/18/19 3:36 PM:
-

Fix:
{code}
diff --git a/airflow/contrib/hooks/spark_submit_hook.py 
b/airflow/contrib/hooks/spark_submit_hook.py
index 449f072..bab5a71 100644
--- a/airflow/contrib/hooks/spark_submit_hook.py
+++ b/airflow/contrib/hooks/spark_submit_hook.py
@@ -174,7 +174,7 @@ class SparkSubmitHook(BaseHook, LoggingMixin):
  'queue': None,
  'deploy_mode': None,
  'spark_home': None,
- 'spark_binary': self._spark_binary or "spark-submit",
+ 'spark_binary': self._spark_binary,
  'namespace': None}

 try:
diff --git a/airflow/contrib/operators/spark_submit_operator.py 
b/airflow/contrib/operators/spark_submit_operator.py
index 8325e1f..4c57e34 100644
--- a/airflow/contrib/operators/spark_submit_operator.py
+++ b/airflow/contrib/operators/spark_submit_operator.py
@@ -117,7 +117,7 @@ class SparkSubmitOperator(BaseOperator):
  application_args=None,
  env_vars=None,
  verbose=False,
- spark_binary="spark-submit",
+ spark_binary=None,
  *args,
  **kwargs):
 super().__init__(*args, **kwargs)
{code}


was (Author: boiler):
Fix:
{code}
diff --git a/airflow/contrib/operators/spark_submit_operator.py 
b/airflow/contrib/operators/spark_submit_operator.py
index 8325e1f..4c57e34 100644
--- a/airflow/contrib/operators/spark_submit_operator.py
+++ b/airflow/contrib/operators/spark_submit_operator.py
@@ -117,7 +117,7 @@ class SparkSubmitOperator(BaseOperator):
  application_args=None,
  env_vars=None,
  verbose=False,
- spark_binary="spark-submit",
+ spark_binary=None,
  *args,
  **kwargs):
 super().__init__(*args, **kwargs)
{code}

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Ash Berlin-Taylor (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16932571#comment-16932571
 ] 

Ash Berlin-Taylor commented on AIRFLOW-5517:


The hunk from spark_submit_hook is fine as it is (that path is the fallback for 
when the connection doesn't exist) so it's just the path from the operator that 
is buggy.

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5517) SparkSubmitOperator: spark-binary parameter no longer taken from connection extra

2019-09-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-5517:
---
Fix Version/s: 1.10.6

> SparkSubmitOperator: spark-binary parameter no longer taken from connection 
> extra
> -
>
> Key: AIRFLOW-5517
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5517
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib
>Affects Versions: 1.10.4, 1.10.5
>Reporter: Alexander Kazarin
>Priority: Major
> Fix For: 1.10.6
>
>
> We have an extra parameters in spark connection:
> {code:java}
> {"deploy-mode": "cluster", "spark-binary": "spark2-submit"}
> {code}
> After upgrade to 1.10.5 from 1.10.3 parameter 'spark-binary' in extra is no 
> longer take effect.
>  Broken after 
> [this|https://github.com/apache/airflow/commit/8be59fb4edf0f2a132b13d0ffd1df0b8908191ab]
>  commit, I think
> Workaround: call SparkSubmitOperator with spark_binary=None argument



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6130: [AIRFLOW-5508] Add a whitelist mechanism for the StatsD metrics

2019-09-18 Thread GitBox
ashb commented on a change in pull request #6130: [AIRFLOW-5508] Add a 
whitelist mechanism for the StatsD metrics
URL: https://github.com/apache/airflow/pull/6130#discussion_r325677754
 
 

 ##
 File path: docs/metrics.rst
 ##
 @@ -41,6 +41,13 @@ Add the following lines to your configuration file e.g. 
``airflow.cfg``
 statsd_port = 8125
 statsd_prefix = airflow
 
+If you want to avoid send all the available metrics to StatsD, you can 
configure a whitelist of prefixes to send only
+the metrics that start with the elements of the list:
+
+.. code-block:: bash
 
 Review comment:
   `.. code-block:: ini` was probably what he meant.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused pagination bug

2019-09-18 Thread GitBox
alrolorojas edited a comment on issue #6100: [AIRFLOW-5387] Fix show paused 
pagination bug
URL: https://github.com/apache/airflow/pull/6100#issuecomment-532644657
 
 
   > You could even use @conf_vars to annotate the test method to use the env 
var only in the whole test method. But either way it is better than before 
(try-finally...).
   
   @feluelle I did not find a way to pass arguments dynamically from 
`@parameterized.expand` to `@conf_vars` when used as a decorator. Is there a 
way I can do that annotating the whole method with `conf_vars` and use an 
argument from `@parameterized.expand` within it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   3   >