[jira] [Commented] (AIRFLOW-3353) redis-py 3.0.0 dependency breaks celery executor

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3353?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690421#comment-16690421
 ] 

ASF GitHub Bot commented on AIRFLOW-3353:
-

r39132 closed pull request #4203: [AIRFLOW-3353] Upgrade redis client.
URL: https://github.com/apache/incubator-airflow/pull/4203
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/redis_hook.py 
b/airflow/contrib/hooks/redis_hook.py
index 650cc9308b..cf684b862f 100644
--- a/airflow/contrib/hooks/redis_hook.py
+++ b/airflow/contrib/hooks/redis_hook.py
@@ -20,7 +20,7 @@
 """
 RedisHook module
 """
-from redis import StrictRedis
+from redis import Redis
 
 from airflow.exceptions import AirflowException
 from airflow.hooks.base_hook import BaseHook
@@ -69,7 +69,7 @@ def get_conn(self):
 self.redis_conn_id, self.host, self.port, self.db
 )
 try:
-self.client = StrictRedis(
+self.client = Redis(
 host=self.host,
 port=self.port,
 password=self.password,
diff --git a/setup.py b/setup.py
index e651f5a66e..70adeb2b2e 100644
--- a/setup.py
+++ b/setup.py
@@ -218,7 +218,7 @@ def write_version(filename=os.path.join(*['airflow',
 postgres = ['psycopg2-binary>=2.7.4']
 qds = ['qds-sdk>=1.9.6']
 rabbitmq = ['librabbitmq>=1.6.1']
-redis = ['redis>=2.10.5,<3.0.0']
+redis = ['redis>=3.0.0,<4.0.0']
 s3 = ['boto3>=1.7.0, <1.8.0']
 salesforce = ['simple-salesforce>=0.72']
 samba = ['pysmbclient>=0.1.3']
diff --git a/tests/contrib/hooks/test_redis_hook.py 
b/tests/contrib/hooks/test_redis_hook.py
index 12c30680e1..74d4b6c2e3 100644
--- a/tests/contrib/hooks/test_redis_hook.py
+++ b/tests/contrib/hooks/test_redis_hook.py
@@ -35,7 +35,7 @@ def test_get_conn(self):
 self.assertEqual(
 repr(hook.get_conn()),
 (
-'StrictRedis>>'
 )
 )
diff --git a/tests/contrib/sensors/test_redis_sensor.py 
b/tests/contrib/sensors/test_redis_sensor.py
index 394c8e574b..95cbf67d13 100644
--- a/tests/contrib/sensors/test_redis_sensor.py
+++ b/tests/contrib/sensors/test_redis_sensor.py
@@ -55,7 +55,7 @@ def test_poke(self, key_exists):
 key_exists.return_value = False
 self.assertFalse(self.sensor.poke(None))
 
-@patch("airflow.contrib.hooks.redis_hook.StrictRedis.exists")
+@patch("airflow.contrib.hooks.redis_hook.Redis.exists")
 def test_existing_key_called(self, redis_client_exists):
 self.sensor.run(
 start_date=DEFAULT_DATE,


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> redis-py 3.0.0 dependency breaks celery executor
> 
>
> Key: AIRFLOW-3353
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3353
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery
>Affects Versions: 1.10.0
>Reporter: Stefan Seelmann
>Assignee: Ash Berlin-Taylor
>Priority: Major
> Fix For: 2.0.0
>
>
> redis-py 3.0.0 was just released. Airflow 1.10.0 defines redis>=2.10.5 so 
> installs redis-py 3.0.0 now.
> Error in worker below.
> Workaround: Pin redis==2.10.6 (e.g. in constraints.txt)
> {code}
> [2018-11-15 12:06:18,441: CRITICAL/MainProcess] Unrecoverable error: 
> AttributeError("'float' object has no attribute 'items'",)
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/site-packages/celery/worker/worker.py", line 
> 205, in start
> self.blueprint.start(self)
>   File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 
> 119, in start
> step.start(parent)
>   File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 
> 369, in start
> return self.obj.start()
>   File 
> "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", 
> line 317, in start
> blueprint.start(self)
>   File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 
> 119, in start
> step.start(parent)
>   File 
> "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", 
> line 593, in start
> c.loop(*c.loop_args())
>   File "/usr/local/lib/python3.6/site-packages/celery/worker/loops.py", line 
> 91, in asynloop
> next(loop)
>   File 

[jira] [Commented] (AIRFLOW-3358) POC: Refactor command line to make it more testable and easy to develop

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690425#comment-16690425
 ] 

Iuliia Volkova commented on AIRFLOW-3358:
-

[~kaxilnaik], at real, 'Click' is one of the possible variants, we can stay 
with argparse and it will be ok. But need to refactor cli to match 'Command 
pattern', when command execution, methods what created final command and utils 
methods - separated to get possible cover each step by tests. For, example, PR 
what I attached upper (I just refactor part to get possible add tests). I also 
look at changes in PRs what done last time in cli and most of them without 
tests and it's understandable because there is no easy way to add a test to new 
feature without refactoring a big part of a code.  

> POC: Refactor command line to make it  more testable and easy to develop
> 
>
> Key: AIRFLOW-3358
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 2.0.0
>Reporter: Iuliia Volkova
>Assignee: Iuliia Volkova
>Priority: Major
>
> Hi all! 
> In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
> talk with Ashb, what will be cool to refactor the cli for getting more 
> testable and readable code.
> I want to prepare POC based on one command with implementation (with Click if 
> we want it to use, or with Argparse and Command Pattern) and covering with 
> tests for discussing Airflow Cli architecture.
> Click already exists in Airflow dependencies.
> Main stimulus: 
> - Get more readable and changeable cli - for easy adding command or changing 
> commands
> - Get possible to add more tests 
>  Will be good to know your concerns about such initiative and if there are no 
> disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3358) POC: Refactor command line to make it more testable and easy to develop

2018-11-16 Thread Iuliia Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3358?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Iuliia Volkova updated AIRFLOW-3358:

Description: 
Hi all! 

In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
talk with Ashb, what will be cool to refactor the cli for getting more testable 
and readable code.

I want to prepare POC based on one command with implementation (with Click if 
we want it to use, or with Argparse and Command Pattern) and covering with 
tests for discussing Airflow Cli architecture.


Click already exists in Airflow dependencies.

Main stimulus: 

- Get more readable and changeable cli - for easy adding command or changing 
commands
- Get possible to add more tests 

 Will be good to know your concerns about such initiative and if there are no 
disagrees about it, I will be happy to start POC

  was:
Hi all! 

In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
talk with Ashb, what will be cool to refactor the cli for getting more testable 
and readable code.

I want to prepare POC based on one command with implementation with Click and 
covering with tests for discussing Airflow Cli architecture.


Click already exists in Airflow dependencies.

Main stimulus: 

- Get more readable and changeable cli - for easy adding command or changing 
commands
- Get possible to add more tests 

 Will be good to know your concerns about such initiative and if there are no 
disagrees about it, I will be happy to start POC


> POC: Refactor command line to make it  more testable and easy to develop
> 
>
> Key: AIRFLOW-3358
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 2.0.0
>Reporter: Iuliia Volkova
>Assignee: Iuliia Volkova
>Priority: Major
>
> Hi all! 
> In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
> talk with Ashb, what will be cool to refactor the cli for getting more 
> testable and readable code.
> I want to prepare POC based on one command with implementation (with Click if 
> we want it to use, or with Argparse and Command Pattern) and covering with 
> tests for discussing Airflow Cli architecture.
> Click already exists in Airflow dependencies.
> Main stimulus: 
> - Get more readable and changeable cli - for easy adding command or changing 
> commands
> - Get possible to add more tests 
>  Will be good to know your concerns about such initiative and if there are no 
> disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3358) POC: Refactor command line to make it more testable and easy to develop

2018-11-16 Thread Iuliia Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3358?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Iuliia Volkova updated AIRFLOW-3358:

Summary: POC: Refactor command line to make it  more testable and easy to 
develop  (was: POC: Refactor command line to use Click)

> POC: Refactor command line to make it  more testable and easy to develop
> 
>
> Key: AIRFLOW-3358
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 2.0.0
>Reporter: Iuliia Volkova
>Assignee: Iuliia Volkova
>Priority: Major
>
> Hi all! 
> In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
> talk with Ashb, what will be cool to refactor the cli for getting more 
> testable and readable code.
> I want to prepare POC based on one command with implementation with Click and 
> covering with tests for discussing Airflow Cli architecture.
> Click already exists in Airflow dependencies.
> Main stimulus: 
> - Get more readable and changeable cli - for easy adding command or changing 
> commands
> - Get possible to add more tests 
>  Will be good to know your concerns about such initiative and if there are no 
> disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] r39132 closed pull request #4203: [AIRFLOW-3353] Upgrade redis client.

2018-11-16 Thread GitBox
r39132 closed pull request #4203: [AIRFLOW-3353] Upgrade redis client.
URL: https://github.com/apache/incubator-airflow/pull/4203
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/redis_hook.py 
b/airflow/contrib/hooks/redis_hook.py
index 650cc9308b..cf684b862f 100644
--- a/airflow/contrib/hooks/redis_hook.py
+++ b/airflow/contrib/hooks/redis_hook.py
@@ -20,7 +20,7 @@
 """
 RedisHook module
 """
-from redis import StrictRedis
+from redis import Redis
 
 from airflow.exceptions import AirflowException
 from airflow.hooks.base_hook import BaseHook
@@ -69,7 +69,7 @@ def get_conn(self):
 self.redis_conn_id, self.host, self.port, self.db
 )
 try:
-self.client = StrictRedis(
+self.client = Redis(
 host=self.host,
 port=self.port,
 password=self.password,
diff --git a/setup.py b/setup.py
index e651f5a66e..70adeb2b2e 100644
--- a/setup.py
+++ b/setup.py
@@ -218,7 +218,7 @@ def write_version(filename=os.path.join(*['airflow',
 postgres = ['psycopg2-binary>=2.7.4']
 qds = ['qds-sdk>=1.9.6']
 rabbitmq = ['librabbitmq>=1.6.1']
-redis = ['redis>=2.10.5,<3.0.0']
+redis = ['redis>=3.0.0,<4.0.0']
 s3 = ['boto3>=1.7.0, <1.8.0']
 salesforce = ['simple-salesforce>=0.72']
 samba = ['pysmbclient>=0.1.3']
diff --git a/tests/contrib/hooks/test_redis_hook.py 
b/tests/contrib/hooks/test_redis_hook.py
index 12c30680e1..74d4b6c2e3 100644
--- a/tests/contrib/hooks/test_redis_hook.py
+++ b/tests/contrib/hooks/test_redis_hook.py
@@ -35,7 +35,7 @@ def test_get_conn(self):
 self.assertEqual(
 repr(hook.get_conn()),
 (
-'StrictRedis>>'
 )
 )
diff --git a/tests/contrib/sensors/test_redis_sensor.py 
b/tests/contrib/sensors/test_redis_sensor.py
index 394c8e574b..95cbf67d13 100644
--- a/tests/contrib/sensors/test_redis_sensor.py
+++ b/tests/contrib/sensors/test_redis_sensor.py
@@ -55,7 +55,7 @@ def test_poke(self, key_exists):
 key_exists.return_value = False
 self.assertFalse(self.sensor.poke(None))
 
-@patch("airflow.contrib.hooks.redis_hook.StrictRedis.exists")
+@patch("airflow.contrib.hooks.redis_hook.Redis.exists")
 def test_existing_key_called(self, redis_client_exists):
 self.sensor.run(
 start_date=DEFAULT_DATE,


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] r39132 closed pull request #4204: [AIRFLOW-XXX] Add Etsy to companies list

2018-11-16 Thread GitBox
r39132 closed pull request #4204: [AIRFLOW-XXX] Add Etsy to companies list
URL: https://github.com/apache/incubator-airflow/pull/4204
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/README.md b/README.md
index cff59f9dff..06e618bf49 100644
--- a/README.md
+++ b/README.md
@@ -180,6 +180,7 @@ Currently **officially** using Airflow:
 1. [Easy Taxi](http://www.easytaxi.com/) 
[[@caique-lima](https://github.com/caique-lima) & 
[@diraol](https://github.com/diraol)]
 1. [Enigma](https://www.enigma.com) 
[[@hydrosquall](https://github.com/hydrosquall)]
 1. [eRevalue](https://www.datamaran.com) 
[[@hamedhsn](https://github.com/hamedhsn)]
+1. [Etsy](https://www.etsy.com) [[@mchalek](https://github.com/mchalek)]
 1. [evo.company](https://evo.company/) 
[[@orhideous](https://github.com/orhideous)]
 1. [Fathom Health](https://www.fathomhealth.co/)
 1. [Flipp](https://www.flipp.com) 
[[@sethwilsonwishabi](https://github.com/sethwilsonwishabi)]


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] r39132 opened a new pull request #4204: [AIRFLOW-XXX] Add Etsy to companies list

2018-11-16 Thread GitBox
r39132 opened a new pull request #4204: [AIRFLOW-XXX] Add Etsy to companies list
URL: https://github.com/apache/incubator-airflow/pull/4204
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4203: Upgrade redis client.

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4203: Upgrade redis client.
URL: 
https://github.com/apache/incubator-airflow/pull/4203#issuecomment-439587778
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=h1)
 Report
   > Merging 
[#4203](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/4fac6b9b1d6f3ca697906f28ec9e4271706f2c07?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4203/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4203  +/-   ##
   =
   + Coverage77.7%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16315   16315  
   =
   + Hits12677   12678   +1 
   + Misses   36383637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `89.05% <0%> (+0.36%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=footer).
 Last update 
[4fac6b9...465b91e](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] phani8996 commented on a change in pull request #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
phani8996 commented on a change in pull request #4194: [AIRFLOW-3352] Fix 
showing config on RBAC UI when expose_config is False
URL: https://github.com/apache/incubator-airflow/pull/4194#discussion_r234397259
 
 

 ##
 File path: tests/www_rbac/test_views.py
 ##
 @@ -448,13 +448,23 @@ def test_refresh(self):
 
 
 class TestConfigurationView(TestBase):
-def test_configuration(self):
+def test_configuration_do_not_expose_config(self):
 self.logout()
 self.login()
+conf.set("webserver", "expose_config", "False")
 
 Review comment:
   My bad, i added it in wrong direction 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao closed pull request #4202: [AIRFLOW-XXX] Fix incorrect URL for Task Tries and Task Duration

2018-11-16 Thread GitBox
feng-tao closed pull request #4202: [AIRFLOW-XXX] Fix incorrect URL for Task 
Tries and Task Duration
URL: https://github.com/apache/incubator-airflow/pull/4202
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www/templates/airflow/list_dags.html 
b/airflow/www/templates/airflow/list_dags.html
index c7f2497b94..047b579726 100644
--- a/airflow/www/templates/airflow/list_dags.html
+++ b/airflow/www/templates/airflow/list_dags.html
@@ -153,10 +153,10 @@ DAGs
   
   
 
-  
+  
   
   
-  
+  
   
   
   


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4202: [AIRFLOW-XXX] Fix incorrect URL for Task Tries and Task Duration

2018-11-16 Thread GitBox
feng-tao commented on issue #4202: [AIRFLOW-XXX] Fix incorrect URL for Task 
Tries and Task Duration
URL: 
https://github.com/apache/incubator-airflow/pull/4202#issuecomment-439589069
 
 
   LGTM


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4203: [WIP] Upgrade redis client.

2018-11-16 Thread GitBox
codecov-io commented on issue #4203: [WIP] Upgrade redis client.
URL: 
https://github.com/apache/incubator-airflow/pull/4203#issuecomment-439587778
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=h1)
 Report
   > Merging 
[#4203](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/6b68f08edd57792e0757faa87ef1b4d935f7f8dc?src=pr=desc)
 will **increase** coverage by `62.47%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4203/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master   #4203   +/-   ##
   ==
   + Coverage   15.23%   77.7%   +62.47% 
   ==
 Files 199 199   
 Lines   16315   16315   
   ==
   + Hits 2486   12678+10192 
   + Misses  138293637-10192
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/executors/dask\_executor.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvZGFza19leGVjdXRvci5weQ==)
 | `2% <0%> (+2%)` | :arrow_up: |
   | 
[airflow/exceptions.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9leGNlcHRpb25zLnB5)
 | `100% <0%> (+2.85%)` | :arrow_up: |
   | 
[airflow/utils/operator\_resources.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9vcGVyYXRvcl9yZXNvdXJjZXMucHk=)
 | `86.95% <0%> (+4.34%)` | :arrow_up: |
   | 
[airflow/executors/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9leGVjdXRvcnMvX19pbml0X18ucHk=)
 | `55.76% <0%> (+5.76%)` | :arrow_up: |
   | 
[airflow/utils/decorators.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kZWNvcmF0b3JzLnB5)
 | `91.66% <0%> (+14.58%)` | :arrow_up: |
   | 
[airflow/settings.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9zZXR0aW5ncy5weQ==)
 | `80.41% <0%> (+14.68%)` | :arrow_up: |
   | 
[airflow/hooks/oracle\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9vcmFjbGVfaG9vay5weQ==)
 | `15.47% <0%> (+15.47%)` | :arrow_up: |
   | 
[airflow/task/task\_runner/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy90YXNrL3Rhc2tfcnVubmVyL19faW5pdF9fLnB5)
 | `63.63% <0%> (+18.18%)` | :arrow_up: |
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `33.6% <0%> (+18.4%)` | :arrow_up: |
   | 
[airflow/\_\_init\_\_.py](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree#diff-YWlyZmxvdy9fX2luaXRfXy5weQ==)
 | `74.28% <0%> (+19.99%)` | :arrow_up: |
   | ... and [160 
more](https://codecov.io/gh/apache/incubator-airflow/pull/4203/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=footer).
 Last update 
[6b68f08...3046e5e](https://codecov.io/gh/apache/incubator-airflow/pull/4203?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jmcarp opened a new pull request #4203: [WIP] Upgrade redis client.

2018-11-16 Thread GitBox
jmcarp opened a new pull request #4203: [WIP] Upgrade redis client.
URL: https://github.com/apache/incubator-airflow/pull/4203
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] phani8996 commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
phani8996 commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC 
UI when expose_config is False
URL: 
https://github.com/apache/incubator-airflow/pull/4194#issuecomment-439577478
 
 
   Thanks @kaxil for guiding me with tests and reviewing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #4202: [AIRFLOW-XXX] Fix incorrect URL for Task Tries and Task Duration

2018-11-16 Thread GitBox
kaxil opened a new pull request #4202: [AIRFLOW-XXX] Fix incorrect URL for Task 
Tries and Task Duration
URL: https://github.com/apache/incubator-airflow/pull/4202
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. 
   
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   n/a
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3346) Add gcp transfer service hook and operators

2018-11-16 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-3346:

Fix Version/s: 1.10.2

> Add gcp transfer service hook and operators
> ---
>
> Key: AIRFLOW-3346
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3346
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Josh Carp
>Priority: Minor
> Fix For: 1.10.2
>
>
> Add a hook and operator(s) to connect to gcp storage transfer service and 
> transfer files from s3 to gcp (and gcp to gcp) without copying to local disk.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3346) Add gcp transfer service hook and operators

2018-11-16 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-3346.
-
Resolution: Fixed

Resolved by https://github.com/apache/incubator-airflow/pull/4189

> Add gcp transfer service hook and operators
> ---
>
> Key: AIRFLOW-3346
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3346
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Josh Carp
>Priority: Minor
>
> Add a hook and operator(s) to connect to gcp storage transfer service and 
> transfer files from s3 to gcp (and gcp to gcp) without copying to local disk.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil edited a comment on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP transfer service.

2018-11-16 Thread GitBox
kaxil edited a comment on issue #4189: [AIRFLOW-3346] Add hook and operator for 
GCP transfer service.
URL: 
https://github.com/apache/incubator-airflow/pull/4189#issuecomment-439563848
 
 
   I have squashed and merged it. Thanks @jmcarp for your contribution :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3346) Add gcp transfer service hook and operators

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3346?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690146#comment-16690146
 ] 

ASF GitHub Bot commented on AIRFLOW-3346:
-

kaxil closed pull request #4189: [AIRFLOW-3346] Add hook and operator for GCP 
transfer service.
URL: https://github.com/apache/incubator-airflow/pull/4189
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/gcp_transfer_hook.py 
b/airflow/contrib/hooks/gcp_transfer_hook.py
new file mode 100644
index 00..88534a5103
--- /dev/null
+++ b/airflow/contrib/hooks/gcp_transfer_hook.py
@@ -0,0 +1,107 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+import time
+import datetime
+from googleapiclient.discovery import build
+
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.gcp_api_base_hook import GoogleCloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 1
+
+
+# noinspection PyAbstractClass
+class GCPTransferServiceHook(GoogleCloudBaseHook):
+"""
+Hook for GCP Storage Transfer Service.
+"""
+_conn = None
+
+def __init__(self,
+ api_version='v1',
+ gcp_conn_id='google_cloud_default',
+ delegate_to=None):
+super(GCPTransferServiceHook, self).__init__(gcp_conn_id, delegate_to)
+self.api_version = api_version
+
+def get_conn(self):
+"""
+Retrieves connection to Google Storage Transfer service.
+
+:return: Google Storage Transfer service object
+:rtype: dict
+"""
+if not self._conn:
+http_authorized = self._authorize()
+self._conn = build('storagetransfer', self.api_version,
+   http=http_authorized, cache_discovery=False)
+return self._conn
+
+def create_transfer_job(self, project_id, transfer_spec, **kwargs):
+conn = self.get_conn()
+now = datetime.datetime.utcnow()
+transfer_job = {
+'status': 'ENABLED',
+'projectId': project_id,
+'transferSpec': transfer_spec,
+'schedule': {
+'scheduleStartDate': {
+'day': now.day,
+'month': now.month,
+'year': now.year,
+},
+'scheduleEndDate': {
+'day': now.day,
+'month': now.month,
+'year': now.year,
+}
+}
+}
+transfer_job.update(kwargs)
+result = conn.transferJobs().create(body=transfer_job).execute()
+self.wait_for_transfer_job(result, conn=conn)
+
+def wait_for_transfer_job(self, job, conn=None):
+conn = conn or self.get_conn()
+while True:
+result = conn.transferOperations().list(
+name='transferOperations',
+filter=json.dumps({
+'project_id': job['projectId'],
+'job_names': [job['name']],
+}),
+).execute()
+if self._check_operations_result(result):
+return True
+time.sleep(TIME_TO_SLEEP_IN_SECONDS)
+
+def _check_operations_result(self, result):
+operations = result.get('operations', [])
+if len(operations) == 0:
+return False
+for operation in operations:
+if operation['metadata']['status'] in {'FAILED', 'ABORTED'}:
+raise AirflowException('Operation {} {}'.format(
+operation['name'], operation['metadata']['status']))
+if operation['metadata']['status'] != 'SUCCESS':
+return False
+return True
diff --git a/airflow/contrib/operators/s3_to_gcs_transfer_operator.py 
b/airflow/contrib/operators/s3_to_gcs_transfer_operator.py
new 

[GitHub] kaxil commented on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP transfer service.

2018-11-16 Thread GitBox
kaxil commented on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP 
transfer service.
URL: 
https://github.com/apache/incubator-airflow/pull/4189#issuecomment-439563848
 
 
   I have squashed and merged it. Thanks @jmcarp :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
kaxil commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC UI 
when expose_config is False
URL: 
https://github.com/apache/incubator-airflow/pull/4194#issuecomment-439563808
 
 
   Thanks @phani8996 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil closed pull request #4189: [AIRFLOW-3346] Add hook and operator for GCP transfer service.

2018-11-16 Thread GitBox
kaxil closed pull request #4189: [AIRFLOW-3346] Add hook and operator for GCP 
transfer service.
URL: https://github.com/apache/incubator-airflow/pull/4189
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/gcp_transfer_hook.py 
b/airflow/contrib/hooks/gcp_transfer_hook.py
new file mode 100644
index 00..88534a5103
--- /dev/null
+++ b/airflow/contrib/hooks/gcp_transfer_hook.py
@@ -0,0 +1,107 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+import time
+import datetime
+from googleapiclient.discovery import build
+
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.gcp_api_base_hook import GoogleCloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 1
+
+
+# noinspection PyAbstractClass
+class GCPTransferServiceHook(GoogleCloudBaseHook):
+"""
+Hook for GCP Storage Transfer Service.
+"""
+_conn = None
+
+def __init__(self,
+ api_version='v1',
+ gcp_conn_id='google_cloud_default',
+ delegate_to=None):
+super(GCPTransferServiceHook, self).__init__(gcp_conn_id, delegate_to)
+self.api_version = api_version
+
+def get_conn(self):
+"""
+Retrieves connection to Google Storage Transfer service.
+
+:return: Google Storage Transfer service object
+:rtype: dict
+"""
+if not self._conn:
+http_authorized = self._authorize()
+self._conn = build('storagetransfer', self.api_version,
+   http=http_authorized, cache_discovery=False)
+return self._conn
+
+def create_transfer_job(self, project_id, transfer_spec, **kwargs):
+conn = self.get_conn()
+now = datetime.datetime.utcnow()
+transfer_job = {
+'status': 'ENABLED',
+'projectId': project_id,
+'transferSpec': transfer_spec,
+'schedule': {
+'scheduleStartDate': {
+'day': now.day,
+'month': now.month,
+'year': now.year,
+},
+'scheduleEndDate': {
+'day': now.day,
+'month': now.month,
+'year': now.year,
+}
+}
+}
+transfer_job.update(kwargs)
+result = conn.transferJobs().create(body=transfer_job).execute()
+self.wait_for_transfer_job(result, conn=conn)
+
+def wait_for_transfer_job(self, job, conn=None):
+conn = conn or self.get_conn()
+while True:
+result = conn.transferOperations().list(
+name='transferOperations',
+filter=json.dumps({
+'project_id': job['projectId'],
+'job_names': [job['name']],
+}),
+).execute()
+if self._check_operations_result(result):
+return True
+time.sleep(TIME_TO_SLEEP_IN_SECONDS)
+
+def _check_operations_result(self, result):
+operations = result.get('operations', [])
+if len(operations) == 0:
+return False
+for operation in operations:
+if operation['metadata']['status'] in {'FAILED', 'ABORTED'}:
+raise AirflowException('Operation {} {}'.format(
+operation['name'], operation['metadata']['status']))
+if operation['metadata']['status'] != 'SUCCESS':
+return False
+return True
diff --git a/airflow/contrib/operators/s3_to_gcs_transfer_operator.py 
b/airflow/contrib/operators/s3_to_gcs_transfer_operator.py
new file mode 100644
index 00..e2fbf95b73
--- /dev/null
+++ b/airflow/contrib/operators/s3_to_gcs_transfer_operator.py
@@ -0,0 +1,124 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more 

[jira] [Commented] (AIRFLOW-3352) Don't Show Airflow config in rbac view based on flag

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3352?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690143#comment-16690143
 ] 

ASF GitHub Bot commented on AIRFLOW-3352:
-

kaxil closed pull request #4194: [AIRFLOW-3352] Fix showing config on RBAC UI 
when expose_config is False
URL: https://github.com/apache/incubator-airflow/pull/4194
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/www_rbac/views.py b/airflow/www_rbac/views.py
index 29e8da1b9c..f5ca0ca0c7 100644
--- a/airflow/www_rbac/views.py
+++ b/airflow/www_rbac/views.py
@@ -1798,11 +1798,18 @@ def conf(self):
 raw = request.args.get('raw') == "true"
 title = "Airflow Configuration"
 subtitle = conf.AIRFLOW_CONFIG
-with open(conf.AIRFLOW_CONFIG, 'r') as f:
-config = f.read()
-table = [(section, key, value, source)
- for section, parameters in conf.as_dict(True, True).items()
- for key, (value, source) in parameters.items()]
+# Don't show config when expose_config variable is False in airflow 
config
+if conf.getboolean("webserver", "expose_config"):
+with open(conf.AIRFLOW_CONFIG, 'r') as f:
+config = f.read()
+table = [(section, key, value, source)
+ for section, parameters in conf.as_dict(True, 
True).items()
+ for key, (value, source) in parameters.items()]
+else:
+config = (
+"# Your Airflow administrator chose not to expose the "
+"configuration, most likely for security reasons.")
+table = None
 
 if raw:
 return Response(
diff --git a/tests/www_rbac/test_views.py b/tests/www_rbac/test_views.py
index af5fee3180..2520cfe340 100644
--- a/tests/www_rbac/test_views.py
+++ b/tests/www_rbac/test_views.py
@@ -448,9 +448,19 @@ def test_refresh(self):
 
 
 class TestConfigurationView(TestBase):
-def test_configuration(self):
+def test_configuration_do_not_expose_config(self):
 self.logout()
 self.login()
+conf.set("webserver", "expose_config", "False")
+resp = self.client.get('configuration', follow_redirects=True)
+self.check_content_in_response(
+['Airflow Configuration', '# Your Airflow administrator chose not 
to expose the configuration, '
+  'most likely for security reasons.'], 
resp)
+
+def test_configuration_expose_config(self):
+self.logout()
+self.login()
+conf.set("webserver", "expose_config", "True")
 resp = self.client.get('configuration', follow_redirects=True)
 self.check_content_in_response(
 ['Airflow Configuration', 'Running Configuration'], resp)


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Don't Show Airflow config in rbac view based on flag
> 
>
> Key: AIRFLOW-3352
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3352
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: webserver
>Affects Versions: 1.10.0
>Reporter: Sai Phanindhra
>Assignee: Sai Phanindhra
>Priority: Major
>  Labels: configuration, rbac, webserver
> Fix For: 1.10.2
>
>
> Earlier expose_config flags is used to toggle whether to show configuration 
> in UI or not. This feature is not enabled when rbac is enabled. Add provision 
> to toggle this feature when rbac is enabled on airflow.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
codecov-io commented on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC 
UI when expose_config is False
URL: 
https://github.com/apache/incubator-airflow/pull/4194#issuecomment-439556402
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=h1)
 Report
   > Merging 
[#4194](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4194/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4194  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16315   +6 
   =
   + Hits12672   12677   +5 
   - Misses   36373638   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4194/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.38% <100%> (+0.06%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4194/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.29% <0%> (-0.04%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=footer).
 Last update 
[8668ef8...ba93f0a](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4194: [AIRFLOW-3352] Fix showing config 
on RBAC UI when expose_config is False
URL: 
https://github.com/apache/incubator-airflow/pull/4194#issuecomment-439556402
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=h1)
 Report
   > Merging 
[#4194](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4194/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4194  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16315   +6 
   =
   + Hits12672   12677   +5 
   - Misses   36373638   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4194/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `72.38% <100%> (+0.06%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4194/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.29% <0%> (-0.04%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=footer).
 Last update 
[8668ef8...ba93f0a](https://codecov.io/gh/apache/incubator-airflow/pull/4194?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jmcarp commented on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP transfer service.

2018-11-16 Thread GitBox
jmcarp commented on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP 
transfer service.
URL: 
https://github.com/apache/incubator-airflow/pull/4189#issuecomment-439554912
 
 
   Thanks for the update @kaxil! Let me know when you're done and I'll squash 
commits.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3355) Fix BigQueryCursor.execute to work with Python3

2018-11-16 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-3355.
-
Resolution: Fixed

Resolved by  https://github.com/apache/incubator-airflow/pull/4198

> Fix BigQueryCursor.execute to work with Python3
> ---
>
> Key: AIRFLOW-3355
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3355
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp, hooks
>Reporter: Kengo Seki
>Assignee: Kengo Seki
>Priority: Major
>
> {{BigQueryCursor.execute}} uses {{dict.iteritems}} internally, so it fails 
> with Python3 if binding parameters are provided.
> {code}
> In [1]: import sys
> In [2]: sys.version
> Out[2]: '3.6.6 (default, Sep 12 2018, 18:26:19) \n[GCC 8.0.1 20180414 
> (experimental) [trunk revision 259383]]'
> In [3]: from airflow.contrib.hooks.bigquery_hook import BigQueryHook
> In [4]: hook = BigQueryHook()
> In [5]: conn = hook.get_conn()
> [2018-11-15 19:01:35,856] {discovery.py:267} INFO - URL being requested: GET 
> https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
> In [6]: cur = conn.cursor()
> In [7]: cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ---
> AttributeErrorTraceback (most recent call last)
>  in 
> > 1 cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> execute(self, operation, parameters)
>1561 """
>1562 sql = _bind_parameters(operation,
> -> 1563parameters) if parameters else 
> operation
>1564 self.job_id = self.run_query(sql)
>1565
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> _bind_parameters(operation, parameters)
>1684 # inspired by MySQL Python Connector (conversion.py)
>1685 string_parameters = {}
> -> 1686 for (name, value) in parameters.iteritems():
>1687 if value is None:
>1688 string_parameters[name] = 'NULL'
> AttributeError: 'dict' object has no attribute 'iteritems'
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3355) Fix BigQueryCursor.execute to work with Python3

2018-11-16 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-3355:

Fix Version/s: 1.10.2

> Fix BigQueryCursor.execute to work with Python3
> ---
>
> Key: AIRFLOW-3355
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3355
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp, hooks
>Reporter: Kengo Seki
>Assignee: Kengo Seki
>Priority: Major
> Fix For: 2.0.0
>
>
> {{BigQueryCursor.execute}} uses {{dict.iteritems}} internally, so it fails 
> with Python3 if binding parameters are provided.
> {code}
> In [1]: import sys
> In [2]: sys.version
> Out[2]: '3.6.6 (default, Sep 12 2018, 18:26:19) \n[GCC 8.0.1 20180414 
> (experimental) [trunk revision 259383]]'
> In [3]: from airflow.contrib.hooks.bigquery_hook import BigQueryHook
> In [4]: hook = BigQueryHook()
> In [5]: conn = hook.get_conn()
> [2018-11-15 19:01:35,856] {discovery.py:267} INFO - URL being requested: GET 
> https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
> In [6]: cur = conn.cursor()
> In [7]: cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ---
> AttributeErrorTraceback (most recent call last)
>  in 
> > 1 cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> execute(self, operation, parameters)
>1561 """
>1562 sql = _bind_parameters(operation,
> -> 1563parameters) if parameters else 
> operation
>1564 self.job_id = self.run_query(sql)
>1565
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> _bind_parameters(operation, parameters)
>1684 # inspired by MySQL Python Connector (conversion.py)
>1685 string_parameters = {}
> -> 1686 for (name, value) in parameters.iteritems():
>1687 if value is None:
>1688 string_parameters[name] = 'NULL'
> AttributeError: 'dict' object has no attribute 'iteritems'
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3355) Fix BigQueryCursor.execute to work with Python3

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690077#comment-16690077
 ] 

ASF GitHub Bot commented on AIRFLOW-3355:
-

kaxil closed pull request #4198: [AIRFLOW-3355] Fix BigQueryCursor.execute to 
work with Python3
URL: https://github.com/apache/incubator-airflow/pull/4198
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/bigquery_hook.py 
b/airflow/contrib/hooks/bigquery_hook.py
index d300dbe6b7..98e66c405c 100644
--- a/airflow/contrib/hooks/bigquery_hook.py
+++ b/airflow/contrib/hooks/bigquery_hook.py
@@ -25,6 +25,7 @@
 import time
 from builtins import range
 from copy import deepcopy
+from six import iteritems
 
 from past.builtins import basestring
 
@@ -1683,7 +1684,7 @@ def _bind_parameters(operation, parameters):
 """ Helper method that binds parameters to a SQL query. """
 # inspired by MySQL Python Connector (conversion.py)
 string_parameters = {}
-for (name, value) in parameters.iteritems():
+for (name, value) in iteritems(parameters):
 if value is None:
 string_parameters[name] = 'NULL'
 elif isinstance(value, basestring):
diff --git a/tests/contrib/hooks/test_bigquery_hook.py 
b/tests/contrib/hooks/test_bigquery_hook.py
index 8f350ff2ee..82bd00e4f4 100644
--- a/tests/contrib/hooks/test_bigquery_hook.py
+++ b/tests/contrib/hooks/test_bigquery_hook.py
@@ -295,6 +295,14 @@ def test_duplication_check(self):
 "key_one", key_one, {"key_one": True}))
 
 
+class TestBigQueryCursor(unittest.TestCase):
+@mock.patch.object(hook.BigQueryBaseCursor, 'run_with_configuration')
+def test_execute_with_parameters(self, mocked_rwc):
+hook.BigQueryCursor("test", "test").execute(
+"SELECT %(foo)s", {"foo": "bar"})
+mocked_rwc.assert_called_once()
+
+
 class TestLabelsInRunJob(unittest.TestCase):
 @mock.patch.object(hook.BigQueryBaseCursor, 'run_with_configuration')
 def test_run_query_with_arg(self, mocked_rwc):


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Fix BigQueryCursor.execute to work with Python3
> ---
>
> Key: AIRFLOW-3355
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3355
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp, hooks
>Reporter: Kengo Seki
>Assignee: Kengo Seki
>Priority: Major
> Fix For: 2.0.0
>
>
> {{BigQueryCursor.execute}} uses {{dict.iteritems}} internally, so it fails 
> with Python3 if binding parameters are provided.
> {code}
> In [1]: import sys
> In [2]: sys.version
> Out[2]: '3.6.6 (default, Sep 12 2018, 18:26:19) \n[GCC 8.0.1 20180414 
> (experimental) [trunk revision 259383]]'
> In [3]: from airflow.contrib.hooks.bigquery_hook import BigQueryHook
> In [4]: hook = BigQueryHook()
> In [5]: conn = hook.get_conn()
> [2018-11-15 19:01:35,856] {discovery.py:267} INFO - URL being requested: GET 
> https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
> In [6]: cur = conn.cursor()
> In [7]: cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ---
> AttributeErrorTraceback (most recent call last)
>  in 
> > 1 cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> execute(self, operation, parameters)
>1561 """
>1562 sql = _bind_parameters(operation,
> -> 1563parameters) if parameters else 
> operation
>1564 self.job_id = self.run_query(sql)
>1565
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> _bind_parameters(operation, parameters)
>1684 # inspired by MySQL Python Connector (conversion.py)
>1685 string_parameters = {}
> -> 1686 for (name, value) in parameters.iteritems():
>1687 if value is None:
>1688 string_parameters[name] = 'NULL'
> AttributeError: 'dict' object has no attribute 'iteritems'
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3355) Fix BigQueryCursor.execute to work with Python3

2018-11-16 Thread Kaxil Naik (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik updated AIRFLOW-3355:

Fix Version/s: (was: 1.10.2)
   2.0.0

> Fix BigQueryCursor.execute to work with Python3
> ---
>
> Key: AIRFLOW-3355
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3355
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp, hooks
>Reporter: Kengo Seki
>Assignee: Kengo Seki
>Priority: Major
> Fix For: 2.0.0
>
>
> {{BigQueryCursor.execute}} uses {{dict.iteritems}} internally, so it fails 
> with Python3 if binding parameters are provided.
> {code}
> In [1]: import sys
> In [2]: sys.version
> Out[2]: '3.6.6 (default, Sep 12 2018, 18:26:19) \n[GCC 8.0.1 20180414 
> (experimental) [trunk revision 259383]]'
> In [3]: from airflow.contrib.hooks.bigquery_hook import BigQueryHook
> In [4]: hook = BigQueryHook()
> In [5]: conn = hook.get_conn()
> [2018-11-15 19:01:35,856] {discovery.py:267} INFO - URL being requested: GET 
> https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
> In [6]: cur = conn.cursor()
> In [7]: cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ---
> AttributeErrorTraceback (most recent call last)
>  in 
> > 1 cur.execute("SELECT count(*) FROM ds.t WHERE c = %(v)d", {"v": 0})
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> execute(self, operation, parameters)
>1561 """
>1562 sql = _bind_parameters(operation,
> -> 1563parameters) if parameters else 
> operation
>1564 self.job_id = self.run_query(sql)
>1565
> ~/dev/incubator-airflow/airflow/contrib/hooks/bigquery_hook.py in 
> _bind_parameters(operation, parameters)
>1684 # inspired by MySQL Python Connector (conversion.py)
>1685 string_parameters = {}
> -> 1686 for (name, value) in parameters.iteritems():
>1687 if value is None:
>1688 string_parameters[name] = 'NULL'
> AttributeError: 'dict' object has no attribute 'iteritems'
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil closed pull request #4198: [AIRFLOW-3355] Fix BigQueryCursor.execute to work with Python3

2018-11-16 Thread GitBox
kaxil closed pull request #4198: [AIRFLOW-3355] Fix BigQueryCursor.execute to 
work with Python3
URL: https://github.com/apache/incubator-airflow/pull/4198
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/bigquery_hook.py 
b/airflow/contrib/hooks/bigquery_hook.py
index d300dbe6b7..98e66c405c 100644
--- a/airflow/contrib/hooks/bigquery_hook.py
+++ b/airflow/contrib/hooks/bigquery_hook.py
@@ -25,6 +25,7 @@
 import time
 from builtins import range
 from copy import deepcopy
+from six import iteritems
 
 from past.builtins import basestring
 
@@ -1683,7 +1684,7 @@ def _bind_parameters(operation, parameters):
 """ Helper method that binds parameters to a SQL query. """
 # inspired by MySQL Python Connector (conversion.py)
 string_parameters = {}
-for (name, value) in parameters.iteritems():
+for (name, value) in iteritems(parameters):
 if value is None:
 string_parameters[name] = 'NULL'
 elif isinstance(value, basestring):
diff --git a/tests/contrib/hooks/test_bigquery_hook.py 
b/tests/contrib/hooks/test_bigquery_hook.py
index 8f350ff2ee..82bd00e4f4 100644
--- a/tests/contrib/hooks/test_bigquery_hook.py
+++ b/tests/contrib/hooks/test_bigquery_hook.py
@@ -295,6 +295,14 @@ def test_duplication_check(self):
 "key_one", key_one, {"key_one": True}))
 
 
+class TestBigQueryCursor(unittest.TestCase):
+@mock.patch.object(hook.BigQueryBaseCursor, 'run_with_configuration')
+def test_execute_with_parameters(self, mocked_rwc):
+hook.BigQueryCursor("test", "test").execute(
+"SELECT %(foo)s", {"foo": "bar"})
+mocked_rwc.assert_called_once()
+
+
 class TestLabelsInRunJob(unittest.TestCase):
 @mock.patch.object(hook.BigQueryBaseCursor, 'run_with_configuration')
 def test_run_query_with_arg(self, mocked_rwc):


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #4194: [AIRFLOW-3352] Fix showing config on RBAC UI when expose_config is False

2018-11-16 Thread GitBox
kaxil commented on a change in pull request #4194: [AIRFLOW-3352] Fix showing 
config on RBAC UI when expose_config is False
URL: https://github.com/apache/incubator-airflow/pull/4194#discussion_r234367024
 
 

 ##
 File path: tests/www_rbac/test_views.py
 ##
 @@ -448,13 +448,23 @@ def test_refresh(self):
 
 
 class TestConfigurationView(TestBase):
-def test_configuration(self):
+def test_configuration_do_not_expose_config(self):
 self.logout()
 self.login()
+conf.set("webserver", "expose_config", "False")
 
 Review comment:
   The tests should pass now


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3360) Search does not respect ShowPaused Dags querystring

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3360?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690025#comment-16690025
 ] 

ASF GitHub Bot commented on AIRFLOW-3360:
-

jongsy opened a new pull request #4201: [AIRFLOW-3360] Make the DAGs search 
respect other querystring parameters
URL: https://github.com/apache/incubator-airflow/pull/4201
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Search will append to the query string rather than replacing it. Allows 
users to decide whether they want to search for hidden DAGs.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   No existing tests around the DAGs template.
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Search does not respect ShowPaused Dags querystring
> ---
>
> Key: AIRFLOW-3360
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3360
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.0, 2.0.0
> Environment: All
>Reporter: Jonathan Burgess
>Assignee: Jonathan Burgess
>Priority: Minor
>  Labels: rbac, ui
>
> After submitting a search query on the DAGs page the showPaused parameter 
> will revert to the default. This is troublesome when using the search to find 
> paused dags if the default is ShowPaused is False.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] jongsy opened a new pull request #4201: [AIRFLOW-3360] Make the DAGs search respect other querystring parameters

2018-11-16 Thread GitBox
jongsy opened a new pull request #4201: [AIRFLOW-3360] Make the DAGs search 
respect other querystring parameters
URL: https://github.com/apache/incubator-airflow/pull/4201
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Search will append to the query string rather than replacing it. Allows 
users to decide whether they want to search for hidden DAGs.
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   No existing tests around the DAGs template.
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SamWildmo commented on issue #4030: [AIRFLOW-XXX] Log the task_id in the PendingDeprecationWarning for BaseOperator

2018-11-16 Thread GitBox
SamWildmo commented on issue #4030: [AIRFLOW-XXX] Log the task_id in the 
PendingDeprecationWarning for BaseOperator
URL: 
https://github.com/apache/incubator-airflow/pull/4030#issuecomment-439521234
 
 
   @ashb this might be usefull in 1.10.1


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3358) POC: Refactor command line to use Click

2018-11-16 Thread Kaxil Naik (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689962#comment-16689962
 ] 

Kaxil Naik commented on AIRFLOW-3358:
-

>From what I know and read, click is not fully customizable but I don't have a 
>strong opinion. I tried looking on the internet on what others are thinking:

* 
https://www.reddit.com/r/Python/comments/73xb5y/click_reviews_should_i_migrate_to_click_from/dntuqlb/
* https://medium.com/@collectiveacuity/argparse-vs-click-227f53f023dc

I would suggest asking this on the mailing list as well to get opinions of 
others who have used click and know the pros and cons compared to argparse.



> POC: Refactor command line to use Click
> ---
>
> Key: AIRFLOW-3358
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 2.0.0
>Reporter: Iuliia Volkova
>Assignee: Iuliia Volkova
>Priority: Major
>
> Hi all! 
> In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
> talk with Ashb, what will be cool to refactor the cli for getting more 
> testable and readable code.
> I want to prepare POC based on one command with implementation with Click and 
> covering with tests for discussing Airflow Cli architecture.
> Click already exists in Airflow dependencies.
> Main stimulus: 
> - Get more readable and changeable cli - for easy adding command or changing 
> commands
> - Get possible to add more tests 
>  Will be good to know your concerns about such initiative and if there are no 
> disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4189: [AIRFLOW-3346] Add hook and operator for GCP transfer service.

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4189: [AIRFLOW-3346] Add hook and 
operator for GCP transfer service.
URL: 
https://github.com/apache/incubator-airflow/pull/4189#issuecomment-439172483
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=h1)
 Report
   > Merging 
[#4189](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4189/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=tree)
   
   ```diff
   @@  Coverage Diff   @@
   ##   master   #4189   +/-   ##
   ==
 Coverage77.7%   77.7%   
   ==
 Files 199 199   
 Lines   16312   16312   
   ==
 Hits12675   12675   
 Misses   36373637
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=footer).
 Last update 
[2b707ab...137b91d](https://codecov.io/gh/apache/incubator-airflow/pull/4189?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4200: [AIRFLOW-3359] Added customer managed encryption keys as an option to…

2018-11-16 Thread GitBox
codecov-io commented on issue #4200: [AIRFLOW-3359] Added customer managed 
encryption keys as an option to…
URL: 
https://github.com/apache/incubator-airflow/pull/4200#issuecomment-439485743
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=h1)
 Report
   > Merging 
[#4200](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4200/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=tree)
   
   ```diff
   @@  Coverage Diff   @@
   ##   master   #4200   +/-   ##
   ==
 Coverage77.7%   77.7%   
   ==
 Files 199 199   
 Lines   16312   16312   
   ==
 Hits12675   12675   
 Misses   36373637
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=footer).
 Last update 
[2b707ab...3a4f7ea](https://codecov.io/gh/apache/incubator-airflow/pull/4200?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #3197: [AIRFLOW-2267] Airflow DAG level access

2018-11-16 Thread GitBox
feng-tao commented on issue #3197: [AIRFLOW-2267] Airflow DAG level access
URL: 
https://github.com/apache/incubator-airflow/pull/3197#issuecomment-439476832
 
 
   hey @gauthiermartin , just want to check with you and see if you want to 
contributing the remaining features for this part(e.g group filtering etc)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3359) Add CMEK as a disk encryption option to the Dataproc operator

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689689#comment-16689689
 ] 

ASF GitHub Bot commented on AIRFLOW-3359:
-

emailbob opened a new pull request #4200: [AIRFLOW-3359] Added customer managed 
encryption keys as an option to…
URL: https://github.com/apache/incubator-airflow/pull/4200
 
 
   … the Dataproc operator
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   We have a requirement to use customer managed keys to encrypt disks. This PR 
adds an option so the Dataproc cluster instances are created with disks 
encrypted with customer managed keys.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add CMEK as a disk encryption option to the Dataproc operator
> -
>
> Key: AIRFLOW-3359
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3359
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp, operators
>Reporter: Bob Lee
>Assignee: Bob Lee
>Priority: Minor
>
> Add customer managed encryption keys as an option to the Dataproc operator



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] emailbob opened a new pull request #4200: [AIRFLOW-3359] Added customer managed encryption keys as an option to…

2018-11-16 Thread GitBox
emailbob opened a new pull request #4200: [AIRFLOW-3359] Added customer managed 
encryption keys as an option to…
URL: https://github.com/apache/incubator-airflow/pull/4200
 
 
   … the Dataproc operator
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   We have a requirement to use customer managed keys to encrypt disks. This PR 
adds an option so the Dataproc cluster instances are created with disks 
encrypted with customer managed keys.
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4182: [AIRFLOW-3336] Add new TriggerRule that will consider skipped ancestors as success

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4182: [AIRFLOW-3336] Add new TriggerRule 
that will consider skipped ancestors as success 
URL: 
https://github.com/apache/incubator-airflow/pull/4182#issuecomment-439204135
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=h1)
 Report
   > Merging 
[#4182](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **decrease** coverage by `0.01%`.
   > The diff coverage is `60%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4182/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4182  +/-   ##
   ==
   - Coverage77.7%   77.69%   -0.02% 
   ==
 Files 199  199  
 Lines   1631216322  +10 
   ==
   + Hits1267512681   +6 
   - Misses   3637 3641   +4
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/trigger\_rule.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy90cmlnZ2VyX3J1bGUucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/ti\_deps/deps/trigger\_rule\_dep.py](https://codecov.io/gh/apache/incubator-airflow/pull/4182/diff?src=pr=tree#diff-YWlyZmxvdy90aV9kZXBzL2RlcHMvdHJpZ2dlcl9ydWxlX2RlcC5weQ==)
 | `90.14% <55.55%> (-5.03%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=footer).
 Last update 
[2b707ab...62e8227](https://codecov.io/gh/apache/incubator-airflow/pull/4182?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3359) Add CMEK as a disk encryption option to the Dataproc operator

2018-11-16 Thread Bob Lee (JIRA)
Bob Lee created AIRFLOW-3359:


 Summary: Add CMEK as a disk encryption option to the Dataproc 
operator
 Key: AIRFLOW-3359
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3359
 Project: Apache Airflow
  Issue Type: New Feature
  Components: gcp, operators
Reporter: Bob Lee
Assignee: Bob Lee


Add customer managed encryption keys as an option to the Dataproc operator



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] rmn36 commented on issue #4182: [AIRFLOW-3336] Add new TriggerRule that will consider skipped ancestors as success

2018-11-16 Thread GitBox
rmn36 commented on issue #4182: [AIRFLOW-3336] Add new TriggerRule that will 
consider skipped ancestors as success 
URL: 
https://github.com/apache/incubator-airflow/pull/4182#issuecomment-439445281
 
 
   @dlamblin updated name to be `none_failed`. Others had requested that as 
well. Also fixed the documentation issue. I'm not sure exactly what you mean 
about the `example_short_circuit_operator.py` though. can you explain further 
and I'll look into it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-987) `airflow kerberos` ignores --keytab and --principal arguments

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689493#comment-16689493
 ] 

Iuliia Volkova commented on AIRFLOW-987:


[~pratap20] this issue about the command line, you define your settings in 
config file

> `airflow kerberos` ignores --keytab and --principal arguments
> -
>
> Key: AIRFLOW-987
> URL: https://issues.apache.org/jira/browse/AIRFLOW-987
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: security
>Affects Versions: 1.8.0
> Environment: 1.8-rc5
>Reporter: Ruslan Dautkhanov
>Assignee: Pratap20
>Priority: Major
>  Labels: easyfix, kerberos, security
>
> No matter which arguments I pass to `airflow kerberos`, 
> it always executes as `kinit -r 3600m -k -t airflow.keytab -c 
> /tmp/airflow_krb5_ccache airflow`
> So it failes with expected "kinit: Keytab contains no suitable keys for 
> airf...@corp.some.com while getting initial credentials"
> Tried different arguments, -kt and --keytab, here's one of the runs (some 
> lines wrapped for readability):
> {noformat}
> $ airflow kerberos -kt /home/rdautkha/.keytab rdautkha...@corp.some.com
> [2017-03-14 23:50:11,523] {__init__.py:57} INFO - Using executor LocalExecutor
> [2017-03-14 23:50:12,069] {kerberos.py:43} INFO - Reinitting kerberos from 
> keytab: 
> kinit -r 3600m -k -t airflow.keytab -c /tmp/airflow_krb5_ccache airflow
> [2017-03-14 23:50:12,080] {kerberos.py:55} ERROR -
>  Couldn't reinit from keytab! `kinit' exited with 1.
> kinit: Keytab contains no suitable keys for airf...@corp.some.com 
> while getting initial credentials
> {noformat}
> 1.8-rc5



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1945) Pass --autoscale to celery workers

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1945?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689475#comment-16689475
 ] 

Iuliia Volkova commented on AIRFLOW-1945:
-

[~ashb], [~Fokko], please close the task, PR was already merged - 
https://github.com/apache/incubator-airflow/pull/3989/files 

> Pass --autoscale to celery workers
> --
>
> Key: AIRFLOW-1945
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1945
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: celery, cli
>Reporter: Michael O.
>Assignee: Sai Phanindhra
>Priority: Trivial
>  Labels: easyfix
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> Celery supports autoscaling of the worker pool size (number of tasks that can 
> parallelize within one worker node).  I'd like to propose to support passing 
> the --autoscale parameter to {{airflow worker}}.
> Since this is a trivial change, I am not sure if there's any reason for not 
> being supported already.(?)
> For example
> {{airflow worker --concurrency=4}} will set a fixed pool size of 4.
> With minimal changes in 
> [https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855]
>  it could support
> {{airflow worker --autoscale=2,10}} to set an autoscaled pool size of 2 to 10
> Some references:
> * 
> http://docs.celeryproject.org/en/latest/internals/reference/celery.worker.autoscale.html
> * 
> https://github.com/apache/incubator-airflow/blob/4ce4faaeae7a76d97defcf9a9d3304ac9d78b9bd/airflow/bin/cli.py#L855



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-390) [AIRFLOW-Don't load example dags by default]

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-390?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689486#comment-16689486
 ] 

Iuliia Volkova commented on AIRFLOW-390:


[~ashb], [~Fokko], [~bolke], any concerns on this scope? Could we set up 
'load_examples = False' by default?

> [AIRFLOW-Don't load example dags by default]
> 
>
> Key: AIRFLOW-390
> URL: https://issues.apache.org/jira/browse/AIRFLOW-390
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Sunny Sun
>Priority: Trivial
>  Labels: easyfix
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Load examples should by default be set to False, so they are not 
> automatically deployed into production environments. This is especially heavy 
> because the twitter example dag requires Hive, which users may or may not use 
> in their own deployments.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4090: [AIRFLOW-3250] Fix for Redis Hook for not authorised connection calls

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4090: [AIRFLOW-3250] Fix for Redis Hook 
for not authorised connection calls
URL: 
https://github.com/apache/incubator-airflow/pull/4090#issuecomment-437704538
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=h1)
 Report
   > Merging 
[#4090](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **decrease** coverage by `0.04%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4090/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4090  +/-   ##
   ==
   - Coverage77.7%   77.66%   -0.05% 
   ==
 Files 199  199  
 Lines   1631216273  -39 
   ==
   - Hits1267512638  -37 
   + Misses   3637 3635   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `92.61% <0%> (-0.15%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (-0.15%)` | :arrow_down: |
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `97.05% <0%> (-0.03%)` | :arrow_down: |
   | 
[airflow/operators/bash\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaF9vcGVyYXRvci5weQ==)
 | `91.37% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `97.67% <0%> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.82% <0%> (+0.23%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=footer).
 Last update 
[2b707ab...f68a61a](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4090: [AIRFLOW-3250] Fix for Redis Hook for not authorised connection calls

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4090: [AIRFLOW-3250] Fix for Redis Hook 
for not authorised connection calls
URL: 
https://github.com/apache/incubator-airflow/pull/4090#issuecomment-437704538
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=h1)
 Report
   > Merging 
[#4090](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/2b707aba3cf7aa78fff81065432e9fbebf3b15ca?src=pr=desc)
 will **decrease** coverage by `0.04%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4090/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4090  +/-   ##
   ==
   - Coverage77.7%   77.66%   -0.05% 
   ==
 Files 199  199  
 Lines   1631216273  -39 
   ==
   - Hits1267512638  -37 
   + Misses   3637 3635   -2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `92.61% <0%> (-0.15%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.19% <0%> (-0.15%)` | :arrow_down: |
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `97.05% <0%> (-0.03%)` | :arrow_down: |
   | 
[airflow/operators/bash\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaF9vcGVyYXRvci5weQ==)
 | `91.37% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `97.67% <0%> (ø)` | :arrow_up: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/4090/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.82% <0%> (+0.23%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=footer).
 Last update 
[2b707ab...f68a61a](https://codecov.io/gh/apache/incubator-airflow/pull/4090?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-801) Outdated docstring on baseclass

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-801?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689468#comment-16689468
 ] 

Iuliia Volkova commented on AIRFLOW-801:


[~jackjack10] PR was merged, changes in master, a task should be closed 
[~dseisun][~ashb]

> Outdated docstring on baseclass
> ---
>
> Key: AIRFLOW-801
> URL: https://issues.apache.org/jira/browse/AIRFLOW-801
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Daniel Seisun
>Assignee: Kengo Seki
>Priority: Trivial
>
> The docstring of the BaseOperator still makes reference to it inheriting from 
> SQL Alchemy's Base class, which it no longer does. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3307) Update insecure node dependencies

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689460#comment-16689460
 ] 

Iuliia Volkova commented on AIRFLOW-3307:
-

[~jmcarp], please, do not forget to close the task if PR was merged ) Thank you!

> Update insecure node dependencies
> -
>
> Key: AIRFLOW-3307
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3307
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
>
> `npm audit` shows some node dependencies that are out of date and potentially 
> insecure. We should update them with `npm audit fix`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3306) Disable unused flask-sqlalchemy modification tracking

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689458#comment-16689458
 ] 

Iuliia Volkova commented on AIRFLOW-3306:
-

[~jmcarp], please, do not forget to close the task if PR was merged ) Thank you!

> Disable unused flask-sqlalchemy modification tracking
> -
>
> Key: AIRFLOW-3306
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3306
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
>
> By default, flask-sqlalchemy tracks model changes for its event system, which 
> adds some overhead. Since I don't think we're using the flask-sqlalchemy 
> event system, we should be able to turn off modification tracking and improve 
> performance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1822) Add gaiohttp and gthread gunicorn workerclass option in cli

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1822?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689450#comment-16689450
 ] 

Iuliia Volkova commented on AIRFLOW-1822:
-

covered in this PR: https://github.com/apache/incubator-airflow/pull/4174 

> Add gaiohttp and gthread gunicorn workerclass option in cli
> ---
>
> Key: AIRFLOW-1822
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1822
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: webserver
>Reporter: Sanjay Pillai
>Assignee: Iuliia Volkova
>Priority: Minor
>
> gunicorn in min version has been updated to 19.40 
> we need to add cli support for gthread and gaiohttp worker class



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1592) Add keep-alive argument supported by gunicorn backend to the airflow configuration

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689449#comment-16689449
 ] 

Iuliia Volkova commented on AIRFLOW-1592:
-

covered in this PR: https://github.com/apache/incubator-airflow/pull/4174 

> Add keep-alive argument supported by gunicorn backend to the airflow 
> configuration
> --
>
> Key: AIRFLOW-1592
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1592
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Demian Ginther
>Assignee: Iuliia Volkova
>Priority: Minor
>
> The --keep-alive option is necessary for gunicorn to function properly with 
> AWS ELBs, as gunicorn appears to have an issue with the ELB timeouts as set 
> by default.
> In addition, it makes no sense to provide a wrapper for a program but not 
> allow all configuration options to be set.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3251) KubernetesPodOperator does not use 'image_pull_secrets' argument

2018-11-16 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3251.

   Resolution: Fixed
Fix Version/s: 1.10.2
   2.0.0

> KubernetesPodOperator does not use 'image_pull_secrets' argument
> 
>
> Key: AIRFLOW-3251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3251
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.0
>Reporter: Padarn Wilson
>Assignee: Padarn Wilson
>Priority: Minor
> Fix For: 2.0.0, 1.10.2
>
>
> The KubernetesPodOperator accepts argument `image_pull_secrets`, and the Pod 
> object can use this variable when deploying the pod, but currently the 
> argument to the operator is not added to the Pod before it is launched
> Relevant code in 
> `incubator-airflow/airflow/contrib/operators/kubernetes_pod_operator.py`. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb closed pull request #4188: [AIRFLOW-3251] KubernetesPodOperator does not use 'image_pull_secrets…

2018-11-16 Thread GitBox
ashb closed pull request #4188: [AIRFLOW-3251] KubernetesPodOperator does not 
use 'image_pull_secrets…
URL: https://github.com/apache/incubator-airflow/pull/4188
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/kubernetes/pod.py 
b/airflow/contrib/kubernetes/pod.py
index 5de23ff5bd..bad5caa738 100644
--- a/airflow/contrib/kubernetes/pod.py
+++ b/airflow/contrib/kubernetes/pod.py
@@ -54,6 +54,10 @@ class Pod:
 :type result: any
 :param image_pull_policy: Specify a policy to cache or always pull an image
 :type image_pull_policy: str
+:param image_pull_secrets: Any image pull secrets to be given to the pod.
+   If more than one secret is required, provide a
+   comma separated list: secret_a,secret_b
+:type image_pull_secrets: str
 :param affinity: A dict containing a group of affinity scheduling rules
 :type affinity: dict
 """
diff --git a/airflow/contrib/operators/kubernetes_pod_operator.py 
b/airflow/contrib/operators/kubernetes_pod_operator.py
index d4f1013876..99c6da11b3 100644
--- a/airflow/contrib/operators/kubernetes_pod_operator.py
+++ b/airflow/contrib/operators/kubernetes_pod_operator.py
@@ -45,6 +45,12 @@ class KubernetesPodOperator(BaseOperator):
 :param arguments: arguments of to the entrypoint. (templated)
 The docker image's CMD is used if this is not provided.
 :type arguments: list of str
+:param image_pull_policy: Specify a policy to cache or always pull an image
+:type image_pull_policy: str
+:param image_pull_secrets: Any image pull secrets to be given to the pod.
+   If more than one secret is required, provide a
+   comma separated list: secret_a,secret_b
+:type image_pull_secrets: str
 :param volume_mounts: volumeMounts for launched pod
 :type volume_mounts: list of VolumeMount
 :param volumes: volumes for launched pod. Includes ConfigMaps and 
PersistentVolumes
@@ -108,6 +114,7 @@ def execute(self, context):
 pod.secrets = self.secrets
 pod.envs = self.env_vars
 pod.image_pull_policy = self.image_pull_policy
+pod.image_pull_secrets = self.image_pull_secrets
 pod.annotations = self.annotations
 pod.resources = self.resources
 pod.affinity = self.affinity
diff --git a/tests/contrib/minikube/test_kubernetes_pod_operator.py 
b/tests/contrib/minikube/test_kubernetes_pod_operator.py
index f39fcdb03d..f808a2f47f 100644
--- a/tests/contrib/minikube/test_kubernetes_pod_operator.py
+++ b/tests/contrib/minikube/test_kubernetes_pod_operator.py
@@ -84,6 +84,29 @@ def test_config_path(self, client_mock, launcher_mock):
cluster_context='default',
config_file=file_path)
 
+@mock.patch("airflow.contrib.kubernetes.pod_launcher.PodLauncher.run_pod")
+@mock.patch("airflow.contrib.kubernetes.kube_client.get_kube_client")
+def test_image_pull_secrets_correctly_set(self, client_mock, 
launcher_mock):
+from airflow.utils.state import State
+
+fake_pull_secrets = "fakeSecret"
+k = KubernetesPodOperator(
+namespace='default',
+image="ubuntu:16.04",
+cmds=["bash", "-cx"],
+arguments=["echo 10"],
+labels={"foo": "bar"},
+name="test",
+task_id="task",
+image_pull_secrets=fake_pull_secrets,
+in_cluster=False,
+cluster_context='default'
+)
+launcher_mock.return_value = (State.SUCCESS, None)
+k.execute(None)
+self.assertEqual(launcher_mock.call_args[0][0].image_pull_secrets,
+ fake_pull_secrets)
+
 @staticmethod
 def test_working_pod():
 k = KubernetesPodOperator(


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4199: [AIRFLOW-251] Add option 
SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…
URL: 
https://github.com/apache/incubator-airflow/pull/4199#issuecomment-439390637
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=h1)
 Report
   > Merging 
[#4199](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `75%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4199/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4199  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16312   +3 
   =
   + Hits12672   12675   +3 
 Misses   36373637
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4199/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <75%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=footer).
 Last update 
[8668ef8...e7c1ddc](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…

2018-11-16 Thread GitBox
codecov-io commented on issue #4199: [AIRFLOW-251] Add option 
SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…
URL: 
https://github.com/apache/incubator-airflow/pull/4199#issuecomment-439390637
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=h1)
 Report
   > Merging 
[#4199](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `75%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4199/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4199  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16312   +3 
   =
   + Hits12672   12675   +3 
 Misses   36373637
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4199/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <75%> (ø)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=footer).
 Last update 
[8668ef8...e7c1ddc](https://codecov.io/gh/apache/incubator-airflow/pull/4199?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-16 Thread GitBox
phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and 
hook
URL: 
https://github.com/apache/incubator-airflow/pull/4111#issuecomment-439390099
 
 
   Thank you very much @ashb for guided me through the right path. Thanks 
@fokko @kaxil @ckljohn  @XD-DENG for reviewing this PR. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-251) Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata repository

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-251?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689393#comment-16689393
 ] 

ASF GitHub Bot commented on AIRFLOW-251:


ashb closed pull request #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA 
parameter to use SQL Serv…
URL: https://github.com/apache/incubator-airflow/pull/4199
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 4d73fdf51d..a9473178c1 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -106,6 +106,10 @@ sql_alchemy_pool_recycle = 1800
 # disconnects. Setting this to 0 disables retries.
 sql_alchemy_reconnect_timeout = 300
 
+# The schema to use for the metadata database
+# SqlAlchemy supports databases with the concept of multiple schemas.
+sql_alchemy_schema =
+
 # The amount of parallelism as a setting to the executor. This defines
 # the max number of task instances that should run simultaneously
 # on this airflow installation
diff --git a/airflow/models.py b/airflow/models.py
index 92440a81a9..bb068499fe 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -64,7 +64,7 @@
 
 from sqlalchemy import (
 Boolean, Column, DateTime, Float, ForeignKey, ForeignKeyConstraint, Index,
-Integer, LargeBinary, PickleType, String, Text, UniqueConstraint,
+Integer, LargeBinary, PickleType, String, Text, UniqueConstraint, MetaData,
 and_, asc, func, or_, true as sqltrue
 )
 from sqlalchemy.ext.declarative import declarative_base, declared_attr
@@ -108,7 +108,13 @@
 
 install_aliases()
 
-Base = declarative_base()
+SQL_ALCHEMY_SCHEMA = configuration.get('core', 'SQL_ALCHEMY_SCHEMA')
+
+if not SQL_ALCHEMY_SCHEMA or SQL_ALCHEMY_SCHEMA.isspace():
+Base = declarative_base()
+else:
+Base = declarative_base(metadata=MetaData(schema=SQL_ALCHEMY_SCHEMA))
+
 ID_LEN = 250
 XCOM_RETURN_KEY = 'return_value'
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata 
> repository
> ---
>
> Key: AIRFLOW-251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-251
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Reporter: Ed Parcell
>Assignee: Iuliia Volkova
>Priority: Minor
> Fix For: 2.0.0
>
>
> Using SQL Server as a database for metadata, it is preferable to group all 
> Airflow tables into a separate schema, rather than using dbo. I propose 
> adding an optional parameter SQL_ALCHEMY_SCHEMA to control this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3266) AWS Athena Operator in Airflow

2018-11-16 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3266?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3266.

   Resolution: Fixed
Fix Version/s: 2.0.0

> AWS Athena Operator in Airflow
> --
>
> Key: AIRFLOW-3266
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3266
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: aws
>Affects Versions: 1.10.0
>Reporter: Sai Phanindhra
>Assignee: Sai Phanindhra
>Priority: Minor
> Fix For: 2.0.0
>
>
> There is no official athena operator as of now airflow. Either one has do it 
> using boto3 in python operator or using aws cli in bash operator. Either of 
> these does not take care of total life cycle of query. Create a Athena 
> operator and hook to submit presto query and update task based of state of 
> submitted query.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3266) AWS Athena Operator in Airflow

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3266?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689398#comment-16689398
 ] 

ASF GitHub Bot commented on AIRFLOW-3266:
-

ashb closed pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_athena_hook.py 
b/airflow/contrib/hooks/aws_athena_hook.py
new file mode 100644
index 00..f11ff23c51
--- /dev/null
+++ b/airflow/contrib/hooks/aws_athena_hook.py
@@ -0,0 +1,150 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from time import sleep
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSAthenaHook(AwsHook):
+"""
+Interact with AWS Athena to run, poll queries and return query results
+
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+INTERMEDIATE_STATES = ('QUEUED', 'RUNNING',)
+FAILURE_STATES = ('FAILED', 'CANCELLED',)
+SUCCESS_STATES = ('SUCCEEDED',)
+
+def __init__(self, aws_conn_id='aws_default', sleep_time=30, *args, 
**kwargs):
+super(AWSAthenaHook, self).__init__(aws_conn_id, **kwargs)
+self.sleep_time = sleep_time
+self.conn = None
+
+def get_conn(self):
+"""
+check if aws conn exists already or create one and return it
+
+:return: boto3 session
+"""
+if not self.conn:
+self.conn = self.get_client_type('athena')
+return self.conn
+
+def run_query(self, query, query_context, result_configuration, 
client_request_token=None):
+"""
+Run Presto query on athena with provided config and return submitted 
query_execution_id
+
+:param query: Presto query to run
+:type query: str
+:param query_context: Context in which query need to be run
+:type query_context: dict
+:param result_configuration: Dict with path to store results in and 
config related to encryption
+:type result_configuration: dict
+:param client_request_token: Unique token created by user to avoid 
multiple executions of same query
+:type client_request_token: str
+:return: str
+"""
+response = self.conn.start_query_execution(QueryString=query,
+   
ClientRequestToken=client_request_token,
+   
QueryExecutionContext=query_context,
+   
ResultConfiguration=result_configuration)
+query_execution_id = response['QueryExecutionId']
+return query_execution_id
+
+def check_query_status(self, query_execution_id):
+"""
+Fetch the status of submitted athena query. Returns None or one of 
valid query states.
+
+:param query_execution_id: Id of submitted athena query
+:type query_execution_id: str
+:return: str
+"""
+response = 
self.conn.get_query_execution(QueryExecutionId=query_execution_id)
+state = None
+try:
+state = response['QueryExecution']['Status']['State']
+except Exception as ex:
+self.log.error('Exception while getting query state', ex)
+finally:
+return state
+
+def get_query_results(self, query_execution_id):
+"""
+Fetch submitted athena query results. returns none if query is in 
intermediate state or
+failed/cancelled state else dict of query output
+
+:param query_execution_id: Id of submitted athena query
+:type query_execution_id: str
+:return: dict
+"""
+query_state = self.check_query_status(query_execution_id)
+ 

[GitHub] ashb closed pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-16 Thread GitBox
ashb closed pull request #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook
URL: https://github.com/apache/incubator-airflow/pull/4111
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/hooks/aws_athena_hook.py 
b/airflow/contrib/hooks/aws_athena_hook.py
new file mode 100644
index 00..f11ff23c51
--- /dev/null
+++ b/airflow/contrib/hooks/aws_athena_hook.py
@@ -0,0 +1,150 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from time import sleep
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSAthenaHook(AwsHook):
+"""
+Interact with AWS Athena to run, poll queries and return query results
+
+:param aws_conn_id: aws connection to use.
+:type aws_conn_id: str
+:param sleep_time: Time to wait between two consecutive call to check 
query status on athena
+:type sleep_time: int
+"""
+
+INTERMEDIATE_STATES = ('QUEUED', 'RUNNING',)
+FAILURE_STATES = ('FAILED', 'CANCELLED',)
+SUCCESS_STATES = ('SUCCEEDED',)
+
+def __init__(self, aws_conn_id='aws_default', sleep_time=30, *args, 
**kwargs):
+super(AWSAthenaHook, self).__init__(aws_conn_id, **kwargs)
+self.sleep_time = sleep_time
+self.conn = None
+
+def get_conn(self):
+"""
+check if aws conn exists already or create one and return it
+
+:return: boto3 session
+"""
+if not self.conn:
+self.conn = self.get_client_type('athena')
+return self.conn
+
+def run_query(self, query, query_context, result_configuration, 
client_request_token=None):
+"""
+Run Presto query on athena with provided config and return submitted 
query_execution_id
+
+:param query: Presto query to run
+:type query: str
+:param query_context: Context in which query need to be run
+:type query_context: dict
+:param result_configuration: Dict with path to store results in and 
config related to encryption
+:type result_configuration: dict
+:param client_request_token: Unique token created by user to avoid 
multiple executions of same query
+:type client_request_token: str
+:return: str
+"""
+response = self.conn.start_query_execution(QueryString=query,
+   
ClientRequestToken=client_request_token,
+   
QueryExecutionContext=query_context,
+   
ResultConfiguration=result_configuration)
+query_execution_id = response['QueryExecutionId']
+return query_execution_id
+
+def check_query_status(self, query_execution_id):
+"""
+Fetch the status of submitted athena query. Returns None or one of 
valid query states.
+
+:param query_execution_id: Id of submitted athena query
+:type query_execution_id: str
+:return: str
+"""
+response = 
self.conn.get_query_execution(QueryExecutionId=query_execution_id)
+state = None
+try:
+state = response['QueryExecution']['Status']['State']
+except Exception as ex:
+self.log.error('Exception while getting query state', ex)
+finally:
+return state
+
+def get_query_results(self, query_execution_id):
+"""
+Fetch submitted athena query results. returns none if query is in 
intermediate state or
+failed/cancelled state else dict of query output
+
+:param query_execution_id: Id of submitted athena query
+:type query_execution_id: str
+:return: dict
+"""
+query_state = self.check_query_status(query_execution_id)
+if query_state is None:
+self.log.error('Invalid Query state')
+return None
+elif query_state in self.INTERMEDIATE_STATES or query_state in 
self.FAILURE_STATES:
+self.log.error('Query is in 

[jira] [Commented] (AIRFLOW-3354) Scheduler compares offset-naive and offset-aware dates

2018-11-16 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689397#comment-16689397
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3354:


{{start_date}} controls when a task is valid to be scheduled from - specifying 
it as {{now()}} is a bug and not something an operator should do.

What were you trying to achieve by specifying that in the operator? Bear in 
mind that Airflow parses your dag each and every time it runs tasks (and more 
often too) so this value is constantly changing.

> Scheduler compares offset-naive and offset-aware dates
> --
>
> Key: AIRFLOW-3354
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3354
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.1
>Reporter: Jakub Powierza
>Priority: Major
>
> New version of Airflow (1.10.1rc1 and 1.10.1rc2) tries to compare 
> offset-naive and offset-aware dates in Scheduler. I've tested a simple case 
> with schedule set to "*/10 * * * *". I've tried to clean my developer 
> instance with `airflow resetdb` and start from scratch but it does not help 
> at all. This issue does not occur on stable version 1.10.0.
> My setup: Python 3.6 on Ubuntu 14.04 with Airflow Scheduler based on Celery 
> with RabbitMQ backend.
> Exception found in Scheduler logs:
> {code:java}
> 2018-11-15 14:41:23,194:ERROR:airflow.processor:[CT=None] Got an exception! 
> Propagating...
> Traceback (most recent call last):
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 389, in helper
>  pickle_dags)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1846, in process_file
>  self._process_dags(dagbag, dags, ti_keys_to_schedule)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1426, in _process_dags
>  dag_run = self.create_dag_run(dag)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 909, in create_dag_run
>  external_trigger=False
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/models.py",
>  line 4270, in create_dagrun
>  run.verify_integrity(session=session)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 70, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/models.py",
>  line 5215, in verify_integrity
>  if task.start_date > self.execution_date and not self.is_backfill:
> TypeError: can't compare offset-naive and offset-aware datetimes
> Process DagFileProcessor40-Process:
> Traceback (most recent call last):
>  File "/usr/lib/python3.6/multiprocessing/process.py", line 249, in _bootstrap
>  self.run()
>  File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
>  self._target(*self._args, **self._kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 389, in helper
>  pickle_dags)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1846, in process_file
>  self._process_dags(dagbag, dags, ti_keys_to_schedule)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1426, in _process_dags
>  dag_run = self.create_dag_run(dag)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 909, in create_dag_run
>  external_trigger=False
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/models.py",
>  line 4270, in create_dagrun
>  run.verify_integrity(session=session)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 70, in wrapper
>  return func(*args, **kwargs)
>  File 
> 

[GitHub] phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-16 Thread GitBox
phani8996 commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and 
hook
URL: 
https://github.com/apache/incubator-airflow/pull/4111#issuecomment-439387510
 
 
   > Assuming it was just formatting changes since I last looked LGTM.
   
   Changes are new line in docstring. Apart from that everything else is same. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-251) Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata repository

2018-11-16 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-251:
--
Affects Version/s: (was: 2.0.0)

> Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata 
> repository
> ---
>
> Key: AIRFLOW-251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-251
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Reporter: Ed Parcell
>Assignee: Iuliia Volkova
>Priority: Minor
> Fix For: 2.0.0
>
>
> Using SQL Server as a database for metadata, it is preferable to group all 
> Airflow tables into a separate schema, rather than using dbo. I propose 
> adding an optional parameter SQL_ALCHEMY_SCHEMA to control this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-251) Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata repository

2018-11-16 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-251.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata 
> repository
> ---
>
> Key: AIRFLOW-251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-251
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Reporter: Ed Parcell
>Assignee: Iuliia Volkova
>Priority: Minor
> Fix For: 2.0.0
>
>
> Using SQL Server as a database for metadata, it is preferable to group all 
> Airflow tables into a separate schema, rather than using dbo. I propose 
> adding an optional parameter SQL_ALCHEMY_SCHEMA to control this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb closed pull request #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…

2018-11-16 Thread GitBox
ashb closed pull request #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA 
parameter to use SQL Serv…
URL: https://github.com/apache/incubator-airflow/pull/4199
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_airflow.cfg 
b/airflow/config_templates/default_airflow.cfg
index 4d73fdf51d..a9473178c1 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -106,6 +106,10 @@ sql_alchemy_pool_recycle = 1800
 # disconnects. Setting this to 0 disables retries.
 sql_alchemy_reconnect_timeout = 300
 
+# The schema to use for the metadata database
+# SqlAlchemy supports databases with the concept of multiple schemas.
+sql_alchemy_schema =
+
 # The amount of parallelism as a setting to the executor. This defines
 # the max number of task instances that should run simultaneously
 # on this airflow installation
diff --git a/airflow/models.py b/airflow/models.py
index 92440a81a9..bb068499fe 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -64,7 +64,7 @@
 
 from sqlalchemy import (
 Boolean, Column, DateTime, Float, ForeignKey, ForeignKeyConstraint, Index,
-Integer, LargeBinary, PickleType, String, Text, UniqueConstraint,
+Integer, LargeBinary, PickleType, String, Text, UniqueConstraint, MetaData,
 and_, asc, func, or_, true as sqltrue
 )
 from sqlalchemy.ext.declarative import declarative_base, declared_attr
@@ -108,7 +108,13 @@
 
 install_aliases()
 
-Base = declarative_base()
+SQL_ALCHEMY_SCHEMA = configuration.get('core', 'SQL_ALCHEMY_SCHEMA')
+
+if not SQL_ALCHEMY_SCHEMA or SQL_ALCHEMY_SCHEMA.isspace():
+Base = declarative_base()
+else:
+Base = declarative_base(metadata=MetaData(schema=SQL_ALCHEMY_SCHEMA))
+
 ID_LEN = 250
 XCOM_RETURN_KEY = 'return_value'
 


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-251) Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata repository

2018-11-16 Thread Iuliia Volkova (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Iuliia Volkova reassigned AIRFLOW-251:
--

Assignee: Iuliia Volkova

> Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata 
> repository
> ---
>
> Key: AIRFLOW-251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-251
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 2.0.0
>Reporter: Ed Parcell
>Assignee: Iuliia Volkova
>Priority: Minor
>
> Using SQL Server as a database for metadata, it is preferable to group all 
> Airflow tables into a separate schema, rather than using dbo. I propose 
> adding an optional parameter SQL_ALCHEMY_SCHEMA to control this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-251) Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata repository

2018-11-16 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-251?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689379#comment-16689379
 ] 

ASF GitHub Bot commented on AIRFLOW-251:


xnuinside opened a new pull request #4199: [AIRFLOW-251] Add option 
SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…
URL: https://github.com/apache/incubator-airflow/pull/4199
 
 
   …er for metadata
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   Adopted PR, original: https://github.com/apache/incubator-airflow/pull/1600 
   
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add optional parameter SQL_ALCHEMY_SCHEMA to control schema for metadata 
> repository
> ---
>
> Key: AIRFLOW-251
> URL: https://issues.apache.org/jira/browse/AIRFLOW-251
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core
>Affects Versions: 2.0.0
>Reporter: Ed Parcell
>Assignee: Iuliia Volkova
>Priority: Minor
>
> Using SQL Server as a database for metadata, it is preferable to group all 
> Airflow tables into a separate schema, rather than using dbo. I propose 
> adding an optional parameter SQL_ALCHEMY_SCHEMA to control this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] xnuinside opened a new pull request #4199: [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…

2018-11-16 Thread GitBox
xnuinside opened a new pull request #4199: [AIRFLOW-251] Add option 
SQL_ALCHEMY_SCHEMA parameter to use SQL Serv…
URL: https://github.com/apache/incubator-airflow/pull/4199
 
 
   …er for metadata
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   Adopted PR, original: https://github.com/apache/incubator-airflow/pull/1600 
   
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] ashb commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook

2018-11-16 Thread GitBox
ashb commented on issue #4111: [AIRFLOW-3266] Add AWS Athena Operator and hook
URL: 
https://github.com/apache/incubator-airflow/pull/4111#issuecomment-439380529
 
 
   Assuming it was just formatting changes since I last looked LGTM.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] SamWildmo commented on issue #2538: [AIRFLOW-1491] Recover celery queue on restart

2018-11-16 Thread GitBox
SamWildmo commented on issue #2538: [AIRFLOW-1491] Recover celery queue on 
restart
URL: 
https://github.com/apache/incubator-airflow/pull/2538#issuecomment-439375716
 
 
   Is this still an issue in 1.10? I'm considring switching  to celery and 
wondring if its smart due to this bug which hasnt been fixed yet? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3358) POC: Refactor command line to use Click

2018-11-16 Thread Iuliia Volkova (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689297#comment-16689297
 ] 

Iuliia Volkova commented on AIRFLOW-3358:
-

[~sanand], [~ashb], [~bolke], [~Fokko], [~kaxilnaik], hi guys, sorry what I'm 
pulling you, but will be cool to get your comments or maybe need to ping 
somebody else from Maintainers team. And thank in advance! 

> POC: Refactor command line to use Click
> ---
>
> Key: AIRFLOW-3358
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: cli
>Affects Versions: 2.0.0
>Reporter: Iuliia Volkova
>Assignee: Iuliia Volkova
>Priority: Major
>
> Hi all! 
> In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
> talk with Ashb, what will be cool to refactor the cli for getting more 
> testable and readable code.
> I want to prepare POC based on one command with implementation with Click and 
> covering with tests for discussing Airflow Cli architecture.
> Click already exists in Airflow dependencies.
> Main stimulus: 
> - Get more readable and changeable cli - for easy adding command or changing 
> commands
> - Get possible to add more tests 
>  Will be good to know your concerns about such initiative and if there are no 
> disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-987) `airflow kerberos` ignores --keytab and --principal arguments

2018-11-16 Thread Pratap20 (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689291#comment-16689291
 ] 

Pratap20 commented on AIRFLOW-987:
--

Hi I am also facing the same issue . 

Can anyone help me out ?

I am using airflow version 1.8.0  having issue with renew kerberos ticket. 
frequently failing to renew ticket and airflow kerberos process gets exited. 

Below are error logs .

 

[2018-11-16 01:00:48,899] \{kerberos.py:43} INFO - Reinitting kerberos from 
keytab: kinit -r 3600m -k -t /home/user_test/user_test.keytab -c 
/tmp/airflow_krb5_ccache user_test
[2018-11-16 01:00:48,910] \{kerberos.py:55} ERROR - Couldn't reinit from 
keytab! `kinit' exited with 1.

kinit: Pre-authentication failed: No key table entry found for 
user_t...@prod.org while getting initial credentials

 

Kerberos configuration in airflow.cfg:

 

[kerberos]
#ccache = /tmp/airflow_krb5_ccache
# gets augmented with fqdn
principal = user_test
reinit_frequency = 3600
kinit_path = /usr/bin/kinit
keytab = /home/user_test/user_test.keytab

to start airflow kerberos process  we are using below script:

$cat startup_kerberos.sh

#! /bin/sh
# Startup Script for Airflow
echo "Starting Up Kerberos Renewer"
nohup airflow kerberos $* >> /data/airflow/logs/kerberos.logs &

 

 

> `airflow kerberos` ignores --keytab and --principal arguments
> -
>
> Key: AIRFLOW-987
> URL: https://issues.apache.org/jira/browse/AIRFLOW-987
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: security
>Affects Versions: 1.8.0
> Environment: 1.8-rc5
>Reporter: Ruslan Dautkhanov
>Assignee: Pratap20
>Priority: Major
>  Labels: easyfix, kerberos, security
>
> No matter which arguments I pass to `airflow kerberos`, 
> it always executes as `kinit -r 3600m -k -t airflow.keytab -c 
> /tmp/airflow_krb5_ccache airflow`
> So it failes with expected "kinit: Keytab contains no suitable keys for 
> airf...@corp.some.com while getting initial credentials"
> Tried different arguments, -kt and --keytab, here's one of the runs (some 
> lines wrapped for readability):
> {noformat}
> $ airflow kerberos -kt /home/rdautkha/.keytab rdautkha...@corp.some.com
> [2017-03-14 23:50:11,523] {__init__.py:57} INFO - Using executor LocalExecutor
> [2017-03-14 23:50:12,069] {kerberos.py:43} INFO - Reinitting kerberos from 
> keytab: 
> kinit -r 3600m -k -t airflow.keytab -c /tmp/airflow_krb5_ccache airflow
> [2017-03-14 23:50:12,080] {kerberos.py:55} ERROR -
>  Couldn't reinit from keytab! `kinit' exited with 1.
> kinit: Keytab contains no suitable keys for airf...@corp.some.com 
> while getting initial credentials
> {noformat}
> 1.8-rc5



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3358) POC: Refactor command line to use Click

2018-11-16 Thread Iuliia Volkova (JIRA)
Iuliia Volkova created AIRFLOW-3358:
---

 Summary: POC: Refactor command line to use Click
 Key: AIRFLOW-3358
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3358
 Project: Apache Airflow
  Issue Type: Improvement
  Components: cli
Affects Versions: 2.0.0
Reporter: Iuliia Volkova
Assignee: Iuliia Volkova


Hi all! 

In one of PR: https://github.com/apache/incubator-airflow/pull/4174 we had a 
talk with Ashb, what will be cool to refactor the cli for getting more testable 
and readable code.

I want to prepare POC based on one command with implementation with Click and 
covering with tests for discussing Airflow Cli architecture.


Click already exists in Airflow dependencies.

Main stimulus: 

- Get more readable and changeable cli - for easy adding command or changing 
commands
- Get possible to add more tests 

 Will be good to know your concerns about such initiative and if there are no 
disagrees about it, I will be happy to start POC



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Issue Comment Deleted] (AIRFLOW-987) `airflow kerberos` ignores --keytab and --principal arguments

2018-11-16 Thread Pratap20 (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pratap20 updated AIRFLOW-987:
-
Comment: was deleted

(was: Hi )

> `airflow kerberos` ignores --keytab and --principal arguments
> -
>
> Key: AIRFLOW-987
> URL: https://issues.apache.org/jira/browse/AIRFLOW-987
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: security
>Affects Versions: 1.8.0
> Environment: 1.8-rc5
>Reporter: Ruslan Dautkhanov
>Assignee: Bolke de Bruin
>Priority: Major
>  Labels: easyfix, kerberos, security
>
> No matter which arguments I pass to `airflow kerberos`, 
> it always executes as `kinit -r 3600m -k -t airflow.keytab -c 
> /tmp/airflow_krb5_ccache airflow`
> So it failes with expected "kinit: Keytab contains no suitable keys for 
> airf...@corp.some.com while getting initial credentials"
> Tried different arguments, -kt and --keytab, here's one of the runs (some 
> lines wrapped for readability):
> {noformat}
> $ airflow kerberos -kt /home/rdautkha/.keytab rdautkha...@corp.some.com
> [2017-03-14 23:50:11,523] {__init__.py:57} INFO - Using executor LocalExecutor
> [2017-03-14 23:50:12,069] {kerberos.py:43} INFO - Reinitting kerberos from 
> keytab: 
> kinit -r 3600m -k -t airflow.keytab -c /tmp/airflow_krb5_ccache airflow
> [2017-03-14 23:50:12,080] {kerberos.py:55} ERROR -
>  Couldn't reinit from keytab! `kinit' exited with 1.
> kinit: Keytab contains no suitable keys for airf...@corp.some.com 
> while getting initial credentials
> {noformat}
> 1.8-rc5



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-987) `airflow kerberos` ignores --keytab and --principal arguments

2018-11-16 Thread Pratap20 (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689283#comment-16689283
 ] 

Pratap20 commented on AIRFLOW-987:
--

Hi 

> `airflow kerberos` ignores --keytab and --principal arguments
> -
>
> Key: AIRFLOW-987
> URL: https://issues.apache.org/jira/browse/AIRFLOW-987
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: security
>Affects Versions: 1.8.0
> Environment: 1.8-rc5
>Reporter: Ruslan Dautkhanov
>Assignee: Bolke de Bruin
>Priority: Major
>  Labels: easyfix, kerberos, security
>
> No matter which arguments I pass to `airflow kerberos`, 
> it always executes as `kinit -r 3600m -k -t airflow.keytab -c 
> /tmp/airflow_krb5_ccache airflow`
> So it failes with expected "kinit: Keytab contains no suitable keys for 
> airf...@corp.some.com while getting initial credentials"
> Tried different arguments, -kt and --keytab, here's one of the runs (some 
> lines wrapped for readability):
> {noformat}
> $ airflow kerberos -kt /home/rdautkha/.keytab rdautkha...@corp.some.com
> [2017-03-14 23:50:11,523] {__init__.py:57} INFO - Using executor LocalExecutor
> [2017-03-14 23:50:12,069] {kerberos.py:43} INFO - Reinitting kerberos from 
> keytab: 
> kinit -r 3600m -k -t airflow.keytab -c /tmp/airflow_krb5_ccache airflow
> [2017-03-14 23:50:12,080] {kerberos.py:55} ERROR -
>  Couldn't reinit from keytab! `kinit' exited with 1.
> kinit: Keytab contains no suitable keys for airf...@corp.some.com 
> while getting initial credentials
> {noformat}
> 1.8-rc5



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3354) Scheduler compares offset-naive and offset-aware dates

2018-11-16 Thread Jakub Powierza (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3354?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16689197#comment-16689197
 ] 

Jakub Powierza commented on AIRFLOW-3354:
-

Here is my reproduction for this issue:
{code:java}
from typing import Mapping, Any

from datetime import datetime

from airflow import DAG
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults

default_args = {
'owner': 'gta',
'email': ['my_acco...@gmail.com'],
'retries': 3,
'start_date': datetime(2017, 3, 1),
'depends_on_past': False,
}
dag = DAG('hello_world', default_args=default_args, catchup=False, 
schedule_interval='*/1 * * * *')


class MyOperator(BaseOperator):

@apply_defaults
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.start_date = datetime.utcnow()

def execute(self, context: Mapping) -> None:
print('Hello, World!')


hello_world_op = MyOperator(task_id='hello_world_op', dag=dag)
{code}
It seems that setting start date with datetime module inside my operator breaks 
scheduler. This change fixes this issue:
{code:java}
from airflow.utils import timezone

...

class MyOperator(BaseOperator):

@apply_defaults
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.start_date = timezone.utcnow()
{code}
 

However, change in default args is not needed and it works well.

> Scheduler compares offset-naive and offset-aware dates
> --
>
> Key: AIRFLOW-3354
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3354
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.10.1
>Reporter: Jakub Powierza
>Priority: Major
>
> New version of Airflow (1.10.1rc1 and 1.10.1rc2) tries to compare 
> offset-naive and offset-aware dates in Scheduler. I've tested a simple case 
> with schedule set to "*/10 * * * *". I've tried to clean my developer 
> instance with `airflow resetdb` and start from scratch but it does not help 
> at all. This issue does not occur on stable version 1.10.0.
> My setup: Python 3.6 on Ubuntu 14.04 with Airflow Scheduler based on Celery 
> with RabbitMQ backend.
> Exception found in Scheduler logs:
> {code:java}
> 2018-11-15 14:41:23,194:ERROR:airflow.processor:[CT=None] Got an exception! 
> Propagating...
> Traceback (most recent call last):
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 389, in helper
>  pickle_dags)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1846, in process_file
>  self._process_dags(dagbag, dags, ti_keys_to_schedule)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1426, in _process_dags
>  dag_run = self.create_dag_run(dag)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 909, in create_dag_run
>  external_trigger=False
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/models.py",
>  line 4270, in create_dagrun
>  run.verify_integrity(session=session)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 70, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/models.py",
>  line 5215, in verify_integrity
>  if task.start_date > self.execution_date and not self.is_backfill:
> TypeError: can't compare offset-naive and offset-aware datetimes
> Process DagFileProcessor40-Process:
> Traceback (most recent call last):
>  File "/usr/lib/python3.6/multiprocessing/process.py", line 249, in _bootstrap
>  self.run()
>  File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
>  self._target(*self._args, **self._kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 389, in helper
>  pickle_dags)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 74, in wrapper
>  return func(*args, **kwargs)
>  File 
> "/home/jpowierz/my_project/venv/lib/python3.6/site-packages/airflow/jobs.py", 
> line 1846, in process_file
>  self._process_dags(dagbag, dags, ti_keys_to_schedule)
>  File 
> 

[jira] [Updated] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2018-11-16 Thread Pulin Pathneja (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pulin Pathneja updated AIRFLOW-:

Component/s: contrib

> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
> Fix For: 1.10.2
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4188: [AIRFLOW-3251] KubernetesPodOperator does not use 'image_pull_secrets…

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4188: [AIRFLOW-3251] 
KubernetesPodOperator does not use 'image_pull_secrets…
URL: 
https://github.com/apache/incubator-airflow/pull/4188#issuecomment-438759774
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=h1)
 Report
   > Merging 
[#4188](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4188/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4188  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16309  
   =
   + Hits12672   12673   +1 
   + Misses   36373636   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4188/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.37% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=footer).
 Last update 
[8668ef8...b7bd02d](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4188: [AIRFLOW-3251] KubernetesPodOperator does not use 'image_pull_secrets…

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4188: [AIRFLOW-3251] 
KubernetesPodOperator does not use 'image_pull_secrets…
URL: 
https://github.com/apache/incubator-airflow/pull/4188#issuecomment-438759774
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=h1)
 Report
   > Merging 
[#4188](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4188/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4188  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16309  
   =
   + Hits12672   12673   +1 
   + Misses   36373636   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4188/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.37% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=footer).
 Last update 
[8668ef8...b7bd02d](https://codecov.io/gh/apache/incubator-airflow/pull/4188?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2018-11-16 Thread Pulin Pathneja (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pulin Pathneja updated AIRFLOW-:

Description: 
New features enable transferring of files or data from GCS(Google Cloud 
Storage) to a SFTP remote path and SFTP to GCS(Google Cloud Storage) path. 
  

  was:
New features enable transferring of files or data from S3 to a SFTP remote path 
and SFTP to S3 path. 
 


> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
> Fix For: 1.10.2
>
>
> New features enable transferring of files or data from GCS(Google Cloud 
> Storage) to a SFTP remote path and SFTP to GCS(Google Cloud Storage) path. 
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2018-11-16 Thread Pulin Pathneja (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pulin Pathneja updated AIRFLOW-:

Fix Version/s: 1.10.2

> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
> Fix For: 1.10.2
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Work started] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2018-11-16 Thread Pulin Pathneja (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW- started by Pulin Pathneja.
---
> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>Reporter: Pulin Pathneja
>Assignee: Pulin Pathneja
>Priority: Major
> Fix For: 1.10.2
>
>
> New features enable transferring of files or data from S3 to a SFTP remote 
> path and SFTP to S3 path. 
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] victornoel commented on issue #4188: [AIRFLOW-3251] KubernetesPodOperator does not use 'image_pull_secrets…

2018-11-16 Thread GitBox
victornoel commented on issue #4188: [AIRFLOW-3251] KubernetesPodOperator does 
not use 'image_pull_secrets…
URL: 
https://github.com/apache/incubator-airflow/pull/4188#issuecomment-439325577
 
 
   @ashb I have finally added a test using the mock to validate that the pod 
operator properly set the parameter on the pod.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4165: [AIRFLOW-3322] Update qubole_hook to fetch command args dynamically from qds_sdk

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4165: [AIRFLOW-3322] Update qubole_hook 
to fetch command args dynamically from qds_sdk
URL: 
https://github.com/apache/incubator-airflow/pull/4165#issuecomment-437297736
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=h1)
 Report
   > Merging 
[#4165](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4165/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4165  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16309  
   =
   + Hits12672   12673   +1 
   + Misses   36373636   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4165/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.37% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=footer).
 Last update 
[8668ef8...c48f9f7](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4165: [AIRFLOW-3322] Update qubole_hook to fetch command args dynamically from qds_sdk

2018-11-16 Thread GitBox
codecov-io edited a comment on issue #4165: [AIRFLOW-3322] Update qubole_hook 
to fetch command args dynamically from qds_sdk
URL: 
https://github.com/apache/incubator-airflow/pull/4165#issuecomment-437297736
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=h1)
 Report
   > Merging 
[#4165](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/8668ef869d3d844dac746ec88609d3710a1264ab?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4165/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4165  +/-   ##
   =
   + Coverage   77.69%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16309   16309  
   =
   + Hits12672   12673   +1 
   + Misses   36373636   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4165/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.37% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=footer).
 Last update 
[8668ef8...c48f9f7](https://codecov.io/gh/apache/incubator-airflow/pull/4165?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services