[GitHub] omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234518339
 
 

 ##
 File path: docs/integration.rst
 ##
 @@ -186,6 +186,43 @@ AzureDataLakeStorageListOperator
 
 .. autoclass:: 
airflow.contrib.operators.adls_list_operator.AzureDataLakeStorageListOperator
 
+Azure Container Instances
+'
+
+Azure Container Instances provides a method to run a docker container without 
having to worry
+about managing infrastructure. The AzureContainerInstanceHook requires a 
service principal. The
+credentials for this principal can either be defined in the extra field 
`key_path`, as an
+environment variable named `AZURE_AUTH_LOCATION`,
+or by providing a login/password and tenantId in extras.
+
+The AzureContainerRegistryHook requires a host/login/password to be defined in 
the connection.
+
+- :ref:`AzureContainerInstancesOperator` : Start/Monitor a new ACI.
+- :ref:`AzureContainerInstanceHook` : Wrapper around a single ACI.
+- :ref:`AzureContainerRegistryHook` : Wrapper around a ACR
+- :ref:`AzureContainerVolumeHook` : Wrapper around Container Volumes
+
+AzureContainerInstancesOperator
+"""
+
+.. autoclass:: 
airflow.contrib.operators.azure_container_instances_operator.AzureContainerInstancesOperator
+
+AzureContainerInstanceHook
+""
+
+.. autoclass:: 
airflow.contrib.hooks.azure_container_hook.AzureContainerInstanceHook
 
 Review comment:
   Will update and verify docs are fixed...


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234517627
 
 

 ##
 File path: airflow/models.py
 ##
 @@ -805,6 +806,9 @@ def get_hook(self):
 elif self.conn_type == 'azure_data_lake':
 from airflow.contrib.hooks.azure_data_lake_hook import 
AzureDataLakeHook
 return AzureDataLakeHook(azure_data_lake_conn_id=self.conn_id)
+elif self.conn_type == 'azure_container_instances':
+from airflow.contrib.hooks.azure_data_lake_hook import 
AzureContainerHook
 
 Review comment:
   Good catch, was not updated in last iteration, will fix and do some more 
testing to find additional issues like this


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
omusavi commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234517271
 
 

 ##
 File path: airflow/contrib/hooks/azure_container_registry_hook.py
 ##
 @@ -0,0 +1,32 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.hooks.base_hook import BaseHook
+from azure.mgmt.containerinstance.models import ImageRegistryCredential
+
+
+class AzureContainerRegistryHook(BaseHook):
+
+def __init__(self, conn_id='azure_registry'):
+self.conn_id = conn_id
+self.connection = self.get_conn()
 
 Review comment:
   Looks like self.connection gets used inside 
`azure_container_instances_operator.py` when setting up the registry hook. Is 
there something else I am missing here? Design is a bit confusing admittedly as 
this was code I picked up from an existing PR so if there is some best practice 
here to change the code please let me know


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-3053) Airflow scheduler - high availability ?

2018-11-18 Thread Sai Phanindhra (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Phanindhra reassigned AIRFLOW-3053:
---

Assignee: Sai Phanindhra

> Airflow scheduler - high availability ?
> ---
>
> Key: AIRFLOW-3053
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3053
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: scheduler
>Reporter: Damian Momot
>Assignee: Sai Phanindhra
>Priority: Major
>
> Probably more of a question than issue.
> Does airflow scheduler support high availability? For example running 2 
> instances of airflow-scheduler connecting to single MySQL database?
> I tried to search for this question but only thing i found are conversations 
> from 2-3 years ago that "at that time it wasn't supported but it was planned"



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4207: [WIP] Run celery integration test with redis broker.

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4207: [WIP] Run celery integration test 
with redis broker.
URL: 
https://github.com/apache/incubator-airflow/pull/4207#issuecomment-439668923
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=h1)
 Report
   > Merging 
[#4207](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/94d19707d7f3563bb48868c2d6442c3da923da20?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4207/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4207  +/-   ##
   ==
   + Coverage77.7%   77.71%   +<.01% 
   ==
 Files 199  199  
 Lines   1631716317  
   ==
   + Hits1267912680   +1 
   + Misses   3638 3637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4207/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=footer).
 Last update 
[94d1970...675be46](https://codecov.io/gh/apache/incubator-airflow/pull/4207?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #4153: [AIRFLOW-3308] Fix plugins import

2018-11-18 Thread GitBox
feng-tao commented on issue #4153: [AIRFLOW-3308] Fix plugins import
URL: 
https://github.com/apache/incubator-airflow/pull/4153#issuecomment-439774005
 
 
   @Fokko , could you take a look? I think plugin management functionality is 
very important.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #4209: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
codecov-io commented on issue #4209: [AIRFLOW-2966] Catch ApiException in the 
Kubernetes Executor
URL: 
https://github.com/apache/incubator-airflow/pull/4209#issuecomment-439729859
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=h1)
 Report
   > Merging 
[#4209](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/94d19707d7f3563bb48868c2d6442c3da923da20?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4209/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4209  +/-   ##
   ==
   + Coverage77.7%   77.71%   +<.01% 
   ==
 Files 199  199  
 Lines   1631716317  
   ==
   + Hits1267912680   +1 
   + Misses   3638 3637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4209/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=footer).
 Last update 
[94d1970...b4365c4](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4209: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4209: [AIRFLOW-2966] Catch ApiException 
in the Kubernetes Executor
URL: 
https://github.com/apache/incubator-airflow/pull/4209#issuecomment-439729859
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=h1)
 Report
   > Merging 
[#4209](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/94d19707d7f3563bb48868c2d6442c3da923da20?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4209/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4209  +/-   ##
   ==
   + Coverage77.7%   77.71%   +<.01% 
   ==
 Files 199  199  
 Lines   1631716317  
   ==
   + Hits1267912680   +1 
   + Misses   3638 3637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4209/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=footer).
 Last update 
[94d1970...b4365c4](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4209: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4209: [AIRFLOW-2966] Catch ApiException 
in the Kubernetes Executor
URL: 
https://github.com/apache/incubator-airflow/pull/4209#issuecomment-439729859
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=h1)
 Report
   > Merging 
[#4209](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/94d19707d7f3563bb48868c2d6442c3da923da20?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4209/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4209  +/-   ##
   ==
   + Coverage77.7%   77.71%   +<.01% 
   ==
 Files 199  199  
 Lines   1631716317  
   ==
   + Hits1267912680   +1 
   + Misses   3638 3637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4209/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=footer).
 Last update 
[94d1970...b4365c4](https://codecov.io/gh/apache/incubator-airflow/pull/4209?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] johnhofman commented on issue #4209: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
johnhofman commented on issue #4209: [AIRFLOW-2966] Catch ApiException in the 
Kubernetes Executor
URL: 
https://github.com/apache/incubator-airflow/pull/4209#issuecomment-439727645
 
 
   @Fokko This is a resubmission of #3960, which was reverted when it broke the 
CI on master. This was due to the refactoring of the task instance key in PR 
#3994, which was not included in the PR #3960 tests, but got merged to master 
first.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] johnhofman opened a new pull request #4209: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
johnhofman opened a new pull request #4209: [AIRFLOW-2966] Catch ApiException 
in the Kubernetes Executor
URL: https://github.com/apache/incubator-airflow/pull/4209
 
 
   ### Description
   
   Creating a pod that exceeds a namespace's resource quota throws an 
ApiException. This change catches the exception and the task is re-queued 
inside the Executor instead of killing the scheduler.
   
   `click 7.0` was recently released but `flask-appbuilder 1.11.1 has 
requirement click==6.7`. I have pinned `click==6.7` to make the dependencies 
resolve.
   
   ### Tests
   
   This adds a single test `TestKubernetesExecutor. test_run_next_exception` 
that covers this single scenario. Without the changes this test fails when the 
ApiException is not caught. 
   
   This is the first test case for the `KubernetesExecutor`,  so I needed to 
add the `[kubernetes]` section to `default_test.cfg` so that the 
`KubernetesExecutor` can be built without exceptions.
   
   Jira ticket: https://issues.apache.org/jira/browse/AIRFLOW-2966
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2966) KubernetesExecutor + namespace quotas kills scheduler if the pod can't be launched

2018-11-18 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16691099#comment-16691099
 ] 

ASF GitHub Bot commented on AIRFLOW-2966:
-

johnhofman opened a new pull request #4209: [AIRFLOW-2966] Catch ApiException 
in the Kubernetes Executor
URL: https://github.com/apache/incubator-airflow/pull/4209
 
 
   ### Description
   
   Creating a pod that exceeds a namespace's resource quota throws an 
ApiException. This change catches the exception and the task is re-queued 
inside the Executor instead of killing the scheduler.
   
   `click 7.0` was recently released but `flask-appbuilder 1.11.1 has 
requirement click==6.7`. I have pinned `click==6.7` to make the dependencies 
resolve.
   
   ### Tests
   
   This adds a single test `TestKubernetesExecutor. test_run_next_exception` 
that covers this single scenario. Without the changes this test fails when the 
ApiException is not caught. 
   
   This is the first test case for the `KubernetesExecutor`,  so I needed to 
add the `[kubernetes]` section to `default_test.cfg` so that the 
`KubernetesExecutor` can be built without exceptions.
   
   Jira ticket: https://issues.apache.org/jira/browse/AIRFLOW-2966
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> KubernetesExecutor + namespace quotas kills scheduler if the pod can't be 
> launched
> --
>
> Key: AIRFLOW-2966
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2966
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 2.0.0
> Environment: Kubernetes 1.9.8
>Reporter: John Hofman
>Assignee: John Hofman
>Priority: Major
> Fix For: 2.0.0
>
>
> When running Airflow in Kubernetes with the KubernetesExecutor and resource 
> quota's set on the namespace Airflow is deployed in. If the scheduler tries 
> to launch a pod into the namespace that exceeds the namespace limits it gets 
> an ApiException, and crashes the scheduler.
> This stack trace is an example of the ApiException from the kubernetes client:
> {code:java}
> [2018-08-27 09:51:08,516] {pod_launcher.py:58} ERROR - Exception when 
> attempting to create Namespaced Pod.
> Traceback (most recent call last):
> File "/src/apache-airflow/airflow/contrib/kubernetes/pod_launcher.py", line 
> 55, in run_pod_async
> resp = self._client.create_namespaced_pod(body=req, namespace=pod.namespace)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py",
>  line 6057, in create_namespaced_pod
> (data) = self.create_namespaced_pod_with_http_info(namespace, body, **kwargs)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py",
>  line 6142, in create_namespaced_pod_with_http_info
> collection_formats=collection_formats)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 321, in call_api
> _return_http_data_only, collection_formats, _preload_content, 
> _request_timeout)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 155, in __call_api
> _request_timeout=_request_timeout)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 364, in request
> body=body)
> File "/usr/local/lib/python3.6/site-packages/kubernetes/client/rest.py", line 
> 266, in POST
> body=body)
> File "/usr/local/lib/python3.6/site-packages/kubernetes/client/rest.py", line 
> 222, in request
> raise ApiException(http_resp=r)
> kubernetes.client.rest.ApiException: (403)
> Reason: Forbidden
> HTTP response headers: HTTPHeaderDict({'Audit-Id': 
> 'b00e2cbb-bdb2-41f3-8090-824aee79448c', 'Content-Type': 'application/json', 
> 'Date': 'Mon, 27 Aug 2018 09:51:08 GMT', 'Content-Length': '410'})
> HTTP response body: 
> {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods
>  \"podname-ec366e89ef934d91b2d3ffe96234a725\" is forbidden: exceeded quota: 
> compute-resources, requested: limits.memory=4Gi, used: limits.memory=6508Mi, 
> limited: 
> limits.memory=10Gi","reason":"Forbidden","details":{"name":"podname-ec366e89ef934d91b2d3ffe96234a725","kind":"pods"},"code":403}{code}
>  
> I would expect the scheduler to catch the Exception and at least mark the 
> task as failed, or better yet retry the task later.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16691042#comment-16691042
 ] 

ASF GitHub Bot commented on AIRFLOW-3361:
-

ashb closed pull request #4030: [AIRFLOW-3361] Log the task_id in the 
PendingDeprecationWarning for BaseOperator
URL: https://github.com/apache/incubator-airflow/pull/4030
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 239a3a5263..47a1827e09 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2494,12 +2494,13 @@ def __init__(
 if args or kwargs:
 # TODO remove *args and **kwargs in Airflow 2.0
 warnings.warn(
-'Invalid arguments were passed to {c}. Support for '
-'passing such arguments will be dropped in Airflow 2.0. '
-'Invalid arguments were:'
+'Invalid arguments were passed to {c} (task_id: {t}). '
+'Support for passing such arguments will be dropped in '
+'Airflow 2.0. Invalid arguments were:'
 '\n*args: {a}\n**kwargs: {k}'.format(
-c=self.__class__.__name__, a=args, k=kwargs),
-category=PendingDeprecationWarning
+c=self.__class__.__name__, a=args, k=kwargs, t=task_id),
+category=PendingDeprecationWarning,
+stacklevel=3
 )
 
 validate_key(task_id)
diff --git a/tests/core.py b/tests/core.py
index c37b1f9c8b..5f24b6c1e9 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -443,7 +443,8 @@ def test_illegal_args(self):
 self.assertTrue(
 issubclass(w[0].category, PendingDeprecationWarning))
 self.assertIn(
-'Invalid arguments were passed to BashOperator.',
+('Invalid arguments were passed to BashOperator '
+ '(task_id: test_illegal_args).'),
 w[0].message.args[0])
 
 def test_bash_operator(self):


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Task
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
> Fix For: 2.0.0, 1.10.2
>
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3361.

   Resolution: Fixed
Fix Version/s: 1.10.2
   2.0.0

> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
> Fix For: 2.0.0, 1.10.2
>
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-801) Outdated docstring on baseclass

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-801?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-801.
---
Resolution: Fixed

> Outdated docstring on baseclass
> ---
>
> Key: AIRFLOW-801
> URL: https://issues.apache.org/jira/browse/AIRFLOW-801
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Daniel Seisun
>Assignee: Kengo Seki
>Priority: Trivial
>
> The docstring of the BaseOperator still makes reference to it inheriting from 
> SQL Alchemy's Base class, which it no longer does. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3307) Update insecure node dependencies

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3307.

   Resolution: Fixed
Fix Version/s: 2.0.0

> Update insecure node dependencies
> -
>
> Key: AIRFLOW-3307
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3307
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
> Fix For: 2.0.0
>
>
> `npm audit` shows some node dependencies that are out of date and potentially 
> insecure. We should update them with `npm audit fix`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3307) Update insecure node dependencies

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3307:
---
Issue Type: Improvement  (was: Bug)

> Update insecure node dependencies
> -
>
> Key: AIRFLOW-3307
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3307
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
> Fix For: 2.0.0
>
>
> `npm audit` shows some node dependencies that are out of date and potentially 
> insecure. We should update them with `npm audit fix`.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-3306) Disable unused flask-sqlalchemy modification tracking

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor resolved AIRFLOW-3306.

   Resolution: Fixed
Fix Version/s: 1.10.2
   2.0.0

> Disable unused flask-sqlalchemy modification tracking
> -
>
> Key: AIRFLOW-3306
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3306
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Josh Carp
>Assignee: Josh Carp
>Priority: Trivial
> Fix For: 2.0.0, 1.10.2
>
>
> By default, flask-sqlalchemy tracks model changes for its event system, which 
> adds some overhead. Since I don't think we're using the flask-sqlalchemy 
> event system, we should be able to turn off modification tracking and improve 
> performance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-3361:
---
Issue Type: Improvement  (was: Task)

> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
> Fix For: 2.0.0, 1.10.2
>
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb closed pull request #4030: [AIRFLOW-3361] Log the task_id in the PendingDeprecationWarning for BaseOperator

2018-11-18 Thread GitBox
ashb closed pull request #4030: [AIRFLOW-3361] Log the task_id in the 
PendingDeprecationWarning for BaseOperator
URL: https://github.com/apache/incubator-airflow/pull/4030
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/models.py b/airflow/models.py
index 239a3a5263..47a1827e09 100755
--- a/airflow/models.py
+++ b/airflow/models.py
@@ -2494,12 +2494,13 @@ def __init__(
 if args or kwargs:
 # TODO remove *args and **kwargs in Airflow 2.0
 warnings.warn(
-'Invalid arguments were passed to {c}. Support for '
-'passing such arguments will be dropped in Airflow 2.0. '
-'Invalid arguments were:'
+'Invalid arguments were passed to {c} (task_id: {t}). '
+'Support for passing such arguments will be dropped in '
+'Airflow 2.0. Invalid arguments were:'
 '\n*args: {a}\n**kwargs: {k}'.format(
-c=self.__class__.__name__, a=args, k=kwargs),
-category=PendingDeprecationWarning
+c=self.__class__.__name__, a=args, k=kwargs, t=task_id),
+category=PendingDeprecationWarning,
+stacklevel=3
 )
 
 validate_key(task_id)
diff --git a/tests/core.py b/tests/core.py
index c37b1f9c8b..5f24b6c1e9 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -443,7 +443,8 @@ def test_illegal_args(self):
 self.assertTrue(
 issubclass(w[0].category, PendingDeprecationWarning))
 self.assertIn(
-'Invalid arguments were passed to BashOperator.',
+('Invalid arguments were passed to BashOperator '
+ '(task_id: test_illegal_args).'),
 w[0].message.args[0])
 
 def test_bash_operator(self):


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3362) Template to support jinja2 native python types

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3362?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16691041#comment-16691041
 ] 

Ash Berlin-Taylor commented on AIRFLOW-3362:


Oh interesting! I think that is a fairly new feature in Jinja

> Template to support jinja2 native python types
> --
>
> Key: AIRFLOW-3362
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3362
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: core, DAG
>Reporter: Duan Shiqiang
>Priority: Major
>
> Airflow latest (1.10.x)'s template can only render into string which is fine 
> most of the times, but it would be better to support render into python types.
> It can be very useful if the template system can support render into native 
> python types like list, dictionary, etc. Especially when using xcom to pass 
> some values between operators.
> Jinja2 supports this feature from 2.10, more info can found here: 
> http://jinja.pocoo.org/docs/2.10/nativetypes/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-1252) Experimental API - exception when conf is present in JSON body

2018-11-18 Thread Ash Berlin-Taylor (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-1252:
---
Fix Version/s: 1.10.2

> Experimental API - exception when conf is present in JSON body
> --
>
> Key: AIRFLOW-1252
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1252
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: api
>Affects Versions: 1.8.0, 1.8.1, 1.9.0, 1.10.0, 2.0.0
>Reporter: Sergio Herrera
>Assignee: Sergio Herrera
>Priority: Major
>  Labels: api
> Fix For: 2.0.0, 1.10.2
>
>
> When someones calls to the endpoint _POST 
> :/api/experimental/dags//dag_runs {}_, Airflow never run 
> that request if the body of that contains _conf_.
>  This occurs due to a mismatch between types when calling function 
> _trigger_dag()_, which is also used by *CLI*. That function perform a 
> _json.loads(conf)_ because from CLI the type of conf is _string_, but, in the 
> other side, from *experimental API*, that type is _dict_ (because _Json_ is 
> processed before to get all data, such as execution_date).
> There are two possibilities:
>  1. Look for every use of _trigger_dag()_ function and put _Json_ formatting 
> from outside the function.
>  2. In the *experimental API*, put the conf in a string (with _json.dumps()_) 
> to allow _trigger_dag()_ transform into _dict_.
> I have implemented the second option, so I can make a PR with that if you 
> want.
> Thank you a lot
> EDIT: Also, there are no tests which uses conf in the Json passed through 
> request currently.
> Examples:
>  - Before fix (escaped json):
> {noformat}
> POST /api/experimental/dags/test_conf/dag_runs HTTP/1.1
> Content-Type: application/json
> {
>   "conf": "{
> \"k1\": \"v1\",
> \"k2\": \"v2\",
> \"k3\": [\"av1\", \"av2\", \"av3\"],
> \"k4\": {
>   \"sk1\": \"sv1\",
>   \"sk2\": \"sv2\"
> }
>   }"  
> }
> {noformat}
>  - After fix (pure json):
> {noformat}
> POST /api/experimental/dags/test_conf/dag_runs HTTP/1.1
> Content-Type: application/json
> {
>   "conf": {
> "k1": "v1",
> "k2": "v2",
> "k3": ["av1", "av2", "av3"],
> "k4": {
>   "sk1": "sv1",
>   "sk2": "sv2"
> }
>   }
> }
> {noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #3960: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
Fokko commented on issue #3960: [AIRFLOW-2966] Catch ApiException in the 
Kubernetes Executor
URL: 
https://github.com/apache/incubator-airflow/pull/3960#issuecomment-439722764
 
 
   @johnhofman I had to revert the commit again. The CI wasn't happy. Please 
take a look and open a new PR: 
https://travis-ci.org/apache/incubator-airflow/jobs/456661023
   Please tag me when I can take a look.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region 
setting while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-438279332
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=h1)
 Report
   > Merging 
[#4139](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr=desc)
 will **decrease** coverage by `4.53%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4139/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4139  +/-   ##
   ==
   - Coverage   77.66%   73.12%   -4.54% 
   ==
 Files 199  199  
 Lines   1629017807+1517 
   ==
   + Hits1265213022 +370 
   - Misses   3638 4785+1147
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `63.88% <0%> (-28.73%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `75% <0%> (-22.68%)` | :arrow_down: |
   | 
[airflow/operators/bash\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaF9vcGVyYXRvci5weQ==)
 | `70% <0%> (-21.38%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `70.95% <0%> (-21.29%)` | :arrow_down: |
   | 
[airflow/api/common/experimental/trigger\_dag.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC90cmlnZ2VyX2RhZy5weQ==)
 | `80.39% <0%> (-19.61%)` | :arrow_down: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `65.24% <0%> (-7.08%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=footer).
 Last update 
[e6291e8...75f8141](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4139: [AIRFLOW-2715] Pick up the region 
setting while launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-438279332
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=h1)
 Report
   > Merging 
[#4139](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/e6291e8d50701b80d37850a63c555a42a6134775?src=pr=desc)
 will **decrease** coverage by `4.53%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4139/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4139  +/-   ##
   ==
   - Coverage   77.66%   73.12%   -4.54% 
   ==
 Files 199  199  
 Lines   1629017807+1517 
   ==
   + Hits1265213022 +370 
   - Misses   3638 4785+1147
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/www\_rbac/security.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9zZWN1cml0eS5weQ==)
 | `63.88% <0%> (-28.73%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `75% <0%> (-22.68%)` | :arrow_down: |
   | 
[airflow/operators/bash\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvYmFzaF9vcGVyYXRvci5weQ==)
 | `70% <0%> (-21.38%)` | :arrow_down: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `70.95% <0%> (-21.29%)` | :arrow_down: |
   | 
[airflow/api/common/experimental/trigger\_dag.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9hcGkvY29tbW9uL2V4cGVyaW1lbnRhbC90cmlnZ2VyX2RhZy5weQ==)
 | `80.39% <0%> (-19.61%)` | :arrow_down: |
   | 
[airflow/www\_rbac/views.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy92aWV3cy5weQ==)
 | `65.24% <0%> (-7.08%)` | :arrow_down: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/incubator-airflow/pull/4139/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `77.36% <0%> (+0.27%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=footer).
 Last update 
[e6291e8...75f8141](https://codecov.io/gh/apache/incubator-airflow/pull/4139?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234454702
 
 

 ##
 File path: docs/integration.rst
 ##
 @@ -186,6 +186,43 @@ AzureDataLakeStorageListOperator
 
 .. autoclass:: 
airflow.contrib.operators.adls_list_operator.AzureDataLakeStorageListOperator
 
+Azure Container Instances
+'
+
+Azure Container Instances provides a method to run a docker container without 
having to worry
+about managing infrastructure. The AzureContainerInstanceHook requires a 
service principal. The
+credentials for this principal can either be defined in the extra field 
`key_path`, as an
+environment variable named `AZURE_AUTH_LOCATION`,
+or by providing a login/password and tenantId in extras.
+
+The AzureContainerRegistryHook requires a host/login/password to be defined in 
the connection.
+
+- :ref:`AzureContainerInstancesOperator` : Start/Monitor a new ACI.
+- :ref:`AzureContainerInstanceHook` : Wrapper around a single ACI.
+- :ref:`AzureContainerRegistryHook` : Wrapper around a ACR
+- :ref:`AzureContainerVolumeHook` : Wrapper around Container Volumes
+
+AzureContainerInstancesOperator
+"""
+
+.. autoclass:: 
airflow.contrib.operators.azure_container_instances_operator.AzureContainerInstancesOperator
+
+AzureContainerInstanceHook
+""
+
+.. autoclass:: 
airflow.contrib.hooks.azure_container_hook.AzureContainerInstanceHook
 
 Review comment:
   I don't think these resolve? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234454674
 
 

 ##
 File path: airflow/models.py
 ##
 @@ -805,6 +806,9 @@ def get_hook(self):
 elif self.conn_type == 'azure_data_lake':
 from airflow.contrib.hooks.azure_data_lake_hook import 
AzureDataLakeHook
 return AzureDataLakeHook(azure_data_lake_conn_id=self.conn_id)
+elif self.conn_type == 'azure_container_instances':
+from airflow.contrib.hooks.azure_data_lake_hook import 
AzureContainerHook
 
 Review comment:
   Where is the `AzureContainerHook`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure Container Instances operator

2018-11-18 Thread GitBox
Fokko commented on a change in pull request #4121: [AIRFLOW-2568] Azure 
Container Instances operator
URL: https://github.com/apache/incubator-airflow/pull/4121#discussion_r234454490
 
 

 ##
 File path: airflow/contrib/hooks/azure_container_registry_hook.py
 ##
 @@ -0,0 +1,32 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow.hooks.base_hook import BaseHook
+from azure.mgmt.containerinstance.models import ImageRegistryCredential
+
+
+class AzureContainerRegistryHook(BaseHook):
+
+def __init__(self, conn_id='azure_registry'):
+self.conn_id = conn_id
+self.connection = self.get_conn()
 
 Review comment:
   Unused.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-1252) Experimental API - exception when conf is present in JSON body

2018-11-18 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-1252.
---
   Resolution: Fixed
Fix Version/s: 2.0.0

> Experimental API - exception when conf is present in JSON body
> --
>
> Key: AIRFLOW-1252
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1252
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: api
>Affects Versions: 1.8.0, 1.8.1, 1.9.0, 1.10.0, 2.0.0
>Reporter: Sergio Herrera
>Assignee: Sergio Herrera
>Priority: Major
>  Labels: api
> Fix For: 2.0.0
>
>
> When someones calls to the endpoint _POST 
> :/api/experimental/dags//dag_runs {}_, Airflow never run 
> that request if the body of that contains _conf_.
>  This occurs due to a mismatch between types when calling function 
> _trigger_dag()_, which is also used by *CLI*. That function perform a 
> _json.loads(conf)_ because from CLI the type of conf is _string_, but, in the 
> other side, from *experimental API*, that type is _dict_ (because _Json_ is 
> processed before to get all data, such as execution_date).
> There are two possibilities:
>  1. Look for every use of _trigger_dag()_ function and put _Json_ formatting 
> from outside the function.
>  2. In the *experimental API*, put the conf in a string (with _json.dumps()_) 
> to allow _trigger_dag()_ transform into _dict_.
> I have implemented the second option, so I can make a PR with that if you 
> want.
> Thank you a lot
> EDIT: Also, there are no tests which uses conf in the Json passed through 
> request currently.
> Examples:
>  - Before fix (escaped json):
> {noformat}
> POST /api/experimental/dags/test_conf/dag_runs HTTP/1.1
> Content-Type: application/json
> {
>   "conf": "{
> \"k1\": \"v1\",
> \"k2\": \"v2\",
> \"k3\": [\"av1\", \"av2\", \"av3\"],
> \"k4\": {
>   \"sk1\": \"sv1\",
>   \"sk2\": \"sv2\"
> }
>   }"  
> }
> {noformat}
>  - After fix (pure json):
> {noformat}
> POST /api/experimental/dags/test_conf/dag_runs HTTP/1.1
> Content-Type: application/json
> {
>   "conf": {
> "k1": "v1",
> "k2": "v2",
> "k3": ["av1", "av2", "av3"],
> "k4": {
>   "sk1": "sv1",
>   "sk2": "sv2"
> }
>   }
> }
> {noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] Fokko commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog Langs

2018-11-18 Thread GitBox
Fokko commented on issue #3815: [AIRFLOW-2973] Add Python 3.6 to Supported Prog 
Langs
URL: 
https://github.com/apache/incubator-airflow/pull/3815#issuecomment-439710806
 
 
   Give me some time to look into this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4139: [AIRFLOW-2715] Pick up the region setting while launching Dataflow templates

2018-11-18 Thread GitBox
Fokko commented on issue #4139: [AIRFLOW-2715] Pick up the region setting while 
launching Dataflow templates
URL: 
https://github.com/apache/incubator-airflow/pull/4139#issuecomment-439710673
 
 
   Rerunning the failed tests.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-1252) Experimental API - exception when conf is present in JSON body

2018-11-18 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-1252?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690980#comment-16690980
 ] 

ASF GitHub Bot commented on AIRFLOW-1252:
-

Fokko closed pull request #2334: [AIRFLOW-1252] API - Fix when conf is in JSON 
body
URL: https://github.com/apache/incubator-airflow/pull/2334
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/api/common/experimental/trigger_dag.py 
b/airflow/api/common/experimental/trigger_dag.py
index d7353f66d1..0268752565 100644
--- a/airflow/api/common/experimental/trigger_dag.py
+++ b/airflow/api/common/experimental/trigger_dag.py
@@ -59,7 +59,10 @@ def _trigger_dag(
 
 run_conf = None
 if conf:
-run_conf = json.loads(conf)
+if type(conf) is dict:
+run_conf = conf
+else:
+run_conf = json.loads(conf)
 
 triggers = list()
 dags_to_trigger = list()
diff --git a/tests/api/common/experimental/trigger_dag_tests.py 
b/tests/api/common/experimental/trigger_dag_tests.py
index d6354840e2..fc1d7cda8e 100644
--- a/tests/api/common/experimental/trigger_dag_tests.py
+++ b/tests/api/common/experimental/trigger_dag_tests.py
@@ -19,6 +19,7 @@
 
 import mock
 import unittest
+import json
 
 from airflow.exceptions import AirflowException
 from airflow.models import DAG, DagRun
@@ -88,6 +89,44 @@ def test_trigger_dag_include_subdags(self, dag_bag_mock, 
dag_run_mock, dag_mock)
 
 self.assertEqual(3, len(triggers))
 
+@mock.patch('airflow.models.DagBag')
+def test_trigger_dag_with_str_conf(self, dag_bag_mock):
+dag_id = "trigger_dag_with_str_conf"
+dag = DAG(dag_id)
+dag_bag_mock.dags = [dag_id]
+dag_bag_mock.get_dag.return_value = dag
+conf = "{\"foo\": \"bar\"}"
+dag_run = DagRun()
+triggers = _trigger_dag(
+dag_id,
+dag_bag_mock,
+dag_run,
+run_id=None,
+conf=conf,
+execution_date=None,
+replace_microseconds=True)
+
+self.assertEquals(triggers[0].conf, json.loads(conf))
+
+@mock.patch('airflow.models.DagBag')
+def test_trigger_dag_with_dict_conf(self, dag_bag_mock):
+dag_id = "trigger_dag_with_dict_conf"
+dag = DAG(dag_id)
+dag_bag_mock.dags = [dag_id]
+dag_bag_mock.get_dag.return_value = dag
+conf = dict(foo="bar")
+dag_run = DagRun()
+triggers = _trigger_dag(
+dag_id,
+dag_bag_mock,
+dag_run,
+run_id=None,
+conf=conf,
+execution_date=None,
+replace_microseconds=True)
+
+self.assertEquals(triggers[0].conf, conf)
+
 
 if __name__ == '__main__':
 unittest.main()


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Experimental API - exception when conf is present in JSON body
> --
>
> Key: AIRFLOW-1252
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1252
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: api
>Affects Versions: 1.8.0, 1.8.1, 1.9.0, 1.10.0, 2.0.0
>Reporter: Sergio Herrera
>Assignee: Sergio Herrera
>Priority: Major
>  Labels: api
> Fix For: 2.0.0
>
>
> When someones calls to the endpoint _POST 
> :/api/experimental/dags//dag_runs {}_, Airflow never run 
> that request if the body of that contains _conf_.
>  This occurs due to a mismatch between types when calling function 
> _trigger_dag()_, which is also used by *CLI*. That function perform a 
> _json.loads(conf)_ because from CLI the type of conf is _string_, but, in the 
> other side, from *experimental API*, that type is _dict_ (because _Json_ is 
> processed before to get all data, such as execution_date).
> There are two possibilities:
>  1. Look for every use of _trigger_dag()_ function and put _Json_ formatting 
> from outside the function.
>  2. In the *experimental API*, put the conf in a string (with _json.dumps()_) 
> to allow _trigger_dag()_ transform into _dict_.
> I have implemented the second option, so I can make a PR with that if you 
> want.
> Thank you a lot
> EDIT: Also, there are no tests which uses conf in the Json passed through 
> request currently.
> Examples:
>  - Before fix (escaped json):
> {noformat}
> POST 

[GitHub] Fokko closed pull request #2334: [AIRFLOW-1252] API - Fix when conf is in JSON body

2018-11-18 Thread GitBox
Fokko closed pull request #2334: [AIRFLOW-1252] API - Fix when conf is in JSON 
body
URL: https://github.com/apache/incubator-airflow/pull/2334
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/api/common/experimental/trigger_dag.py 
b/airflow/api/common/experimental/trigger_dag.py
index d7353f66d1..0268752565 100644
--- a/airflow/api/common/experimental/trigger_dag.py
+++ b/airflow/api/common/experimental/trigger_dag.py
@@ -59,7 +59,10 @@ def _trigger_dag(
 
 run_conf = None
 if conf:
-run_conf = json.loads(conf)
+if type(conf) is dict:
+run_conf = conf
+else:
+run_conf = json.loads(conf)
 
 triggers = list()
 dags_to_trigger = list()
diff --git a/tests/api/common/experimental/trigger_dag_tests.py 
b/tests/api/common/experimental/trigger_dag_tests.py
index d6354840e2..fc1d7cda8e 100644
--- a/tests/api/common/experimental/trigger_dag_tests.py
+++ b/tests/api/common/experimental/trigger_dag_tests.py
@@ -19,6 +19,7 @@
 
 import mock
 import unittest
+import json
 
 from airflow.exceptions import AirflowException
 from airflow.models import DAG, DagRun
@@ -88,6 +89,44 @@ def test_trigger_dag_include_subdags(self, dag_bag_mock, 
dag_run_mock, dag_mock)
 
 self.assertEqual(3, len(triggers))
 
+@mock.patch('airflow.models.DagBag')
+def test_trigger_dag_with_str_conf(self, dag_bag_mock):
+dag_id = "trigger_dag_with_str_conf"
+dag = DAG(dag_id)
+dag_bag_mock.dags = [dag_id]
+dag_bag_mock.get_dag.return_value = dag
+conf = "{\"foo\": \"bar\"}"
+dag_run = DagRun()
+triggers = _trigger_dag(
+dag_id,
+dag_bag_mock,
+dag_run,
+run_id=None,
+conf=conf,
+execution_date=None,
+replace_microseconds=True)
+
+self.assertEquals(triggers[0].conf, json.loads(conf))
+
+@mock.patch('airflow.models.DagBag')
+def test_trigger_dag_with_dict_conf(self, dag_bag_mock):
+dag_id = "trigger_dag_with_dict_conf"
+dag = DAG(dag_id)
+dag_bag_mock.dags = [dag_id]
+dag_bag_mock.get_dag.return_value = dag
+conf = dict(foo="bar")
+dag_run = DagRun()
+triggers = _trigger_dag(
+dag_id,
+dag_bag_mock,
+dag_run,
+run_id=None,
+conf=conf,
+execution_date=None,
+replace_microseconds=True)
+
+self.assertEquals(triggers[0].conf, conf)
+
 
 if __name__ == '__main__':
 unittest.main()


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #2799: [AIRFLOW-1836] airflow uses OAuth Provider keycloak

2018-11-18 Thread GitBox
Fokko commented on issue #2799: [AIRFLOW-1836] airflow uses OAuth Provider 
keycloak
URL: 
https://github.com/apache/incubator-airflow/pull/2799#issuecomment-439710457
 
 
   I think it is for #3941 hard to implement it generic, i.e. have some logic 
to test the connection of an arbitrary connection. Implementing OAuth makes 
perfect sense. What do you suggest @ron819 ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-2966) KubernetesExecutor + namespace quotas kills scheduler if the pod can't be launched

2018-11-18 Thread Fokko Driesprong (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2966?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong closed AIRFLOW-2966.
-
   Resolution: Fixed
 Assignee: John Hofman
Fix Version/s: 2.0.0

> KubernetesExecutor + namespace quotas kills scheduler if the pod can't be 
> launched
> --
>
> Key: AIRFLOW-2966
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2966
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler
>Affects Versions: 2.0.0
> Environment: Kubernetes 1.9.8
>Reporter: John Hofman
>Assignee: John Hofman
>Priority: Major
> Fix For: 2.0.0
>
>
> When running Airflow in Kubernetes with the KubernetesExecutor and resource 
> quota's set on the namespace Airflow is deployed in. If the scheduler tries 
> to launch a pod into the namespace that exceeds the namespace limits it gets 
> an ApiException, and crashes the scheduler.
> This stack trace is an example of the ApiException from the kubernetes client:
> {code:java}
> [2018-08-27 09:51:08,516] {pod_launcher.py:58} ERROR - Exception when 
> attempting to create Namespaced Pod.
> Traceback (most recent call last):
> File "/src/apache-airflow/airflow/contrib/kubernetes/pod_launcher.py", line 
> 55, in run_pod_async
> resp = self._client.create_namespaced_pod(body=req, namespace=pod.namespace)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py",
>  line 6057, in create_namespaced_pod
> (data) = self.create_namespaced_pod_with_http_info(namespace, body, **kwargs)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py",
>  line 6142, in create_namespaced_pod_with_http_info
> collection_formats=collection_formats)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 321, in call_api
> _return_http_data_only, collection_formats, _preload_content, 
> _request_timeout)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 155, in __call_api
> _request_timeout=_request_timeout)
> File 
> "/usr/local/lib/python3.6/site-packages/kubernetes/client/api_client.py", 
> line 364, in request
> body=body)
> File "/usr/local/lib/python3.6/site-packages/kubernetes/client/rest.py", line 
> 266, in POST
> body=body)
> File "/usr/local/lib/python3.6/site-packages/kubernetes/client/rest.py", line 
> 222, in request
> raise ApiException(http_resp=r)
> kubernetes.client.rest.ApiException: (403)
> Reason: Forbidden
> HTTP response headers: HTTPHeaderDict({'Audit-Id': 
> 'b00e2cbb-bdb2-41f3-8090-824aee79448c', 'Content-Type': 'application/json', 
> 'Date': 'Mon, 27 Aug 2018 09:51:08 GMT', 'Content-Length': '410'})
> HTTP response body: 
> {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods
>  \"podname-ec366e89ef934d91b2d3ffe96234a725\" is forbidden: exceeded quota: 
> compute-resources, requested: limits.memory=4Gi, used: limits.memory=6508Mi, 
> limited: 
> limits.memory=10Gi","reason":"Forbidden","details":{"name":"podname-ec366e89ef934d91b2d3ffe96234a725","kind":"pods"},"code":403}{code}
>  
> I would expect the scheduler to catch the Exception and at least mark the 
> task as failed, or better yet retry the task later.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2966) KubernetesExecutor + namespace quotas kills scheduler if the pod can't be launched

2018-11-18 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690971#comment-16690971
 ] 

ASF GitHub Bot commented on AIRFLOW-2966:
-

Fokko closed pull request #3960: [AIRFLOW-2966] Catch ApiException in the 
Kubernetes Executor
URL: https://github.com/apache/incubator-airflow/pull/3960
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_test.cfg 
b/airflow/config_templates/default_test.cfg
index f9279cce54..2630a60ce4 100644
--- a/airflow/config_templates/default_test.cfg
+++ b/airflow/config_templates/default_test.cfg
@@ -125,3 +125,6 @@ hide_sensitive_variable_fields = True
 elasticsearch_host =
 elasticsearch_log_id_template = 
{{dag_id}}-{{task_id}}-{{execution_date}}-{{try_number}}
 elasticsearch_end_of_log_mark = end_of_log
+
+[kubernetes]
+dags_volume_claim = default
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index de1f9f4235..f9e350d303 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -599,8 +599,14 @@ def sync(self):
 last_resource_version, session=self._session)
 
 if not self.task_queue.empty():
-key, command, kube_executor_config = self.task_queue.get()
-self.kube_scheduler.run_next((key, command, kube_executor_config))
+task = self.task_queue.get()
+
+try:
+self.kube_scheduler.run_next(task)
+except ApiException:
+self.log.exception('ApiException when attempting ' +
+   'to run task, re-queueing.')
+self.task_queue.put(task)
 
 def _change_state(self, key, state, pod_id):
 if state != State.RUNNING:
diff --git a/tests/contrib/executors/test_kubernetes_executor.py 
b/tests/contrib/executors/test_kubernetes_executor.py
index c203e18d5c..905beeec40 100644
--- a/tests/contrib/executors/test_kubernetes_executor.py
+++ b/tests/contrib/executors/test_kubernetes_executor.py
@@ -18,10 +18,13 @@
 import re
 import string
 import random
+from urllib3 import HTTPResponse
 from datetime import datetime
 
 try:
+from kubernetes.client.rest import ApiException
 from airflow.contrib.executors.kubernetes_executor import 
AirflowKubernetesScheduler
+from airflow.contrib.executors.kubernetes_executor import 
KubernetesExecutor
 from airflow.contrib.kubernetes.worker_configuration import 
WorkerConfiguration
 except ImportError:
 AirflowKubernetesScheduler = None
@@ -81,6 +84,7 @@ class TestKubernetesWorkerConfiguration(unittest.TestCase):
 Tests that if dags_volume_subpath/logs_volume_subpath configuration
 options are passed to worker pod config
 """
+
 def setUp(self):
 if AirflowKubernetesScheduler is None:
 self.skipTest("kubernetes python package is not installed")
@@ -152,5 +156,61 @@ def 
test_worker_environment_when_dags_folder_specified(self):
 self.assertEqual(dags_folder, env['AIRFLOW__CORE__DAGS_FOLDER'])
 
 
+class TestKubernetesExecutor(unittest.TestCase):
+"""
+Tests if an ApiException from the Kube Client will cause the task to
+be rescheduled.
+"""
+@unittest.skipIf(AirflowKubernetesScheduler is None,
+ 'kubernetes python package is not installed')
+
@mock.patch('airflow.contrib.executors.kubernetes_executor.KubernetesJobWatcher')
+
@mock.patch('airflow.contrib.executors.kubernetes_executor.get_kube_client')
+def test_run_next_exception(self, mock_get_kube_client, 
mock_kubernetes_job_watcher):
+
+# When a quota is exceeded this is the ApiException we get
+r = HTTPResponse()
+r.body = {
+"kind": "Status",
+"apiVersion": "v1",
+"metadata": {},
+"status": "Failure",
+"message": "pods \"podname\" is forbidden: " +
+"exceeded quota: compute-resources, " +
+"requested: limits.memory=4Gi, " +
+"used: limits.memory=6508Mi, " +
+"limited: limits.memory=10Gi",
+"reason": "Forbidden",
+"details": {"name": "podname", "kind": "pods"},
+"code": 403},
+r.status = 403
+r.reason = "Forbidden"
+
+# A mock kube_client that throws errors when making a pod
+mock_kube_client = mock.patch('kubernetes.client.CoreV1Api', 
autospec=True)
+mock_kube_client.create_namespaced_pod = mock.MagicMock(
+side_effect=ApiException(http_resp=r))
+mock_get_kube_client.return_value = mock_kube_client
+
+

[GitHub] Fokko closed pull request #3960: [AIRFLOW-2966] Catch ApiException in the Kubernetes Executor

2018-11-18 Thread GitBox
Fokko closed pull request #3960: [AIRFLOW-2966] Catch ApiException in the 
Kubernetes Executor
URL: https://github.com/apache/incubator-airflow/pull/3960
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/config_templates/default_test.cfg 
b/airflow/config_templates/default_test.cfg
index f9279cce54..2630a60ce4 100644
--- a/airflow/config_templates/default_test.cfg
+++ b/airflow/config_templates/default_test.cfg
@@ -125,3 +125,6 @@ hide_sensitive_variable_fields = True
 elasticsearch_host =
 elasticsearch_log_id_template = 
{{dag_id}}-{{task_id}}-{{execution_date}}-{{try_number}}
 elasticsearch_end_of_log_mark = end_of_log
+
+[kubernetes]
+dags_volume_claim = default
diff --git a/airflow/contrib/executors/kubernetes_executor.py 
b/airflow/contrib/executors/kubernetes_executor.py
index de1f9f4235..f9e350d303 100644
--- a/airflow/contrib/executors/kubernetes_executor.py
+++ b/airflow/contrib/executors/kubernetes_executor.py
@@ -599,8 +599,14 @@ def sync(self):
 last_resource_version, session=self._session)
 
 if not self.task_queue.empty():
-key, command, kube_executor_config = self.task_queue.get()
-self.kube_scheduler.run_next((key, command, kube_executor_config))
+task = self.task_queue.get()
+
+try:
+self.kube_scheduler.run_next(task)
+except ApiException:
+self.log.exception('ApiException when attempting ' +
+   'to run task, re-queueing.')
+self.task_queue.put(task)
 
 def _change_state(self, key, state, pod_id):
 if state != State.RUNNING:
diff --git a/tests/contrib/executors/test_kubernetes_executor.py 
b/tests/contrib/executors/test_kubernetes_executor.py
index c203e18d5c..905beeec40 100644
--- a/tests/contrib/executors/test_kubernetes_executor.py
+++ b/tests/contrib/executors/test_kubernetes_executor.py
@@ -18,10 +18,13 @@
 import re
 import string
 import random
+from urllib3 import HTTPResponse
 from datetime import datetime
 
 try:
+from kubernetes.client.rest import ApiException
 from airflow.contrib.executors.kubernetes_executor import 
AirflowKubernetesScheduler
+from airflow.contrib.executors.kubernetes_executor import 
KubernetesExecutor
 from airflow.contrib.kubernetes.worker_configuration import 
WorkerConfiguration
 except ImportError:
 AirflowKubernetesScheduler = None
@@ -81,6 +84,7 @@ class TestKubernetesWorkerConfiguration(unittest.TestCase):
 Tests that if dags_volume_subpath/logs_volume_subpath configuration
 options are passed to worker pod config
 """
+
 def setUp(self):
 if AirflowKubernetesScheduler is None:
 self.skipTest("kubernetes python package is not installed")
@@ -152,5 +156,61 @@ def 
test_worker_environment_when_dags_folder_specified(self):
 self.assertEqual(dags_folder, env['AIRFLOW__CORE__DAGS_FOLDER'])
 
 
+class TestKubernetesExecutor(unittest.TestCase):
+"""
+Tests if an ApiException from the Kube Client will cause the task to
+be rescheduled.
+"""
+@unittest.skipIf(AirflowKubernetesScheduler is None,
+ 'kubernetes python package is not installed')
+
@mock.patch('airflow.contrib.executors.kubernetes_executor.KubernetesJobWatcher')
+
@mock.patch('airflow.contrib.executors.kubernetes_executor.get_kube_client')
+def test_run_next_exception(self, mock_get_kube_client, 
mock_kubernetes_job_watcher):
+
+# When a quota is exceeded this is the ApiException we get
+r = HTTPResponse()
+r.body = {
+"kind": "Status",
+"apiVersion": "v1",
+"metadata": {},
+"status": "Failure",
+"message": "pods \"podname\" is forbidden: " +
+"exceeded quota: compute-resources, " +
+"requested: limits.memory=4Gi, " +
+"used: limits.memory=6508Mi, " +
+"limited: limits.memory=10Gi",
+"reason": "Forbidden",
+"details": {"name": "podname", "kind": "pods"},
+"code": 403},
+r.status = 403
+r.reason = "Forbidden"
+
+# A mock kube_client that throws errors when making a pod
+mock_kube_client = mock.patch('kubernetes.client.CoreV1Api', 
autospec=True)
+mock_kube_client.create_namespaced_pod = mock.MagicMock(
+side_effect=ApiException(http_resp=r))
+mock_get_kube_client.return_value = mock_kube_client
+
+kubernetesExecutor = KubernetesExecutor()
+kubernetesExecutor.start()
+
+# Execute a task while the Api Throws errors
+kubernetesExecutor.execute_async(key=('dag', 'task', 
datetime.utcnow()),
+

[GitHub] Fokko commented on issue #4101: [AIRFLOW-3272] Add base grpc hook

2018-11-18 Thread GitBox
Fokko commented on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: 
https://github.com/apache/incubator-airflow/pull/4101#issuecomment-439708448
 
 
   Would it be possible to add some tests to this PR as well?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] jmcarp commented on issue #4207: [WIP] Run celery integration test with redis broker.

2018-11-18 Thread GitBox
jmcarp commented on issue #4207: [WIP] Run celery integration test with redis 
broker.
URL: 
https://github.com/apache/incubator-airflow/pull/4207#issuecomment-439708132
 
 
   @Fokko: sorry, I didn't realize this was a paid travis account. Anyway, the 
idea here is to test the redis backend for celery, not the redis hook or 
sensor, which is why I changed the test configuration but not the test code. 
I'll continue with this locally or on my own travis for now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4207: [WIP] Run celery integration test with redis broker.

2018-11-18 Thread GitBox
Fokko commented on issue #4207: [WIP] Run celery integration test with redis 
broker.
URL: 
https://github.com/apache/incubator-airflow/pull/4207#issuecomment-439707713
 
 
   Please consult with https://github.com/apache/incubator-airflow/pull/4090 
before you do any duplicate work :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4090: [AIRFLOW-3250] Fix for Redis Hook for not authorised connection calls

2018-11-18 Thread GitBox
Fokko commented on a change in pull request #4090: [AIRFLOW-3250] Fix for Redis 
Hook for not authorised connection calls
URL: https://github.com/apache/incubator-airflow/pull/4090#discussion_r234452826
 
 

 ##
 File path: tests/contrib/hooks/test_redis_hook.py
 ##
 @@ -19,32 +19,116 @@
 
 
 import unittest
-from mock import patch
-
+from mock import patch, MagicMock
 from airflow import configuration
 from airflow.contrib.hooks.redis_hook import RedisHook
 
 
 class TestRedisHook(unittest.TestCase):
+
 def setUp(self):
 configuration.load_test_config()
 
-def test_get_conn(self):
+@patch('airflow.contrib.hooks.redis_hook.StrictRedis')
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
+def test_get_conn(self, redis_hook_get_connection_mock, StrictRedisMock):
+HOST = 'localhost'
+PORT = 6379
+PASSWORD = 's3cret!'
+DB = 0
+
+extra_dejson_mock = MagicMock()
+extra_dejson_mock.get.return_value = DB
+connection_parameters = MagicMock()
+connection_parameters.configure_mock(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+extra_dejson=extra_dejson_mock)
+redis_hook_get_connection_mock.return_value = connection_parameters
+
+hook = RedisHook(redis_conn_id='redis_default')
+self.assertEqual(hook.redis, None)
+
+self.assertEqual(hook.host, None, 'host initialised as None.')
+self.assertEqual(hook.port, None, 'port initialised as None.')
+self.assertEqual(hook.password, None, 'password initialised as None.')
+self.assertEqual(hook.db, None, 'db initialised as None.')
+
+self.assertIs(hook.get_conn(), hook.get_conn(), 'Connection 
initialized only if None.')
+
+StrictRedisMock.assert_called_once_with(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+db=DB)
+
+@patch('airflow.contrib.hooks.redis_hook.StrictRedis')
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
 
 Review comment:
   Since we now have a Redis running, we don't need to mock the hook anymore.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on a change in pull request #4090: [AIRFLOW-3250] Fix for Redis Hook for not authorised connection calls

2018-11-18 Thread GitBox
Fokko commented on a change in pull request #4090: [AIRFLOW-3250] Fix for Redis 
Hook for not authorised connection calls
URL: https://github.com/apache/incubator-airflow/pull/4090#discussion_r234452823
 
 

 ##
 File path: tests/contrib/hooks/test_redis_hook.py
 ##
 @@ -19,32 +19,116 @@
 
 
 import unittest
-from mock import patch
-
+from mock import patch, MagicMock
 from airflow import configuration
 from airflow.contrib.hooks.redis_hook import RedisHook
 
 
 class TestRedisHook(unittest.TestCase):
+
 def setUp(self):
 configuration.load_test_config()
 
-def test_get_conn(self):
+@patch('airflow.contrib.hooks.redis_hook.StrictRedis')
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
+def test_get_conn(self, redis_hook_get_connection_mock, StrictRedisMock):
+HOST = 'localhost'
+PORT = 6379
+PASSWORD = 's3cret!'
+DB = 0
+
+extra_dejson_mock = MagicMock()
+extra_dejson_mock.get.return_value = DB
+connection_parameters = MagicMock()
+connection_parameters.configure_mock(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+extra_dejson=extra_dejson_mock)
+redis_hook_get_connection_mock.return_value = connection_parameters
+
+hook = RedisHook(redis_conn_id='redis_default')
+self.assertEqual(hook.redis, None)
+
+self.assertEqual(hook.host, None, 'host initialised as None.')
+self.assertEqual(hook.port, None, 'port initialised as None.')
+self.assertEqual(hook.password, None, 'password initialised as None.')
+self.assertEqual(hook.db, None, 'db initialised as None.')
+
+self.assertIs(hook.get_conn(), hook.get_conn(), 'Connection 
initialized only if None.')
+
+StrictRedisMock.assert_called_once_with(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+db=DB)
+
+@patch('airflow.contrib.hooks.redis_hook.StrictRedis')
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
+def test_get_conn_password_stays_none(self, 
redis_hook_get_connection_mock, StrictRedisMock):
+HOST = 'localhost'
+PORT = 6379
+PASSWORD = 'None'
+DB = 0
+
+extra_dejson_mock = MagicMock()
+extra_dejson_mock.get.return_value = DB
+connection_parameters = MagicMock()
+connection_parameters.configure_mock(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+extra_dejson=extra_dejson_mock)
+redis_hook_get_connection_mock.return_value = connection_parameters
+
+hook = RedisHook(redis_conn_id='redis_default')
+hook.get_conn()
+self.assertEqual(hook.password, None)
+
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
+def test_real_ping(self, redis_hook_get_connection_mock):
+HOST = 'redis'
+PORT = 6379
+PASSWORD = 'None'
+DB = 0
+
+extra_dejson_mock = MagicMock()
+extra_dejson_mock.get.return_value = DB
+connection_parameters = MagicMock()
+connection_parameters.configure_mock(
+host=HOST,
+port=PORT,
+password=PASSWORD,
+extra_dejson=extra_dejson_mock)
+redis_hook_get_connection_mock.return_value = connection_parameters
+
 hook = RedisHook(redis_conn_id='redis_default')
-self.assertEqual(hook.client, None)
-self.assertEqual(
-repr(hook.get_conn()),
-(
-'StrictRedis>>'
-)
-)
-
-@patch("airflow.contrib.hooks.redis_hook.RedisHook.get_conn")
-def test_first_conn_instantiation(self, get_conn):
+redis = hook.get_conn()
+
+self.assertTrue(redis.ping(), 'Connection to Redis with PING works.')
+
+@patch('airflow.contrib.hooks.redis_hook.RedisHook.get_connection')
 
 Review comment:
   Since we now have a Redis running, we don't need to mock the hook anymore.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4134: [AIRFLOW-3213] Create ADLS to GCS operator

2018-11-18 Thread GitBox
Fokko commented on issue #4134: [AIRFLOW-3213] Create ADLS to GCS operator
URL: 
https://github.com/apache/incubator-airflow/pull/4134#issuecomment-439706018
 
 
   @bkvarda Can you rebase on master as well?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] Fokko commented on issue #4207: [WIP] Run celery integration test with redis broker.

2018-11-18 Thread GitBox
Fokko commented on issue #4207: [WIP] Run celery integration test with redis 
broker.
URL: 
https://github.com/apache/incubator-airflow/pull/4207#issuecomment-439705915
 
 
   Hi @jmcarp 
   
   Thanks for picking this up. Preferably you run the tests against your own 
Travis instance first, instead of using the Apache Travis. The hours of Apache 
Travis are paid, and on your own fork is for free :-)
   
   I don't see any changes in the tests itself. Maybe it would be nice to add a 
test like: Run a dag with a RedisSensor, and make sure that it will pass. So we 
have some actual code which communicates with Travis (and therefore also uses 
the RedisHook).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-3363) Plugin templates not rendered by Flask appbuilder's baseview

2018-11-18 Thread Ran Zvi (JIRA)
Ran Zvi created AIRFLOW-3363:


 Summary: Plugin templates not rendered by Flask appbuilder's 
baseview
 Key: AIRFLOW-3363
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3363
 Project: Apache Airflow
  Issue Type: Bug
  Components: plugins
Affects Versions: 1.10.0
 Environment: Docker 
Reporter: Ran Zvi


Hello, I'm having issues with the new F.A.B plugins, the documentation lacks 
the import for `AppBuilderBaseView`, However after a bit of digging I found out 
it is probably `from flask_appbuilder import BaseView as AppBuilderBaseView`.

The next issue is that the class lacks a `render` function and F.A.B only 
provides a `render_template` function which uses a preconfigured path for the 
`template` folder which isn't under the airflow `plugins` folder.

 

Thabks in advance!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3362) Template to support jinja2 native python types

2018-11-18 Thread Duan Shiqiang (JIRA)
Duan Shiqiang created AIRFLOW-3362:
--

 Summary: Template to support jinja2 native python types
 Key: AIRFLOW-3362
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3362
 Project: Apache Airflow
  Issue Type: Improvement
  Components: core, DAG
Reporter: Duan Shiqiang


Airflow latest (1.10.x)'s template can only render into string which is fine 
most of the times, but it would be better to support render into python types.

It can be very useful if the template system can support render into native 
python types like list, dictionary, etc. Especially when using xcom to pass 
some values between operators.

Jinja2 supports this feature from 2.10, more info can found here: 
http://jinja.pocoo.org/docs/2.10/nativetypes/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #4201: [AIRFLOW-3360] Make the DAGs search respect other querystring parameters

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4201: [AIRFLOW-3360] Make the DAGs search 
respect other querystring parameters
URL: 
https://github.com/apache/incubator-airflow/pull/4201#issuecomment-439543567
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=h1)
 Report
   > Merging 
[#4201](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/abd6c1b371f71ee42e6040fa5cb3787f64a66ddd?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4201/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4201  +/-   ##
   =
   + Coverage77.7%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16315   16315  
   =
   + Hits12677   12678   +1 
   + Misses   36383637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4201/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=footer).
 Last update 
[abd6c1b...fcc1763](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #4201: [AIRFLOW-3360] Make the DAGs search respect other querystring parameters

2018-11-18 Thread GitBox
codecov-io edited a comment on issue #4201: [AIRFLOW-3360] Make the DAGs search 
respect other querystring parameters
URL: 
https://github.com/apache/incubator-airflow/pull/4201#issuecomment-439543567
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=h1)
 Report
   > Merging 
[#4201](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/abd6c1b371f71ee42e6040fa5cb3787f64a66ddd?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/4201/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #4201  +/-   ##
   =
   + Coverage77.7%   77.7%   +<.01% 
   =
 Files 199 199  
 Lines   16315   16315  
   =
   + Hits12677   12678   +1 
   + Misses   36383637   -1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/4201/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `92.33% <0%> (+0.04%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=footer).
 Last update 
[abd6c1b...fcc1763](https://codecov.io/gh/apache/incubator-airflow/pull/4201?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Martin Black (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Martin Black updated AIRFLOW-3361:
--
Description: In 2.0 passing invalid keywords to {{BaseOperator}} will be 
deprecated. Prior to that, there is a {{PendingDeprecationWarning}} raised, 
however it can be hard to track down which specific task is raising this 
warning.  (was: In 2.0 passing invalid keywords to {{BaseOperator}} will be 
deprecated. Prior to that, there is a {{PendingDeprecationWarning}} raised, 
however it can be hard to track down which specific task is raising this 
warning. This PR adds the {{task_id}} to aid this.)

> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Task
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] mblack20 commented on issue #4030: [AIRFLOW-3361] Log the task_id in the PendingDeprecationWarning for BaseOperator

2018-11-18 Thread GitBox
mblack20 commented on issue #4030: [AIRFLOW-3361] Log the task_id in the 
PendingDeprecationWarning for BaseOperator
URL: 
https://github.com/apache/incubator-airflow/pull/4030#issuecomment-439688383
 
 
   Done, JIRA issue 
[AIRFLOW-3361](https://issues.apache.org/jira/browse/AIRFLOW-3361)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Martin Black (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-3361 started by Martin Black.
-
> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Task
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning. This PR adds 
> the {{task_id}} to aid this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Martin Black (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3361?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690880#comment-16690880
 ] 

Martin Black commented on AIRFLOW-3361:
---

PR https://github.com/apache/incubator-airflow/pull/4030

> Add the task_id to the Deprecation Warning when passing unsupported keywords 
> to BaseOperator
> 
>
> Key: AIRFLOW-3361
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
> Project: Apache Airflow
>  Issue Type: Task
>  Components: logging
>Affects Versions: 1.9.0
>Reporter: Martin Black
>Assignee: Martin Black
>Priority: Trivial
>
> In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
> to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
> hard to track down which specific task is raising this warning. This PR adds 
> the {{task_id}} to aid this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-3361) Add the task_id to the Deprecation Warning when passing unsupported keywords to BaseOperator

2018-11-18 Thread Martin Black (JIRA)
Martin Black created AIRFLOW-3361:
-

 Summary: Add the task_id to the Deprecation Warning when passing 
unsupported keywords to BaseOperator
 Key: AIRFLOW-3361
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3361
 Project: Apache Airflow
  Issue Type: Task
  Components: logging
Affects Versions: 1.9.0
Reporter: Martin Black
Assignee: Martin Black


In 2.0 passing invalid keywords to {{BaseOperator}} will be deprecated. Prior 
to that, there is a {{PendingDeprecationWarning}} raised, however it can be 
hard to track down which specific task is raising this warning. This PR adds 
the {{task_id}} to aid this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2775) Recent Tasks Label Misleading

2018-11-18 Thread jack (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16690833#comment-16690833
 ] 

jack commented on AIRFLOW-2775:
---

isn't it better to just say "tasks from the most recent run" ? If it's active 
or not the status of the task will tell. (If it's light green or dark green)

> Recent Tasks Label Misleading
> -
>
> Key: AIRFLOW-2775
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2775
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ui
>Affects Versions: 1.9.0
>Reporter: Matthias Niehoff
>Priority: Major
> Attachments: recent-tasks.png
>
>
> The label for the Recent Tasks in the DAGs UI is misleading. 
> The mouse over label says: "Status of tasks from all active DAG runs or, if 
> not currently active, from most recent run."
> While the "not currently active" part is correct, the active DAG run is 
> incorrect. Shown are the status of the task from all active DAG runs plus the 
> tasks from the most recent run". When the run finishes the task from the 
> previous run are removed from the view and only the tasks of the most recent 
> run are shown.
> Either the label should be updated to reflect his
> or
> Only the tasks of the current run will be shown, without the task of the last 
> finished run 
>  
> Imho the second options makes more sense.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] ashb commented on issue #4195: [AIRFLOW-3353] Pin redis version

2018-11-18 Thread GitBox
ashb commented on issue #4195: [AIRFLOW-3353] Pin redis version
URL: 
https://github.com/apache/incubator-airflow/pull/4195#issuecomment-439676172
 
 
   No, we aren't planning on releasing a new 1.9 - also celery are working on a 
proper fix on their side (if you were going to install a new version of 
anythingb it is just as easy to downgrade redis-py yourself)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services