[GitHub] [airflow] Fokko commented on a change in pull request #6210: [AIRFLOW-5567] BaseAsyncOperator

2019-10-18 Thread GitBox
Fokko commented on a change in pull request #6210: [AIRFLOW-5567] 
BaseAsyncOperator
URL: https://github.com/apache/airflow/pull/6210#discussion_r336723480
 
 

 ##
 File path: airflow/models/base_async_operator.py
 ##
 @@ -0,0 +1,161 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Base Asynchronous Operator for kicking off a long running
+operations and polling for completion with reschedule mode.
+"""
+
+from abc import abstractmethod
+from typing import Dict, List, Optional, Union
+
+from airflow.models import SkipMixin, TaskReschedule
+from airflow.models.xcom import XCOM_EXTERNAL_RESOURCE_ID_KEY
+from airflow.sensors.base_sensor_operator import BaseSensorOperator
+from airflow.utils.decorators import apply_defaults
+
+
 
 Review comment:
   My pleasure. This is some cool functionality that a lot of users would 
benefit from. As @potiuk mentioned, let's see if we can make this happen :)
   
   So the scheduler puts it to `SCHEDULED`, and then it the job will be pushed 
to the Celery queue (CeleryExecutor), Kubernetes (KubernetesExecutor) or a 
local queue (LocalExecutor) based on your executor. When there are resources 
available, in the case of Celery, a worker will pick it from the queue and 
start executing. When this happens, the worker will first put . the state from 
`QUEUED` to `RUNNING`.
   
   The explicit clearing of the state goes back to Airflow 1.7. I think we can 
just update the state, instead of explicitly clearing it. I'm working on a PR 
here: https://github.com/apache/airflow/pull/6370
   
   Since the xcom is a key-value structure, I think we should reserve some key 
for storing this state. Using xcom for this will also enable us to look at the 
state from the Airflow UI, this comes free.
   The only downside for not clearing the xcom upfront, as far as I can see, is 
that some other task might still read old xcom data, in the case the task 
hasn't finished yet.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5701) Don't clear xcom explicitly before execution

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16955073#comment-16955073
 ] 

ASF GitHub Bot commented on AIRFLOW-5701:
-

Fokko commented on pull request #6370: AIRFLOW-5701: Don't clear xcom 
explicitly before execution
URL: https://github.com/apache/airflow/pull/6370
 
 
   Make sure you have checked _all_ steps below.
   
   - Don't clear the xcom upfront, but only when the job ran and new values are 
provided.
   - Remove the `id` column from the xcom table and make `(dag_id, task_id, 
execution_date, key)` the PK
   - Remove the commit in between of the delete and insert of the new xcom 
value. This would actually potentially lead to two values and violate the 
uniqueness constraint that we expect there (however until not this was 
implicit). If you would have two jobs running, they could do a DELETE at the 
same time, and then an INSERT at the same time, now this is atomic.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-5701\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5701
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Don't clear xcom explicitly before execution
> 
>
> Key: AIRFLOW-5701
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5701
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: xcom
>Affects Versions: 1.10.5
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] Fokko opened a new pull request #6370: AIRFLOW-5701: Don't clear xcom explicitly before execution

2019-10-18 Thread GitBox
Fokko opened a new pull request #6370: AIRFLOW-5701: Don't clear xcom 
explicitly before execution
URL: https://github.com/apache/airflow/pull/6370
 
 
   Make sure you have checked _all_ steps below.
   
   - Don't clear the xcom upfront, but only when the job ran and new values are 
provided.
   - Remove the `id` column from the xcom table and make `(dag_id, task_id, 
execution_date, key)` the PK
   - Remove the commit in between of the delete and insert of the new xcom 
value. This would actually potentially lead to two values and violate the 
uniqueness constraint that we expect there (however until not this was 
implicit). If you would have two jobs running, they could do a DELETE at the 
same time, and then an INSERT at the same time, now this is atomic.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-5701\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5701
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5701) Dont clear xcom explicitly before execution

2019-10-18 Thread Fokko Driesprong (Jira)
Fokko Driesprong created AIRFLOW-5701:
-

 Summary: Dont clear xcom explicitly before execution
 Key: AIRFLOW-5701
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5701
 Project: Apache Airflow
  Issue Type: Bug
  Components: xcom
Affects Versions: 1.10.5
Reporter: Fokko Driesprong
Assignee: Fokko Driesprong
 Fix For: 2.0.0






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5701) Don't clear xcom explicitly before execution

2019-10-18 Thread Fokko Driesprong (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5701?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong updated AIRFLOW-5701:
--
Summary: Don't clear xcom explicitly before execution  (was: Dont clear 
xcom explicitly before execution)

> Don't clear xcom explicitly before execution
> 
>
> Key: AIRFLOW-5701
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5701
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: xcom
>Affects Versions: 1.10.5
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] fritz-clicktripz edited a comment on issue #6230: [AIRFLOW-5413] Allow K8S pod to be configured from JSON/YAML file

2019-10-18 Thread GitBox
fritz-clicktripz edited a comment on issue #6230: [AIRFLOW-5413] Allow K8S pod 
to be configured from JSON/YAML file
URL: https://github.com/apache/airflow/pull/6230#issuecomment-544047860
 
 
   `path_to_deployment_file` is not super clear what it means - could the name 
be something like `path_to_pod_template` or etc, to more closely match 
kubernetes terminology?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] fritz-clicktripz edited a comment on issue #6230: [AIRFLOW-5413] Allow K8S pod to be configured from JSON/YAML file

2019-10-18 Thread GitBox
fritz-clicktripz edited a comment on issue #6230: [AIRFLOW-5413] Allow K8S pod 
to be configured from JSON/YAML file
URL: https://github.com/apache/airflow/pull/6230#issuecomment-544047860
 
 
   `path_to_deployment_file` is not super clear what it means - could the name 
be something like `path_to_pod_template` or to more closely match kubernetes 
terminology or `path_to_worker_template` to more closely match airflow 
terminology?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] fritz-clicktripz commented on issue #6230: [AIRFLOW-5413] Allow K8S pod to be configured from JSON/YAML file

2019-10-18 Thread GitBox
fritz-clicktripz commented on issue #6230: [AIRFLOW-5413] Allow K8S pod to be 
configured from JSON/YAML file
URL: https://github.com/apache/airflow/pull/6230#issuecomment-544047860
 
 
   `path_to_deployment_file` is not super clear what it means - could the name 
be something like `path_to_pod_template` or etc, to more closely kubernetes 
terminology?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

F D closed AIRFLOW-5700.

Resolution: Duplicate

> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client. I believe supporting a YAML template would solve a lot of 
> problems and reduce the load on airflow maintainers for feature requests 
> around supporting more of the Kubernetes API
> Dask has good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16955047#comment-16955047
 ] 

F D commented on AIRFLOW-5700:
--

Duplicate of https://issues.apache.org/jira/browse/AIRFLOW-5413

Closing

> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client. I believe supporting a YAML template would solve a lot of 
> problems and reduce the load on airflow maintainers for feature requests 
> around supporting more of the Kubernetes API
> Dask has good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

F D updated AIRFLOW-5700:
-
Description: 
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.

  was:
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.


> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client.
> Dask has really good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566
> If Airflow implemented support for creating pods via a YAML template, new 
> feature requests and maintenance for the KubernetesExecutor would drop to 
> very little, and airflow users would have a lot of power and flexibility with 
> what pods look like in their system.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mingrammer edited a comment on issue #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
mingrammer edited a comment on issue #6364: [AIRFLOW-5693] Support the "blocks" 
component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364#issuecomment-544039958
 
 
   CI failed due to this issue. (https://github.com/docker/for-linux/issues/830)
   
   And, this issue seems to be fixed. Please re-run the CI :) 
(https://github.com/docker/for-linux/issues/830#issuecomment-543930808)
   
   ```
   Err:13 https://download.docker.com/linux/debian stretch/stable amd64 Packages
 Writing more data than expected (6 > 9983)
 Hashes of expected file:
  - Filesize:9983 [weak]
  - 
SHA512:4faf843dcb2544a9135a36ecfc4b914dd4e9cdada9a5e20e4994f5dd33d2d5aacb44f237bf9bf7048bac571e8be47d04975f8d93a19dab67dbfc231b68012011
  - SHA256:db9d0a054919cfa52440f1d59ec3f4f2fd15b89661c87f35f02a24f3832a6ce6
  - SHA1:d52b8071d1449718bc483c54a5bc698a26107aa6 [weak]
  - MD5Sum:d7b024e64ef3849c725f4ce1f05bf7a7 [weak]
 Release file created at: Sat, 17 Aug 2019 05:04:49 +
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mingrammer commented on issue #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
mingrammer commented on issue #6364: [AIRFLOW-5693] Support the "blocks" 
component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364#issuecomment-544039958
 
 
   CI has been failed due to this issue. 
(https://github.com/docker/for-linux/issues/830)
   
   And, this issue seems to be fixed. Please re-run the CI :) 
(https://github.com/docker/for-linux/issues/830#issuecomment-543930808)
   
   ```
   Err:13 https://download.docker.com/linux/debian stretch/stable amd64 Packages
 Writing more data than expected (6 > 9983)
 Hashes of expected file:
  - Filesize:9983 [weak]
  - 
SHA512:4faf843dcb2544a9135a36ecfc4b914dd4e9cdada9a5e20e4994f5dd33d2d5aacb44f237bf9bf7048bac571e8be47d04975f8d93a19dab67dbfc231b68012011
  - SHA256:db9d0a054919cfa52440f1d59ec3f4f2fd15b89661c87f35f02a24f3832a6ce6
  - SHA1:d52b8071d1449718bc483c54a5bc698a26107aa6 [weak]
  - MD5Sum:d7b024e64ef3849c725f4ce1f05bf7a7 [weak]
 Release file created at: Sat, 17 Aug 2019 05:04:49 +
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

F D updated AIRFLOW-5700:
-
Description: 
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client. I believe supporting a YAML template would solve a lot of 
problems and reduce the load on airflow maintainers for feature requests around 
supporting more of the Kubernetes API

Dask has good Kubernetes support. They provide a hook to define a pod template 
via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566

  was:
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has good Kubernetes support. They provide a hook to define a pod template 
via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.


> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client. I believe supporting a YAML template would solve a lot of 
> problems and reduce the load on airflow maintainers for feature requests 
> around supporting more of the Kubernetes API
> Dask has good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

F D updated AIRFLOW-5700:
-
Description: 
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has good Kubernetes support. They provide a hook to define a pod template 
via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.

  was:
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.


> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client.
> Dask has good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L531-L566
> If Airflow implemented support for creating pods via a YAML template, new 
> feature requests and maintenance for the KubernetesExecutor would drop to 
> very little, and airflow users would have a lot of power and flexibility with 
> what pods look like in their system.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

F D updated AIRFLOW-5700:
-
Description: 
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that are 
difficult to get working with my kubernetes implementation, are difficult 
because certain features of the Kubernetes Spec aren't implemented in a way 
that is configurable, or features are forced to be implemented in a certain way 
because of design decisions in the airflow wrappers around around the 
kubernetes client.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
and source here:
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291


If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.

  was:
As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that 
don't work with my kubernetes implementation, are unnecessarily difficult 
because certain features of the Kubernetes Spec aren't implemented in a way I 
can configure, or features are forced to be implemented in a certain way 
because of historical design decisions by the airflow team.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults. 
You can see an example here:
https://kubernetes.dask.org/en/latest/#quickstart

If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.


> KubeExecutor - Specify YAML template
> 
>
> Key: AIRFLOW-5700
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: executor-kubernetes
>Affects Versions: 1.10.5
>Reporter: F D
>Assignee: Daniel Imberman
>Priority: Critical
>
> As a developer - I find myself struggling with the Airflow implementation of 
> launching pods within the KubeExecutor. Things are implemented in ways that 
> are difficult to get working with my kubernetes implementation, are difficult 
> because certain features of the Kubernetes Spec aren't implemented in a way 
> that is configurable, or features are forced to be implemented in a certain 
> way because of design decisions in the airflow wrappers around around the 
> kubernetes client.
> Dask has really good Kubernetes support. They provide a hook to define a pod 
> template via YAML, OR to use defaults a thin abstraction via `make_pod_spec` 
> You can see an example here: https://kubernetes.dask.org/en/latest/#quickstart
> and source here:
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/kubernetes.yaml#L35-L64
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/objects.py#L100-L163
> https://github.com/dask/dask-kubernetes/blob/master/dask_kubernetes/core.py#L279-L291
> If Airflow implemented support for creating pods via a YAML template, new 
> feature requests and maintenance for the KubernetesExecutor would drop to 
> very little, and airflow users would have a lot of power and flexibility with 
> what pods look like in their system.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5700) KubeExecutor - Specify YAML template

2019-10-18 Thread F D (Jira)
F D created AIRFLOW-5700:


 Summary: KubeExecutor - Specify YAML template
 Key: AIRFLOW-5700
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5700
 Project: Apache Airflow
  Issue Type: New Feature
  Components: executor-kubernetes
Affects Versions: 1.10.5
Reporter: F D
Assignee: Daniel Imberman


As a developer - I find myself struggling with the Airflow implementation of 
launching pods within the KubeExecutor. Things are implemented in ways that 
don't work with my kubernetes implementation, are unnecessarily difficult 
because certain features of the Kubernetes Spec aren't implemented in a way I 
can configure, or features are forced to be implemented in a certain way 
because of historical design decisions by the airflow team.

Dask has really good Kubernetes support. They provide a hook to define a pod 
template via YAML, OR to use defaults. 
You can see an example here:
https://kubernetes.dask.org/en/latest/#quickstart

If Airflow implemented support for creating pods via a YAML template, new 
feature requests and maintenance for the KubernetesExecutor would drop to very 
little, and airflow users would have a lot of power and flexibility with what 
pods look like in their system.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io commented on issue #6369: [AIRFLOW-5699][part of AIRFLOW-5697][depends on AIRFLOW-5698] Add more tests for Dataflow integration

2019-10-18 Thread GitBox
codecov-io commented on issue #6369: [AIRFLOW-5699][part of 
AIRFLOW-5697][depends on AIRFLOW-5698] Add more tests for Dataflow integration
URL: https://github.com/apache/airflow/pull/6369#issuecomment-544034730
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@b8c0263`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6369/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ## master#6369   +/-   ##
   =
 Coverage  ?   80.12%   
   =
 Files ?  616   
 Lines ?35807   
 Branches  ?0   
   =
 Hits  ?28690   
 Misses? 7117   
 Partials  ?0
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/gcp/hooks/dataflow.py](https://codecov.io/gh/apache/airflow/pull/6369/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvaG9va3MvZGF0YWZsb3cucHk=)
 | `91.07% <100%> (ø)` | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=footer). 
Last update 
[b8c0263...21eb83f](https://codecov.io/gh/apache/airflow/pull/6369?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6368: [AIRFLOW-5698][part of AIRFLOW-5697] Organize Dataflow tests

2019-10-18 Thread GitBox
codecov-io edited a comment on issue #6368: [AIRFLOW-5698][part of 
AIRFLOW-5697] Organize Dataflow tests
URL: https://github.com/apache/airflow/pull/6368#issuecomment-544025511
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@b8c0263`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6368/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ## master#6368   +/-   ##
   =
 Coverage  ?   80.09%   
   =
 Files ?  616   
 Lines ?35804   
 Branches  ?0   
   =
 Hits  ?28677   
 Misses? 7127   
 Partials  ?0
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=footer). 
Last update 
[b8c0263...0216625](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #6368: [AIRFLOW-5698][part of AIRFLOW-5697] Organize Dataflow tests

2019-10-18 Thread GitBox
codecov-io commented on issue #6368: [AIRFLOW-5698][part of AIRFLOW-5697] 
Organize Dataflow tests
URL: https://github.com/apache/airflow/pull/6368#issuecomment-544025511
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=h1) 
Report
   > :exclamation: No coverage uploaded for pull request base 
(`master@b8c0263`). [Click here to learn what that 
means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6368/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ## master#6368   +/-   ##
   =
 Coverage  ?   80.09%   
   =
 Files ?  616   
 Lines ?35804   
 Branches  ?0   
   =
 Hits  ?28677   
 Misses? 7127   
 Partials  ?0
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=footer). 
Last update 
[b8c0263...0216625](https://codecov.io/gh/apache/airflow/pull/6368?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #6344: [AIRFLOW-5665] Add path_exists method to SFTPHook

2019-10-18 Thread GitBox
mik-laj commented on a change in pull request #6344: [AIRFLOW-5665] Add 
path_exists method to SFTPHook
URL: https://github.com/apache/airflow/pull/6344#discussion_r336709991
 
 

 ##
 File path: airflow/contrib/hooks/sftp_hook.py
 ##
 @@ -214,3 +214,12 @@ def get_mod_time(self, path):
 conn = self.get_conn()
 ftp_mdtm = conn.stat(path).st_mtime
 return 
datetime.datetime.fromtimestamp(ftp_mdtm).strftime('%Y%m%d%H%M%S')
+
+def path_exists(self, path):
+"""
+Returns True if a remote entity exists
+:param path: full path to the remote file or directory
 
 Review comment:
   ```suggestion
   
   :param path: full path to the remote file or directory
   ```
   New line is required in this place. Otherwise, docs doesn't looks good.
   https://user-images.githubusercontent.com/12058428/67134612-4a845e00-f213-11e9-96a2-b83efd76a7da.png;>
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KevinYang21 commented on issue #6367: [AIRFLOW-5695] use RUNNING_DEPS to check run from UI

2019-10-18 Thread GitBox
KevinYang21 commented on issue #6367: [AIRFLOW-5695] use RUNNING_DEPS to check 
run from UI
URL: https://github.com/apache/airflow/pull/6367#issuecomment-544007106
 
 
   So what's left are:
   ```
   DagrunRunningDep
   DagrunIdDep
   DagUnpausedDep
   ExecDateAfterStartDateDep
   ```
   
   I'm +1 for using `RUNNING_DEPS ` when trying to trigger from webserver. 
Don't think the deps in the gap is what people triggering task from UI want to 
care. In fact from the comment I think it is meant to use `RUNNING_DEPS` in 
webserver.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5698) Organize Dataflow tests

2019-10-18 Thread Kamil Bregula (Jira)
Kamil Bregula created AIRFLOW-5698:
--

 Summary: Organize Dataflow tests
 Key: AIRFLOW-5698
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5698
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 1.10.5
Reporter: Kamil Bregula
Assignee: Kamil Bregula






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5698) Organize Dataflow tests

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16955005#comment-16955005
 ] 

ASF GitHub Bot commented on AIRFLOW-5698:
-

mik-laj commented on pull request #6368: [AIRFLOW-5698][part of AIRFLOW-5697] 
Organize Dataflow tests
URL: https://github.com/apache/airflow/pull/6368
 
 
   This PR is one of a series that aims to improve this integration
   https://issues.apache.org/jira/browse/AIRFLOW-5697
   
   ---
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-5698\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Organize Dataflow tests
> ---
>
> Key: AIRFLOW-5698
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5698
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Assignee: Kamil Bregula
>Priority: Trivial
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5699) Add more tests for Dataflow integration

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16955008#comment-16955008
 ] 

ASF GitHub Bot commented on AIRFLOW-5699:
-

mik-laj commented on pull request #6369: [AIRFLOW-5699][part of 
AIRFLOW-5697][depends on AIRFLOW-5698] Add more tests for Dataflow integration
URL: https://github.com/apache/airflow/pull/6369
 
 
   This PR is one of a series that aims to improve this integration
   https://issues.apache.org/jira/browse/AIRFLOW-5697
   
   ---
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5699
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add more tests for Dataflow integration
> ---
>
> Key: AIRFLOW-5699
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5699
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Priority: Trivial
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj opened a new pull request #6369: [AIRFLOW-5699][part of AIRFLOW-5697][depends on AIRFLOW-5698] Add more tests for Dataflow integration

2019-10-18 Thread GitBox
mik-laj opened a new pull request #6369: [AIRFLOW-5699][part of 
AIRFLOW-5697][depends on AIRFLOW-5698] Add more tests for Dataflow integration
URL: https://github.com/apache/airflow/pull/6369
 
 
   This PR is one of a series that aims to improve this integration
   https://issues.apache.org/jira/browse/AIRFLOW-5697
   
   ---
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5699
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5697) Retrofit Dataflow integration

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-5697:
---
Description: 
I have made many Dataflow patches that I would like to introduce
{code:java}
1a43b5aa0 [AIRFLOW-XXX] Use JobID to monitor statuses when running a Dataflow 
template
9d92f6e15 [AIRFLOW-XXX] Improve job_id detection
fdea0b2b9 [AIRFLOW-YYY] Fetch all jobs during searching
33a47d566 [AIRFLOW-YYY] Add fallback for connection's project ID in Dataflow 
hook
4ec35e8da [AIRFLOW-YYY] Simplify DataflowJobsController logic
d6db0fe20 [AIRFLOW-YYY] Use keywords arguments as a parameter
850f06f36 [AIRFLOW-YYY] Remove dead code
2eb94e9b2 [AIRFLOW-YYY] Rename internal Dataflow classes
6a1c8b823 [AIRFLOW-YYY] Add Dataflow tests with various job statuses
79c0de0aa [AIRFLOW-YYY] Add tests when Job starts without an custom interpreter
79009c692 [AIRFLOW-YYY] Move Dataflow tests to other class
d504bc29a [AIRFLOW-YYY] Use parameterized in Dataflow Hook tests{code}

> Retrofit Dataflow integration
> -
>
> Key: AIRFLOW-5697
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5697
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Priority: Major
>
> I have made many Dataflow patches that I would like to introduce
> {code:java}
> 1a43b5aa0 [AIRFLOW-XXX] Use JobID to monitor statuses when running a Dataflow 
> template
> 9d92f6e15 [AIRFLOW-XXX] Improve job_id detection
> fdea0b2b9 [AIRFLOW-YYY] Fetch all jobs during searching
> 33a47d566 [AIRFLOW-YYY] Add fallback for connection's project ID in Dataflow 
> hook
> 4ec35e8da [AIRFLOW-YYY] Simplify DataflowJobsController logic
> d6db0fe20 [AIRFLOW-YYY] Use keywords arguments as a parameter
> 850f06f36 [AIRFLOW-YYY] Remove dead code
> 2eb94e9b2 [AIRFLOW-YYY] Rename internal Dataflow classes
> 6a1c8b823 [AIRFLOW-YYY] Add Dataflow tests with various job statuses
> 79c0de0aa [AIRFLOW-YYY] Add tests when Job starts without an custom 
> interpreter
> 79009c692 [AIRFLOW-YYY] Move Dataflow tests to other class
> d504bc29a [AIRFLOW-YYY] Use parameterized in Dataflow Hook tests{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Closed] (AIRFLOW-5692) Retrofit Dataflow integration

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5692?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula closed AIRFLOW-5692.
--
Resolution: Duplicate

> Retrofit Dataflow integration
> -
>
> Key: AIRFLOW-5692
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5692
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5699) Add more tests for Dataflow integration

2019-10-18 Thread Kamil Bregula (Jira)
Kamil Bregula created AIRFLOW-5699:
--

 Summary: Add more tests for Dataflow integration
 Key: AIRFLOW-5699
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5699
 Project: Apache Airflow
  Issue Type: Sub-task
  Components: gcp
Affects Versions: 1.10.5
Reporter: Kamil Bregula






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj opened a new pull request #6368: [AIRFLOW-5698][part of AIRFLOW-5697] Organize Dataflow tests

2019-10-18 Thread GitBox
mik-laj opened a new pull request #6368: [AIRFLOW-5698][part of AIRFLOW-5697] 
Organize Dataflow tests
URL: https://github.com/apache/airflow/pull/6368
 
 
   This PR is one of a series that aims to improve this integration
   https://issues.apache.org/jira/browse/AIRFLOW-5697
   
   ---
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-5698\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-5298) Move FileToGcs to core

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5298?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-5298.

Fix Version/s: 2.0.0
   Resolution: Fixed

> Move FileToGcs to core
> --
>
> Key: AIRFLOW-5298
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5298
> Project: Apache Airflow
>  Issue Type: Task
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5697) Retrofit Dataflow integration

2019-10-18 Thread Kamil Bregula (Jira)
Kamil Bregula created AIRFLOW-5697:
--

 Summary: Retrofit Dataflow integration
 Key: AIRFLOW-5697
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5697
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 1.10.5
Reporter: Kamil Bregula






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5698) Organize Dataflow tests

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5698?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-5698:
---
Parent: AIRFLOW-5697
Issue Type: Sub-task  (was: Improvement)

> Organize Dataflow tests
> ---
>
> Key: AIRFLOW-5698
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5698
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Kamil Bregula
>Assignee: Kamil Bregula
>Priority: Trivial
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5294) Make GCP MLEngine pylint compatible

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5294?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-5294.

Fix Version/s: 2.0.0
 Assignee: Tomasz Urbaszek
   Resolution: Fixed

> Make GCP MLEngine pylint compatible
> ---
>
> Key: AIRFLOW-5294
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5294
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Assignee: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5300) Move GcsToService operators to core

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5300?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-5300.

Fix Version/s: 2.0.0
   Resolution: Fixed

> Move GcsToService operators to core
> ---
>
> Key: AIRFLOW-5300
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5300
> Project: Apache Airflow
>  Issue Type: Task
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5299) Move SQLToGCS to core

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5299?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-5299.

Fix Version/s: 2.0.0
   Resolution: Fixed

> Move SQLToGCS to core
> -
>
> Key: AIRFLOW-5299
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5299
> Project: Apache Airflow
>  Issue Type: Task
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-5297) Move AdlsToGcs operator to core

2019-10-18 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-5297.

Fix Version/s: 2.0.0
 Assignee: Tomasz Urbaszek
   Resolution: Fixed

> Move AdlsToGcs operator to core
> ---
>
> Key: AIRFLOW-5297
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5297
> Project: Apache Airflow
>  Issue Type: Task
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Assignee: Tomasz Urbaszek
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-5695) Cannot run a task from UI if its state is None

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954980#comment-16954980
 ] 

ASF GitHub Bot commented on AIRFLOW-5695:
-

pingzh commented on pull request #6367: [AIRFLOW-5695] use RUNNING_DEPS to 
check run from UI
URL: https://github.com/apache/airflow/pull/6367
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Cannot run a task from UI if its state is None
> --
>
> Key: AIRFLOW-5695
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5695
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: 1.10.4
>Reporter: Ping Zhang
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] pingzh opened a new pull request #6367: [AIRFLOW-5695] use RUNNING_DEPS to check run from UI

2019-10-18 Thread GitBox
pingzh opened a new pull request #6367: [AIRFLOW-5695] use RUNNING_DEPS to 
check run from UI
URL: https://github.com/apache/airflow/pull/6367
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5696) Add GoogleCloudStorageToSFTPOperator

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5696?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954925#comment-16954925
 ] 

ASF GitHub Bot commented on AIRFLOW-5696:
-

TobKed commented on pull request #6366: [AIRFLOW-5696] 
GoogleCloudStorageToSFTPOperator
URL: https://github.com/apache/airflow/pull/6366
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5696
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add GoogleCloudStorageToSFTPOperator
> 
>
> Key: AIRFLOW-5696
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5696
> Project: Apache Airflow
>  Issue Type: Sub-task
>  Components: gcp, operators
>Affects Versions: 1.10.5
>Reporter: Tobiasz Kedzierski
>Assignee: Tobiasz Kedzierski
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-3333) New features enable transferring of files or data from GCS to a SFTP remote path and SFTP to GCS path.

2019-10-18 Thread Tobiasz Kedzierski (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954489#comment-16954489
 ] 

Tobiasz Kedzierski edited comment on AIRFLOW- at 10/18/19 7:30 PM:
---

Hi [~aaronfowles], [~pulinpathneja], [~kamil.bregula],
 I was also working on this, I have `GoogleCloudStorageToSFTPOperator` in 
progress. WDYT about it?
 -here is link:- 
[-https://github.com/PolideaInternal/airflow/pull/361/files#diff-741a1ddda812aa42923ba786189f671f-]

Edit: I've created subtask and PR for GCS to SFTP Operator: 
https://issues.apache.org/jira/browse/AIRFLOW-5696


was (Author: tobked):
Hi [~aaronfowles], [~pulinpathneja], [~kamil.bregula],
I was also working on this, I have `GoogleCloudStorageToSFTPOperator` in 
progress. WDYT about it?
here is link: 
[https://github.com/PolideaInternal/airflow/pull/361/files#diff-741a1ddda812aa42923ba786189f671f]

> New features enable transferring of files or data from GCS to a SFTP remote 
> path and SFTP to GCS path. 
> ---
>
> Key: AIRFLOW-
> URL: https://issues.apache.org/jira/browse/AIRFLOW-
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp
>Reporter: Pulin Pathneja
>Priority: Major
>
> New features enable transferring of files or data from GCS(Google Cloud 
> Storage) to a SFTP remote path and SFTP to GCS(Google Cloud Storage) path. 
>   



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] TobKed opened a new pull request #6366: [AIRFLOW-5696] GoogleCloudStorageToSFTPOperator

2019-10-18 Thread GitBox
TobKed opened a new pull request #6366: [AIRFLOW-5696] 
GoogleCloudStorageToSFTPOperator
URL: https://github.com/apache/airflow/pull/6366
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5696
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5696) Add GoogleCloudStorageToSFTPOperator

2019-10-18 Thread Tobiasz Kedzierski (Jira)
Tobiasz Kedzierski created AIRFLOW-5696:
---

 Summary: Add GoogleCloudStorageToSFTPOperator
 Key: AIRFLOW-5696
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5696
 Project: Apache Airflow
  Issue Type: Sub-task
  Components: gcp, operators
Affects Versions: 1.10.5
Reporter: Tobiasz Kedzierski
Assignee: Tobiasz Kedzierski






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk commented on issue #6349: [AIRFLOW-XXX] Add logo info to readme

2019-10-18 Thread GitBox
potiuk commented on issue #6349: [AIRFLOW-XXX] Add logo info to readme
URL: https://github.com/apache/airflow/pull/6349#issuecomment-543897037
 
 
   Thanks @leahecole !


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk merged pull request #6349: [AIRFLOW-XXX] Add logo info to readme

2019-10-18 Thread GitBox
potiuk merged pull request #6349: [AIRFLOW-XXX] Add logo info to readme
URL: https://github.com/apache/airflow/pull/6349
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5695) Cannot run a task from UI if its state is None

2019-10-18 Thread Ping Zhang (Jira)
Ping Zhang created AIRFLOW-5695:
---

 Summary: Cannot run a task from UI if its state is None
 Key: AIRFLOW-5695
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5695
 Project: Apache Airflow
  Issue Type: Bug
  Components: ui
Affects Versions: 1.10.4
Reporter: Ping Zhang






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] milton0825 commented on issue #5942: [AIRFLOW-5339] Fix infinite wait for Spark subprocess

2019-10-18 Thread GitBox
milton0825 commented on issue #5942: [AIRFLOW-5339] Fix infinite wait for Spark 
subprocess
URL: https://github.com/apache/airflow/pull/5942#issuecomment-54390
 
 
   Got it. I thought it would automatically resolve the Jira.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of running GCP system tests

2019-10-18 Thread GitBox
codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of 
running GCP system tests
URL: https://github.com/apache/airflow/pull/6299#issuecomment-540547492
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=h1) 
Report
   > Merging 
[#6299](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9ec562f88ef8e690f0b17526878b46847f0422e7?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `94.91%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6299/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6299  +/-   ##
   ==
   + Coverage   80.09%   80.11%   +0.01% 
   ==
 Files 616  617   +1 
 Lines   3580435863  +59 
   ==
   + Hits2867728730  +53 
   - Misses   7127 7133   +6
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/log/colored\_log.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvY29sb3JlZF9sb2cucHk=)
 | `93.18% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/utils/credentials\_provider.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvY3JlZGVudGlhbHNfcHJvdmlkZXIucHk=)
 | `94.91% <94.91%> (ø)` | |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.28% <0%> (-0.51%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=footer). 
Last update 
[9ec562f...0a71e2d](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of running GCP system tests

2019-10-18 Thread GitBox
codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of 
running GCP system tests
URL: https://github.com/apache/airflow/pull/6299#issuecomment-540547492
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=h1) 
Report
   > Merging 
[#6299](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9ec562f88ef8e690f0b17526878b46847f0422e7?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `94.91%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6299/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6299  +/-   ##
   ==
   + Coverage   80.09%   80.11%   +0.01% 
   ==
 Files 616  617   +1 
 Lines   3580435863  +59 
   ==
   + Hits2867728730  +53 
   - Misses   7127 7133   +6
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/log/colored\_log.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvY29sb3JlZF9sb2cucHk=)
 | `93.18% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/utils/credentials\_provider.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvY3JlZGVudGlhbHNfcHJvdmlkZXIucHk=)
 | `94.91% <94.91%> (ø)` | |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.28% <0%> (-0.51%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=footer). 
Last update 
[9ec562f...0a71e2d](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of running GCP system tests

2019-10-18 Thread GitBox
codecov-io edited a comment on issue #6299: [AIRFLOW-5631] Change way of 
running GCP system tests
URL: https://github.com/apache/airflow/pull/6299#issuecomment-540547492
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=h1) 
Report
   > Merging 
[#6299](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9ec562f88ef8e690f0b17526878b46847f0422e7?src=pr=desc)
 will **increase** coverage by `0.01%`.
   > The diff coverage is `94.91%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6299/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6299  +/-   ##
   ==
   + Coverage   80.09%   80.11%   +0.01% 
   ==
 Files 616  617   +1 
 Lines   3580435863  +59 
   ==
   + Hits2867728730  +53 
   - Misses   7127 7133   +6
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/utils/log/colored\_log.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvY29sb3JlZF9sb2cucHk=)
 | `93.18% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/gcp\_api\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9hcGlfYmFzZV9ob29rLnB5)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/gcp/utils/credentials\_provider.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9nY3AvdXRpbHMvY3JlZGVudGlhbHNfcHJvdmlkZXIucHk=)
 | `94.91% <94.91%> (ø)` | |
   | 
[airflow/models/taskinstance.py](https://codecov.io/gh/apache/airflow/pull/6299/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvdGFza2luc3RhbmNlLnB5)
 | `93.28% <0%> (-0.51%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=footer). 
Last update 
[9ec562f...0a71e2d](https://codecov.io/gh/apache/airflow/pull/6299?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5694) Check for blinker when detecting Sentry packages

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954855#comment-16954855
 ] 

ASF GitHub Bot commented on AIRFLOW-5694:
-

marcusianlevine commented on pull request #6365: [AIRFLOW-5694] Check for 
blinker in Sentry setup
URL: https://github.com/apache/airflow/pull/6365
 
 
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5694#
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   In order to avoid setting up Sentry when `sentry-sdk` is installed but not 
`blinker`, we import an unused module from `blinker` so that the `ImportError` 
catch will trigger and use the `DummySentry` instead
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   Prevents a known edge case
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Check for blinker when detecting Sentry packages
> 
>
> Key: AIRFLOW-5694
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5694
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: dependencies
>Affects Versions: 1.10.6
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Minor
> Fix For: 1.10.6
>
>
> After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
> the `[sentry]` extra, the dependency `blinker` will cause failures of the 
> following form:
> {code:python}
> ../lib/python3.7/site-packages/airflow/__init__.py:40: in 
>     from airflow.models import DAG
> ../lib/python3.7/site-packages/airflow/models/__init__.py:21: in 
>     from airflow.models.baseoperator import BaseOperator  # noqa: F401
> ../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
>     from airflow.models.dag import DAG
> ../lib/python3.7/site-packages/airflow/models/dag.py:51: in 
>     from airflow.models.taskinstance import TaskInstance, clear_task_instances
> ../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
>     from airflow.sentry import Sentry
> ../lib/python3.7/site-packages/airflow/sentry.py:167: in 
>     Sentry = ConfiguredSentry()
> ../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__
>     init(integrations=integrations)
> ../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init
>     client = Client(*args, **kwargs)  # type: ignore
> ../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__
>     self._init_impl()
> ../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl
>     with_defaults=self.options["default_integrations"],
> ../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
> setup_integrations
>     

[GitHub] [airflow-site] kgabryje opened a new pull request #82: [WIP] accordion w icon, blogpost box, youtube section, features, principles

2019-10-18 Thread GitBox
kgabryje opened a new pull request #82: [WIP] accordion w icon, blogpost box, 
youtube section, features, principles
URL: https://github.com/apache/airflow-site/pull/82
 
 
   https://github.com/apache/airflow-site/issues/27
   https://github.com/apache/airflow-site/issues/35
   https://github.com/apache/airflow-site/issues/36


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] marcusianlevine opened a new pull request #6365: [AIRFLOW-5694] Check for blinker in Sentry setup

2019-10-18 Thread GitBox
marcusianlevine opened a new pull request #6365: [AIRFLOW-5694] Check for 
blinker in Sentry setup
URL: https://github.com/apache/airflow/pull/6365
 
 
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-5694#
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   In order to avoid setting up Sentry when `sentry-sdk` is installed but not 
`blinker`, we import an unused module from `blinker` so that the `ImportError` 
catch will trigger and use the `DummySentry` instead
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   Prevents a known edge case
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5694) Check for blinker when detecting Sentry packages

2019-10-18 Thread Marcus Levine (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5694:
---
Description: 
{{After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
the `[sentry]` extra, the dependency `blinker` will cause failures of the 
following form: }}
{{ ../lib/python3.7/site-packages/airflow/__init__.py:40: in }}
{{    from airflow.models import DAG}}
{{ ../lib/python3.7/site-packages/airflow/models/__init__.py:21: in }}
{{    from airflow.models.baseoperator import BaseOperator  # noqa: F401}}
{{ ../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
}}
{{    from airflow.models.dag import DAG}}
{{ ../lib/python3.7/site-packages/airflow/models/dag.py:51: in }}
{{    from airflow.models.taskinstance import TaskInstance, 
clear_task_instances}}
{{ ../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
}}
{{    from airflow.sentry import Sentry}}
{{ ../lib/python3.7/site-packages/airflow/sentry.py:167: in }}
{{    Sentry = ConfiguredSentry()}}
{{ ../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__}}
{{    init(integrations=integrations)}}
{{ ../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init}}
{{    client = Client(*args, **kwargs)  # type: ignore}}
{{ ../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__}}
{{    self._init_impl()}}
{{ ../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl}}
{{    with_defaults=self.options["default_integrations"],}}
{{ ../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
setup_integrations}}
{{    type(integration).setup_once()}}
{{ ../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57: in 
setup_once}}
{{    appcontext_pushed.connect(_push_appctx)}}
{{ ../lib/python3.7/site-packages/flask/signals.py:39: in _fail}}
{{    "Signalling support is unavailable because the blinker"}}
{{ E   RuntimeError: Signalling support is unavailable because the blinker 
library is not installed.}}

  was:
After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying the 
`[sentry]` extra, the dependency `blinker` will cause failures of the following 
form: 
../lib/python3.7/site-packages/airflow/__init__.py:40: in 
    from airflow.models import DAG
../lib/python3.7/site-packages/airflow/models/__init__.py:21: in 
    from airflow.models.baseoperator import BaseOperator  # noqa: F401
../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
    from airflow.models.dag import DAG
../lib/python3.7/site-packages/airflow/models/dag.py:51: in 
    from airflow.models.taskinstance import TaskInstance, clear_task_instances
../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
    from airflow.sentry import Sentry
../lib/python3.7/site-packages/airflow/sentry.py:167: in 
    Sentry = ConfiguredSentry()
../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__
    init(integrations=integrations)
../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init
    client = Client(*args, **kwargs)  # type: ignore
../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__
    self._init_impl()
../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl
    with_defaults=self.options["default_integrations"],
../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
setup_integrations
    type(integration).setup_once()
../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57: in 
setup_once
    appcontext_pushed.connect(_push_appctx)
../lib/python3.7/site-packages/flask/signals.py:39: in _fail
    "Signalling support is unavailable because the blinker"
E   RuntimeError: Signalling support is unavailable because the blinker library 
is not installed.


> Check for blinker when detecting Sentry packages
> 
>
> Key: AIRFLOW-5694
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5694
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: dependencies
>Affects Versions: 1.10.6
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Minor
> Fix For: 1.10.6
>
>
> {{After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
> the `[sentry]` extra, the dependency `blinker` will cause failures of the 
> following form: }}
> {{ ../lib/python3.7/site-packages/airflow/__init__.py:40: in }}
> {{    from airflow.models import DAG}}
> {{ ../lib/python3.7/site-packages/airflow/models/__init__.py:21: in }}
> {{    from airflow.models.baseoperator import BaseOperator  # noqa: F401}}
> {{ ../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
> }}
> {{    from airflow.models.dag import DAG}}
> {{ ../lib/python3.7/site-packages/airflow/models/dag.py:51: in }}
> {{    from 

[jira] [Updated] (AIRFLOW-5694) Check for blinker when detecting Sentry packages

2019-10-18 Thread Marcus Levine (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5694:
---
Description: 
After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying the 
`[sentry]` extra, the dependency `blinker` will cause failures of the following 
form:

{code:python}
../lib/python3.7/site-packages/airflow/__init__.py:40: in 
    from airflow.models import DAG
../lib/python3.7/site-packages/airflow/models/__init__.py:21: in 
    from airflow.models.baseoperator import BaseOperator  # noqa: F401
../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
    from airflow.models.dag import DAG
../lib/python3.7/site-packages/airflow/models/dag.py:51: in 
    from airflow.models.taskinstance import TaskInstance, clear_task_instances
../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
    from airflow.sentry import Sentry
../lib/python3.7/site-packages/airflow/sentry.py:167: in 
    Sentry = ConfiguredSentry()
../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__
    init(integrations=integrations)
../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init
    client = Client(*args, **kwargs)  # type: ignore
../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__
    self._init_impl()
../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl
    with_defaults=self.options["default_integrations"],
../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
setup_integrations
    type(integration).setup_once()
../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57: in 
setup_once
    appcontext_pushed.connect(_push_appctx)
../lib/python3.7/site-packages/flask/signals.py:39: in _fail
    "Signalling support is unavailable because the blinker"
E   RuntimeError: Signalling support is unavailable because the blinker library 
is not installed.
{code}

  was:
After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying the 
`[sentry]` extra, the dependency `blinker` will cause failures of the following 
form:
{code:java}
../lib/python3.7/site-packages/airflow/__init__.py:40: in     from 
airflow.models import 
DAG../lib/python3.7/site-packages/airflow/models/__init__.py:21: in     
from airflow.models.baseoperator import BaseOperator  # noqa: 
F401../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
    from airflow.models.dag import 
DAG../lib/python3.7/site-packages/airflow/models/dag.py:51: in     from 
airflow.models.taskinstance import TaskInstance, 
clear_task_instances../lib/python3.7/site-packages/airflow/models/taskinstance.py:53:
 in     from airflow.sentry import 
Sentry../lib/python3.7/site-packages/airflow/sentry.py:167: in     
Sentry = ConfiguredSentry()../lib/python3.7/site-packages/airflow/sentry.py:94: 
in __init__    
init(integrations=integrations)../lib/python3.7/site-packages/sentry_sdk/hub.py:81:
 in _init    client = Client(*args, **kwargs)  # type: 
ignore../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__    
self._init_impl()../lib/python3.7/site-packages/sentry_sdk/client.py:108: in 
_init_impl    
with_defaults=self.options["default_integrations"],../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82:
 in setup_integrations    
type(integration).setup_once()../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57:
 in setup_once    
appcontext_pushed.connect(_push_appctx)../lib/python3.7/site-packages/flask/signals.py:39:
 in _fail    "Signalling support is unavailable because the blinker"E   
RuntimeError: Signalling support is unavailable because the blinker library is 
not installed.


{code}


> Check for blinker when detecting Sentry packages
> 
>
> Key: AIRFLOW-5694
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5694
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: dependencies
>Affects Versions: 1.10.6
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Minor
> Fix For: 1.10.6
>
>
> After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
> the `[sentry]` extra, the dependency `blinker` will cause failures of the 
> following form:
> {code:python}
> ../lib/python3.7/site-packages/airflow/__init__.py:40: in 
>     from airflow.models import DAG
> ../lib/python3.7/site-packages/airflow/models/__init__.py:21: in 
>     from airflow.models.baseoperator import BaseOperator  # noqa: F401
> ../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
>     from airflow.models.dag import DAG
> ../lib/python3.7/site-packages/airflow/models/dag.py:51: in 
>     from airflow.models.taskinstance import TaskInstance, clear_task_instances
> 

[jira] [Updated] (AIRFLOW-5694) Check for blinker when detecting Sentry packages

2019-10-18 Thread Marcus Levine (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcus Levine updated AIRFLOW-5694:
---
Description: 
After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying the 
`[sentry]` extra, the dependency `blinker` will cause failures of the following 
form:
{code:java}
../lib/python3.7/site-packages/airflow/__init__.py:40: in     from 
airflow.models import 
DAG../lib/python3.7/site-packages/airflow/models/__init__.py:21: in     
from airflow.models.baseoperator import BaseOperator  # noqa: 
F401../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
    from airflow.models.dag import 
DAG../lib/python3.7/site-packages/airflow/models/dag.py:51: in     from 
airflow.models.taskinstance import TaskInstance, 
clear_task_instances../lib/python3.7/site-packages/airflow/models/taskinstance.py:53:
 in     from airflow.sentry import 
Sentry../lib/python3.7/site-packages/airflow/sentry.py:167: in     
Sentry = ConfiguredSentry()../lib/python3.7/site-packages/airflow/sentry.py:94: 
in __init__    
init(integrations=integrations)../lib/python3.7/site-packages/sentry_sdk/hub.py:81:
 in _init    client = Client(*args, **kwargs)  # type: 
ignore../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__    
self._init_impl()../lib/python3.7/site-packages/sentry_sdk/client.py:108: in 
_init_impl    
with_defaults=self.options["default_integrations"],../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82:
 in setup_integrations    
type(integration).setup_once()../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57:
 in setup_once    
appcontext_pushed.connect(_push_appctx)../lib/python3.7/site-packages/flask/signals.py:39:
 in _fail    "Signalling support is unavailable because the blinker"E   
RuntimeError: Signalling support is unavailable because the blinker library is 
not installed.


{code}

  was:
{{After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
the `[sentry]` extra, the dependency `blinker` will cause failures of the 
following form: }}
{{ ../lib/python3.7/site-packages/airflow/__init__.py:40: in }}
{{    from airflow.models import DAG}}
{{ ../lib/python3.7/site-packages/airflow/models/__init__.py:21: in }}
{{    from airflow.models.baseoperator import BaseOperator  # noqa: F401}}
{{ ../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
}}
{{    from airflow.models.dag import DAG}}
{{ ../lib/python3.7/site-packages/airflow/models/dag.py:51: in }}
{{    from airflow.models.taskinstance import TaskInstance, 
clear_task_instances}}
{{ ../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
}}
{{    from airflow.sentry import Sentry}}
{{ ../lib/python3.7/site-packages/airflow/sentry.py:167: in }}
{{    Sentry = ConfiguredSentry()}}
{{ ../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__}}
{{    init(integrations=integrations)}}
{{ ../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init}}
{{    client = Client(*args, **kwargs)  # type: ignore}}
{{ ../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__}}
{{    self._init_impl()}}
{{ ../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl}}
{{    with_defaults=self.options["default_integrations"],}}
{{ ../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
setup_integrations}}
{{    type(integration).setup_once()}}
{{ ../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57: in 
setup_once}}
{{    appcontext_pushed.connect(_push_appctx)}}
{{ ../lib/python3.7/site-packages/flask/signals.py:39: in _fail}}
{{    "Signalling support is unavailable because the blinker"}}
{{ E   RuntimeError: Signalling support is unavailable because the blinker 
library is not installed.}}


> Check for blinker when detecting Sentry packages
> 
>
> Key: AIRFLOW-5694
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5694
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: dependencies
>Affects Versions: 1.10.6
>Reporter: Marcus Levine
>Assignee: Marcus Levine
>Priority: Minor
> Fix For: 1.10.6
>
>
> After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying 
> the `[sentry]` extra, the dependency `blinker` will cause failures of the 
> following form:
> {code:java}
> ../lib/python3.7/site-packages/airflow/__init__.py:40: in     from 
> airflow.models import 
> DAG../lib/python3.7/site-packages/airflow/models/__init__.py:21: in   
>   from airflow.models.baseoperator import BaseOperator  # noqa: 
> F401../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
>     from airflow.models.dag import 
> DAG../lib/python3.7/site-packages/airflow/models/dag.py:51: in     
> from 

[jira] [Created] (AIRFLOW-5694) Check for blinker when detecting Sentry packages

2019-10-18 Thread Marcus Levine (Jira)
Marcus Levine created AIRFLOW-5694:
--

 Summary: Check for blinker when detecting Sentry packages
 Key: AIRFLOW-5694
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5694
 Project: Apache Airflow
  Issue Type: Bug
  Components: dependencies
Affects Versions: 1.10.6
Reporter: Marcus Levine
Assignee: Marcus Levine
 Fix For: 1.10.6


After upgrading to 1.10.6rc1 with `sentry-sdk` installed but not specifying the 
`[sentry]` extra, the dependency `blinker` will cause failures of the following 
form: 
../lib/python3.7/site-packages/airflow/__init__.py:40: in 
    from airflow.models import DAG
../lib/python3.7/site-packages/airflow/models/__init__.py:21: in 
    from airflow.models.baseoperator import BaseOperator  # noqa: F401
../lib/python3.7/site-packages/airflow/models/baseoperator.py:42: in 
    from airflow.models.dag import DAG
../lib/python3.7/site-packages/airflow/models/dag.py:51: in 
    from airflow.models.taskinstance import TaskInstance, clear_task_instances
../lib/python3.7/site-packages/airflow/models/taskinstance.py:53: in 
    from airflow.sentry import Sentry
../lib/python3.7/site-packages/airflow/sentry.py:167: in 
    Sentry = ConfiguredSentry()
../lib/python3.7/site-packages/airflow/sentry.py:94: in __init__
    init(integrations=integrations)
../lib/python3.7/site-packages/sentry_sdk/hub.py:81: in _init
    client = Client(*args, **kwargs)  # type: ignore
../lib/python3.7/site-packages/sentry_sdk/client.py:80: in __init__
    self._init_impl()
../lib/python3.7/site-packages/sentry_sdk/client.py:108: in _init_impl
    with_defaults=self.options["default_integrations"],
../lib/python3.7/site-packages/sentry_sdk/integrations/__init__.py:82: in 
setup_integrations
    type(integration).setup_once()
../lib/python3.7/site-packages/sentry_sdk/integrations/flask.py:57: in 
setup_once
    appcontext_pushed.connect(_push_appctx)
../lib/python3.7/site-packages/flask/signals.py:39: in _fail
    "Signalling support is unavailable because the blinker"
E   RuntimeError: Signalling support is unavailable because the blinker library 
is not installed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6364: [AIRFLOW-5693] Support the 
"blocks" component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364#discussion_r336595294
 
 

 ##
 File path: tests/contrib/hooks/test_slack_webhook_hook.py
 ##
 @@ -35,6 +35,7 @@ class TestSlackWebhookHook(unittest.TestCase):
 'webhook_token': 'manual_token',
 'message': 'Awesome message to put on Slack',
 'attachments': [{'fallback': 'Required plain-text summary'}],
+'blocks': [{'type': 'section', 'text': {'type': 'mrkdwn', 'text': 
'*bold text*'}}],
 
 Review comment:
   You make an excelent point :D


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5693) Support the "blocks" component for the Slack messages

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Description: 
Currently, the slack webhook hook does not support "blocks" components for 
Slack messages ([https://api.slack.com/reference/block-kit/blocks])

 

So, we should support the "blocks" field to slack webhook hook.

  was:
Currently, the slack webhook hook does support "blocks" components for Slack 
messages ([https://api.slack.com/reference/block-kit/blocks])

 

So, we should support the "blocks" field to slack webhook hook.


> Support the "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.7
>
>
> Currently, the slack webhook hook does not support "blocks" components for 
> Slack messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5693) Support the "blocks" component for the Slack messages

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Description: 
Currently, the slack webhook hook does not support "blocks" component for the 
Slack messages.([https://api.slack.com/reference/block-kit/blocks])

 

So, we should support the "blocks" field to slack webhook hook.

  was:
Currently, the slack webhook hook does not support "blocks" components for 
Slack messages ([https://api.slack.com/reference/block-kit/blocks])

 

So, we should support the "blocks" field to slack webhook hook.


> Support the "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.7
>
>
> Currently, the slack webhook hook does not support "blocks" component for the 
> Slack messages.([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mingrammer commented on a change in pull request #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
mingrammer commented on a change in pull request #6364: [AIRFLOW-5693] Support 
the "blocks" component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364#discussion_r336593052
 
 

 ##
 File path: tests/contrib/hooks/test_slack_webhook_hook.py
 ##
 @@ -35,6 +35,7 @@ class TestSlackWebhookHook(unittest.TestCase):
 'webhook_token': 'manual_token',
 'message': 'Awesome message to put on Slack',
 'attachments': [{'fallback': 'Required plain-text summary'}],
+'blocks': [{'type': 'section', 'text': {'type': 'mrkdwn', 'text': 
'*bold text*'}}],
 
 Review comment:
   It does not use the slack client module, but just HTTP post request.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5693) Support the "blocks" component for the Slack messages

2019-10-18 Thread Ash Berlin-Taylor (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-5693:
---
Fix Version/s: (was: 1.10.6)
   1.10.7

> Support the "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.7
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6364: [AIRFLOW-5693] Support the 
"blocks" component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364#discussion_r336591510
 
 

 ##
 File path: tests/contrib/hooks/test_slack_webhook_hook.py
 ##
 @@ -35,6 +35,7 @@ class TestSlackWebhookHook(unittest.TestCase):
 'webhook_token': 'manual_token',
 'message': 'Awesome message to put on Slack',
 'attachments': [{'fallback': 'Required plain-text summary'}],
+'blocks': [{'type': 'section', 'text': {'type': 'mrkdwn', 'text': 
'*bold text*'}}],
 
 Review comment:
   Does the version of the slack module we support (1.x, not 2.x) support this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5693) Support the "blocks" component for the Slack messages

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954813#comment-16954813
 ] 

ASF GitHub Bot commented on AIRFLOW-5693:
-

mingrammer commented on pull request #6364: [AIRFLOW-5693] Support the "blocks" 
component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Support the "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 'Create a custom operator' doc

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 
'Create a custom operator' doc
URL: https://github.com/apache/airflow/pull/6348#discussion_r336589019
 
 

 ##
 File path: docs/howto/operator/custom-operator.rst
 ##
 @@ -0,0 +1,185 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+
+Create Custom Operator
 
 Review comment:
   `Create a custom Operator` then?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 'Create a custom operator' doc

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 
'Create a custom operator' doc
URL: https://github.com/apache/airflow/pull/6348#discussion_r336588731
 
 

 ##
 File path: docs/howto/operator/custom-operator.rst
 ##
 @@ -0,0 +1,185 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+
+Create Custom Operator
+===
+
+
+Airflow allows you to create new operators to suit the requirements of you or 
your team. 
+The extensibility is one of the many reasons which makes Apache Airflow 
powerful. 
+
+You can create any operator you want by extending the 
:class:`airflow.models.baseoperator.BaseOperator`
+
+There are two methods that you need to override in a derived class:
+
+* Constructor - Define the parameters required for the operator. You only need 
to specify the arguments specific to your operator.
+  Use ``@apply_defaults`` decorator function to fill unspecified arguments 
with ``default_args``. You can specify the ``default_args``
+  in the dag file. See :ref:`Default args ` for more details.
+
+* Execute - The code to execute when the runner calls the operator. The method 
contains the 
+  airflow context as a parameter that can be used to read config values.
+
+Let's implement an example ``HelloOperator``:
+
+.. code::  python
+
+from airflow.models.baseoperator import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+class HelloOperator(BaseOperator):
+
+@apply_defaults
+def __init__(
+self,
+name: str,
+*args, **kwargs) -> None:
+super().__init__(*args, **kwargs)
+self.name = name
+
+def execute(self, context):
+message = "Hello {}".format(name)
+print(message)
+return message
+
+You can now use the derived custom operator as follows:
+
+.. code:: python
+
+hello_task = HelloOperator(task_id='sample-task', dag=dag, name='foo_bar')
 
 Review comment:
   It might, but the main reason is for consistency: all the rest of the 
examples (should) be updated already use this new preferred way.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5693) Support blocks components for the Slack message

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Summary: Support blocks components for the Slack message  (was: Support 
slack blocks componentss)

> Support blocks components for the Slack message
> ---
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding Task re-run documentation

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding 
Task re-run documentation
URL: https://github.com/apache/airflow/pull/6295#discussion_r336587858
 
 

 ##
 File path: docs/dag-run.rst
 ##
 @@ -0,0 +1,189 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+DAG Runs
+=
+A DAG Run is an object representing an instantiation of the DAG in time.
+
+Each DAG may or may not have a schedule, which informs how DAG Runs are
+created. ``schedule_interval`` is defined as a DAG argument, and receives
+preferably a
+`cron expression `_ as
+a ``str``, or a ``datetime.timedelta`` object. Alternatively, you can also
+use one of these cron "presets":
+
++--++---+
+| preset   | meaning   
 | cron  |
++==++===+
+| ``None`` | Don't schedule, use for exclusively "externally triggered"
 |   |
+|  | DAGs  
 |   |
++--++---+
+| ``@once``| Schedule once and only once   
 |   |
++--++---+
+| ``@hourly``  | Run once an hour at the beginning of the hour 
 | ``0 * * * *`` |
++--++---+
+| ``@daily``   | Run once a day at midnight
 | ``0 0 * * *`` |
++--++---+
+| ``@weekly``  | Run once a week at midnight on Sunday morning 
 | ``0 0 * * 0`` |
++--++---+
+| ``@monthly`` | Run once a month at midnight of the first day of the month
 | ``0 0 1 * *`` |
++--++---+
+| ``@yearly``  | Run once a year at midnight of January 1  
 | ``0 0 1 1 *`` |
++--++---+
+
+Your DAG will be instantiated for each schedule along with a corresponding 
+DAG Run entry in backend.
+
+**Note**: If you run a DAG on a schedule_interval of one day, the run stamped 
2020-01-01 
 
 Review comment:
   I think this and the next paragraph sould be in a `.. note::` block to make 
them stand out more. What does that look like?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mingrammer opened a new pull request #6364: [AIRFLOW-5693] Support the "blocks" component for the Slack messages

2019-10-18 Thread GitBox
mingrammer opened a new pull request #6364: [AIRFLOW-5693] Support the "blocks" 
component for the Slack messages
URL: https://github.com/apache/airflow/pull/6364
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5693) Support the "blocks" component for the Slack messages

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Summary: Support the "blocks" component for the Slack messages  (was: 
Support "blocks" component for the Slack messages)

> Support the "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5693) Support "blocks" components for the Slack messages

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Summary: Support "blocks" components for the Slack messages  (was: Support 
"blocks" components for the Slack message)

> Support "blocks" components for the Slack messages
> --
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5693) Support "blocks" component for the Slack messages

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Summary: Support "blocks" component for the Slack messages  (was: Support 
"blocks" components for the Slack messages)

> Support "blocks" component for the Slack messages
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5693) Support "blocks" components for the Slack message

2019-10-18 Thread MinJae Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

MinJae Kwon updated AIRFLOW-5693:
-
Summary: Support "blocks" components for the Slack message  (was: Support 
blocks components for the Slack message)

> Support "blocks" components for the Slack message
> -
>
> Key: AIRFLOW-5693
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks
>Affects Versions: 1.10.6
>Reporter: MinJae Kwon
>Assignee: MinJae Kwon
>Priority: Minor
> Fix For: 1.10.6
>
>
> Currently, the slack webhook hook does support "blocks" components for Slack 
> messages ([https://api.slack.com/reference/block-kit/blocks])
>  
> So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding Task re-run documentation

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding 
Task re-run documentation
URL: https://github.com/apache/airflow/pull/6295#discussion_r336583966
 
 

 ##
 File path: docs/scheduler.rst
 ##
 @@ -32,161 +30,10 @@ Airflow production environment. To kick it off, all you 
need to do is
 execute ``airflow scheduler``. It will use the configuration specified in
 ``airflow.cfg``.
 
-Note that if you run a DAG on a ``schedule_interval`` of one day,
 
 Review comment:
   This was quite an important point. I feel we should direct people to the new 
dag run page (where ever we decide it lives) from here too.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5693) Support slack blocks componentss

2019-10-18 Thread MinJae Kwon (Jira)
MinJae Kwon created AIRFLOW-5693:


 Summary: Support slack blocks componentss
 Key: AIRFLOW-5693
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5693
 Project: Apache Airflow
  Issue Type: Improvement
  Components: hooks
Affects Versions: 1.10.6
Reporter: MinJae Kwon
Assignee: MinJae Kwon
 Fix For: 1.10.6


Currently, the slack webhook hook does support "blocks" components for Slack 
messages ([https://api.slack.com/reference/block-kit/blocks])

 

So, we should support the "blocks" field to slack webhook hook.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding Task re-run documentation

2019-10-18 Thread GitBox
ashb commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding 
Task re-run documentation
URL: https://github.com/apache/airflow/pull/6295#discussion_r336581003
 
 

 ##
 File path: docs/index.rst
 ##
 @@ -84,6 +84,7 @@ Content
 concepts
 scheduler
 executor/index
+dag-run
 
 Review comment:
   I think this probably makes more sense as a page under Concepts -- it 
doesn't fit with the rest of the top level items we have.
   
   WDYT?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-5688) Merge alembic migrations

2019-10-18 Thread Kaxil Naik (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5688?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik resolved AIRFLOW-5688.
-
Resolution: Fixed

> Merge alembic migrations
> 
>
> Key: AIRFLOW-5688
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5688
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: database
>Affects Versions: 2.0.0
>Reporter: Fokko Driesprong
>Assignee: Fokko Driesprong
>Priority: Major
> Fix For: 2.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] kaxil commented on issue #6352: [WIP][AIRFLOW-5683] Add SubDagOperator that propagates skipped state

2019-10-18 Thread GitBox
kaxil commented on issue #6352: [WIP][AIRFLOW-5683] Add SubDagOperator that 
propagates skipped state
URL: https://github.com/apache/airflow/pull/6352#issuecomment-543821517
 
 
   +1 for a flag instead of a new operator. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KKcorps commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 'Create a custom operator' doc

2019-10-18 Thread GitBox
KKcorps commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 
'Create a custom operator' doc
URL: https://github.com/apache/airflow/pull/6348#discussion_r336572794
 
 

 ##
 File path: docs/howto/operator/custom-operator.rst
 ##
 @@ -0,0 +1,185 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+
+Create Custom Operator
+===
+
+
+Airflow allows you to create new operators to suit the requirements of you or 
your team. 
+The extensibility is one of the many reasons which makes Apache Airflow 
powerful. 
+
+You can create any operator you want by extending the 
:class:`airflow.models.baseoperator.BaseOperator`
+
+There are two methods that you need to override in a derived class:
+
+* Constructor - Define the parameters required for the operator. You only need 
to specify the arguments specific to your operator.
+  Use ``@apply_defaults`` decorator function to fill unspecified arguments 
with ``default_args``. You can specify the ``default_args``
+  in the dag file. See :ref:`Default args ` for more details.
+
+* Execute - The code to execute when the runner calls the operator. The method 
contains the 
+  airflow context as a parameter that can be used to read config values.
+
+Let's implement an example ``HelloOperator``:
+
+.. code::  python
+
+from airflow.models.baseoperator import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+class HelloOperator(BaseOperator):
+
+@apply_defaults
+def __init__(
+self,
+name: str,
+*args, **kwargs) -> None:
+super().__init__(*args, **kwargs)
+self.name = name
+
+def execute(self, context):
+message = "Hello {}".format(name)
+print(message)
+return message
+
+You can now use the derived custom operator as follows:
+
+.. code:: python
+
+hello_task = HelloOperator(task_id='sample-task', dag=dag, name='foo_bar')
 
 Review comment:
   Sure, but is there any particular reason for this change? Is the old method 
getting deprecated?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] kaxil merged pull request #6362: [AIRFLOW-5688] Merge alembic migrations

2019-10-18 Thread GitBox
kaxil merged pull request #6362: [AIRFLOW-5688] Merge alembic migrations
URL: https://github.com/apache/airflow/pull/6362
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] KKcorps commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 'Create a custom operator' doc

2019-10-18 Thread GitBox
KKcorps commented on a change in pull request #6348: [AIRFLOW-XXX] GSoD: Adding 
'Create a custom operator' doc
URL: https://github.com/apache/airflow/pull/6348#discussion_r336572034
 
 

 ##
 File path: docs/howto/operator/custom-operator.rst
 ##
 @@ -0,0 +1,185 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+
+Create Custom Operator
 
 Review comment:
   Changing it because this way the tense is also in sync with the rest of the 
headings


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5386) Move Google Dataproc to core

2019-10-18 Thread Prabhjot Singh Bharaj (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954735#comment-16954735
 ] 

Prabhjot Singh Bharaj commented on AIRFLOW-5386:


Which release is this being planned in ?

I've come across a use case where airflow times out a stuck dataproc job, but 
doesn't cancel it.

I'd like to cancel that job before airflow starts another execution.

> Move Google Dataproc to core
> 
>
> Key: AIRFLOW-5386
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5386
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 2.0.0
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] KKcorps commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding Task re-run documentation

2019-10-18 Thread GitBox
KKcorps commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding 
Task re-run documentation
URL: https://github.com/apache/airflow/pull/6295#discussion_r336570463
 
 

 ##
 File path: docs/dag-run.rst
 ##
 @@ -0,0 +1,193 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+
+DAG Runs
+=
+A DAG Run is an object representing an instantiation of the DAG in time.
+
+Each DAG may or may not have a schedule, which informs how ``DAG Runs`` are
+created. ``schedule_interval`` is defined as a DAG arguments, and receives
+preferably a
+`cron expression `_ as
+a ``str``, or a ``datetime.timedelta`` object. Alternatively, you can also
+use one of these cron "preset":
+
++--++---+
+| preset   | meaning   
 | cron  |
++==++===+
+| ``None`` | Don't schedule, use for exclusively "externally triggered"
 |   |
+|  | DAGs  
 |   |
++--++---+
+| ``@once``| Schedule once and only once   
 |   |
++--++---+
+| ``@hourly``  | Run once an hour at the beginning of the hour 
 | ``0 * * * *`` |
++--++---+
+| ``@daily``   | Run once a day at midnight
 | ``0 0 * * *`` |
++--++---+
+| ``@weekly``  | Run once a week at midnight on Sunday morning 
 | ``0 0 * * 0`` |
++--++---+
+| ``@monthly`` | Run once a month at midnight of the first day of the month
 | ``0 0 1 * *`` |
++--++---+
+| ``@yearly``  | Run once a year at midnight of January 1  
 | ``0 0 1 1 *`` |
++--++---+
+
+Your DAG will be instantiated for each schedule along with a corresponding 
+``DAG Run`` entry in backend.
+
+**Note**: If you run a DAG on a schedule_interval of one day, the run stamped 
2020-01-01 
+will be triggered soon after 2020-01-01T23:59. In other words, the job 
instance is 
+started once the period it covers has ended.  The execution_date passed in the 
dag 
+will also be 2020-01-01.
+
+The first ``DAG Run`` is created based on the minimum ``start_date`` for the 
tasks in your DAG. 
+Subsequent ``DAG Runs`` are created by the scheduler process, based on your 
DAG’s ``schedule_interval``, 
+sequentially. If your start_date is 2020-01-01 and schedule_interval is @daily 
the first run 
+will be created on 2020-01-02 i.e. after your start date has passed.
+
+Re-run DAG
+''
+There can be cases where you will want to execute your DAG again. One such 
case is when the scheduled
+DAG run fails. Another can be the scheduled DAG run wasn't executed due to low 
resources or the DAG being turned off.
+
+Catchup
+---
+
+An Airflow DAG with a ``start_date``, possibly an ``end_date``, and a 
``schedule_interval`` defines a 
+series of intervals which the scheduler turn into individual DAG Runs and 
execute. A key capability 
+of Airflow is that these DAG Runs are atomic and idempotent items. The 
scheduler, by default, will
+kick off a DAG Run for any interval that has not been run (or has been 
cleared). This concept is called Catchup.
 
 Review comment:
   Updated with changes.


This is an automated message from the Apache Git Service.
To respond to the message, 

[GitHub] [airflow] JonnyIncognito edited a comment on issue #6210: [AIRFLOW-5567] BaseAsyncOperator

2019-10-18 Thread GitBox
JonnyIncognito edited a comment on issue #6210: [AIRFLOW-5567] BaseAsyncOperator
URL: https://github.com/apache/airflow/pull/6210#issuecomment-543808908
 
 
   > @JonnyIncognito I'm a gcp-er don't have aws access.
   
   @jaketf I'm almost certainly going to write an operator that combines 
EmrAddStepsOperator and EmrStepSensor on top of your work once we figure out 
the state retention issue, so I'd be happy to contribute it to your PR as an 
example if you don't get to a GCP example first.
   
   Question is what we're going to name this style of operators, related to the 
[other 
thread](https://github.com/apache/airflow/pull/6210#discussion_r335271755) 
about the class hierarchy and base naming. Options:
   
   - Suffix to imply composition / doing more than one thing: XYZAsyncOperator 
vs. XYZAtomicOperator e.g. `Emr...StepsAsyncOperator` or 
`Emr...StepsAtomicOperator`
   
   - How to imply action and sensing: RunXYZ vs. EnsureXYZ e.g. 
`EmrRunSteps...Operator` or `EmrEnsureSteps...Operator`
   
   I suspect these are going to become very popular, so it's important to start 
off with a good convention!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] JonnyIncognito edited a comment on issue #6210: [AIRFLOW-5567] BaseAsyncOperator

2019-10-18 Thread GitBox
JonnyIncognito edited a comment on issue #6210: [AIRFLOW-5567] BaseAsyncOperator
URL: https://github.com/apache/airflow/pull/6210#issuecomment-543808908
 
 
   > @JonnyIncognito I'm a gcp-er don't have aws access.
   
   @jaketf I'm almost certainly going to write an operator that combines 
EmrAddStepsOperator and EmrStepSensor on top of your work once we figure out 
the state retention issue, so I'd be happy to contribute it to your PR as an 
example if you don't get to a GCP example first.
   
   Question is what we're going to name this style of operators, related to the 
[other 
thread](https://github.com/apache/airflow/pull/6210#discussion_r335271755) 
about the class hierarchy and base naming. Options:
   
   - Suffix to imply composition / doing more than one thing: XYZAsyncOperator 
vs. XYZAtomicOperator e.g. `Emr...StepsAsyncOperator` or 
`Emr...StepsAtomicOperator`
   
   - How to imply action and sensing: RunXYZ vs. EnsureXYZ e.g. 
`EmrRunSteps...Operator` or `EmrEnsureSteps...Operator`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] JonnyIncognito commented on issue #6210: [AIRFLOW-5567] BaseAsyncOperator

2019-10-18 Thread GitBox
JonnyIncognito commented on issue #6210: [AIRFLOW-5567] BaseAsyncOperator
URL: https://github.com/apache/airflow/pull/6210#issuecomment-543808908
 
 
   > @JonnyIncognito I'm a gcp-er don't have aws access.
   
   @jaketf I'm almost certainly going to write an operator that combines 
EmrAddStepsOperator and EmrStepSensor on top of your work once we figure out 
the state retention issue, so I'd be happy to contribute it to your PR as an 
example if you don't get to a GCP example first.
   
   Question is what we're going to name this style of operators, related to the 
[other 
thread](https://github.com/apache/airflow/pull/6210#discussion_r335271755) 
about the class hierarchy and base naming. Options:
   
   - Suffix to imply composition / doing more than one thing: XYZAsyncOperator 
vs. XYZAtomicOperator e.g. `EmrRunStepsAsyncOperator` or 
`EmrRunStepsAtomicOperator`
   
   - How to imply action and sensing: RunXYZ vs. EnsureXYZ e.g. 
`EmrRunSteps...Operator` or `EmrEnsureSteps...Operator`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] JonnyIncognito commented on a change in pull request #6210: [AIRFLOW-5567] BaseAsyncOperator

2019-10-18 Thread GitBox
JonnyIncognito commented on a change in pull request #6210: [AIRFLOW-5567] 
BaseAsyncOperator
URL: https://github.com/apache/airflow/pull/6210#discussion_r336552413
 
 

 ##
 File path: airflow/models/base_async_operator.py
 ##
 @@ -0,0 +1,161 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Base Asynchronous Operator for kicking off a long running
+operations and polling for completion with reschedule mode.
+"""
+
+from abc import abstractmethod
+from typing import Dict, List, Optional, Union
+
+from airflow.models import SkipMixin, TaskReschedule
+from airflow.models.xcom import XCOM_EXTERNAL_RESOURCE_ID_KEY
+from airflow.sensors.base_sensor_operator import BaseSensorOperator
+from airflow.utils.decorators import apply_defaults
+
+
 
 Review comment:
   @seelmann and @Fokko thanks for jumping in and helping to build consensus!
   
   > State.UP_FOR_RESCHEDULE means that it is waiting to be rescheduled again. 
There is a cooldown period before poking again. By making the transition from 
State.UP_FOR_RESCHEDULE to State.UP_FOR_RETRY will let the scheduler know that 
it can be rerun again.
   
   Ref. the state changes, my interpretation of the [scheduler 
code](https://github.com/apache/airflow/blob/master/airflow/jobs/scheduler_job.py#L1557)
 is that the state change is from `UP_FOR_RESCHEDULE` to `SCHEDULED` to 
`QUEUED` to `RUNNING`? If that's the case then it does make my option 3 much 
more tricky, since it's entirely expected to want to retain the current 
behaviour in the `QUEUED` to `RUNNING` transition for other paths than 
`UP_FOR_RESCHEDULE`.
   
   I do like the semantic separation between `UP_FOR_RETRY` and 
`UP_FOR_RESCHEDULE` because the latter implies that it's continuing work and 
state should be retained, whereas the former implies that it failed and the 
entire task should be retried.
   
   > My suggestion would be to not clear the xcom value, and when the operator 
finishes, we just overwrite the existing xcom value. WDYT?
   ...
   > I'll open up a PR, and see if we can upsert the xcom instead of clearing 
and inserting it.
   
   Yes, upsert could be an interesting approach. It's not guaranteed to be 
clean if e.g. an operator were to have branches that cause different XCom keys 
to be output, though. I'm open to it as a compromise as it'll cover all 
practical cases that we're aware of today. Ref. backwards compatibility, how do 
we know that all operator instances will behave correctly if we stop clearing 
their state before starting?
   
   > I'm also thinking about the class structure. Right now we have the 
BaseOperator -> BaseSensor -> BaseAsyncOperator, which feels a bit awkward. 
Ideally we would like to push the retry logic up in the tree.
   
   Agreed, see [my 
comment](https://github.com/apache/airflow/pull/6210#discussion_r335271755) in 
the other thread about this.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[airflow-site] branch aip-11 updated: Add case study page (#81)

2019-10-18 Thread kamilbregula
This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a commit to branch aip-11
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/aip-11 by this push:
 new dbb402c  Add case study page (#81)
dbb402c is described below

commit dbb402ceccfaa87cbdb5e08e0ad633b47344d90d
Author: Kamil Gabryjelski 
AuthorDate: Fri Oct 18 17:27:11 2019 +0200

Add case study page (#81)
---
 landing-pages/site/assets/icons/spotify-logo.svg   | 21 ++
 .../scss/{main-custom.scss => _base-layout.scss}   | 38 +++---
 landing-pages/site/assets/scss/_buttons.scss   | 24 ++-
 .../scss/{main-custom.scss => _case-study.scss}| 14 ++--
 landing-pages/site/assets/scss/_list-boxes.scss| 10 +++
 landing-pages/site/assets/scss/_ol-ul.scss |  2 -
 .../assets/scss/{main-custom.scss => _pager.scss}  | 16 ++---
 .../scss/{main-custom.scss => _paragraph.scss} | 19 ++---
 .../assets/scss/{main-custom.scss => _quote.scss}  | 41 ---
 landing-pages/site/assets/scss/main-custom.scss|  5 ++
 .../site/content/en/case-studies/_index.html   |  7 ++
 .../content/en/case-studies/example-case1.html | 27 +++
 .../content/en/case-studies/example-case2.html | 27 +++
 .../content/en/case-studies/example-case3.html | 27 +++
 .../content/en/case-studies/example-case4.html | 27 +++
 .../content/en/case-studies/example-case5.html | 27 +++
 .../content/en/case-studies/example-case6.html | 27 +++
 landing-pages/site/data/case_studies.json  | 25 +--
 landing-pages/site/layouts/_default/baseof.html|  9 ++-
 .../layouts/{_default => case-studies}/baseof.html |  9 ++-
 .../content.html}  |  6 +-
 .../case-study.html => case-studies/list.html} | 24 ---
 .../single.html}   |  7 +-
 landing-pages/site/layouts/examples/list.html  | 83 --
 .../site/layouts/partials/boxes/case-study.html|  2 +-
 .../layouts/partials/buttons/button-hollow.html|  2 +-
 .../layouts/partials/buttons/button-with-icon.html |  2 +-
 .../partials/{boxes/case-study.html => pager.html} | 17 ++---
 .../{buttons/button-hollow.html => quote.html} | 10 ++-
 .../paragraph.html}|  3 +-
 30 files changed, 435 insertions(+), 123 deletions(-)

diff --git a/landing-pages/site/assets/icons/spotify-logo.svg 
b/landing-pages/site/assets/icons/spotify-logo.svg
new file mode 100644
index 000..1aebfcd
--- /dev/null
+++ b/landing-pages/site/assets/icons/spotify-logo.svg
@@ -0,0 +1,21 @@
+http://www.w3.org/2000/svg; width="166.806" height="50.029" 
viewBox="0 0 166.806 50.029">
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/landing-pages/site/assets/scss/main-custom.scss 
b/landing-pages/site/assets/scss/_base-layout.scss
similarity index 62%
copy from landing-pages/site/assets/scss/main-custom.scss
copy to landing-pages/site/assets/scss/_base-layout.scss
index 0d0dc95..2c560ec 100644
--- a/landing-pages/site/assets/scss/main-custom.scss
+++ b/landing-pages/site/assets/scss/_base-layout.scss
@@ -17,13 +17,33 @@
  * under the License.
  */
 
-@import url('https://fonts.googleapis.com/css?family=Rubik:500=swap');
-@import 
url('https://fonts.googleapis.com/css?family=Roboto:400,500,700=swap');
-@import 
url('https://fonts.googleapis.com/css?family=Roboto+Mono=swap');
+.base-layout {
+  padding: 64px 0 40px;
 
-@import "typography";
-@import "accordion";
-@import "buttons";
-@import "ol-ul";
-@import "list-boxes";
-@import "avatar";
+  &--button {
+display: flex;
+justify-content: flex-end;
+margin-right: 45px;
+margin-top: 80px
+  }
+}
+
+.page-header {
+  @extend .header__medium--greyish-brown;
+  text-align: center;
+  margin-bottom: 16px;
+}
+
+.page-subtitle {
+  @extend .subtitle__large--brownish-grey;
+  text-align: center;
+  font-weight: normal;
+  margin-bottom: 80px;
+}
+
+.container {
+  margin-top: 44px;
+  @media(min-width: 1200px) {
+max-width: 1230px;
+  }
+}
diff --git a/landing-pages/site/assets/scss/_buttons.scss 
b/landing-pages/site/assets/scss/_buttons.scss
index a186550..4aa76fc 100644
--- a/landing-pages/site/assets/scss/_buttons.scss
+++ b/landing-pages/site/assets/scss/_buttons.scss
@@ -16,7 +16,6 @@
  * specific language governing permissions and limitations
  * under the License.
  */
-
 @import "colors";
 
 button {
@@ -26,6 +25,10 @@ button {
   padding: 9px 29px;
   transition: all ease-out 0.2s;
 
+  &:disabled {
+cursor: not-allowed;
+  }
+
   &.btn-filled {
 border-color: map_get($colors, cerulean-blue);
 background-color: map_get($colors, cerulean-blue);
@@ -60,7 +63,12 @@ button {
   border-color: map_get($colors, cerulean-blue);
   background-color: 

[GitHub] [airflow-site] mik-laj merged pull request #81: Add case study page

2019-10-18 Thread GitBox
mik-laj merged pull request #81: Add case study page
URL: https://github.com/apache/airflow-site/pull/81
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-5692) Retrofit Dataflow integration

2019-10-18 Thread Kamil Bregula (Jira)
Kamil Bregula created AIRFLOW-5692:
--

 Summary: Retrofit Dataflow integration
 Key: AIRFLOW-5692
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5692
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 1.10.5
Reporter: Kamil Bregula






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] TobKed commented on a change in pull request #6345: [AIRFLOW-5667] Improve type annotations in GCP

2019-10-18 Thread GitBox
TobKed commented on a change in pull request #6345: [AIRFLOW-5667] Improve type 
annotations in GCP
URL: https://github.com/apache/airflow/pull/6345#discussion_r336535574
 
 

 ##
 File path: airflow/gcp/hooks/cloud_sql.py
 ##
 @@ -924,17 +924,17 @@ def _generate_connection_uri(self) -> str:
 quote_plus(self.password) if self.password else 'PASSWORD', 
''))
 return connection_uri
 
-def _get_instance_socket_name(self):
-return self.project_id + ":" + self.location + ":" + self.instance
+def _get_instance_socket_name(self) -> str:
+return self.project_id + ":" + self.location + ":" + self.instance  # 
type: ignore
 
-def _get_sqlproxy_instance_specification(self):
+def _get_sqlproxy_instance_specification(self) -> str:
 instance_specification = self._get_instance_socket_name()
 if self.sql_proxy_use_tcp:
 instance_specification += "=tcp:" + str(self.sql_proxy_tcp_port)
 return instance_specification
 
 @provide_session
-def create_connection(self, session: Session = None):
+def create_connection(self, session: Session = None) -> None:
 
 Review comment:
   Thanks! I've missed it. Fixed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #6356: [AIRFLOW-XXX] Changelog for 1.10.6 (rc1)

2019-10-18 Thread GitBox
potiuk commented on issue #6356: [AIRFLOW-XXX] Changelog for 1.10.6 (rc1)
URL: https://github.com/apache/airflow/pull/6356#issuecomment-543782523
 
 
   +OmerJog: It's also Python 3 compatibility changes. See discussion in
   
https://lists.apache.org/thread.html/b07a93c9114e3d3c55d4ee514955bac79bc012c7a00db627c6b4c55f@%3Cdev.airflow.apache.org%3E
   
   If community is happy about the approach I proposed there, we will release
   a back-port "per-provider" package that people will be able to install to
   use some "new" operators in Python 3 - compatible installations of  1.10.*
   . This way people will be able to start using the "new" operators from
   different import path without having to wait for 2.0. Initially we are
   going to do it for Google operators (we are about to finish migration of
   those according to AIP-21).
   
   Feel free to voice your comments in that thread.
   
   J.
   
   On Fri, Oct 18, 2019 at 4:17 PM Ash Berlin-Taylor 
   wrote:
   
   > 2 new operators were missed from this release?
   > GoogleCloudStorageToGoogleDrive
   > https://issues.apache.org/jira/browse/AIRFLOW-4758
   >
   > BigQueryToMySqlOperator
   > https://issues.apache.org/jira/browse/AIRFLOW-4161
   >
   > No, those are not included in the release. Because of the re-orginization
   > of GCP (both file names and base classes) in master it is a lot of effort
   > to pull these back.
   >
   > —
   > You are receiving this because you are subscribed to this thread.
   > Reply to this email directly, view it on GitHub
   > 
,
   > or unsubscribe
   > 

   > .
   >
   
   
   -- 
   
   Jarek Potiuk
   Polidea  | Principal Software Engineer
   
   M: +48 660 796 129 <+48660796129>
   [image: Polidea] 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-5691) Rewrite Dataproc operators to use python library

2019-10-18 Thread Tomasz Urbaszek (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5691?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tomasz Urbaszek updated AIRFLOW-5691:
-
Affects Version/s: (was: 2.0.0)
   1.10.5

> Rewrite Dataproc operators to use python library
> 
>
> Key: AIRFLOW-5691
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5691
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: gcp
>Affects Versions: 1.10.5
>Reporter: Tomasz Urbaszek
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-5691) Rewrite Dataproc operators to use python library

2019-10-18 Thread Tomasz Urbaszek (Jira)
Tomasz Urbaszek created AIRFLOW-5691:


 Summary: Rewrite Dataproc operators to use python library
 Key: AIRFLOW-5691
 URL: https://issues.apache.org/jira/browse/AIRFLOW-5691
 Project: Apache Airflow
  Issue Type: Improvement
  Components: gcp
Affects Versions: 2.0.0
Reporter: Tomasz Urbaszek






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] Fokko commented on a change in pull request #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2019-10-18 Thread GitBox
Fokko commented on a change in pull request #6317: [AIRFLOW-5644] Simplify 
TriggerDagRunOperator usage
URL: https://github.com/apache/airflow/pull/6317#discussion_r336513331
 
 

 ##
 File path: airflow/operators/dagrun_operator.py
 ##
 @@ -75,24 +63,21 @@ def __init__(
 self.execution_date = None
 else:
 raise TypeError(
-'Expected str or datetime.datetime type '
-'for execution_date. Got {}'.format(
-type(execution_date)))
+"Expected str or datetime.datetime type for execution_date. "
+"Got {}".format(type(execution_date))
+)
 
 def execute(self, context):
 
 Review comment:
   Opportunity for type annotation :-)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] ashb commented on issue #6356: [AIRFLOW-XXX] Changelog for 1.10.6 (rc1)

2019-10-18 Thread GitBox
ashb commented on issue #6356: [AIRFLOW-XXX] Changelog for 1.10.6 (rc1)
URL: https://github.com/apache/airflow/pull/6356#issuecomment-543767834
 
 
   > 2 new operators were missed from this release?
   > GoogleCloudStorageToGoogleDrive
   > https://issues.apache.org/jira/browse/AIRFLOW-4758
   > 
   > BigQueryToMySqlOperator
   > https://issues.apache.org/jira/browse/AIRFLOW-4161
   
   No, those are not included in the release. Because of the re-orginization of 
GCP (both file names and base classes) in master it is a lot of effort to pull 
these back.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5080) npm: not found with ./airflow/www_rbac/compile_assets.sh

2019-10-18 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5080?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16954632#comment-16954632
 ] 

ASF GitHub Bot commented on AIRFLOW-5080:
-

potiuk commented on pull request #5693: [AIRFLOW-5080] Added npm to apt-get 
install in compile.sh for docker
URL: https://github.com/apache/airflow/pull/5693
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> npm: not found with ./airflow/www_rbac/compile_assets.sh
> 
>
> Key: AIRFLOW-5080
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5080
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: build
>Affects Versions: 1.10.3
>Reporter: Esfahan
>Assignee: Esfahan
>Priority: Blocker
>
> h2. Problem
> When I run 
> [scripts/ci/kubernetes/docker/build.sh|https://github.com/apache/airflow/blob/1.10.3/scripts/ci/kubernetes/docker/build.sh#L45]
>  to build Docker Image, I encountered the following error.
> {code}
> running compile_assets
> ./airflow/www_rbac/compile_assets.sh: 26: 
> ./airflow/www_rbac/compile_assets.sh: npm: not found
> {code}
> So, http://YOUR_HOSTNAME/static didn't create. I got `500 Internal Server 
> Error` with it on a browser(Chrome).
> h2. Proposal
> I think it needs to install npm in 
> [scripts/ci/kubernetes/docker/compile.sh|https://github.com/apache/airflow/blob/1.10.3/scripts/ci/kubernetes/docker/compile.sh#L30].
> {code}
> apt-get install -y --no-install-recommends git nodejs npm
> {code}
> It works for me.
> Please consider my proposal.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] nuclearpinguin commented on issue #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2019-10-18 Thread GitBox
nuclearpinguin commented on issue #6317: [AIRFLOW-5644] Simplify 
TriggerDagRunOperator usage
URL: https://github.com/apache/airflow/pull/6317#issuecomment-543765994
 
 
   In overall I like the proposed simplification ✅


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2019-10-18 Thread GitBox
nuclearpinguin commented on a change in pull request #6317: [AIRFLOW-5644] 
Simplify TriggerDagRunOperator usage
URL: https://github.com/apache/airflow/pull/6317#discussion_r336510499
 
 

 ##
 File path: airflow/operators/dagrun_operator.py
 ##
 @@ -75,24 +63,21 @@ def __init__(
 self.execution_date = None
 else:
 raise TypeError(
-'Expected str or datetime.datetime type '
-'for execution_date. Got {}'.format(
-type(execution_date)))
+"Expected str or datetime.datetime type for execution_date. "
+"Got {}".format(type(execution_date))
+)
 
 def execute(self, context):
 
 Review comment:
   It's not a related change but since you are refactoring the operator I think 
adding a type hint does not hurt.
   ```suggestion
   def execute(self, context: Dict):
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #6317: [AIRFLOW-5644] Simplify TriggerDagRunOperator usage

2019-10-18 Thread GitBox
nuclearpinguin commented on a change in pull request #6317: [AIRFLOW-5644] 
Simplify TriggerDagRunOperator usage
URL: https://github.com/apache/airflow/pull/6317#discussion_r336508465
 
 

 ##
 File path: airflow/example_dags/example_trigger_target_dag.py
 ##
 @@ -18,65 +18,36 @@
 # under the License.
 
 """
-This example illustrates the use of the TriggerDagRunOperator. There are 2
-entities at work in this scenario:
-1. The Controller DAG - the DAG that conditionally executes the trigger
-   (in example_trigger_controller.py)
-2. The Target DAG - DAG being triggered
-
-This example illustrates the following features :
-1. A TriggerDagRunOperator that takes:
-  a. A python callable that decides whether or not to trigger the Target DAG
-  b. An optional params dict passed to the python callable to help in
- evaluating whether or not to trigger the Target DAG
-  c. The id (name) of the Target DAG
-  d. The python callable can add contextual info to the DagRun created by
- way of adding a Pickleable payload (e.g. dictionary of primitives). This
- state is then made available to the TargetDag
-2. A Target DAG : c.f. example_trigger_target_dag.py
+Example usage of the TriggerDagRunOperator. This example holds 2 DAGs:
+1. 1st DAG (example_trigger_controller_dag) holds a TriggerDagRunOperator, 
which will trigger the 2nd DAG
+2. 2nd DAG (example_trigger_target_dag) which will be triggered by the 
TriggerDagRunOperator in the 1st DAG
 """
 
-import pprint
-
-import airflow
+import airflow.utils.dates
 from airflow.models import DAG
 from airflow.operators.bash_operator import BashOperator
 from airflow.operators.python_operator import PythonOperator
 
-pp = pprint.PrettyPrinter(indent=4)
-
-args = {
-'start_date': airflow.utils.dates.days_ago(2),
-'owner': 'Airflow',
-}
-
 dag = DAG(
-dag_id='example_trigger_target_dag',
-default_args=args,
+dag_id="example_trigger_target_dag",
+default_args={"start_date": airflow.utils.dates.days_ago(2), "owner": 
"Airflow"},
 schedule_interval=None,
 )
 
 
-def run_this_func(**kwargs):
+def run_this_func(**context):
 """
 Print the payload "message" passed to the DagRun conf attribute.
 
 :param dict kwargs: Context
 """
-print("Remotely received value of {} for key=message".
-  format(kwargs['dag_run'].conf['message']))
+print("Remotely received value of {} for 
key=message".format(context["dag_run"].conf["message"]))
 
 
-run_this = PythonOperator(
-task_id='run_this',
-python_callable=run_this_func,
-dag=dag,
-)
+run_this = PythonOperator(task_id="run_this", python_callable=run_this_func, 
dag=dag)
 
-# You can also access the DagRun object in templates
 bash_task = BashOperator(
 task_id="bash_task",
-bash_command='echo "Here is the message: '
- '{{ dag_run.conf["message"] if dag_run else "" }}" ',
+bash_command='echo "Here is the message: ' '{{ dag_run.conf["message"] if 
dag_run else "" }}" ',
 
 Review comment:
   ```suggestion
   bash_command='echo "Here is the message: '{{ dag_run.conf["message"] if 
dag_run else "" }}'" ',
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >