[jira] [Created] (AIRFLOW-6654) AWS DataSync - bugfix when creating locations

2020-01-27 Thread Bjorn Olsen (Jira)
Bjorn Olsen created AIRFLOW-6654:


 Summary: AWS DataSync - bugfix when creating locations
 Key: AIRFLOW-6654
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6654
 Project: Apache Airflow
  Issue Type: Bug
  Components: aws
Affects Versions: 1.10.7
Reporter: Bjorn Olsen
Assignee: Bjorn Olsen


If a Location does not exist it will attempt to be created, when creating an 
AWS Datasync task. However this should only happen if the appropriate 
create_kwargs where provided - otherwise creation should not be attempted and 
instead an error should be produced.

 

This is currently not happening, below log indicates a source location was 
missing and DataSync operator tried to create it. However no kwargs were 
provided so instead it should just immediately fail.

The log message is also a bit hard to understand and can be improved.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-6649) Google storage to Snowflake

2020-01-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6649?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula updated AIRFLOW-6649:
---
Labels: snowflake  (was: )

> Google storage to Snowflake
> ---
>
> Key: AIRFLOW-6649
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6649
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp, operators
>Affects Versions: 1.10.6
>Reporter: nexoriv
>Priority: Major
>  Labels: snowflake
>
> can someone share google storage to snowflake operator?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6650) Google Cloud Platform Connection key json documentation or code is wrong

2020-01-27 Thread Kamil Bregula (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024905#comment-17024905
 ] 

Kamil Bregula commented on AIRFLOW-6650:


Does this problem occur in the master branch? 
[https://airflow.readthedocs.io/en/latest/]

> Google Cloud Platform Connection key json documentation or code is wrong
> 
>
> Key: AIRFLOW-6650
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6650
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.4, 1.10.5, 1.10.6, 1.10.7
>Reporter: Evgeniy Sokolov
>Priority: Minor
>
> According to the documentation: 
> [https://airflow.readthedocs.io/en/stable/howto/connection/gcp.html] 
> The name of the external configuration for Keyfile JSON is: 
>  * {{extra__google_cloud_platform__key_dict}} - Keyfile JSON
> Excluding the prefix ({{extra__google_cloud_platform__) the name of the 
> variable is *key_dict*. 
> }}
> However, '*keyfile_dict*' is expected in the source code: 
> [https://github.com/apache/airflow/blob/master/airflow/gcp/hooks/base.py]
> {code:java}
> 146: keyfile_dict = self._get_field('keyfile_dict', None)  # type: 
> Optional[str]{code}
> [https://github.com/apache/airflow/blob/v1-10-stable/airflow/contrib/hooks/gcp_api_base_hook.py]
> {code:java}
> 138: keyfile_dict = self._get_field('keyfile_dict', False){code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Reopened] (AIRFLOW-2289) Add additional quick start to INSTALL

2020-01-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2289?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula reopened AIRFLOW-2289:


> Add additional quick start to INSTALL
> -
>
> Key: AIRFLOW-2289
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2289
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Bolke de Bruin
>Priority: Blocker
>  Labels: gsoc, gsoc2020, mentor
> Fix For: 1.10.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6451) self._print_stat() in dag_processing.py should be skippable by config option

2020-01-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6451?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-6451.

Fix Version/s: (was: 2.0.0)
   1.10.8
   Resolution: Fixed

> self._print_stat() in dag_processing.py should be skippable by config option
> 
>
> Key: AIRFLOW-6451
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6451
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Assignee: t oo
>Priority: Minor
> Fix For: 1.10.8
>
>
> perf benefit
> clean up extra poll, logs, typos



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6451) self._print_stat() in dag_processing.py should be skippable by config option

2020-01-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6451?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024904#comment-17024904
 ] 

ASF subversion and git services commented on AIRFLOW-6451:
--

Commit 1d73fb34e5b82524cd723db991eb837dc6ce062e in airflow's branch 
refs/heads/master from tooptoop4
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=1d73fb3 ]

[AIRFLOW-6451] self._print_stat() in dag_processing.py should be skippable 
(#7134)



> self._print_stat() in dag_processing.py should be skippable by config option
> 
>
> Key: AIRFLOW-6451
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6451
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Assignee: t oo
>Priority: Minor
> Fix For: 2.0.0
>
>
> perf benefit
> clean up extra poll, logs, typos



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6451) self._print_stat() in dag_processing.py should be skippable by config option

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6451?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024903#comment-17024903
 ] 

ASF GitHub Bot commented on AIRFLOW-6451:
-

mik-laj commented on pull request #7134: [AIRFLOW-6451] self._print_stat() in 
dag_processing.py should be skippable
URL: https://github.com/apache/airflow/pull/7134
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> self._print_stat() in dag_processing.py should be skippable by config option
> 
>
> Key: AIRFLOW-6451
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6451
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: scheduler
>Affects Versions: 1.10.7
>Reporter: t oo
>Assignee: t oo
>Priority: Minor
> Fix For: 2.0.0
>
>
> perf benefit
> clean up extra poll, logs, typos



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-2289) Add additional quick start to INSTALL

2020-01-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2289?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-2289.

Resolution: Fixed

> Add additional quick start to INSTALL
> -
>
> Key: AIRFLOW-2289
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2289
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Bolke de Bruin
>Priority: Blocker
>  Labels: gsoc, gsoc2020, mentor
> Fix For: 1.10.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj merged pull request #7134: [AIRFLOW-6451] self._print_stat() in dag_processing.py should be skippable

2020-01-27 Thread GitBox
mik-laj merged pull request #7134: [AIRFLOW-6451] self._print_stat() in 
dag_processing.py should be skippable
URL: https://github.com/apache/airflow/pull/7134
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6653) Add option to use local or DAG's timezone in jinja template macros

2020-01-27 Thread Chuan Qiu (Jira)
Chuan Qiu created AIRFLOW-6653:
--

 Summary: Add option to use local or DAG's timezone in jinja 
template macros
 Key: AIRFLOW-6653
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6653
 Project: Apache Airflow
  Issue Type: New Feature
  Components: DAG, DagRun
Affects Versions: 1.10.7
Reporter: Chuan Qiu


We use DAG in a different timezone, but for jinja macros like "ds", it always 
uses UTC.

 

if the DAG schedule & triggering is based on some timezone, it would make more 
sense to use the dag's timezone for such macros. (internally in Sql DB it's 
still OK to use UTC time to be exact)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6652) Set is_delete_operator_pod=True by default in KubernetesPodOperator()

2020-01-27 Thread Mathew Wicks (Jira)
Mathew Wicks created AIRFLOW-6652:
-

 Summary: Set is_delete_operator_pod=True by default in 
KubernetesPodOperator()
 Key: AIRFLOW-6652
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6652
 Project: Apache Airflow
  Issue Type: Improvement
  Components: operators
Affects Versions: 1.10.7
Reporter: Mathew Wicks


As we are not far from a major release (Airflow 2.0) could we consider changing 
the default behaviour of the *is_delete_operator_pod* argument in 
*KubernetesPodOperator*() to *True, r*ather than False?

 

For most usecases, it is not desirable that the pod continues running 
independent of the airflow task, and having an unsafe default is bad practice.

 

Thoughts?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] akki commented on a change in pull request #6552: [AIRFLOW-5850] Capture task logs in DockerSwarmOperator

2020-01-27 Thread GitBox
akki commented on a change in pull request #6552: [AIRFLOW-5850] Capture task 
logs in DockerSwarmOperator
URL: https://github.com/apache/airflow/pull/6552#discussion_r371607250
 
 

 ##
 File path: airflow/providers/docker/operators/docker_swarm.py
 ##
 @@ -123,11 +129,43 @@ def _run_image(self):
 
 self.log.info('Service started: %s', str(self.service))
 
-status = None
 # wait for the service to start the task
 while not self.cli.tasks(filters={'service': self.service['ID']}):
 continue
-while True:
+
+logs = self.cli.service_logs(
+self.service['ID'], follow=True, stdout=True, stderr=True, 
is_tty=self.tty
+)
+line = ''
+_stream_logs = self.enable_logging  # Status of the service_logs' 
generator
+while True:  # pylint: disable=too-many-nested-blocks
+if self.enable_logging:
 
 Review comment:
   Okay, I can think of a way to simplify the logic.
   
   (By putting the service termination check within the `except 
ConnectionError` clause itself.)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie edited a comment on issue #5659: [AIRFLOW-5033] Switched to snakebite-py3 [DO NOT MERGE]

2020-01-27 Thread GitBox
zhongjiajie edited a comment on issue #5659: [AIRFLOW-5033] Switched to 
snakebite-py3 [DO NOT MERGE]
URL: https://github.com/apache/airflow/pull/5659#issuecomment-579059497
 
 
   @elukey  Thanks for you working here, Yeah I think it could be done in 
separately PR, but I think we should done before Airflow 2.0 release, cause we 
have roadmap to drop py2 support before Airflow 2.0 WDYT @potiuk  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie edited a comment on issue #5659: [AIRFLOW-5033] Switched to snakebite-py3 [DO NOT MERGE]

2020-01-27 Thread GitBox
zhongjiajie edited a comment on issue #5659: [AIRFLOW-5033] Switched to 
snakebite-py3 [DO NOT MERGE]
URL: https://github.com/apache/airflow/pull/5659#issuecomment-579059497
 
 
   Thanks for you working here, Yeah I think it could be done in separately PR, 
but I think we should done before Airflow 2.0 release, cause we have roadmap to 
drop py2 support before Airflow 2.0 WDYT @potiuk  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #5659: [AIRFLOW-5033] Switched to snakebite-py3 [DO NOT MERGE]

2020-01-27 Thread GitBox
zhongjiajie commented on issue #5659: [AIRFLOW-5033] Switched to snakebite-py3 
[DO NOT MERGE]
URL: https://github.com/apache/airflow/pull/5659#issuecomment-579059497
 
 
   Thanks for you working here, Yeah I think it could be done in separately PR, 
but I think we should done before Airflow 2.0 release, WDYT @potiuk  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-01-27 Thread GitBox
zhongjiajie commented on a change in pull request #6090: [AIRFLOW-5470] Add 
Apache Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r371591219
 
 

 ##
 File path: docs/operators-and-hooks-ref.rst
 ##
 @@ -98,6 +98,12 @@ Foundation.
:mod:`airflow.sensors.hive_partition_sensor`,
:mod:`airflow.sensors.metastore_partition_sensor`
 
+   * - `Apache Livy `__
+ -
+ - :mod:`airflow.contrib.hooks.livy_hook`
 
 Review comment:
   And others path


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-01-27 Thread GitBox
zhongjiajie commented on a change in pull request #6090: [AIRFLOW-5470] Add 
Apache Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r371591125
 
 

 ##
 File path: docs/operators-and-hooks-ref.rst
 ##
 @@ -98,6 +98,12 @@ Foundation.
:mod:`airflow.sensors.hive_partition_sensor`,
:mod:`airflow.sensors.metastore_partition_sensor`
 
+   * - `Apache Livy `__
+ -
+ - :mod:`airflow.contrib.hooks.livy_hook`
 
 Review comment:
   ```suggestion
- :mod:`airflow.providers.apache.livy.hooks.livy_hook`
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-01-27 Thread GitBox
zhongjiajie commented on issue #6090: [AIRFLOW-5470] Add Apache Livy REST 
operator
URL: https://github.com/apache/airflow/pull/6090#issuecomment-579057953
 
 
   FYI, ci sad, you should add new hook operator and sensor to 
`operators-and-hooks-ref.rst` @lucacavazzana
   
   ```
   Missing modules:
   airflow.providers.apache.livy.hooks.livy_hook
   airflow.providers.apache.livy.operators.livy_operator
   airflow.providers.apache.livy.sensors.livy_sensor
   
   Please add this module to operators-and-hooks-ref.rst file.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-01-27 Thread GitBox
roitvt commented on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-579006913
 
 
   @mik-laj can you make another review? :) I fixed the connection according to 
your recommendations 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-574641714
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=h1) 
Report
   > Merging 
[#7163](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/113ff1bcc9a90207e1aaa82e7f72b1322f835afd?src=pr=desc)
 will **decrease** coverage by `0.07%`.
   > The diff coverage is `40.49%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7163/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7163  +/-   ##
   ==
   - Coverage   85.39%   85.31%   -0.08% 
   ==
 Files 791  842  +51 
 Lines   4013740444 +307 
   ==
   + Hits3427334506 +233 
   - Misses   5864 5938  +74
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.07% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/kubernetes\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2t1YmVybmV0ZXNfaG9vay5weQ==)
 | `27.58% <27.58%> (ø)` | |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `76.77% <33.33%> (-0.63%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/spark\_kubernetes\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc3Bhcmtfa3ViZXJuZXRlc19zZW5zb3IucHk=)
 | `35.89% <35.89%> (ø)` | |
   | 
[...low/contrib/operators/spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zcGFya19rdWJlcm5ldGVzX29wZXJhdG9yLnB5)
 | `43.33% <43.33%> (ø)` | |
   | 
[.../example\_dags/example\_spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3NwYXJrX2t1YmVybmV0ZXNfb3BlcmF0b3IucHk=)
 | `56.25% <56.25%> (ø)` | |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/grpc\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9ncnBjX29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [97 
more](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=footer). 
Last update 
[113ff1b...059245e](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-574641714
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=h1) 
Report
   > Merging 
[#7163](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/113ff1bcc9a90207e1aaa82e7f72b1322f835afd?src=pr=desc)
 will **decrease** coverage by `0.07%`.
   > The diff coverage is `40.49%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7163/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7163  +/-   ##
   ==
   - Coverage   85.39%   85.31%   -0.08% 
   ==
 Files 791  842  +51 
 Lines   4013740444 +307 
   ==
   + Hits3427334506 +233 
   - Misses   5864 5938  +74
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.07% <100%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/kubernetes\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2t1YmVybmV0ZXNfaG9vay5weQ==)
 | `27.58% <27.58%> (ø)` | |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `76.77% <33.33%> (-0.63%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/spark\_kubernetes\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc3Bhcmtfa3ViZXJuZXRlc19zZW5zb3IucHk=)
 | `35.89% <35.89%> (ø)` | |
   | 
[...low/contrib/operators/spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zcGFya19rdWJlcm5ldGVzX29wZXJhdG9yLnB5)
 | `43.33% <43.33%> (ø)` | |
   | 
[.../example\_dags/example\_spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3NwYXJrX2t1YmVybmV0ZXNfb3BlcmF0b3IucHk=)
 | `56.25% <56.25%> (ø)` | |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/hooks/jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9qZGJjX2hvb2sucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/grpc\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9ncnBjX29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [97 
more](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=footer). 
Last update 
[113ff1b...059245e](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7163: [AIRFLOW-6542] add spark-on-k8s 
operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#issuecomment-574641714
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=h1) 
Report
   > Merging 
[#7163](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/113ff1bcc9a90207e1aaa82e7f72b1322f835afd?src=pr=desc)
 will **decrease** coverage by `53.05%`.
   > The diff coverage is `2.47%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7163/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree)
   
   ```diff
   @@ Coverage Diff @@
   ##   master#7163   +/-   ##
   ===
   - Coverage   85.39%   32.33%   -53.06% 
   ===
 Files 791  841   +50 
 Lines   4013740431  +294 
   ===
   - Hits3427313074-21199 
   - Misses   586427357+21493
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../example\_dags/example\_spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3NwYXJrX2t1YmVybmV0ZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `35.07% <0%> (-42.34%)` | :arrow_down: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `25.53% <0%> (-50.54%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/spark\_kubernetes\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc3Bhcmtfa3ViZXJuZXRlc19zZW5zb3IucHk=)
 | `0% <0%> (ø)` | |
   | 
[airflow/contrib/hooks/kubernetes\_hook.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2t1YmVybmV0ZXNfaG9vay5weQ==)
 | `0% <0%> (ø)` | |
   | 
[...low/contrib/operators/spark\_kubernetes\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zcGFya19rdWJlcm5ldGVzX29wZXJhdG9yLnB5)
 | `0% <0%> (ø)` | |
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `95.83% <100%> (-4.17%)` | :arrow_down: |
   | 
[...low/contrib/operators/wasb\_delete\_blob\_operator.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy93YXNiX2RlbGV0ZV9ibG9iX29wZXJhdG9yLnB5)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...flow/contrib/example\_dags/example\_qubole\_sensor.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX3F1Ym9sZV9zZW5zb3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...ample\_dags/example\_emr\_job\_flow\_automatic\_steps.py](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2V4YW1wbGVfZGFncy9leGFtcGxlX2Vtcl9qb2JfZmxvd19hdXRvbWF0aWNfc3RlcHMucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | ... and [708 
more](https://codecov.io/gh/apache/airflow/pull/7163/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=footer). 
Last update 
[113ff1b...059245e](https://codecov.io/gh/apache/airflow/pull/7163?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 edited a comment on issue #7134: [AIRFLOW-6451] self._print_stat() in dag_processing.py should be skippable

2020-01-27 Thread GitBox
tooptoop4 edited a comment on issue #7134: [AIRFLOW-6451] self._print_stat() in 
dag_processing.py should be skippable
URL: https://github.com/apache/airflow/pull/7134#issuecomment-576452064
 
 
   @mik-laj  pls merge


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] tooptoop4 commented on issue #7157: [AIRFLOW-6251] add config for max tasks per dag

2020-01-27 Thread GitBox
tooptoop4 commented on issue #7157: [AIRFLOW-6251] add config for max tasks per 
dag
URL: https://github.com/apache/airflow/pull/7157#issuecomment-578975631
 
 
   @kaxil WDYT


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6651) Make Heartbeat available on Redis

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6651?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024701#comment-17024701
 ] 

ASF GitHub Bot commented on AIRFLOW-6651:
-

saguziel commented on pull request #7269: [AIRFLOW-6651][WIP] Add Redis 
Heartbeat option
URL: https://github.com/apache/airflow/pull/7269
 
 
   Adds the option to use Redis to store the heartbeat data. This will reduce 
load on the DB. All the guarantees a DB provides is not needed for the 
heartbeat and Redis is pretty simple to use and common.
   
   Unit tests WIP
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Make Heartbeat available on Redis
> -
>
> Key: AIRFLOW-6651
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6651
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: worker
>Affects Versions: 2.0.0
>Reporter: Alex Guziel
>Assignee: Alex Guziel
>Priority: Major
>
> Heartbeating takes a lot of MySQL load and doesn't need all the guarantees 
> MySQL provides. Offloading it to Redis as an option can make Airflow much 
> more scalable.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] saguziel opened a new pull request #7269: [AIRFLOW-6651][WIP] Add Redis Heartbeat option

2020-01-27 Thread GitBox
saguziel opened a new pull request #7269: [AIRFLOW-6651][WIP] Add Redis 
Heartbeat option
URL: https://github.com/apache/airflow/pull/7269
 
 
   Adds the option to use Redis to store the heartbeat data. This will reduce 
load on the DB. All the guarantees a DB provides is not needed for the 
heartbeat and Redis is pretty simple to use and common.
   
   Unit tests WIP
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-]`. AIRFLOW- = 
JIRA ID*
   - [ ] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [ ] Relevant documentation is updated including usage instructions.
   - [ ] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   * For document-only changes commit message can start with 
`[AIRFLOW-]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6629) Dask executor tests fail in master

2020-01-27 Thread Loren Brindze (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6629?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024694#comment-17024694
 ] 

Loren Brindze commented on AIRFLOW-6629:


Able to reproduce, working on a PR to fix these.

> Dask executor tests fail in master 
> ---
>
> Key: AIRFLOW-6629
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6629
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Assigned] (AIRFLOW-6629) Dask executor tests fail in master

2020-01-27 Thread Loren Brindze (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Loren Brindze reassigned AIRFLOW-6629:
--

Assignee: Loren Brindze

> Dask executor tests fail in master 
> ---
>
> Key: AIRFLOW-6629
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6629
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: ci
>Affects Versions: 2.0.0
>Reporter: Jarek Potiuk
>Assignee: Loren Brindze
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6651) Make Heartbeat available on Redis

2020-01-27 Thread Alex Guziel (Jira)
Alex Guziel created AIRFLOW-6651:


 Summary: Make Heartbeat available on Redis
 Key: AIRFLOW-6651
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6651
 Project: Apache Airflow
  Issue Type: Improvement
  Components: worker
Affects Versions: 2.0.0
Reporter: Alex Guziel
Assignee: Alex Guziel


Heartbeating takes a lot of MySQL load and doesn't need all the guarantees 
MySQL provides. Offloading it to Redis as an option can make Airflow much more 
scalable.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-01-27 Thread Ignacio Peluffo (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024541#comment-17024541
 ] 

Ignacio Peluffo edited comment on AIRFLOW-5071 at 1/27/20 6:15 PM:
---

Another one here, below is a capture of Flower showing that a task was 
scheduled multiple times (the capture shows only three but there were 21 
schedules):

!image-2020-01-27-18-10-29-124.png!

Then, in the database's table `task_instance`, the task row looks like:

 
{code:java}
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| task_id  | dag_id  | execution_date | 
start_date | end_date   | duration | state  | try_number | 
hostname | unixname | job_id | pool | queue   | priority_weight | 
operator | queued_dttm| pid  | max_tries | executor_config  |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| clients | xxx | 2020-01-27 13:40:03.324162 | NULL   | 2020-01-27 
14:56:31.304658 | NULL | failed | 0  |  | airflow  | NULL   
| default_pool | default | 1   | NULL | 2020-01-27 
13:53:36.976145 | NULL | 0 | E'\\x80047D942E' |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
{code}
I hope this help to find the cause and solution to the problem.

Also, any recommendation about which setting could be changed to fix this issue 
would be appreciate it.

 


was (Author: ipeluffo):
Another one here, below is a capture of Flower showing that a task was 
scheduled multiple times (the capture shows only three but there were 21 
schedules):

!image-2020-01-27-18-10-29-124.png!

Then, in the database's table `task_instance`, the tasks row looks like:

 
{code:java}
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| task_id  | dag_id  | execution_date | 
start_date | end_date   | duration | state  | try_number | 
hostname | unixname | job_id | pool | queue   | priority_weight | 
operator | queued_dttm| pid  | max_tries | executor_config  |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| clients | xxx | 2020-01-27 13:40:03.324162 | NULL   | 2020-01-27 
14:56:31.304658 | NULL | failed | 0  |  | airflow  | NULL   
| default_pool | default | 1   | NULL | 2020-01-27 
13:53:36.976145 | NULL | 0 | E'\\x80047D942E' |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
{code}
I hope this help to find the cause and solution to the problem.

Also, any recommendation about which setting could be changed to fix this issue 
would be appreciate it.

 

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Attachments: image-2020-01-27-18-10-29-124.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  

[jira] [Commented] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-01-27 Thread Ignacio Peluffo (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024541#comment-17024541
 ] 

Ignacio Peluffo commented on AIRFLOW-5071:
--

Another one here, below is a capture of Flower showing that a task was 
scheduled multiple times (the capture shows only three but there were 21 
schedules):

!image-2020-01-27-18-10-29-124.png!

Then, in the database's table `task_instance`, the tasks row looks like:

 
{code:java}
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| task_id  | dag_id  | execution_date | 
start_date | end_date   | duration | state  | try_number | 
hostname | unixname | job_id | pool | queue   | priority_weight | 
operator | queued_dttm| pid  | max_tries | executor_config  |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
| clients | xxx | 2020-01-27 13:40:03.324162 | NULL   | 2020-01-27 
14:56:31.304658 | NULL | failed | 0  |  | airflow  | NULL   
| default_pool | default | 1   | NULL | 2020-01-27 
13:53:36.976145 | NULL | 0 | E'\\x80047D942E' |
+--+-++++--+++--+--++--+-+-+--++--+---+--+
{code}
I hope this help to find the cause and solution to the problem.

Also, any recommendation about which setting could be changed to fix this issue 
would be appreciate it.

 

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Attachments: image-2020-01-27-18-10-29-124.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (AIRFLOW-5071) Thousand os Executor reports task instance X finished (success) although the task says its queued. Was the task killed externally?

2020-01-27 Thread Ignacio Peluffo (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ignacio Peluffo updated AIRFLOW-5071:
-
Attachment: image-2020-01-27-18-10-29-124.png

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> --
>
> Key: AIRFLOW-5071
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: DAG, scheduler
>Affects Versions: 1.10.3
>Reporter: msempere
>Priority: Critical
> Attachments: image-2020-01-27-18-10-29-124.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance  2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] JavierLopezT commented on issue #6670: [AIRFLOW-4816]MySqlToS3Operator

2020-01-27 Thread GitBox
JavierLopezT commented on issue #6670: [AIRFLOW-4816]MySqlToS3Operator
URL: https://github.com/apache/airflow/pull/6670#issuecomment-578857598
 
 
   Could anyone help me a little more with the unittest please? Thank you very 
much in advance


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
nuclearpinguin commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371366880
 
 

 ##
 File path: airflow/contrib/hooks/yandexcloud_dataproc_hook.py
 ##
 @@ -0,0 +1,571 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import random
+from datetime import datetime
+
+import yandex.cloud.dataproc.v1.cluster_pb2 as cluster_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2 as cluster_service_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2_grpc as 
cluster_service_grpc_pb
+import yandex.cloud.dataproc.v1.common_pb2 as common_pb
+import yandex.cloud.dataproc.v1.job_pb2 as job_pb
+import yandex.cloud.dataproc.v1.job_service_pb2 as job_service_pb
+import yandex.cloud.dataproc.v1.job_service_pb2_grpc as job_service_grpc_pb
+import yandex.cloud.dataproc.v1.subcluster_pb2 as subcluster_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2 as subcluster_service_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2_grpc as 
subcluster_service_grpc_pb
+from google.protobuf.field_mask_pb2 import FieldMask
+from six import string_types
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.exceptions import AirflowException
+
+
+class DataprocHook(YandexCloudBaseHook):
+"""
+A base hook for Yandex.Cloud Data Proc.
+
+:param connection_id: The connection ID to use when fetching connection 
info.
+:type connection_id: str
+"""
+
+def __init__(self, *args, **kwargs):
+super(DataprocHook, self).__init__(*args, **kwargs)
+self.cluster_id = None
+
+def _get_operation_result(self, operation, response_type=None, 
meta_type=None):
+message = 'Running Yandex.Cloud operation. ID: {}. Description: {}. 
Created at: {}. Created by: {}.'
+message = message.format(
+operation.id,
+operation.description,
+datetime.fromtimestamp(operation.created_at.seconds),
+operation.created_by,
+)
+if meta_type:
+unpacked_meta = meta_type()
+operation.metadata.Unpack(unpacked_meta)
+message += ' Meta: {}.'.format(unpacked_meta)
+self.log.info(message)
+result = self.wait_for_operation(operation)
+if result.error and result.error.code:
+error_message = 'Error Yandex.Cloud operation. ID: {}. Error code: 
{}. Details: {}. Message: {}.'
+error_message = error_message.format(
+result.id, result.error.code, result.error.details, 
result.error.message
+)
+self.log.error(error_message)
+raise AirflowException(error_message)
+else:
+log_message = 'Done Yandex.Cloud operation. ID: 
{}.'.format(operation.id)
+unpacked_response = None
+if response_type:
+unpacked_response = response_type()
+result.response.Unpack(unpacked_response)
+log_message += ' Response: {}.'.format(unpacked_response)
+self.log.info(log_message)
+if unpacked_response:
+return unpacked_response
+return None
+
+def add_subcluster(
+self,
+cluster_id,
+subcluster_type,
+name,
+subnet_id,
+resource_preset='s2.small',
+disk_size=15,
+disk_type='network-ssd',
+hosts_count=5,
+):
+"""
+Add subcluster to Yandex.Cloud Data Proc cluster.
+
+:param cluster_id: ID of the cluster.
+:type cluster_id: str
+:param name: Name of the subcluster. Must be unique in the cluster
+:type name: str
+:param subcluster_type: Type of the subcluster. Either "data" or 
"compute".
+:type subcluster_type: str
+:param subnet_id: Subnet ID of the cluster.
+:type subnet_id: str
+:param resource_preset: Resources preset (CPU+RAM configuration) for 
the nodes of the cluster.
+:type resource_preset: str
+:param disk_size: Storage size in GiB.
+:type disk_size: int
+:param disk_type: 

[GitHub] [airflow] osule commented on issue #6354: [AIRFLOW-5664] Store timestamps with microseconds precision in GCSToPSQL

2020-01-27 Thread GitBox
osule commented on issue #6354: [AIRFLOW-5664] Store timestamps with 
microseconds precision in GCSToPSQL
URL: https://github.com/apache/airflow/pull/6354#issuecomment-578836750
 
 
   > @osule can you please, rebase onto new master?
   
   Sure, I'll resolve the conflicts and push a changed commit.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] roitvt commented on a change in pull request #7163: [AIRFLOW-6542] add spark-on-k8s operator/hook/sensor

2020-01-27 Thread GitBox
roitvt commented on a change in pull request #7163: [AIRFLOW-6542] add 
spark-on-k8s operator/hook/sensor
URL: https://github.com/apache/airflow/pull/7163#discussion_r371317220
 
 

 ##
 File path: airflow/contrib/operators/spark_kubernetes_operator.py
 ##
 @@ -0,0 +1,77 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from typing import Optional
+
+from kubernetes import client
+
+from airflow.contrib.hooks.kubernetes_hook import Kuberneteshook
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class SparkKubernetesOperator(BaseOperator):
+"""
+creates sparkapplication object in kubernetes cluster
+
+:param sparkapplication_object: kubernetes custom_resource_definition of 
sparkApplication
+:param namespace: kubernetes namespace to put sparkApplication
+:param kube_config: kubernetes kube_config path
+:param in_cluster: if airflow runs inside kubernetes pod take 
configuration from inside the cluster.
+"""
+
+template_fields = ['sparkapplication_object', 'namespace', 'kube_config']
+template_ext = ()
+ui_color = '#f4a460'
+
+@apply_defaults
+def __init__(self,
+ sparkapplication_object: dict,
+ namespace: str = 'default',
+ kube_config: Optional[str] = None,
+ in_cluster: bool = False,
+ *args, **kwargs) -> None:
+super().__init__(*args, **kwargs)
+self.sparkapplication_object = sparkapplication_object
+self.namespace = namespace
+self.kube_config = kube_config
+self.in_cluster = in_cluster
+if kwargs.get('xcom_push') is not None:
+raise AirflowException("'xcom_push' was deprecated, use 
'BaseOperator.do_xcom_push' instead")
+
+def execute(self, context):
+self.log.info("creating sparkApplication")
+hook = Kuberneteshook(
+kube_config=self.kube_config,
+in_cluster=self.in_cluster
+)
+api_client = hook.get_conn()
+api = client.CustomObjectsApi(api_client)
 
 Review comment:
   in the connection you have namespace field, the operator/sensor can still 
set namespace but unless they do it we will use the namespace defined in the 
connection, if the connection doesn't have namespace then we will use "default" 
namespace.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6650) Google Cloud Platform Connection key json documentation or code is wrong

2020-01-27 Thread Evgeniy Sokolov (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024434#comment-17024434
 ] 

Evgeniy Sokolov commented on AIRFLOW-6650:
--

Either documentation or the code should be fixed. 

I suppose that documentation is easier. 

> Google Cloud Platform Connection key json documentation or code is wrong
> 
>
> Key: AIRFLOW-6650
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6650
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: 1.10.4, 1.10.5, 1.10.6, 1.10.7
>Reporter: Evgeniy Sokolov
>Priority: Minor
>
> According to the documentation: 
> [https://airflow.readthedocs.io/en/stable/howto/connection/gcp.html] 
> The name of the external configuration for Keyfile JSON is: 
>  * {{extra__google_cloud_platform__key_dict}} - Keyfile JSON
> Excluding the prefix ({{extra__google_cloud_platform__) the name of the 
> variable is *key_dict*. 
> }}
> However, '*keyfile_dict*' is expected in the source code: 
> [https://github.com/apache/airflow/blob/master/airflow/gcp/hooks/base.py]
> {code:java}
> 146: keyfile_dict = self._get_field('keyfile_dict', None)  # type: 
> Optional[str]{code}
> [https://github.com/apache/airflow/blob/v1-10-stable/airflow/contrib/hooks/gcp_api_base_hook.py]
> {code:java}
> 138: keyfile_dict = self._get_field('keyfile_dict', False){code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] aviemzur commented on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on issue #6824: [AIRFLOW-6258] add CloudFormation operators 
to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-578805415
 
 
   @feluelle committed all suggestions you made + made the changes you 
requested and all tests are passing on Travis.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6650) Google Cloud Platform Connection key json documentation or code is wrong

2020-01-27 Thread Evgeniy Sokolov (Jira)
Evgeniy Sokolov created AIRFLOW-6650:


 Summary: Google Cloud Platform Connection key json documentation 
or code is wrong
 Key: AIRFLOW-6650
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6650
 Project: Apache Airflow
  Issue Type: Bug
  Components: gcp
Affects Versions: 1.10.7, 1.10.6, 1.10.5, 1.10.4
Reporter: Evgeniy Sokolov


According to the documentation: 

[https://airflow.readthedocs.io/en/stable/howto/connection/gcp.html] 

The name of the external configuration for Keyfile JSON is: 
 * {{extra__google_cloud_platform__key_dict}} - Keyfile JSON

Excluding the prefix ({{extra__google_cloud_platform__) the name of the 
variable is *key_dict*. 
}}
However, '*keyfile_dict*' is expected in the source code: 

[https://github.com/apache/airflow/blob/master/airflow/gcp/hooks/base.py]
{code:java}
146: keyfile_dict = self._get_field('keyfile_dict', None)  # type: 
Optional[str]{code}

[https://github.com/apache/airflow/blob/v1-10-stable/airflow/contrib/hooks/gcp_api_base_hook.py]
{code:java}
138: keyfile_dict = self._get_field('keyfile_dict', False){code}
 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-571189190
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=h1) 
Report
   > Merging 
[#6824](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **increase** coverage by `0.35%`.
   > The diff coverage is `99%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6824/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #6824  +/-   ##
   =
   + Coverage   84.85%   85.2%   +0.35% 
   =
 Files 679 841 +162 
 Lines   38536   40424+1888 
   =
   + Hits32698   34445+1747 
   - Misses   58385979 +141
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../providers/amazon/aws/operators/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9vcGVyYXRvcnMvY2xvdWRfZm9ybWF0aW9uLnB5)
 | `100% <100%> (ø)` | |
   | 
[...ow/providers/amazon/aws/sensors/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Nsb3VkX2Zvcm1hdGlvbi5weQ==)
 | `100% <100%> (ø)` | |
   | 
[...flow/providers/amazon/aws/hooks/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9jbG91ZF9mb3JtYXRpb24ucHk=)
 | `96.77% <96.77%> (ø)` | |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/operators/snowflake\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zbm93Zmxha2Vfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.84%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/snowflake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3Nub3dmbGFrZV9ob29rLnB5)
 | `0% <0%> (-81.14%)` | :arrow_down: |
   | ... and [481 
more](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=footer). 
Last update 
[be812bd...78a5d82](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-571189190
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=h1) 
Report
   > Merging 
[#6824](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **increase** coverage by `0.35%`.
   > The diff coverage is `99%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6824/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #6824  +/-   ##
   =
   + Coverage   84.85%   85.2%   +0.35% 
   =
 Files 679 841 +162 
 Lines   38536   40424+1888 
   =
   + Hits32698   34445+1747 
   - Misses   58385979 +141
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../providers/amazon/aws/operators/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9vcGVyYXRvcnMvY2xvdWRfZm9ybWF0aW9uLnB5)
 | `100% <100%> (ø)` | |
   | 
[...ow/providers/amazon/aws/sensors/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Nsb3VkX2Zvcm1hdGlvbi5weQ==)
 | `100% <100%> (ø)` | |
   | 
[...flow/providers/amazon/aws/hooks/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9jbG91ZF9mb3JtYXRpb24ucHk=)
 | `96.77% <96.77%> (ø)` | |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/operators/snowflake\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zbm93Zmxha2Vfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.84%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/snowflake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3Nub3dmbGFrZV9ob29rLnB5)
 | `0% <0%> (-81.14%)` | :arrow_down: |
   | ... and [481 
more](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=footer). 
Last update 
[be812bd...78a5d82](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-571189190
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=h1) 
Report
   > Merging 
[#6824](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **increase** coverage by `0.35%`.
   > The diff coverage is `99%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6824/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #6824  +/-   ##
   =
   + Coverage   84.85%   85.2%   +0.35% 
   =
 Files 679 841 +162 
 Lines   38536   40424+1888 
   =
   + Hits32698   34445+1747 
   - Misses   58385979 +141
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../providers/amazon/aws/operators/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9vcGVyYXRvcnMvY2xvdWRfZm9ybWF0aW9uLnB5)
 | `100% <100%> (ø)` | |
   | 
[...ow/providers/amazon/aws/sensors/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Nsb3VkX2Zvcm1hdGlvbi5weQ==)
 | `100% <100%> (ø)` | |
   | 
[...flow/providers/amazon/aws/hooks/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9jbG91ZF9mb3JtYXRpb24ucHk=)
 | `96.77% <96.77%> (ø)` | |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/operators/snowflake\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zbm93Zmxha2Vfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.84%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/snowflake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3Nub3dmbGFrZV9ob29rLnB5)
 | `0% <0%> (-81.14%)` | :arrow_down: |
   | ... and [481 
more](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=footer). 
Last update 
[be812bd...78a5d82](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] robinedwards commented on a change in pull request #7251: [AIRFLOW-6628] Fix search auto complete behaviour

2020-01-27 Thread GitBox
robinedwards commented on a change in pull request #7251: [AIRFLOW-6628] Fix 
search auto complete behaviour
URL: https://github.com/apache/airflow/pull/7251#discussion_r371300306
 
 

 ##
 File path: airflow/www/views.py
 ##
 @@ -325,10 +319,36 @@ def get_int_arg(value, default=0):
 paging=wwwutils.generate_pages(current_page, num_of_pages,
search=escape(arg_search_query) if 
arg_search_query else None,
showPaused=not hide_paused),
-auto_complete_data=auto_complete_data,
 num_runs=num_runs,
 tags=tags)
 
+@expose('/dag_autocomplete')
+@has_access
 
 Review comment:
   Implemented the above


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-5777) Migrate AWS DynamoDB to /providers/aws [AIP-21]

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-5777?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024393#comment-17024393
 ] 

ASF GitHub Bot commented on AIRFLOW-5777:
-

potiuk commented on pull request #6455: [WIP][AIRFLOW-5777] Migrate AWS 
DynamoDB to /providers/aws [AIP-21]
URL: https://github.com/apache/airflow/pull/6455
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Migrate AWS DynamoDB to /providers/aws [AIP-21]
> ---
>
> Key: AIRFLOW-5777
> URL: https://issues.apache.org/jira/browse/AIRFLOW-5777
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Affects Versions: 2.0.0
>Reporter: Bas Harenslak
>Assignee: Bas Harenslak
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk closed pull request #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21]

2020-01-27 Thread GitBox
potiuk closed pull request #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to 
/providers/aws [AIP-21]
URL: https://github.com/apache/airflow/pull/6455
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] potiuk commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21]

2020-01-27 Thread GitBox
potiuk commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to 
/providers/aws [AIP-21]
URL: https://github.com/apache/airflow/pull/6455#issuecomment-578787118
 
 
   Closing as the whole migration is automated by @mik-laj 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur removed a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur removed a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-578776781
 
 
   @feluelle committed all suggested changes and all tests passed on Travis


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   
![image](https://user-images.githubusercontent.com/5708714/73183505-92285900-4123-11ea-881d-e9b161d12eff.png)
   
   
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing.
   
   However there could be other scenarios in which that can happen, for 
example, illegal characters in `StackName`:
   ```
   >>> cf.describe_stacks(StackName='||')
   
   Traceback (most recent call last):
 File "", line 1, in 
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
357, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
661, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationError) when 
calling the DescribeStacks operation: 1 validation error detected: Value '||' 
at 'stackName' failed to satisfy constraint: Member must satisfy regular 
expression pattern: [a-zA-Z][-a-zA-Z0-9]*|arn:[-a-zA-Z0-9:/._+]*```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371276037
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/cloud_formation.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This module contains sensors for AWS CloudFormation.
+"""
+from airflow.providers.amazon.aws.hooks.cloud_formation import 
AWSCloudFormationHook
+from airflow.sensors.base_sensor_operator import BaseSensorOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class CloudFormationCreateStackSensor(BaseSensorOperator):
+"""
+Waits for a stack to be created successfully on AWS CloudFormation.
+
+:param stack_name: The name of the stack to wait for (templated)
+:type stack_name: str
+:param aws_conn_id: ID of the Airflow connection where credentials and 
extra configuration are
+stored
+:type aws_conn_id: str
+:param poke_interval: Time in seconds that the job should wait between 
each try
+:type poke_interval: int
+"""
+
+template_fields = ['stack_name']
+ui_color = '#C5CAE9'
+
+@apply_defaults
+def __init__(self,
+ stack_name,
+ aws_conn_id='aws_default',
+ region_name=None,
+ *args,
+ **kwargs):
+super().__init__(*args, **kwargs)
+self.stack_name = stack_name
+self.hook = AWSCloudFormationHook(aws_conn_id=aws_conn_id, 
region_name=region_name)
+
+def poke(self, context):
+stack_status = self.hook.get_stack_status(self.stack_name)
+if stack_status == 'CREATE_COMPLETE':
+return True
+elif stack_status in ('CREATE_IN_PROGRESS', None):
+return False
+else:
+raise ValueError(f'Stack {self.stack_name} in bad state: 
{stack_status}')
+
+
+class CloudFormationDeleteStackSensor(BaseSensorOperator):
+"""
+Waits for a stack to be deleted successfully on AWS CloudFormation.
+
+:param stack_name: The name of the stack to wait for (templated)
+:type stack_name: str
+:param aws_conn_id: ID of the Airflow connection where credentials and 
extra configuration are
+stored
+:type aws_conn_id: str
+:param poke_interval: Time in seconds that the job should wait between 
each try
+:type poke_interval: int
+"""
+
+template_fields = ['stack_name']
+ui_color = '#C5CAE9'
+
+@apply_defaults
+def __init__(self,
+ stack_name,
+ aws_conn_id='aws_default',
+ region_name=None,
+ *args,
+ **kwargs):
+super().__init__(*args, **kwargs)
+self.stack_name = stack_name
+self.hook = AWSCloudFormationHook(aws_conn_id=aws_conn_id, 
region_name=region_name)
+
+def poke(self, context):
+stack_status = self.hook.get_stack_status(self.stack_name)
+if stack_status in ('DELETE_COMPLETE', None):
+return True
+elif stack_status == 'DELETE_IN_PROGRESS':
+return False
+else:
+raise ValueError(f'Stack {self.stack_name} in bad state: 
{stack_status}')
 
 Review comment:
   Same here. https://github.com/apache/airflow/pull/6824#discussion_r371274926


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371274926
 
 

 ##
 File path: airflow/providers/amazon/aws/sensors/cloud_formation.py
 ##
 @@ -0,0 +1,98 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This module contains sensors for AWS CloudFormation.
+"""
+from airflow.providers.amazon.aws.hooks.cloud_formation import 
AWSCloudFormationHook
+from airflow.sensors.base_sensor_operator import BaseSensorOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class CloudFormationCreateStackSensor(BaseSensorOperator):
+"""
+Waits for a stack to be created successfully on AWS CloudFormation.
+
+:param stack_name: The name of the stack to wait for (templated)
+:type stack_name: str
+:param aws_conn_id: ID of the Airflow connection where credentials and 
extra configuration are
+stored
+:type aws_conn_id: str
+:param poke_interval: Time in seconds that the job should wait between 
each try
+:type poke_interval: int
+"""
+
+template_fields = ['stack_name']
+ui_color = '#C5CAE9'
+
+@apply_defaults
+def __init__(self,
+ stack_name,
+ aws_conn_id='aws_default',
+ region_name=None,
+ *args,
+ **kwargs):
+super().__init__(*args, **kwargs)
+self.stack_name = stack_name
+self.hook = AWSCloudFormationHook(aws_conn_id=aws_conn_id, 
region_name=region_name)
+
+def poke(self, context):
+stack_status = self.hook.get_stack_status(self.stack_name)
+if stack_status == 'CREATE_COMPLETE':
+return True
+elif stack_status in ('CREATE_IN_PROGRESS', None):
+return False
+else:
+raise ValueError(f'Stack {self.stack_name} in bad state: 
{stack_status}')
 
 Review comment:
   You don't need the `elif`. You can remove two letters :) ..It can only 
return once. So you can have `if`, `if`, `raise`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on issue #6824: [AIRFLOW-6258] add CloudFormation operators 
to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-578776781
 
 
   @feluelle committed all suggested changes and all tests passed on Travis


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371271765
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   Yes, so let's better check the message string then.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3007) Scheduler docs use deprecated use of `schedule_interval`

2020-01-27 Thread Jira


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3007?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024378#comment-17024378
 ] 

Matjaž Mav commented on AIRFLOW-3007:
-

[~kaxilnaik] Does this fixes annoying warning message on every command?

Example:

{{$ airflow version}}
{{[2020-01-27 14:26:28,431] \{models.py:2450} WARNING - schedule_interval is 
used for , though it has been deprecated as 
a task parameter, you need to specify it as a DAG parameter instead}}
{{[2020-01-27 14:26:28,432] \{models.py:2450} WARNING - schedule_interval is 
used for , though it has been deprecated as 
a task parameter, you need to specify it as a DAG parameter instead}}
{{  _}}
{{  |__( )_ __/__ / __}}
{{ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /}}
{{___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /}}
{{ _/_/ |_/_/ /_/ /_/ /_/ \//|__/}}
{{ v1.10.1}}

> Scheduler docs use deprecated use of `schedule_interval` 
> -
>
> Key: AIRFLOW-3007
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3007
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Minor
> Fix For: 2.0.0
>
> Attachments: screenshot-1.png
>
>
> The scheduler docs at 
> https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated 
> way of passing {{schedule_interval}} . {{schedule_interval}}  should be pass 
> to DAG as a separate parameter and not as a default arg.
>  !screenshot-1.png! 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-571189190
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=h1) 
Report
   > Merging 
[#6824](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **increase** coverage by `0.34%`.
   > The diff coverage is `99%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6824/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6824  +/-   ##
   ==
   + Coverage   84.85%   85.19%   +0.34% 
   ==
 Files 679  841 +162 
 Lines   3853640424+1888 
   ==
   + Hits3269834441+1743 
   - Misses   5838 5983 +145
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../providers/amazon/aws/operators/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9vcGVyYXRvcnMvY2xvdWRfZm9ybWF0aW9uLnB5)
 | `100% <100%> (ø)` | |
   | 
[...ow/providers/amazon/aws/sensors/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Nsb3VkX2Zvcm1hdGlvbi5weQ==)
 | `100% <100%> (ø)` | |
   | 
[...flow/providers/amazon/aws/hooks/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9jbG91ZF9mb3JtYXRpb24ucHk=)
 | `96.77% <96.77%> (ø)` | |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/operators/snowflake\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zbm93Zmxha2Vfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.84%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/snowflake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3Nub3dmbGFrZV9ob29rLnB5)
 | `0% <0%> (-81.14%)` | :arrow_down: |
   | ... and [482 
more](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=footer). 
Last update 
[be812bd...eaa32b7](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#issuecomment-571189190
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=h1) 
Report
   > Merging 
[#6824](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/be812bd660fac4621a25f3269fded8d4e03b0023?src=pr=desc)
 will **increase** coverage by `0.11%`.
   > The diff coverage is `99%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/6824/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#6824  +/-   ##
   ==
   + Coverage   84.85%   84.96%   +0.11% 
   ==
 Files 679  841 +162 
 Lines   3853640424+1888 
   ==
   + Hits3269834348+1650 
   - Misses   5838 6076 +238
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[.../providers/amazon/aws/operators/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9vcGVyYXRvcnMvY2xvdWRfZm9ybWF0aW9uLnB5)
 | `100% <100%> (ø)` | |
   | 
[...ow/providers/amazon/aws/sensors/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9zZW5zb3JzL2Nsb3VkX2Zvcm1hdGlvbi5weQ==)
 | `100% <100%> (ø)` | |
   | 
[...flow/providers/amazon/aws/hooks/cloud\_formation.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYW1hem9uL2F3cy9ob29rcy9jbG91ZF9mb3JtYXRpb24ucHk=)
 | `96.77% <96.77%> (ø)` | |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[...rflow/contrib/sensors/sagemaker\_training\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvc2FnZW1ha2VyX3RyYWluaW5nX3NlbnNvci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/operators/snowflake\_operator.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zbm93Zmxha2Vfb3BlcmF0b3IucHk=)
 | `0% <0%> (-95.84%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/azure\_data\_lake\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F6dXJlX2RhdGFfbGFrZV9ob29rLnB5)
 | `0% <0%> (-93.11%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/grpc\_hook.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2dycGNfaG9vay5weQ==)
 | `0% <0%> (-91.94%)` | :arrow_down: |
   | 
[airflow/contrib/sensors/azure\_cosmos\_sensor.py](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvYXp1cmVfY29zbW9zX3NlbnNvci5weQ==)
 | `0% <0%> (-81.25%)` | :arrow_down: |
   | ... and [485 
more](https://codecov.io/gh/apache/airflow/pull/6824/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=footer). 
Last update 
[be812bd...eaa32b7](https://codecov.io/gh/apache/airflow/pull/6824?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   
![image](https://user-images.githubusercontent.com/5708714/73180571-39a28d00-411e-11ea-960a-69a582d1b7e0.png)
   
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing.
   
   However there could be other scenarios in which that can happen, for 
example, illegal characters in `StackName`:
   ```
   >>> cf.describe_stacks(StackName='||')
   
   Traceback (most recent call last):
 File "", line 1, in 
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
357, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
661, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationError) when 
calling the DescribeStacks operation: 1 validation error detected: Value '||' 
at 'stackName' failed to satisfy constraint: Member must satisfy regular 
expression pattern: [a-zA-Z][-a-zA-Z0-9]*|arn:[-a-zA-Z0-9:/._+]*```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud 
Dataproc support
URL: https://github.com/apache/airflow/pull/7252#issuecomment-578369583
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=h1) 
Report
   > Merging 
[#7252](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9a04013b0e40b0d744ff4ac9f008491806d60df2?src=pr=desc)
 will **decrease** coverage by `0.41%`.
   > The diff coverage is `77.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7252/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7252  +/-   ##
   ==
   - Coverage   85.45%   85.04%   -0.42% 
   ==
 Files 838  843   +5 
 Lines   4032440654 +330 
   ==
   + Hits3445834573 +115 
   - Misses   5866 6081 +215
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `77.4% <ø> (ø)` | :arrow_up: |
   | 
[...andex/example\_dags/example\_yandexcloud\_dataproc.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2V4YW1wbGVfZGFncy9leGFtcGxlX3lhbmRleGNsb3VkX2RhdGFwcm9jLnB5)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `97.95% <100%> (+0.02%)` | :arrow_up: |
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[.../yandex/operators/yandexcloud\_dataproc\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9kYXRhcHJvY19vcGVyYXRvci5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.07% <100%> (ø)` | :arrow_up: |
   | 
[...ow/providers/yandex/hooks/yandexcloud\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2Jhc2VfaG9vay5weQ==)
 | `55% <55%> (ø)` | |
   | 
[...roviders/yandex/hooks/yandexcloud\_dataproc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2RhdGFwcm9jX2hvb2sucHk=)
 | `74.38% <74.38%> (ø)` | |
   | 
[...ders/yandex/operators/yandexcloud\_base\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9iYXNlX29wZXJhdG9yLnB5)
 | `90% <90%> (ø)` | |
   | 
[...flow/providers/apache/cassandra/hooks/cassandra.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2Nhc3NhbmRyYS9ob29rcy9jYXNzYW5kcmEucHk=)
 | `21.51% <0%> (-72.16%)` | :arrow_down: |
   | ... and [12 
more](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=footer). 
Last update 
[9a04013...d79a67f](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud 
Dataproc support
URL: https://github.com/apache/airflow/pull/7252#issuecomment-578369583
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=h1) 
Report
   > Merging 
[#7252](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9a04013b0e40b0d744ff4ac9f008491806d60df2?src=pr=desc)
 will **decrease** coverage by `0.41%`.
   > The diff coverage is `77.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7252/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7252  +/-   ##
   ==
   - Coverage   85.45%   85.04%   -0.42% 
   ==
 Files 838  843   +5 
 Lines   4032440654 +330 
   ==
   + Hits3445834573 +115 
   - Misses   5866 6081 +215
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `77.4% <ø> (ø)` | :arrow_up: |
   | 
[...andex/example\_dags/example\_yandexcloud\_dataproc.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2V4YW1wbGVfZGFncy9leGFtcGxlX3lhbmRleGNsb3VkX2RhdGFwcm9jLnB5)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `97.95% <100%> (+0.02%)` | :arrow_up: |
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[.../yandex/operators/yandexcloud\_dataproc\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9kYXRhcHJvY19vcGVyYXRvci5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.07% <100%> (ø)` | :arrow_up: |
   | 
[...ow/providers/yandex/hooks/yandexcloud\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2Jhc2VfaG9vay5weQ==)
 | `55% <55%> (ø)` | |
   | 
[...roviders/yandex/hooks/yandexcloud\_dataproc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2RhdGFwcm9jX2hvb2sucHk=)
 | `74.38% <74.38%> (ø)` | |
   | 
[...ders/yandex/operators/yandexcloud\_base\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9iYXNlX29wZXJhdG9yLnB5)
 | `90% <90%> (ø)` | |
   | 
[...flow/providers/apache/cassandra/hooks/cassandra.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2Nhc3NhbmRyYS9ob29rcy9jYXNzYW5kcmEucHk=)
 | `21.51% <0%> (-72.16%)` | :arrow_down: |
   | ... and [12 
more](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=footer). 
Last update 
[9a04013...d79a67f](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7252: [AIRFLOW-6531] Initial Yandex.Cloud 
Dataproc support
URL: https://github.com/apache/airflow/pull/7252#issuecomment-578369583
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=h1) 
Report
   > Merging 
[#7252](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/9a04013b0e40b0d744ff4ac9f008491806d60df2?src=pr=desc)
 will **decrease** coverage by `0.41%`.
   > The diff coverage is `77.34%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7252/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7252  +/-   ##
   ==
   - Coverage   85.45%   85.04%   -0.42% 
   ==
 Files 838  843   +5 
 Lines   4032440654 +330 
   ==
   + Hits3445834573 +115 
   - Misses   5866 6081 +215
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/connection.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvY29ubmVjdGlvbi5weQ==)
 | `77.4% <ø> (ø)` | :arrow_up: |
   | 
[...andex/example\_dags/example\_yandexcloud\_dataproc.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2V4YW1wbGVfZGFncy9leGFtcGxlX3lhbmRleGNsb3VkX2RhdGFwcm9jLnB5)
 | `0% <0%> (ø)` | |
   | 
[airflow/utils/db.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYi5weQ==)
 | `97.95% <100%> (+0.02%)` | :arrow_up: |
   | 
[airflow/www/forms.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZm9ybXMucHk=)
 | `100% <100%> (ø)` | :arrow_up: |
   | 
[.../yandex/operators/yandexcloud\_dataproc\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9kYXRhcHJvY19vcGVyYXRvci5weQ==)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.07% <100%> (ø)` | :arrow_up: |
   | 
[...ow/providers/yandex/hooks/yandexcloud\_base\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2Jhc2VfaG9vay5weQ==)
 | `55% <55%> (ø)` | |
   | 
[...roviders/yandex/hooks/yandexcloud\_dataproc\_hook.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L2hvb2tzL3lhbmRleGNsb3VkX2RhdGFwcm9jX2hvb2sucHk=)
 | `74.38% <74.38%> (ø)` | |
   | 
[...ders/yandex/operators/yandexcloud\_base\_operator.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMveWFuZGV4L29wZXJhdG9ycy95YW5kZXhjbG91ZF9iYXNlX29wZXJhdG9yLnB5)
 | `90% <90%> (ø)` | |
   | 
[...flow/providers/apache/cassandra/hooks/cassandra.py](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree#diff-YWlyZmxvdy9wcm92aWRlcnMvYXBhY2hlL2Nhc3NhbmRyYS9ob29rcy9jYXNzYW5kcmEucHk=)
 | `21.51% <0%> (-72.16%)` | :arrow_down: |
   | ... and [12 
more](https://codecov.io/gh/apache/airflow/pull/7252/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=footer). 
Last update 
[9a04013...d79a67f](https://codecov.io/gh/apache/airflow/pull/7252?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   
![image](https://user-images.githubusercontent.com/5708714/73180571-39a28d00-411e-11ea-960a-69a582d1b7e0.png)
   
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing.
   
   However potentially there be other scenarios in which that can happen, for 
example, illegal characters in `StackName`:
   ```
   >>> cf.describe_stacks(StackName='||')
   
   Traceback (most recent call last):
 File "", line 1, in 
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
357, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
661, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationError) when 
calling the DescribeStacks operation: 1 validation error detected: Value '||' 
at 'stackName' failed to satisfy constraint: Member must satisfy regular 
expression pattern: [a-zA-Z][-a-zA-Z0-9]*|arn:[-a-zA-Z0-9:/._+]*```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   
![image](https://user-images.githubusercontent.com/5708714/73180571-39a28d00-411e-11ea-960a-69a582d1b7e0.png)
   
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing.
   
   However potentially there be other scenarios in which that can happen, for 
example, illegal characters in `StackName`:
   ```>>> cf.describe_stacks(StackName='||')
   Traceback (most recent call last):
 File "", line 1, in 
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
357, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
661, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationError) when 
calling the DescribeStacks operation: 1 validation error detected: Value '||' 
at 'stackName' failed to satisfy constraint: Member must satisfy regular 
expression pattern: [a-zA-Z][-a-zA-Z0-9]*|arn:[-a-zA-Z0-9:/._+]*```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   
![image](https://user-images.githubusercontent.com/5708714/73180571-39a28d00-411e-11ea-960a-69a582d1b7e0.png)
   
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing.
   
   However potentially there be other scenarios in which that can happen, for 
example, illegal characters in `StackName`:
   ```>>> cf.describe_stacks(StackName='||')
   Traceback (most recent call last):
 File "", line 1, in 
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
357, in _api_call
   return self._make_api_call(operation_name, kwargs)
 File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
661, in _make_api_call
   raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationError) when 
calling the DescribeStacks operation: 1 validation error detected: Value '||' 
at 'stackName' failed to satisfy constraint: Member must satisfy regular 
expression pattern: [a-zA-Z][-a-zA-Z0-9]*|arn:[-a-zA-Z0-9:/._+]*```
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r369977013
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   Yep, I read that, however there is no class by name 
`AmazonCloudFormationException` in boto.
   
   I could add:
   ```
   if e.response['Error']['Code'] == 'ValidationError':
   ```
   as suggested [here](https://stackoverflow.com/a/47040476/2489287)
   
   However since we are using boto's cloudformation client here it is certain 
that that would be the value of the error code in the exception.
   
   Should I make this change? or should we go for the brevity of simply 
catching `ClientError` and not check its error code value.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371262602
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   We could check the error code instead of the message string, and assume that 
`ValidationError` returned from `describe_stacks` could only be due to stack 
not existing, however potentially there could be other scenarios in which that 
can happen.
   
   
![image](https://user-images.githubusercontent.com/5708714/73180571-39a28d00-411e-11ea-960a-69a582d1b7e0.png)
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Closed] (AIRFLOW-6633) S3 Logging Configurations Are Ignored For Local

2020-01-27 Thread Evan Hlavaty (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Evan Hlavaty closed AIRFLOW-6633.
-
Resolution: Not A Problem

Finally found logs in the dask worker and found out s3 url needed https:// 
prepended to it

> S3 Logging Configurations Are Ignored For Local
> ---
>
> Key: AIRFLOW-6633
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6633
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: logging
>Affects Versions: 1.10.7
> Environment: Ubuntu 16.04 LTS
>Reporter: Evan Hlavaty
>Priority: Major
>
> When using the following config settings and S3 connection ID created in 
> Admin UI, Local logs are still being used and no logs are uploaded to S3. No 
> errors are ever thrown to indicate if connection settings are working.
> [core] # Airflow can store logs remotely in AWS S3. Users must supply a 
> remote # location URL (starting with either 's3://...') and an Airflow 
> connection # id that provides access to the storage location.
> remote_logging = True
> remote_base_log_folder = s3://my-bucket/path/to/logs
> remote_log_conn_id = MyS3Conn
> encrypt_s3_logs = False



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Resolved] (AIRFLOW-6646) Move protocols classes to providers package

2020-01-27 Thread Kamil Bregula (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6646?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kamil Bregula resolved AIRFLOW-6646.

Resolution: Fixed

> Move protocols classes to providers package
> ---
>
> Key: AIRFLOW-6646
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6646
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks, operators
>Affects Versions: 1.10.7
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>
> More information: 
> [https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-21%3A+Changes+in+import+paths]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6646) Move protocols classes to providers package

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6646?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024341#comment-17024341
 ] 

ASF GitHub Bot commented on AIRFLOW-6646:
-

mik-laj commented on pull request #7268: [AIRFLOW-6646][AIP-21] Move protocols 
classes to providers package
URL: https://github.com/apache/airflow/pull/7268
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Move protocols classes to providers package
> ---
>
> Key: AIRFLOW-6646
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6646
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks, operators
>Affects Versions: 1.10.7
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>
> More information: 
> [https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-21%3A+Changes+in+import+paths]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-6646) Move protocols classes to providers package

2020-01-27 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6646?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024343#comment-17024343
 ] 

ASF subversion and git services commented on AIRFLOW-6646:
--

Commit 9a04013b0e40b0d744ff4ac9f008491806d60df2 in airflow's branch 
refs/heads/master from Kamil Breguła
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=9a04013 ]

[AIRFLOW-6646][AIP-21] Move protocols classes to providers package (#7268)

* [AIP-21] Move contrib.hooks.ftp_hook providers.ftp.hooks.ftp

* [AIP-21] Move contrib.hooks.grpc_hook providers.grpc.hooks.grpc

* [AIP-21] Move contrib.hooks.imap_hook providers.imap.hooks.imap

* [AIP-21] Move contrib.hooks.ssh_hook providers.ssh.hooks.ssh

* [AIP-21] Move contrib.hooks.winrm_hook providers.microsoft.winrm.hooks.winrm

* [AIP-21] Move contrib.operators.grpc_operator providers.grpc.operators.grpc

* [AIP-21] Move contrib.operators.ssh_operator providers.ssh.operators.ssh

* [AIP-21] Move contrib.operators.winrm_operator 
providers.microsoft.winrm.operators.winrm

* [AIP-21] Move contrib.sensors.imap_attachment_sensor 
providers.imap.sensors.imap_attachment

* [AIP-21] Move hooks.http_hook providers.http.hooks.http

* [AIP-21] Move hooks.jdbc_hook providers.jdbc.hooks.jdbc

* [AIP-21] Move contrib.sensors.ftp_sensor providers.ftp.sensors.ftp

* [AIP-21] Move operators.email_operator providers.email.operators.email

* [AIP-21] Move operators.http_operator providers.http.operators.http

* [AIP-21] Move operators.jdbc_operator providers.jdbc.operators.jdbc

* [AIP-21] Move sensors.http_sensor providers.http.sensors.http

* Update docs


> Move protocols classes to providers package
> ---
>
> Key: AIRFLOW-6646
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6646
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: hooks, operators
>Affects Versions: 1.10.7
>Reporter: Kamil Bregula
>Priority: Major
> Fix For: 2.0.0
>
>
> More information: 
> [https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-21%3A+Changes+in+import+paths]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj merged pull request #7268: [AIRFLOW-6646][AIP-21] Move protocols classes to providers package

2020-01-27 Thread GitBox
mik-laj merged pull request #7268: [AIRFLOW-6646][AIP-21] Move protocols 
classes to providers package
URL: https://github.com/apache/airflow/pull/7268
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21]

2020-01-27 Thread GitBox
mik-laj commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to 
/providers/aws [AIP-21]
URL: https://github.com/apache/airflow/pull/6455#issuecomment-578748942
 
 
   Yes. I've already done it in my fork. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6649) Google storage to Snowflake

2020-01-27 Thread Kamil Bregula (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024340#comment-17024340
 ] 

Kamil Bregula commented on AIRFLOW-6649:


Can't you use the snowflake operator? This is the fastest way because Snowflake 
will import the data without Airflow involvement. Here is 
guide:[https://docs.snowflake.net/manuals/user-guide/data-load-gcs-copy.html]

> Google storage to Snowflake
> ---
>
> Key: AIRFLOW-6649
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6649
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: gcp, operators
>Affects Versions: 1.10.6
>Reporter: nexoriv
>Priority: Major
>
> can someone share google storage to snowflake operator?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6649) Google storage to Snowflake

2020-01-27 Thread nexoriv (Jira)
nexoriv created AIRFLOW-6649:


 Summary: Google storage to Snowflake
 Key: AIRFLOW-6649
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6649
 Project: Apache Airflow
  Issue Type: New Feature
  Components: gcp, operators
Affects Versions: 1.10.6
Reporter: nexoriv


can someone share google storage to snowflake operator?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] RosterIn commented on a change in pull request #6931: [AIRFLOW-6372] Align WASB remote logging URI scheme (2.0.x)

2020-01-27 Thread GitBox
RosterIn commented on a change in pull request #6931:  [AIRFLOW-6372] Align 
WASB remote logging URI scheme (2.0.x)
URL: https://github.com/apache/airflow/pull/6931#discussion_r371222582
 
 

 ##
 File path: airflow/config_templates/airflow_local_settings.py
 ##
 @@ -184,6 +184,9 @@
 
 DEFAULT_LOGGING_CONFIG['handlers'].update(GCS_REMOTE_HANDLERS)
 elif REMOTE_BASE_LOG_FOLDER.startswith('wasb'):
+
+# If you use URI scheme, the parameter `wasb_container` is redundancy.
 
 Review comment:
   Should exception be fired if such redundancy detected?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371219323
 
 

 ##
 File path: airflow/providers/amazon/aws/operators/cloud_formation.py
 ##
 @@ -0,0 +1,102 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This module contains CloudFormation create/delete stack operators.
+"""
+from typing import List
+
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.cloud_formation import 
AWSCloudFormationHook
+from airflow.utils.decorators import apply_defaults
+
+
+class CloudFormationCreateStackOperator(BaseOperator):
+"""
+An operator that creates a CloudFormation stack.
+
+:param stack_name: stack name (templated)
+:type params: str
 
 Review comment:
   ```suggestion
   :type stack_name: str
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371219248
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,87 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
+
+def create_stack(self, stack_name, params):
+"""
+Create stack in CloudFormation.
+
+:param stack_name: stack_name.
+:type stack_name: str
+:param params: parameters to be passed to CloudFormation.
+:type: dict
+"""
+
+if 'StackName' not in params:
+params['StackName'] = stack_name
+self.get_conn().create_stack(**params)
+
+def delete_stack(self, stack_name, params=None):
+"""
+Delete stack in CloudFormation.
+
+:param stack_name: stack_name.
+:type stack_name: str
+:param params: parameters to be passed to CloudFormation (optional).
+:type: dict
 
 Review comment:
   ```suggestion
   :type params: dict
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371219387
 
 

 ##
 File path: airflow/providers/amazon/aws/operators/cloud_formation.py
 ##
 @@ -0,0 +1,102 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""
+This module contains CloudFormation create/delete stack operators.
+"""
+from typing import List
+
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.cloud_formation import 
AWSCloudFormationHook
+from airflow.utils.decorators import apply_defaults
+
+
+class CloudFormationCreateStackOperator(BaseOperator):
+"""
+An operator that creates a CloudFormation stack.
+
+:param stack_name: stack name (templated)
+:type params: str
+:param params: parameters to be passed to CloudFormation.
+
+.. seealso::
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/cloudformation.html#CloudFormation.Client.create_stack
+:type params: dict
+:param aws_conn_id: aws connection to uses
+:type aws_conn_id: str
+"""
+template_fields: List[str] = ['stack_name']
+template_ext = ()
+ui_color = '#6b9659'
+
+@apply_defaults
+def __init__(
+self,
+stack_name,
+params,
+aws_conn_id='aws_default',
+*args, **kwargs):
+super().__init__(*args, **kwargs)
+self.stack_name = stack_name
+self.params = params
+self.aws_conn_id = aws_conn_id
+
+def execute(self, context):
+self.log.info('Parameters: %s', self.params)
+
+cloudformation_hook = 
AWSCloudFormationHook(aws_conn_id=self.aws_conn_id)
+cloudformation_hook.create_stack(self.stack_name, self.params)
+
+
+class CloudFormationDeleteStackOperator(BaseOperator):
+"""
+An operator that deletes a CloudFormation stack.
+
+:param stack_name: stack name (templated)
+:type params: str
 
 Review comment:
   ```suggestion
   :type stack_name: str
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371219164
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,87 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
+
+def create_stack(self, stack_name, params):
+"""
+Create stack in CloudFormation.
+
+:param stack_name: stack_name.
+:type stack_name: str
+:param params: parameters to be passed to CloudFormation.
+:type: dict
 
 Review comment:
   ```suggestion
   :type params: dict
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371218379
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   Hm ya then let's simply stick with it - so no change needed from your side :)
   
   Thanks for the explanation.  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371218379
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   Hm ya then let's simply stick with it - so now change needed from your side 
:)
   
   Thanks for the explanation.  


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371217057
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   Ah you mean we still need to check the `if 'does not exist' in str(e):` ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
feluelle commented on a change in pull request #6824: [AIRFLOW-6258] add 
CloudFormation operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824#discussion_r371216580
 
 

 ##
 File path: airflow/providers/amazon/aws/hooks/cloud_formation.py
 ##
 @@ -0,0 +1,88 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains AWS CloudFormation Hook
+"""
+from botocore.exceptions import ClientError
+
+from airflow.contrib.hooks.aws_hook import AwsHook
+
+
+class AWSCloudFormationHook(AwsHook):
+"""
+Interact with AWS CloudFormation.
+"""
+
+def __init__(self, region_name=None, *args, **kwargs):
+self.region_name = region_name
+self.conn = None
+super().__init__(*args, **kwargs)
+
+def get_conn(self):
+if not self.conn:
+self.conn = self.get_client_type('cloudformation', 
self.region_name)
+return self.conn
+
+def get_stack_status(self, stack_name):
+"""
+Get stack status from CloudFormation.
+"""
+cloudformation = self.get_conn()
+
+self.log.info('Poking for stack %s', stack_name)
+
+try:
+stacks = 
cloudformation.describe_stacks(StackName=stack_name)['Stacks']
+return stacks[0]['StackStatus']
+except ClientError as e:
+if 'does not exist' in str(e):
+return None
+else:
+raise e
 
 Review comment:
   In my opinion going for the error code is far better than checking the 
message string.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #7251: [AIRFLOW-6628] Fix search auto complete behaviour

2020-01-27 Thread GitBox
codecov-io commented on issue #7251: [AIRFLOW-6628] Fix search auto complete 
behaviour
URL: https://github.com/apache/airflow/pull/7251#issuecomment-578727497
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=h1) 
Report
   > Merging 
[#7251](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/26c0c53e2009d1d4aaefbd5666f2aea97d7f360f?src=pr=desc)
 will **increase** coverage by `0.08%`.
   > The diff coverage is `86.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7251/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7251  +/-   ##
   ==
   + Coverage   85.34%   85.43%   +0.08% 
   ==
 Files 791  822  +31 
 Lines   4012840270 +142 
   ==
   + Hits3424934406 +157 
   + Misses   5879 5864  -15
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/security.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvc2VjdXJpdHkucHk=)
 | `91.5% <ø> (ø)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.1% <86.66%> (+0.02%)` | :arrow_up: |
   | 
[airflow/plugins\_manager.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9wbHVnaW5zX21hbmFnZXIucHk=)
 | `87.57% <0%> (-3.11%)` | :arrow_down: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `88.19% <0%> (-0.39%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/datadog\_hook.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2RhdGFkb2dfaG9vay5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/cloudant\_hook.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2Nsb3VkYW50X2hvb2sucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/vertica\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy92ZXJ0aWNhX29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/dingding\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9kaW5nZGluZ19vcGVyYXRvci5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[.../contrib/operators/segment\_track\_event\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zZWdtZW50X3RyYWNrX2V2ZW50X29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [63 
more](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=footer). 
Last update 
[26c0c53...6223a95](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io edited a comment on issue #7251: [AIRFLOW-6628] Fix search auto complete behaviour

2020-01-27 Thread GitBox
codecov-io edited a comment on issue #7251: [AIRFLOW-6628] Fix search auto 
complete behaviour
URL: https://github.com/apache/airflow/pull/7251#issuecomment-578727497
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=h1) 
Report
   > Merging 
[#7251](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/26c0c53e2009d1d4aaefbd5666f2aea97d7f360f?src=pr=desc)
 will **increase** coverage by `0.08%`.
   > The diff coverage is `86.66%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/7251/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#7251  +/-   ##
   ==
   + Coverage   85.34%   85.43%   +0.08% 
   ==
 Files 791  822  +31 
 Lines   4012840270 +142 
   ==
   + Hits3424934406 +157 
   + Misses   5879 5864  -15
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/security.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvc2VjdXJpdHkucHk=)
 | `91.5% <ø> (ø)` | :arrow_up: |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `76.1% <86.66%> (+0.02%)` | :arrow_up: |
   | 
[airflow/plugins\_manager.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9wbHVnaW5zX21hbmFnZXIucHk=)
 | `87.57% <0%> (-3.11%)` | :arrow_down: |
   | 
[airflow/utils/dag\_processing.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kYWdfcHJvY2Vzc2luZy5weQ==)
 | `88.19% <0%> (-0.39%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/datadog\_hook.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2RhdGFkb2dfaG9vay5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/cloudant\_hook.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2Nsb3VkYW50X2hvb2sucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/operators/slack\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc2xhY2tfb3BlcmF0b3IucHk=)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/vertica\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy92ZXJ0aWNhX29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[airflow/contrib/operators/dingding\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9kaW5nZGluZ19vcGVyYXRvci5weQ==)
 | `100% <0%> (ø)` | :arrow_up: |
   | 
[.../contrib/operators/segment\_track\_event\_operator.py](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zZWdtZW50X3RyYWNrX2V2ZW50X29wZXJhdG9yLnB5)
 | `100% <0%> (ø)` | :arrow_up: |
   | ... and [63 
more](https://codecov.io/gh/apache/airflow/pull/7251/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=footer). 
Last update 
[26c0c53...6223a95](https://codecov.io/gh/apache/airflow/pull/7251?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371186897
 
 

 ##
 File path: airflow/contrib/hooks/yandexcloud_dataproc_hook.py
 ##
 @@ -0,0 +1,571 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import random
+from datetime import datetime
+
+import yandex.cloud.dataproc.v1.cluster_pb2 as cluster_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2 as cluster_service_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2_grpc as 
cluster_service_grpc_pb
+import yandex.cloud.dataproc.v1.common_pb2 as common_pb
+import yandex.cloud.dataproc.v1.job_pb2 as job_pb
+import yandex.cloud.dataproc.v1.job_service_pb2 as job_service_pb
+import yandex.cloud.dataproc.v1.job_service_pb2_grpc as job_service_grpc_pb
+import yandex.cloud.dataproc.v1.subcluster_pb2 as subcluster_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2 as subcluster_service_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2_grpc as 
subcluster_service_grpc_pb
+from google.protobuf.field_mask_pb2 import FieldMask
+from six import string_types
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.exceptions import AirflowException
+
+
+class DataprocHook(YandexCloudBaseHook):
+"""
+A base hook for Yandex.Cloud Data Proc.
+
+:param connection_id: The connection ID to use when fetching connection 
info.
+:type connection_id: str
+"""
+
+def __init__(self, *args, **kwargs):
+super(DataprocHook, self).__init__(*args, **kwargs)
+self.cluster_id = None
+
+def _get_operation_result(self, operation, response_type=None, 
meta_type=None):
+message = 'Running Yandex.Cloud operation. ID: {}. Description: {}. 
Created at: {}. Created by: {}.'
+message = message.format(
+operation.id,
+operation.description,
+datetime.fromtimestamp(operation.created_at.seconds),
+operation.created_by,
+)
+if meta_type:
+unpacked_meta = meta_type()
+operation.metadata.Unpack(unpacked_meta)
+message += ' Meta: {}.'.format(unpacked_meta)
+self.log.info(message)
+result = self.wait_for_operation(operation)
+if result.error and result.error.code:
+error_message = 'Error Yandex.Cloud operation. ID: {}. Error code: 
{}. Details: {}. Message: {}.'
+error_message = error_message.format(
+result.id, result.error.code, result.error.details, 
result.error.message
+)
+self.log.error(error_message)
+raise AirflowException(error_message)
+else:
+log_message = 'Done Yandex.Cloud operation. ID: 
{}.'.format(operation.id)
+unpacked_response = None
+if response_type:
+unpacked_response = response_type()
+result.response.Unpack(unpacked_response)
+log_message += ' Response: {}.'.format(unpacked_response)
+self.log.info(log_message)
+if unpacked_response:
+return unpacked_response
+return None
+
+def add_subcluster(
+self,
+cluster_id,
+subcluster_type,
+name,
+subnet_id,
+resource_preset='s2.small',
+disk_size=15,
+disk_type='network-ssd',
+hosts_count=5,
+):
+"""
+Add subcluster to Yandex.Cloud Data Proc cluster.
+
+:param cluster_id: ID of the cluster.
+:type cluster_id: str
+:param name: Name of the subcluster. Must be unique in the cluster
+:type name: str
+:param subcluster_type: Type of the subcluster. Either "data" or 
"compute".
+:type subcluster_type: str
+:param subnet_id: Subnet ID of the cluster.
+:type subnet_id: str
+:param resource_preset: Resources preset (CPU+RAM configuration) for 
the nodes of the cluster.
+:type resource_preset: str
+:param disk_size: Storage size in GiB.
+:type disk_size: int
+:param disk_type: 

[GitHub] [airflow] peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371206590
 
 

 ##
 File path: airflow/contrib/operators/yandexcloud_base_operator.py
 ##
 @@ -0,0 +1,42 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class YandexCloudBaseOperator(BaseOperator):
+"""The base class for operators that poll on a Dataproc Operation."""
+@apply_defaults
+def __init__(self,
+ folder_id=None,
+ connection_id='yandexcloud_default',
+ *args,
+ **kwargs):
+super(YandexCloudBaseOperator, self).__init__(*args, **kwargs)
 
 Review comment:
   Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371206550
 
 

 ##
 File path: airflow/contrib/operators/yandexcloud_base_operator.py
 ##
 @@ -0,0 +1,42 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class YandexCloudBaseOperator(BaseOperator):
+"""The base class for operators that poll on a Dataproc Operation."""
+@apply_defaults
+def __init__(self,
+ folder_id=None,
+ connection_id='yandexcloud_default',
+ *args,
+ **kwargs):
+super(YandexCloudBaseOperator, self).__init__(*args, **kwargs)
+self.connection_id = connection_id
+self.hook = YandexCloudBaseHook(
 
 Review comment:
   Done


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7126: [AIRFLOW-6405] Add GCP BigQuery Table Upsert Operator

2020-01-27 Thread GitBox
nuclearpinguin commented on a change in pull request #7126: [AIRFLOW-6405] Add 
GCP BigQuery Table Upsert Operator
URL: https://github.com/apache/airflow/pull/7126#discussion_r371205169
 
 

 ##
 File path: airflow/gcp/hooks/bigquery.py
 ##
 @@ -1109,6 +1110,16 @@ def run_table_upsert(self, dataset_id: str, 
table_resource: Dict,
 """
 service = self.get_service()
 # check to see if the table exists
+if 'tableReference' not in table_resource:
+raise AirflowBadRequest(
+'"tableReference" is required within table_resource parameter. 
'
+'See 
https://cloud.google.com/bigquery/docs/reference/v2/tables#resource'
+)
+if 'tableId' not in table_resource["tableReference"]:
+raise AirflowBadRequest(
+'"tableId" is required within 
table_resource["tableReference"]. '
+'See 
https://cloud.google.com/bigquery/docs/reference/v2/tables#resource'
+)
 
 Review comment:
   I just wonder, does it break backward compatibility?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on issue #6354: [AIRFLOW-5664] Store timestamps with microseconds precision in GCSToPSQL

2020-01-27 Thread GitBox
nuclearpinguin commented on issue #6354: [AIRFLOW-5664] Store timestamps with 
microseconds precision in GCSToPSQL
URL: https://github.com/apache/airflow/pull/6354#issuecomment-578713715
 
 
   @osule can you please, rebase onto new master? 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-6648) Timeout Feature - Provided statistical solution to long running/stuck jobs and take appropriate actions

2020-01-27 Thread Golokesh Patra (Jira)
Golokesh Patra created AIRFLOW-6648:
---

 Summary: Timeout Feature - Provided statistical solution to long 
running/stuck jobs and take appropriate actions
 Key: AIRFLOW-6648
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6648
 Project: Apache Airflow
  Issue Type: Improvement
  Components: aws, DAG, database, operators
Affects Versions: 1.10.0
 Environment: AWS Linux AMI -  Ubuntu 18.04.1 LTS (GNU/Linux 
4.15.0-1027-aws x86_64)
Reporter: Golokesh Patra
Assignee: Golokesh Patra
 Attachments: image-2020-01-27-17-07-51-822.png, 
image-2020-01-27-17-08-09-867.png, image-2020-01-27-17-08-33-088.png, 
image-2020-01-27-17-22-07-433.png, image2019-3-25_12-33-57.png

Sometimes, across different type of tasks/jobs ,
 one might encounter issues where airflow jobs/tasks get stuck while they are 
in running state.
 Such issues will cause - Pipeline being stuck for no reason stalling other 
jobs/tasks which will be a disaster when such issues happen on Production.

This particular improvement aims to not only improve upon the TIMEOUT logic 
already in airflow, but to make it more functional and automated.

*Diagrammatically Explanation of the solution -* 
!image-2020-01-27-17-22-07-433.png!

*Detailed Theoretical Explanation -* 

With increasing Data & Complexity of tasks/job , besides the increasing load, 
the chances of memory leaks/stuck jobs/some infrastructural issues etc may 
occur thereby creating some unwanted results.
 Maybe on some day there was more data which resulted in a steep jump in the 
duration of the job; otherwise, the growth is expected to be gradual.
 And sometimes, the Jobs get stuck because of various issues and often requires 
termination followed by a restart.
 So, we are trying to make a logic which will automatically decide whether to
 * _terminate the Job_
 * _Terminate and Restart_
 * _Terminate and Mark as a failure so that downstream jobs don't get 
triggered._
 * _Take no action and inform DevOps regarding the issue ( Manual Action )_
 So, I just want to know, statistically, what will be the effective way to 
achieve the above outcomes.

Lets Consider 2 Jobs X & Y.

Jobs related Info -
 !image-2020-01-27-17-07-51-822.png!

!image-2020-01-27-17-08-09-867.png!

Then I was thinking of having a New Table which would be structured as -

+Derived table-+ 
 !image-2020-01-27-17-08-33-088.png!

( The above Example is theoretical and actual implementation might differ )

*LIMITATION -* 
 # For now , we have only tested the above on EMR ( Personal Usecase )
 # Testing Pending for Databricks. ( Personal Usecase )

Please do suggest any other services where this needs/can be used.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] potiuk commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to /providers/aws [AIP-21]

2020-01-27 Thread GitBox
potiuk commented on issue #6455: [WIP][AIRFLOW-5777] Migrate AWS DynamoDB to 
/providers/aws [AIP-21]
URL: https://github.com/apache/airflow/pull/6455#issuecomment-578710607
 
 
   @mik-laj is doing automated migration now for all contrib packages - I guess 
this one can be closed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aviemzur opened a new pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur opened a new pull request #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-6258\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-6258
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Add 2 new operators: `CloudFormationCreateStackOperator` and 
`CloudFormationDeleteStackOperator`
   https://issues.apache.org/jira/browse/AIRFLOW-6258
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   `TestCloudFormationCreateStackOperator` and 
`TestCloudFormationDeleteStackOperator`
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6258) CloudFormation create_stack and delete_stack operators

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6258?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024276#comment-17024276
 ] 

ASF GitHub Bot commented on AIRFLOW-6258:
-

aviemzur commented on pull request #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-6258\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-6258
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
 - In case you are adding a dependency, check if the license complies with 
the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   Add 2 new operators: `CloudFormationCreateStackOperator` and 
`CloudFormationDeleteStackOperator`
   https://issues.apache.org/jira/browse/AIRFLOW-6258
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   `TestCloudFormationCreateStackOperator` and 
`TestCloudFormationDeleteStackOperator`
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> CloudFormation create_stack and delete_stack operators
> --
>
> Key: AIRFLOW-6258
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6258
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Affects Versions: 1.10.6
>Reporter: Aviem Zur
>Assignee: Aviem Zur
>Priority: Major
>
> Add CloudFormation create_stack and delete_stack operators.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] aviemzur closed pull request #6824: [AIRFLOW-6258] add CloudFormation operators to AWS providers

2020-01-27 Thread GitBox
aviemzur closed pull request #6824: [AIRFLOW-6258] add CloudFormation operators 
to AWS providers
URL: https://github.com/apache/airflow/pull/6824
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-6258) CloudFormation create_stack and delete_stack operators

2020-01-27 Thread ASF GitHub Bot (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-6258?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17024275#comment-17024275
 ] 

ASF GitHub Bot commented on AIRFLOW-6258:
-

aviemzur commented on pull request #6824: [AIRFLOW-6258] add CloudFormation 
operators to AWS providers
URL: https://github.com/apache/airflow/pull/6824
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> CloudFormation create_stack and delete_stack operators
> --
>
> Key: AIRFLOW-6258
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6258
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib
>Affects Versions: 1.10.6
>Reporter: Aviem Zur
>Assignee: Aviem Zur
>Priority: Major
>
> Add CloudFormation create_stack and delete_stack operators.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371186897
 
 

 ##
 File path: airflow/contrib/hooks/yandexcloud_dataproc_hook.py
 ##
 @@ -0,0 +1,571 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import random
+from datetime import datetime
+
+import yandex.cloud.dataproc.v1.cluster_pb2 as cluster_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2 as cluster_service_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2_grpc as 
cluster_service_grpc_pb
+import yandex.cloud.dataproc.v1.common_pb2 as common_pb
+import yandex.cloud.dataproc.v1.job_pb2 as job_pb
+import yandex.cloud.dataproc.v1.job_service_pb2 as job_service_pb
+import yandex.cloud.dataproc.v1.job_service_pb2_grpc as job_service_grpc_pb
+import yandex.cloud.dataproc.v1.subcluster_pb2 as subcluster_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2 as subcluster_service_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2_grpc as 
subcluster_service_grpc_pb
+from google.protobuf.field_mask_pb2 import FieldMask
+from six import string_types
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.exceptions import AirflowException
+
+
+class DataprocHook(YandexCloudBaseHook):
+"""
+A base hook for Yandex.Cloud Data Proc.
+
+:param connection_id: The connection ID to use when fetching connection 
info.
+:type connection_id: str
+"""
+
+def __init__(self, *args, **kwargs):
+super(DataprocHook, self).__init__(*args, **kwargs)
+self.cluster_id = None
+
+def _get_operation_result(self, operation, response_type=None, 
meta_type=None):
+message = 'Running Yandex.Cloud operation. ID: {}. Description: {}. 
Created at: {}. Created by: {}.'
+message = message.format(
+operation.id,
+operation.description,
+datetime.fromtimestamp(operation.created_at.seconds),
+operation.created_by,
+)
+if meta_type:
+unpacked_meta = meta_type()
+operation.metadata.Unpack(unpacked_meta)
+message += ' Meta: {}.'.format(unpacked_meta)
+self.log.info(message)
+result = self.wait_for_operation(operation)
+if result.error and result.error.code:
+error_message = 'Error Yandex.Cloud operation. ID: {}. Error code: 
{}. Details: {}. Message: {}.'
+error_message = error_message.format(
+result.id, result.error.code, result.error.details, 
result.error.message
+)
+self.log.error(error_message)
+raise AirflowException(error_message)
+else:
+log_message = 'Done Yandex.Cloud operation. ID: 
{}.'.format(operation.id)
+unpacked_response = None
+if response_type:
+unpacked_response = response_type()
+result.response.Unpack(unpacked_response)
+log_message += ' Response: {}.'.format(unpacked_response)
+self.log.info(log_message)
+if unpacked_response:
+return unpacked_response
+return None
+
+def add_subcluster(
+self,
+cluster_id,
+subcluster_type,
+name,
+subnet_id,
+resource_preset='s2.small',
+disk_size=15,
+disk_type='network-ssd',
+hosts_count=5,
+):
+"""
+Add subcluster to Yandex.Cloud Data Proc cluster.
+
+:param cluster_id: ID of the cluster.
+:type cluster_id: str
+:param name: Name of the subcluster. Must be unique in the cluster
+:type name: str
+:param subcluster_type: Type of the subcluster. Either "data" or 
"compute".
+:type subcluster_type: str
+:param subnet_id: Subnet ID of the cluster.
+:type subnet_id: str
+:param resource_preset: Resources preset (CPU+RAM configuration) for 
the nodes of the cluster.
+:type resource_preset: str
+:param disk_size: Storage size in GiB.
+:type disk_size: int
+:param disk_type: 

[GitHub] [airflow] peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] Initial Yandex.Cloud Dataproc support

2020-01-27 Thread GitBox
peter-volkov commented on a change in pull request #7252: [AIRFLOW-6531] 
Initial Yandex.Cloud Dataproc support
URL: https://github.com/apache/airflow/pull/7252#discussion_r371186897
 
 

 ##
 File path: airflow/contrib/hooks/yandexcloud_dataproc_hook.py
 ##
 @@ -0,0 +1,571 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import random
+from datetime import datetime
+
+import yandex.cloud.dataproc.v1.cluster_pb2 as cluster_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2 as cluster_service_pb
+import yandex.cloud.dataproc.v1.cluster_service_pb2_grpc as 
cluster_service_grpc_pb
+import yandex.cloud.dataproc.v1.common_pb2 as common_pb
+import yandex.cloud.dataproc.v1.job_pb2 as job_pb
+import yandex.cloud.dataproc.v1.job_service_pb2 as job_service_pb
+import yandex.cloud.dataproc.v1.job_service_pb2_grpc as job_service_grpc_pb
+import yandex.cloud.dataproc.v1.subcluster_pb2 as subcluster_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2 as subcluster_service_pb
+import yandex.cloud.dataproc.v1.subcluster_service_pb2_grpc as 
subcluster_service_grpc_pb
+from google.protobuf.field_mask_pb2 import FieldMask
+from six import string_types
+
+from airflow.contrib.hooks.yandexcloud_base_hook import YandexCloudBaseHook
+from airflow.exceptions import AirflowException
+
+
+class DataprocHook(YandexCloudBaseHook):
+"""
+A base hook for Yandex.Cloud Data Proc.
+
+:param connection_id: The connection ID to use when fetching connection 
info.
+:type connection_id: str
+"""
+
+def __init__(self, *args, **kwargs):
+super(DataprocHook, self).__init__(*args, **kwargs)
+self.cluster_id = None
+
+def _get_operation_result(self, operation, response_type=None, 
meta_type=None):
+message = 'Running Yandex.Cloud operation. ID: {}. Description: {}. 
Created at: {}. Created by: {}.'
+message = message.format(
+operation.id,
+operation.description,
+datetime.fromtimestamp(operation.created_at.seconds),
+operation.created_by,
+)
+if meta_type:
+unpacked_meta = meta_type()
+operation.metadata.Unpack(unpacked_meta)
+message += ' Meta: {}.'.format(unpacked_meta)
+self.log.info(message)
+result = self.wait_for_operation(operation)
+if result.error and result.error.code:
+error_message = 'Error Yandex.Cloud operation. ID: {}. Error code: 
{}. Details: {}. Message: {}.'
+error_message = error_message.format(
+result.id, result.error.code, result.error.details, 
result.error.message
+)
+self.log.error(error_message)
+raise AirflowException(error_message)
+else:
+log_message = 'Done Yandex.Cloud operation. ID: 
{}.'.format(operation.id)
+unpacked_response = None
+if response_type:
+unpacked_response = response_type()
+result.response.Unpack(unpacked_response)
+log_message += ' Response: {}.'.format(unpacked_response)
+self.log.info(log_message)
+if unpacked_response:
+return unpacked_response
+return None
+
+def add_subcluster(
+self,
+cluster_id,
+subcluster_type,
+name,
+subnet_id,
+resource_preset='s2.small',
+disk_size=15,
+disk_type='network-ssd',
+hosts_count=5,
+):
+"""
+Add subcluster to Yandex.Cloud Data Proc cluster.
+
+:param cluster_id: ID of the cluster.
+:type cluster_id: str
+:param name: Name of the subcluster. Must be unique in the cluster
+:type name: str
+:param subcluster_type: Type of the subcluster. Either "data" or 
"compute".
+:type subcluster_type: str
+:param subnet_id: Subnet ID of the cluster.
+:type subnet_id: str
+:param resource_preset: Resources preset (CPU+RAM configuration) for 
the nodes of the cluster.
+:type resource_preset: str
+:param disk_size: Storage size in GiB.
+:type disk_size: int
+:param disk_type: 

[jira] [Assigned] (AIRFLOW-6647) Reduce the cluttering of Airflow UI by merging create and check into a single CHECK step.

2020-01-27 Thread Nidhi Chourasia (Jira)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-6647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nidhi Chourasia reassigned AIRFLOW-6647:


Assignee: Nidhi Chourasia

> Reduce the cluttering of Airflow UI by merging create and check into a single 
> CHECK step.
> -
>
> Key: AIRFLOW-6647
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6647
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: operators, ui
>Affects Versions: 1.10.0
> Environment:  Ubuntu 18.04.1 LTS (GNU/Linux 4.15.0-1027-aws x86_64)
>Reporter: Nidhi Chourasia
>Assignee: Nidhi Chourasia
>Priority: Minor
>
> This is another UI Feature which make the Airflow UI  
>  # Clutter Free 
>  # More readable  
>  # More intuitive  
> Presently on Airflow, for any job there will be 2 steps – CREATE step and 
> CHECK step.  
> CREATE STEP – It only creates the jobs/Runs the Job as per the schedule and 
> dies of immediately  
> CHECK STEP – After the create step, this step will keep on tracking the 
> status of the job ( i.e check if the job is in RUNNING/FAILED/SUCCESSFUL 
> state)  
> We presently host close to approx. 40 jobs,  before our enhancement there 
> would have been 80 Steps created in the form of a TREE ( DAG ), but after our 
> enhancement , the steps CREATE and CHECK have been merged on the UI side to 
> just one single CHECK step.  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (AIRFLOW-6647) Reduce the cluttering of Airflow UI by merging create and check into a single CHECK step.

2020-01-27 Thread Nidhi Chourasia (Jira)
Nidhi Chourasia created AIRFLOW-6647:


 Summary: Reduce the cluttering of Airflow UI by merging create and 
check into a single CHECK step.
 Key: AIRFLOW-6647
 URL: https://issues.apache.org/jira/browse/AIRFLOW-6647
 Project: Apache Airflow
  Issue Type: Improvement
  Components: operators, ui
Affects Versions: 1.10.0
 Environment:  Ubuntu 18.04.1 LTS (GNU/Linux 4.15.0-1027-aws x86_64)
Reporter: Nidhi Chourasia


This is another UI Feature which make the Airflow UI  
 # Clutter Free 

 # More readable  

 # More intuitive  

Presently on Airflow, for any job there will be 2 steps – CREATE step and CHECK 
step.  

CREATE STEP – It only creates the jobs/Runs the Job as per the schedule and 
dies of immediately  

CHECK STEP – After the create step, this step will keep on tracking the status 
of the job ( i.e check if the job is in RUNNING/FAILED/SUCCESSFUL state)  

We presently host close to approx. 40 jobs,  before our enhancement there would 
have been 80 Steps created in the form of a TREE ( DAG ), but after our 
enhancement , the steps CREATE and CHECK have been merged on the UI side to 
just one single CHECK step.  

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[GitHub] [airflow] mik-laj commented on a change in pull request #6552: [AIRFLOW-5850] Capture task logs in DockerSwarmOperator

2020-01-27 Thread GitBox
mik-laj commented on a change in pull request #6552: [AIRFLOW-5850] Capture 
task logs in DockerSwarmOperator
URL: https://github.com/apache/airflow/pull/6552#discussion_r371165897
 
 

 ##
 File path: airflow/providers/docker/operators/docker_swarm.py
 ##
 @@ -123,11 +129,43 @@ def _run_image(self):
 
 self.log.info('Service started: %s', str(self.service))
 
-status = None
 # wait for the service to start the task
 while not self.cli.tasks(filters={'service': self.service['ID']}):
 continue
-while True:
+
+logs = self.cli.service_logs(
+self.service['ID'], follow=True, stdout=True, stderr=True, 
is_tty=self.tty
+)
+line = ''
+_stream_logs = self.enable_logging  # Status of the service_logs' 
generator
+while True:  # pylint: disable=too-many-nested-blocks
+if self.enable_logging:
 
 Review comment:
   Why do we need so many nested blocks?  Can you also create a new method that 
will not be performed when self.enable_logging == False?  When 
self.enable_logging = False, it is not necessary to call 
`self.cli.service_logs` also.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] akki edited a comment on issue #6552: [AIRFLOW-5850] Capture task logs in DockerSwarmOperator

2020-01-27 Thread GitBox
akki edited a comment on issue #6552: [AIRFLOW-5850] Capture task logs in 
DockerSwarmOperator
URL: https://github.com/apache/airflow/pull/6552#issuecomment-578582943
 
 
   Hi Airflow team
   I have rebased my work on top of master (again!) as suggested in the JIRA 
ticket recently. CI also seems green now.  
   
   Would anyone please help review and approve this PR?
   
   
   (PS - the postgres connection error in CI don't seem related to this PR)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >