[jira] [Commented] (AIRFLOW-4076) Correct beeline_default port type in initdb function

2019-03-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791337#comment-16791337
 ] 

ASF GitHub Bot commented on AIRFLOW-4076:
-

zhongjiajie commented on pull request #4908: [AIRFLOW-4076] Correct port type 
of beeline_default
URL: https://github.com/apache/airflow/pull/4908
 
 
   airflow initdb will create default beeline connection
   with port "1", but airflow.models.connection
   variable port is Integer type. It's better to set
   values in same type as int although it could auto
   transfer
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Correct beeline_default port type in initdb function
> 
>
> Key: AIRFLOW-4076
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4076
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: db
>Affects Versions: 1.10.2
>Reporter: zhongjiajie
>Assignee: zhongjiajie
>Priority: Trivial
> Fix For: 1.10.3
>
>
> `beeline_default` conn_id create with command `airflow initdb` use String as 
> port type
> {code:java}
> merge_conn(
> Connection(
> conn_id='beeline_default', conn_type='beeline', port="1", <== HERE
> host='localhost', extra="{\"use_beeline\": true, \"auth\": \"\"}",
> schema='default'))
> {code}
> but `airflow.models.connection` use int type to store port
> {code:java}
> class Connection(Base, LoggingMixin):
> __tablename__ = "connection"
> id = Column(Integer(), primary_key=True)
> conn_id = Column(String(ID_LEN))
> conn_type = Column(String(500))
> host = Column(String(500))
> schema = Column(String(500))
> login = Column(String(500))
> _password = Column('password', String(5000))
> port = Column(Integer()) <== HERE
> is_encrypted = Column(Boolean, unique=False, default=False)
> is_extra_encrypted = Column(Boolean, unique=False, default=False)
> _extra = Column('extra', String(5000))
> {code}
> It's could be transfer str to int, but I think is better to pass int values



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] zhongjiajie opened a new pull request #4908: [AIRFLOW-4076] Correct port type of beeline_default

2019-03-12 Thread GitBox
zhongjiajie opened a new pull request #4908: [AIRFLOW-4076] Correct port type 
of beeline_default
URL: https://github.com/apache/airflow/pull/4908
 
 
   airflow initdb will create default beeline connection
   with port "1", but airflow.models.connection
   variable port is Integer type. It's better to set
   values in same type as int although it could auto
   transfer
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4076) Correct beeline_default port type in initdb function

2019-03-12 Thread zhongjiajie (JIRA)
zhongjiajie created AIRFLOW-4076:


 Summary: Correct beeline_default port type in initdb function
 Key: AIRFLOW-4076
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4076
 Project: Apache Airflow
  Issue Type: Improvement
  Components: db
Affects Versions: 1.10.2
Reporter: zhongjiajie
Assignee: zhongjiajie
 Fix For: 1.10.3


`beeline_default` conn_id create with command `airflow initdb` use String as 
port type
{code:java}
merge_conn(
Connection(
conn_id='beeline_default', conn_type='beeline', port="1", <== HERE
host='localhost', extra="{\"use_beeline\": true, \"auth\": \"\"}",
schema='default'))
{code}
but `airflow.models.connection` use int type to store port
{code:java}
class Connection(Base, LoggingMixin):
__tablename__ = "connection"

id = Column(Integer(), primary_key=True)
conn_id = Column(String(ID_LEN))
conn_type = Column(String(500))
host = Column(String(500))
schema = Column(String(500))
login = Column(String(500))
_password = Column('password', String(5000))
port = Column(Integer()) <== HERE
is_encrypted = Column(Boolean, unique=False, default=False)
is_extra_encrypted = Column(Boolean, unique=False, default=False)
_extra = Column('extra', String(5000))
{code}
It's could be transfer str to int, but I think is better to pass int values



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] codecov-io edited a comment on issue #4396: [AIRFLOW-3585] - Add edges to database

2019-03-12 Thread GitBox
codecov-io edited a comment on issue #4396: [AIRFLOW-3585] - Add edges to 
database
URL: https://github.com/apache/airflow/pull/4396#issuecomment-450633941
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=h1) 
Report
   > Merging 
[#4396](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/e220ac5bcfd56f048f552f47e7e1c5813f7b928f?src=pr=desc)
 will **decrease** coverage by `1.35%`.
   > The diff coverage is `85.92%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4396/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4396  +/-   ##
   ==
   - Coverage75.3%   73.94%   -1.36% 
   ==
 Files 450  451   +1 
 Lines   2902329117  +94 
   ==
   - Hits2185521532 -323 
   - Misses   7168 7585 +417
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/operators/subdag\_operator.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvc3ViZGFnX29wZXJhdG9yLnB5)
 | `90.9% <100%> (+0.58%)` | :arrow_up: |
   | 
[airflow/jobs.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9qb2JzLnB5)
 | `76.29% <100%> (-0.57%)` | :arrow_down: |
   | 
[airflow/models/dag\_edge.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvZGFnX2VkZ2UucHk=)
 | `100% <100%> (ø)` | |
   | 
[airflow/www/views.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvdmlld3MucHk=)
 | `75.41% <62.26%> (-0.94%)` | :arrow_down: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.88% <93.02%> (-0.02%)` | :arrow_down: |
   | 
[airflow/operators/postgres\_operator.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvcG9zdGdyZXNfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_operator.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfb3BlcmF0b3IucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/mysql\_to\_hive.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvbXlzcWxfdG9faGl2ZS5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/contrib/hooks/winrm\_hook.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3dpbnJtX2hvb2sucHk=)
 | `0% <0%> (-100%)` | :arrow_down: |
   | 
[airflow/operators/generic\_transfer.py](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZ2VuZXJpY190cmFuc2Zlci5weQ==)
 | `0% <0%> (-100%)` | :arrow_down: |
   | ... and [68 
more](https://codecov.io/gh/apache/airflow/pull/4396/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=footer). 
Last update 
[e220ac5...f58f552](https://codecov.io/gh/apache/airflow/pull/4396?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (AIRFLOW-4074) Cannot put labels on Cloud Dataproc jobs

2019-03-12 Thread Ryan Yuan (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4074?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Yuan reassigned AIRFLOW-4074:
--

Assignee: Ryan Yuan

> Cannot put labels on Cloud Dataproc jobs
> 
>
> Key: AIRFLOW-4074
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4074
> Project: Apache Airflow
>  Issue Type: Wish
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Andri Renardi Lauw
>Assignee: Ryan Yuan
>Priority: Major
>  Labels: dataproc
>
> Hi
> After looking at 
> [https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py]
> on PySparkDataprocOperator, I realize that there is no way to put labels on 
> invoked Dataproc jobs here unlike when submitting through GCP console or 
> command line job submission.
> Is it possible to add this functionality?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] XD-DENG commented on issue #4141: [AIRFLOW-4075] minikube enviroment lack of init airflow db step

2019-03-12 Thread GitBox
XD-DENG commented on issue #4141:  [AIRFLOW-4075] minikube enviroment lack of 
init airflow db step
URL: https://github.com/apache/airflow/pull/4141#issuecomment-472273450
 
 
   @jie8357IOII  These scripts are for CI usage only. I don't think you're 
supposed to deploy airflow on minikube using them.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791262#comment-16791262
 ] 

ASF GitHub Bot commented on AIRFLOW-4073:
-

feng-tao commented on pull request #4907: [AIRFLOW-4073] add template_ext for 
AWS Athena operator
URL: https://github.com/apache/airflow/pull/4907
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread Tao Feng (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tao Feng resolved AIRFLOW-4073.
---
Resolution: Fixed

> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791263#comment-16791263
 ] 

ASF subversion and git services commented on AIRFLOW-4073:
--

Commit 17eb94f723502d8bbc716763a7513db0a8afa52d in airflow's branch 
refs/heads/master from Bryan Yang
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=17eb94f ]

[AIRFLOW-4073] add template_ext for AWS Athena operator (#4907)



> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] feng-tao merged pull request #4907: [AIRFLOW-4073] add template_ext for AWS Athena operator

2019-03-12 Thread GitBox
feng-tao merged pull request #4907: [AIRFLOW-4073] add template_ext for AWS 
Athena operator
URL: https://github.com/apache/airflow/pull/4907
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264964975
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   > In this case, I wasn't necessarily thinking of during the constructor but 
rather per `execute`.
   > 
   > For example, a use case is instantiating a single hook object and then 
calling `execute` more than once. The first call to `execute` will do the 
`get_connection`, subsequent calls will not.
   
   I not sure about that.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on issue #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on issue #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook 
and Operator
URL: https://github.com/apache/airflow/pull/4903#issuecomment-472266367
 
 
   cc @mik-laj we still have discuss in 
https://github.com/apache/airflow/pull/4903#discussion_r264950380 and 
https://github.com/apache/airflow/pull/4903#discussion_r264949505.
   after we finish we could ask committer to review this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264963604
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
 
 Review comment:
   Maybe we should ask someone else.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264963604
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
 
 Review comment:
   Maybe we should ask suggest from someone else.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Work started] (AIRFLOW-4075) minikube enviroment lack of init airflow db step

2019-03-12 Thread Jayce Li (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-4075?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on AIRFLOW-4075 started by Jayce Li.
-
> minikube enviroment lack of init airflow db step
> 
>
> Key: AIRFLOW-4075
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4075
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ci, kubernetes
>Reporter: Jayce Li
>Assignee: Jayce Li
>Priority: Major
>  Labels: CI
>
> When use kubenetes/kube/deplopy to deploy airflow on minikube, always failure 
> in init container.
> Because lack of 'airflow' database in postgresql.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4075) minikube enviroment lack of init airflow db step

2019-03-12 Thread Jayce Li (JIRA)
Jayce Li created AIRFLOW-4075:
-

 Summary: minikube enviroment lack of init airflow db step
 Key: AIRFLOW-4075
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4075
 Project: Apache Airflow
  Issue Type: Bug
  Components: ci, kubernetes
Reporter: Jayce Li
Assignee: Jayce Li


When use kubenetes/kube/deplopy to deploy airflow on minikube, always failure 
in init container.
Because lack of 'airflow' database in postgresql.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264961711
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
 
 Review comment:
   My original intention of the method was when I originally had ability to 
explicitly pass the key. However, at this point, I see the current state of 
this method as better traceability for scenarios where the `http_connection_id` 
is not set or errors (e.g. connection doesn't exist).
   
   However, if you strongly believe in this change, I would make the change 
with the additional change of making `http_conn_id` a required parameter to the 
hook's constructor.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264960756
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   In this case, I wasn't necessarily thinking of during the constructor but 
rather per `execute`.
   
   For example, a use case is instantiating a single hook object and then 
calling `execute` more than once. The first call to `execute` will do the 
`get_connection`, subsequent calls will not.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-4074) Cannot put labels on Cloud Dataproc jobs

2019-03-12 Thread Andri Renardi Lauw (JIRA)
Andri Renardi Lauw created AIRFLOW-4074:
---

 Summary: Cannot put labels on Cloud Dataproc jobs
 Key: AIRFLOW-4074
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4074
 Project: Apache Airflow
  Issue Type: Wish
  Components: operators
Affects Versions: 1.10.2
Reporter: Andri Renardi Lauw


Hi

After looking at 
[https://github.com/apache/airflow/blob/master/airflow/contrib/operators/dataproc_operator.py]

on PySparkDataprocOperator, I realize that there is no way to put labels on 
invoked Dataproc jobs here unlike when submitting through GCP console or 
command line job submission.

Is it possible to add this functionality?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264957560
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
 
 Review comment:
   I think this review is acceptable. WDYT @nritholtz 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264956183
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   I don't think `get_connection` out of constructor will call many time in 
scheduler? WDYT 
   cc @mik-laj . 
   Only in constructor(mean `__init__` function) will call many time.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264956909
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
 
 Review comment:
   You rigth, we should keep in `password` to not show to logs


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264956183
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   I don't think `get_connection` out of constructor will call many time in 
scheduler? WDYT cc @mik-laj . Only in constructor(mean `__init__` function) 
will call many time, I agree


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264956183
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   I don't think `get_connection` out of constructor will call many time in 
scheduler? WDYT cc @mik-laj . Only in constructor mean `__init__` function will 
call many time, I agree


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264956183
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   I don't think `get_connection` out of constructor will call many time in 
scheduler? WDYT cc@mik-laj . Only in constructor mean `__init__` function will 
call many time, I agree


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264954700
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   @zhongjiajie My only worry was that means every time the `execute` is called 
it's doing another `get_connection` to get the `api_key`. What I was attempting 
to do was basically store the api_key on the object level for the hook, so that 
further calls will have the key already (keeping the `get_connection` to one 
call per instantiated hook).
   
   If that is not a concern, I have no problem changing to your suggestion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264953624
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
 
 Review comment:
   @zhongjiajie As mentioned above 
https://github.com/apache/airflow/pull/4903#issuecomment-472213740 I originally 
had it as `extra -> api_key`, however when you do so it will log it as clear 
text in the logs.
   
   However 
[passwords](https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L266)
 are obfuscated in the logs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264954921
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
 
 Review comment:
   See https://github.com/apache/airflow/pull/4903#discussion_r264954700


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264954700
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   @zhongjiajie My only worry was that means every time the `execute` is called 
it's doing another `get_connection` to get the `api_key`. What I was attempting 
to do was basically store the api_key on the object level for the hook, so that 
further calls will have the key already (basically keeping the `get_connection` 
to once call per instantiated hook.
   
   If that is not a concern, I have no problem changing to your suggestion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264954700
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   @zhongjiajie My only worry was that means every time the `execute` is called 
it's doing another `get_connection` to get the `api_key`. What I was attempting 
to do was basically store the api_key on the object level for the hook, so that 
further calls will have the key already (basically keeping the `get_connection` 
to one call per instantiated hook).
   
   If that is not a concern, I have no problem changing to your suggestion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264953794
 
 

 ##
 File path: docs/code.rst
 ##
 @@ -240,6 +240,7 @@ Operators
 .. autoclass:: 
airflow.contrib.operators.mysql_to_gcs.MySqlToGoogleCloudStorageOperator
 .. autoclass:: 
airflow.contrib.operators.oracle_to_azure_data_lake_transfer.OracleToAzureDataLakeTransfer
 .. autoclass:: 
airflow.contrib.operators.oracle_to_oracle_transfer.OracleToOracleTransfer
+.. autoclass:: 
airflow.contrib.operators.opsgenie_alert_operator.OpsgenieAlertOperator
 
 Review comment:
   Nice find, fixed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264953624
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
 
 Review comment:
   @zhongjiajie I originally had it as `extra -> api_key`, however when you do 
so it will log it as clear text in the logs.
   
   However 
[passwords](https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L266)
 are obfuscated in the logs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread Bryan Yang (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791218#comment-16791218
 ] 

Bryan Yang commented on AIRFLOW-4073:
-

It's my PR: [https://github.com/apache/airflow/pull/4907]

I've seen test cases of other operators, and they didn't test template_ext, so 
I didn't add any test. If need it, please let me know. Thank you.

> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264949227
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
 
 Review comment:
   In your constructor, we never pass value to `self.api_key`, should we remove 
it?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264950380
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
 
 Review comment:
   Maybe we could simply it by using
   ```suggestion
   def _get_api_key(self):
   ```
   and using `conn = self.get_connection(http_conn_id)` in L65, so as L79


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264950087
 
 

 ##
 File path: airflow/contrib/operators/opsgenie_alert_operator.py
 ##
 @@ -0,0 +1,143 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+from airflow.contrib.hooks.opsgenie_alert_hook import OpsgenieAlertHook
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+
+
+class OpsgenieAlertOperator(BaseOperator):
+"""
+This operator allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this operator.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visibleTo: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visibleTo: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+
+@apply_defaults
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visibleTo=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertOperator, self).__init__(*args, **kwargs)
+
+if not http_conn_id:
+raise AirflowException('No valid Opsgenie http_conn_id supplied.')
+
+self.http_conn_id = http_conn_id
+self.message = message
+self.alias = alias
+self.description = description
+self.responders = responders
+self.visibleTo = visibleTo
+self.actions = actions
+self.tags = tags
+self.details = details
+self.entity = entity
+self.source = source
+self.priority = priority
+self.user = user
+self.note = note
+self.proxy = proxy
+self.hook = None
+
+def _build_opsgenie_payload(self):
+"""
+Construct the Opsgenie JSON payload. All relevant parameters are 
combined here
+to a valid Opsgenie JSON payload.
+
+:return: Opsgenie payload (dict) to 

[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264950810
 
 

 ##
 File path: docs/code.rst
 ##
 @@ -240,6 +240,7 @@ Operators
 .. autoclass:: 
airflow.contrib.operators.mysql_to_gcs.MySqlToGoogleCloudStorageOperator
 .. autoclass:: 
airflow.contrib.operators.oracle_to_azure_data_lake_transfer.OracleToAzureDataLakeTransfer
 .. autoclass:: 
airflow.contrib.operators.oracle_to_oracle_transfer.OracleToOracleTransfer
+.. autoclass:: 
airflow.contrib.operators.opsgenie_alert_operator.OpsgenieAlertOperator
 
 Review comment:
   Should we order in alphabetic?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264949505
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param payload: Opsgenie API Create Alert payload values
+See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+:type payload: dict
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ payload={},
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = None
+self.payload = payload
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+   'supplied')
+
+def execute(self):
+"""
+Remote Popen (actually execute the Opsgenie Alert call)
+"""
+proxies = {}
+if self.proxy:
+# we only need https proxy for Opsgenie, as the endpoint is https
+proxies = {'https': self.proxy}
+self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   According to reivew in L53, maybe we could change, WDYT?
   ```suggestion
   api_key = self._get_api_key(self.http_conn_id)
   ```
   also change variable name in L82


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791214#comment-16791214
 ] 

ASF GitHub Bot commented on AIRFLOW-4073:
-

bryanyang0528 commented on pull request #4907: [AIRFLOW-4073] add template_ext 
for AWS Athena operator
URL: https://github.com/apache/airflow/pull/4907
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4073
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] bryanyang0528 opened a new pull request #4907: [AIRFLOW-4073] add template_ext for AWS Athena operator

2019-03-12 Thread GitBox
bryanyang0528 opened a new pull request #4907: [AIRFLOW-4073] add template_ext 
for AWS Athena operator
URL: https://github.com/apache/airflow/pull/4907
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-4073
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [x] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791207#comment-16791207
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791210#comment-16791210
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791208#comment-16791208
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791209#comment-16791209
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791206#comment-16791206
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] jgao54 commented on issue #4101: [AIRFLOW-3272] Add base grpc hook

2019-03-12 Thread GitBox
jgao54 commented on issue #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/airflow/pull/4101#issuecomment-472248758
 
 
   Thank you @morgendave !


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791203#comment-16791203
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791202#comment-16791202
 ] 

ASF GitHub Bot commented on AIRFLOW-3272:
-

jgao54 commented on pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/airflow/pull/4101
 
 
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791205#comment-16791205
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-3272) Create gRPC hook for creating generic grpc connection

2019-03-12 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791204#comment-16791204
 ] 

ASF subversion and git services commented on AIRFLOW-3272:
--

Commit 8d5d46022b5f7d102e94e05cb62d64b15658ad09 in airflow's branch 
refs/heads/master from morgendave
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=8d5d460 ]

[AIRFLOW-3272] Add base grpc hook (#4101)

* [AIRFLOW-3272] add base grpc hook

* [AIRFLOW-3272] fix based on comments and add more docs

* [AIRFLOW-3272] add extra fields to www_rabc view in connection model

* [AIRFLOW-3272] change url for grpc, fix some bugs

* [AIRFLOW-3272] Add mcck for grpc

* [AIRFLOW-3272] add unit tests for grpc hook

* [AIRFLOW-3272] add gRPC connection howto doc


> Create gRPC hook for creating generic grpc connection
> -
>
> Key: AIRFLOW-3272
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3272
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Zhiwei Zhao
>Assignee: Zhiwei Zhao
>Priority: Minor
>
> Add support for gRPC connection in airflow. 
> In Airflow there are use cases of calling gPRC services, so instead of each 
> time create the channel in a PythonOperator, there should be a basic GrpcHook 
> to take care of it. The hook needs to take care of the authentication.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] jgao54 merged pull request #4101: [AIRFLOW-3272] Add base grpc hook

2019-03-12 Thread GitBox
jgao54 merged pull request #4101: [AIRFLOW-3272] Add base grpc hook
URL: https://github.com/apache/airflow/pull/4101
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread Bryan Yang (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-4073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16791200#comment-16791200
 ] 

Bryan Yang commented on AIRFLOW-4073:
-

Original Operator

{quote}class AWSAthenaOperator(BaseOperator):

"""
 An operator that submit presto query to athena.
 :param query: Presto to be run on athena. (templated)
 :type query: str
 :param database: Database to select. (templated)
 :type database: str
 :param output_location: s3 path to write the query results into. (templated)
 :type output_location: str
 :param aws_conn_id: aws connection to use
 :type aws_conn_id: str
 :param sleep_time: Time to wait between two consecutive call to check query 
status on athena
 :type sleep_time: int
 """

ui_color = '#44b5e2'
 template_fields = ('query', 'database', 'output_location')
{quote}

> Add supporting of templated file of Athena Operator
> ---
>
> Key: AIRFLOW-4073
> URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: aws
>Reporter: Bryan Yang
>Assignee: Bryan Yang
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator 
> didn't support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-4073) Add supporting of templated file of Athena Operator

2019-03-12 Thread Bryan Yang (JIRA)
Bryan Yang created AIRFLOW-4073:
---

 Summary: Add supporting of templated file of Athena Operator
 Key: AIRFLOW-4073
 URL: https://issues.apache.org/jira/browse/AIRFLOW-4073
 Project: Apache Airflow
  Issue Type: Improvement
  Components: aws
Reporter: Bryan Yang
Assignee: Bryan Yang


Don't like other SQL operators(eg. hive, bigquery), AWS Athena operator didn't 
support template_ext now. I'll add template_ext to the operator.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
zhongjiajie commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] 
Add Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264948794
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,173 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visible_to: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visible_to: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visible_to=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = self._get_api_key(http_conn_id)
 
 Review comment:
   Agree with @mik-laj point, good catch


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 merged pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 merged pull request #4835: [AIRFLOW-XXX] Improvements to formatted 
content in documentation
URL: https://github.com/apache/airflow/pull/4835
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jyothsnapk17 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
jyothsnapk17 commented on a change in pull request #4835: [AIRFLOW-XXX] 
Improvements to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264944793
 
 

 ##
 File path: docs/kubernetes.rst
 ##
 @@ -23,10 +23,20 @@ Kubernetes Executor
 
 The kubernetes executor is introduced in Apache Airflow 1.10.0. The Kubernetes 
executor will create a new pod for every task instance.
 
-Example helm charts are available at 
`scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml` in the source 
distribution. The volumes are optional and depend on your configuration. There 
are two volumes available:
+Example helm charts are available at 
``scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml`` in the source 
distribution.
+The volumes are optional and depend on your configuration. There are two 
volumes available:
 
-- Dags: by storing all the dags onto the persistent disks, all the workers can 
read the dags from there. Another option is using git-sync, before starting the 
container, a git pull of the dags repository will be performed and used 
throughout the lifecycle of the pod.
-- Logs: by storing the logs onto a persistent disk, all the logs will be 
available for all the workers and the webserver itself. If you don't configure 
this, the logs will be lost after the worker pods shuts down. Another option is 
to use S3/GCS/etc to store the logs.
+- **Dags**:
+
 
 Review comment:
   newlines required to render bullets as a nested list


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] jyothsnapk17 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
jyothsnapk17 commented on a change in pull request #4835: [AIRFLOW-XXX] 
Improvements to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264943098
 
 

 ##
 File path: docs/howto/operator/gcp/sql.rst
 ##
 @@ -639,5 +640,5 @@ Templating
 More information
 
 
-See `Google Cloud SQL Proxy documentation
+See `Google Cloud SQL documentation for more information
 
 Review comment:
   adding links for mysql and postgres documentation 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264942502
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,173 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visible_to: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visible_to: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visible_to=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = self._get_api_key(http_conn_id)
+self.message = message
+self.alias = alias
+self.description = description
+self.responders = responders
+self.visible_to = visible_to
+self.actions = actions
+self.tags = tags
+self.details = details
+self.entity = entity
+self.source = source
+self.priority = priority
+self.user = user
+self.note = note
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+  

[GitHub] [airflow] nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264942080
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,173 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visible_to: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visible_to: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visible_to=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = self._get_api_key(http_conn_id)
 
 Review comment:
   Moved out of constructor


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
mik-laj commented on issue #4892: [AIRFLOW-XXX]  Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472229446
 
 
   @feng-tao LGPL forces the source code to be made available when we modify 
it. It is allowed to use precompiled applications without major restrictions. 
This is my opinion, and I'm not a lawyer.
   
   I did not use the tool because I wanted to use flake8. I did not want to add 
another step in the building process, but only to expand the current one. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao commented on issue #4892: [AIRFLOW-XXX]  Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472225586
 
 
   Actually LGPL library doesn't seem to 
allow(https://issues.apache.org/jira/browse/LEGAL-192).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao edited a comment on issue #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao edited a comment on issue #4892: [AIRFLOW-XXX]  Enforce order in 
imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472225265
 
 
   Thanks @mik-laj for the pr. Always appreciated on your contribution on 
improving the project.
   
   Two questions:
   1. Will the license(LGPL) for the new lib be an issue? I will defer it to 
Ash or Bolke who are expect on the license fields.
   2. I wonder whether you use any linting tool to do the ordering?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao commented on issue #4892: [AIRFLOW-XXX]  Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472225265
 
 
   Thanks @mik-laj for the pr. Always appreciated on your contribution to 
improving the project.
   
   Two questions:
   1. Will the license(LGPL) for the new lib be an issue? I will defer it to 
Ash or Bolke who are expect on the license fields.
   2. I wonder whether you use any linting tool to do the ordering?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on a change in pull request #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao commented on a change in pull request #4892: [AIRFLOW-XXX]  Enforce 
order in imports 
URL: https://github.com/apache/airflow/pull/4892#discussion_r264928852
 
 

 ##
 File path: setup.py
 ##
 @@ -234,6 +233,8 @@ def write_version(filename=os.path.join(*['airflow',
 
 devel = [
 'click==6.7',
+'flake8-import-order>=0.18',
 
 Review comment:
   The license of https://github.com/PyCQA/flake8-import-order is LGPL. Will 
this be an issue?
   cc @bolkedebruin @ashb 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4892: [AIRFLOW-XXX] Enforce order in imports

2019-03-12 Thread GitBox
mik-laj commented on issue #4892: [AIRFLOW-XXX]  Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472224357
 
 
   @feng-tao I did not notice it. I did not delete it when Travis turned green. 
Thanks for taking care of the matter.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feng-tao commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472224111
 
 
   @mik-laj , sure, your pr title is still [WIP] which is the reason I am 
asking.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports

2019-03-12 Thread GitBox
mik-laj commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472223881
 
 
   Linting is included on Travis, so if it were not finished it would not be 
green.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264918198
 
 

 ##
 File path: docs/howto/operator/gcp/sql.rst
 ##
 @@ -639,5 +640,5 @@ Templating
 More information
 
 
-See `Google Cloud SQL Proxy documentation
+See `Google Cloud SQL documentation for more information
 
 Review comment:
   Same here


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264925629
 
 

 ##
 File path: docs/timezone.rst
 ##
 @@ -34,7 +34,7 @@ happen. (The pendulum and pytz documentation discusses these 
issues in greater d
 for a simple DAG, but it’s a problem if you are in, for example, financial 
services where you have end of day
 deadlines to meet.
 
-The time zone is set in `airflow.cfg`. By default it is set to utc, but you 
change it to use the system’s settings or
+The time zone is set in ``airflow.cfg``. By default it is set to utc, but you 
change it to use the system’s settings or
 
 Review comment:
   We need to remove the new line here.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264918967
 
 

 ##
 File path: docs/tutorial.rst
 ##
 @@ -368,8 +368,10 @@ Testing
 Running the Script
 ''
 
-Time to run some tests. First let's make sure that the pipeline
-parses. Let's assume we're saving the code from the previous step in
+Time to run some tests. First, let's make sure the pipeline
+is parsed successfully.
+
 
 Review comment:
   Remove this extra newline


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264918097
 
 

 ##
 File path: docs/howto/operator/gcp/spanner.rst
 ##
 @@ -272,5 +272,5 @@ Templating
 More information
 
 
-See `Google Cloud Spanner API documentation for instance delete
+See `Google Cloud Spanner API documentation to delete an instance
 
 Review comment:
   Change this one as well!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264917282
 
 

 ##
 File path: docs/howto/check-health.rst
 ##
 @@ -33,15 +33,17 @@ To check the health status of your Airflow instance, you 
can simply access the e
 }
   }
 
-* The ``status`` of each component can be either "healthy" or "unhealthy".
+* The ``status`` of each component can be either "healthy" or "unhealthy"
 
-* The status of ``metadatabase`` is depending on whether a valid 
connection can be initiated
-  with the database backend of Airflow.
-* The status of ``scheduler`` is depending on when the latest scheduler 
heartbeat happened. If the latest
-  scheduler heartbeat happened 30 seconds (default value) earlier than the 
current time, scheduler component is
-  considered unhealthy. You can also specify this threshold value by 
changing
-  ``scheduler_health_check_threshold`` in ``scheduler`` section of the 
``airflow.cfg`` file.
+  * The status of ``metadatabase`` depends on whether a valid connection can 
be initiated with the database
+
+  * The status of ``scheduler`` depends on when the latest scheduler heartbeat 
was received
+
+* If the last heartbeat was received more than 30 seconds (default value) 
earlier than the current time, the scheduler is
 
 Review comment:
   There should not be newlines at line 39, 41, 42-43, 46


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264925756
 
 

 ##
 File path: docs/lineage.rst
 ##
 @@ -64,10 +64,19 @@ works.
 run_this.set_downstream(run_this_last)
 
 
-Tasks take the parameters `inlets` and `outlets`. Inlets can be manually 
defined by a list of dataset `{"datasets":
-[dataset1, dataset2]}` or can be configured to look for outlets from upstream 
tasks `{"task_ids": ["task_id1", "task_id2"]}`
-or can be configured to pick up outlets from direct upstream tasks `{"auto": 
True}` or a combination of them. Outlets 
-are defined as list of dataset `{"datasets": [dataset1, dataset2]}`. Any 
fields for the dataset are templated with 
+Tasks take the parameters `inlets` and `outlets`.
+
 
 Review comment:
   Remove the new lines below


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264926576
 
 

 ##
 File path: docs/howto/write-logs.rst
 ##
 @@ -127,12 +128,12 @@ example:
 #. Verify that logs are showing up for newly executed tasks in the bucket 
you've defined.
 #. Verify that the Google Cloud Storage viewer is working in the UI. Pull up a 
newly executed task, and verify that you see something like:
 
-.. code-block:: bash
+.. code-block:: bash
 
-*** Reading remote log from gs:///example_bash_operator/run_this_last/2017-10-03T00:00:00/16.log.
-[2017-10-03 21:57:50,056] {cli.py:377} INFO - Running on host 
chrisr-00532
-[2017-10-03 21:57:50,093] {base_task_runner.py:115} INFO - Running: 
['bash', '-c', u'airflow run example_bash_operator run_this_last 
2017-10-03T00:00:00 --job_id 47 --raw -sd 
DAGS_FOLDER/example_dags/example_bash_operator.py']
-[2017-10-03 21:57:51,264] {base_task_runner.py:98} INFO - Subtask: 
[2017-10-03 21:57:51,263] {__init__.py:45} INFO - Using executor 
SequentialExecutor
-[2017-10-03 21:57:51,306] {base_task_runner.py:98} INFO - Subtask: 
[2017-10-03 21:57:51,306] {models.py:186} INFO - Filling up the DagBag from 
/airflow/dags/example_dags/example_bash_operator.py
+  *** Reading remote log from gs:///example_bash_operator/run_this_last/2017-10-03T00:00:00/16.log.
+  [2017-10-03 21:57:50,056] {cli.py:377} INFO - Running on host chrisr-00532
+  [2017-10-03 21:57:50,093] {base_task_runner.py:115} INFO - Running: ['bash', 
'-c', u'airflow run example_bash_operator run_this_last 2017-10-03T00:00:00 
--job_id 47 --raw -sd DAGS_FOLDER/example_dags/example_bash_operator.py']
+  [2017-10-03 21:57:51,264] {base_task_runner.py:98} INFO - Subtask: 
[2017-10-03 21:57:51,263] {__init__.py:45} INFO - Using executor 
SequentialExecutor
+  [2017-10-03 21:57:51,306] {base_task_runner.py:98} INFO - Subtask: 
[2017-10-03 21:57:51,306] {models.py:186} INFO - Filling up the DagBag from 
/airflow/dags/example_dags/example_bash_operator.py
 
-Note the top line that says it's reading from the remote log file.
+**Note** that the first line that says it's reading from the remote log file.
 
 Review comment:
   Readability ... `that .. that`? Remove the second `that`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements to formatted content in documentation

2019-03-12 Thread GitBox
r39132 commented on a change in pull request #4835: [AIRFLOW-XXX] Improvements 
to formatted content in documentation
URL: https://github.com/apache/airflow/pull/4835#discussion_r264926325
 
 

 ##
 File path: docs/kubernetes.rst
 ##
 @@ -23,10 +23,20 @@ Kubernetes Executor
 
 The kubernetes executor is introduced in Apache Airflow 1.10.0. The Kubernetes 
executor will create a new pod for every task instance.
 
-Example helm charts are available at 
`scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml` in the source 
distribution. The volumes are optional and depend on your configuration. There 
are two volumes available:
+Example helm charts are available at 
``scripts/ci/kubernetes/kube/{airflow,volumes,postgres}.yaml`` in the source 
distribution.
+The volumes are optional and depend on your configuration. There are two 
volumes available:
 
-- Dags: by storing all the dags onto the persistent disks, all the workers can 
read the dags from there. Another option is using git-sync, before starting the 
container, a git pull of the dags repository will be performed and used 
throughout the lifecycle of the pod.
-- Logs: by storing the logs onto a persistent disk, all the logs will be 
available for all the workers and the webserver itself. If you don't configure 
this, the logs will be lost after the worker pods shuts down. Another option is 
to use S3/GCS/etc to store the logs.
+- **Dags**:
+
 
 Review comment:
   New lines here. Are they needed?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] elvijs commented on a change in pull request #4603: [AIRFLOW-3717] Use DAG context managers in documentation examples

2019-03-12 Thread GitBox
elvijs commented on a change in pull request #4603: [AIRFLOW-3717] Use DAG 
context managers in documentation examples
URL: https://github.com/apache/airflow/pull/4603#discussion_r264925236
 
 

 ##
 File path: docs/concepts.rst
 ##
 @@ -759,25 +759,22 @@ For example, consider the following dag:
   from airflow.utils.trigger_rule import TriggerRule
 
 
-  dag = DAG(
-  dag_id='latest_only_with_trigger',
-  schedule_interval=dt.timedelta(hours=4),
-  start_date=dt.datetime(2016, 9, 20),
-  )
+  with DAG(dag_id='latest_only_with_trigger',
+   schedule_interval=dt.timedelta(hours=4),
+   start_date=dt.datetime(2016, 9, 20)) as dag:
 
 Review comment:
   No, we don't, good spot! I'll fix when I get the time (probably this weekend)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
mik-laj commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264925228
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,173 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visible_to: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visible_to: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visible_to=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = self._get_api_key(http_conn_id)
+self.message = message
+self.alias = alias
+self.description = description
+self.responders = responders
+self.visible_to = visible_to
+self.actions = actions
+self.tags = tags
+self.details = details
+self.entity = entity
+self.source = source
+self.priority = priority
+self.user = user
+self.note = note
+self.proxy = proxy
+
+def _get_api_key(self, http_conn_id):
+"""
+Given a conn_id, return the api_key to use
+:param http_conn_id: The conn_id provided
+:type http_conn_id: str
+:return: api_key (str) to use
+"""
+if http_conn_id:
+conn = self.get_connection(http_conn_id)
+return conn.password
+else:
+raise AirflowException('Cannot get api_key: No valid conn_id '
+

[GitHub] [airflow] feng-tao commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports

2019-03-12 Thread GitBox
feng-tao commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472221165
 
 
   test pass now, is the pr ready to go or still WIP?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in imports

2019-03-12 Thread GitBox
codecov-io commented on issue #4892: [AIRFLOW-XXX] [WIP] Enforce order in 
imports 
URL: https://github.com/apache/airflow/pull/4892#issuecomment-472220922
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=h1) 
Report
   > Merging 
[#4892](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/4fb91b0b674a187447e18e6171df7be8a85ac22c?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `84.64%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4892/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4892  +/-   ##
   ==
   + Coverage   75.53%   75.54%   +<.01% 
   ==
 Files 450  450  
 Lines   2903429021  -13 
   ==
   - Hits2193221925   -7 
   + Misses   7102 7096   -6
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/www/gunicorn\_config.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvZ3VuaWNvcm5fY29uZmlnLnB5)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/redis\_hook.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3JlZGlzX2hvb2sucHk=)
 | `100% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/spark\_jdbc\_script.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3NwYXJrX2pkYmNfc2NyaXB0LnB5)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/aws\_athena\_hook.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2F3c19hdGhlbmFfaG9vay5weQ==)
 | `60% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/sensors/weekday\_sensor.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL3NlbnNvcnMvd2Vla2RheV9zZW5zb3IucHk=)
 | `96.15% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/gcp\_compute\_hook.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL2djcF9jb21wdXRlX2hvb2sucHk=)
 | `78.65% <ø> (ø)` | :arrow_up: |
   | 
[...ow/contrib/auth/backends/github\_enterprise\_auth.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2F1dGgvYmFja2VuZHMvZ2l0aHViX2VudGVycHJpc2VfYXV0aC5weQ==)
 | `0% <ø> (ø)` | :arrow_up: |
   | 
[airflow/utils/log/file\_task\_handler.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9sb2cvZmlsZV90YXNrX2hhbmRsZXIucHk=)
 | `89.41% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/kubernetes/worker\_configuration.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2t1YmVybmV0ZXMvd29ya2VyX2NvbmZpZ3VyYXRpb24ucHk=)
 | `93.63% <ø> (ø)` | :arrow_up: |
   | 
[airflow/contrib/hooks/spark\_jdbc\_hook.py](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3NwYXJrX2pkYmNfaG9vay5weQ==)
 | `91.95% <ø> (ø)` | :arrow_up: |
   | ... and [238 
more](https://codecov.io/gh/apache/airflow/pull/4892/diff?src=pr=tree-more) 
| |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=footer). 
Last update 
[4fb91b0...34936cf](https://codecov.io/gh/apache/airflow/pull/4892?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] mik-laj commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
mik-laj commented on a change in pull request #4903: [WIP] [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r264924298
 
 

 ##
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##
 @@ -0,0 +1,173 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+"""
+This hook allows you to post alerts to Opsgenie.
+Accepts a connection that has an Opsgenie API key as the connection's 
password.
+Each Opsgenie API key can be pre-configured to a team integration.
+You can override these defaults in this hook.
+
+:param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/;
+  and Opsgenie API key as the connection's password
+  (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+:type http_conn_id: str
+:param message: The Message of the Opsgenie alert
+:type message: str
+:param alias: Client-defined identifier of the alert
+:type alias: str
+:param description: Description field of the alert
+:type description: str
+:param responders: Teams, users, escalations and schedules that
+  the alert will be routed to send notifications.
+:type responders: list[dict]
+:param visible_to: Teams and users that the alert will become visible
+  to without sending any notification.
+:type visible_to: list[dict]
+:param actions: Custom actions that will be available for the alert.
+:type actions: list[str]
+:param tags: Tags of the alert.
+:type tags: list[str]
+:param details: Map of key-value pairs to use as custom properties of the 
alert.
+:type details: dict
+:param entity: Entity field of the alert that is
+generally used to specify which domain alert is related to.
+:type entity: str
+:param source: Source field of the alert. Default value is
+IP address of the incoming request.
+:type source: str
+:param priority: Priority level of the alert. Default value is P3.
+:type priority: str
+:param user: Display name of the request owner.
+:type user: str
+:param note: Additional note that will be added while creating the alert.
+:type note: str
+:param proxy: Proxy to use to make the Opsgenie Alert API call
+:type proxy: str
+"""
+def __init__(self,
+ http_conn_id=None,
+ message="",
+ alias=None,
+ description=None,
+ responders=None,
+ visible_to=None,
+ actions=None,
+ tags=None,
+ details=None,
+ entity=None,
+ source=None,
+ priority=None,
+ user=None,
+ note=None,
+ proxy=None,
+ *args,
+ **kwargs
+ ):
+super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+self.http_conn_id = http_conn_id
+self.api_key = self._get_api_key(http_conn_id)
 
 Review comment:
   This will cause a performance problem. The operator is created every few 
seconds while parsing DAG. Your code will cause a new database query to be 
called every few seconds to read the connection data, but in most cases this 
value will not be used anywhere. The constructor should be light and not create 
any new objects. Currently my team has a similar problem with operators for 
GCP. In these operators, we create a hook in the constructor.
   CC: @zhongjiajie 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-12 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-472213759
 
 
   okay, flake8 is green! Phew.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz commented on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook and Operator

2019-03-12 Thread GitBox
nritholtz commented on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert 
Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#issuecomment-472213740
 
 
   Added operator for the hook, and also converted the `api_key` field to a 
connection password. This way, the `BaseHook` `debug_info` call will obfuscate 
the key in the logs.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-12 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-472208249
 
 
   Strange, now it's telling me differently about the imports of Callable and 
Pod, etc.
   
   ```
   ./airflow/contrib/utils/gcp_field_validator.py:212:9: F821 undefined name 
'Callable'
   ./airflow/contrib/kubernetes/pod_launcher.py:73:9: F821 undefined name 'Pod'
   ./airflow/contrib/kubernetes/pod_launcher.py:94:9: F821 undefined name 'Pod'
   
./airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py:44:9:
 F821 undefined name 'Pod'
   
./airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py:112:9:
 F821 undefined name 'Pod'
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-12 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-472199552
 
 
   Here are the linting issues - strange that there are files not touched in 
this pull request, I will fix these too.
   ```bash
   ./tests/contrib/operators/test_singularity_operator.py:21:1: F401 'logging' 
imported but unused
   ./tests/contrib/operators/test_singularity_operator.py:28:1: F401 
'AirflowException' imported but unused
   ./tests/contrib/operators/test_singularity_operator.py:55:62: W291 trailing 
whitespace
   ./tests/contrib/operators/test_singularity_operator.py:56:71: W291 trailing 
whitespace
   ./tests/contrib/hooks/test_cassandra_hook.py:89:124: E261 at least two 
spaces before inline comment
   ./tests/cli/test_worker_initialisation.py:35:28: E261 at least two spaces 
before inline comment
   ./airflow/configuration.py:556:15: E261 at least two spaces before inline 
comment
   ./airflow/contrib/operators/singularity_operator.py:20:1: F401 'json' 
imported but unused
   ./airflow/contrib/operators/singularity_operator.py:76:20: E251 unexpected 
spaces around keyword / parameter equals
   ./airflow/contrib/operators/singularity_operator.py:76:22: E251 unexpected 
spaces around keyword / parameter equals
   ./airflow/contrib/operators/singularity_operator.py:101:25: E711 comparison 
to None should be 'if cond is None:'
   ./airflow/contrib/operators/singularity_operator.py:107:66: W291 trailing 
whitespace
   ./airflow/contrib/operators/singularity_operator.py:112:16: F821 undefined 
name 'pull_folder'
   ./airflow/contrib/operators/singularity_operator.py:112:28: E711 comparison 
to None should be 'if cond is not None:'
   ./airflow/contrib/operators/singularity_operator.py:113:43: F821 undefined 
name 'pull_folder'
   ./airflow/contrib/operators/singularity_operator.py:116:57: F841 local 
variable 'host_tmp_dir' is assigned to but never used
   ./airflow/contrib/operators/singularity_operator.py:123:33: E711 comparison 
to None should be 'if cond is not None:'
   ./airflow/contrib/operators/singularity_operator.py:125:1: W293 blank line 
contains whitespace
   ./airflow/contrib/operators/singularity_operator.py:134:58: W291 trailing 
whitespace
   ./airflow/contrib/operators/singularity_operator.py:145:53: W291 trailing 
whitespace
   ./airflow/contrib/utils/gcp_field_validator.py:136:1: F401 'Callable' 
imported but unused
   ./airflow/contrib/kubernetes/pod_launcher.py:23:1: F401 'Pod' imported but 
unused
   
./airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py:19:1:
 F401 'Pod' imported but unused
   ./airflow/executors/__init__.py:24:57: E261 at least two spaces before 
inline comment
   ./airflow/macros/__init__.py:22:16: E261 at least two spaces before inline 
comment
   ./airflow/macros/__init__.py:23:26: E261 at least two spaces before inline 
comment
   ./airflow/macros/__init__.py:24:12: E261 at least two spaces before inline 
comment
   ./airflow/macros/__init__.py:25:19: E261 at least two spaces before inline 
comment
   ./airflow/macros/__init__.py:26:12: E261 at least two spaces before inline 
comment
   ./airflow/models/__init__.py:4650:46: E261 at least two spaces before inline 
comment
   ./airflow/www/views.py:1875:34: E261 at least two spaces before inline 
comment
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for airflow

2019-03-12 Thread GitBox
vsoch commented on issue #4846: [AIRFLOW-4030] adding start to singularity for 
airflow
URL: https://github.com/apache/airflow/pull/4846#issuecomment-472197325
 
 
   oops - so sorry @mik-laj ! I didn't check the CI to see that linting failed. 
I'll get that fixed up.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz edited a comment on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook

2019-03-12 Thread GitBox
nritholtz edited a comment on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie 
Alert Hook
URL: https://github.com/apache/airflow/pull/4903#issuecomment-471830443
 
 
   @zhongjiajie  On second thought, Ill add an operator as well since I'm 
already in this code.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nritholtz removed a comment on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie Alert Hook

2019-03-12 Thread GitBox
nritholtz removed a comment on issue #4903: [WIP] [AIRFLOW-4069] Add Opsgenie 
Alert Hook
URL: https://github.com/apache/airflow/pull/4903#issuecomment-471831969
 
 
   Latest build failure was unrelated to this PR:
   ```
   36) ERROR: test_execution_unlimited_parallelism 
(tests.executors.test_local_executor.LocalExecutorTest)
   --
  Traceback (most recent call last):
   tests/executors/test_local_executor.py line 71 in 
test_execution_unlimited_parallelism
 self.execution_parallelism(parallelism=0)
   tests/executors/test_local_executor.py line 54 in execution_parallelism
 executor.end()
   airflow/executors/local_executor.py line 230 in end
 self.impl.end()
   airflow/executors/local_executor.py line 167 in end
 time.sleep(0.5)
   airflow/utils/timeout.py line 43 in handle_timeout
 raise AirflowTaskTimeout(self.error_message)
  AirflowTaskTimeout: Timeout, PID: 215
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] galak75 edited a comment on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-03-12 Thread GitBox
galak75 edited a comment on issue #4743: [AIRFLOW-3871] render Operators 
template fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-472148500
 
 
   @bjoernpollex-sc : thanks a lot for your feedback
   I actually didn't see your comment on Jira. I do not understand why it did 
not pop up.
   
   I have some questions about it:
   
   
   
   > There might be fields I don't want rendered (maybe I use Jinja templating 
internally)
   
   This is a good point. In such a case, shouldn't Airflow templating process 
be preferred?
   
   
   
   > Due to the dynamic nature of Python, I might want to render fields that 
can't be found via introspection
   
   I'm pretty new with python, how this could happen? Isn't introspection meant 
to provide all existing attributes, methods, and so on, on an object? Do you 
have any example or reading about it?
   
   
   
   > I think a viable alternative would be to use the same approach as for 
operators - declare which fields need templating using a template_fields class 
variable.
   
   It is a good alternative to avoid the first point: just choose the fields we 
want to be templated.
   But then, we need to manipulate a class (add a class attribute) to make sure 
its fields will be rendered. 
   
   One thing I really appreciate with approach # 3 is that is works without any 
change on classes. just setting a templated value on any attribute from any 
class (in an operator template_field), and this value will be rendered during 
DAG execution.
   
   If we have to customize a class so that its inner fields are templated, 
would you rather add a `template_field` class attribute, or define a 
`render_template` custom method (approach # 2)?
   
   I hope I'm clear enough... Thank you in advance for your answer.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] galak75 edited a comment on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-03-12 Thread GitBox
galak75 edited a comment on issue #4743: [AIRFLOW-3871] render Operators 
template fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-472148500
 
 
   @bjoernpollex-sc : thanks a lot for your feedback
   I actually didn't see your comment on Jira. I do not understand why it did 
not pop up.
   
   I have some questions about it:
   
   
   
   > There might be fields I don't want rendered (maybe I use Jinja templating 
internally)
   
   This is a good point. In such a case, shouldn't Airflow templating process 
be preferred?
   
   
   
   > Due to the dynamic nature of Python, I might want to render fields that 
can't be found via introspection
   
   I'm pretty new with python, how this could happen? Isn't introspection meant 
to provide all existing attributes, methods, and so on, on an object? Do you 
have any example or reading about it?
   
   
   
   > I think a viable alternative would be to use the same approach as for 
operators - declare which fields need templating using a template_fields class 
variable.
   
   It is a good alternative to avoid the first point: just choose the fields we 
want to be templated.
   But then, we need to manipulate a class (add a class attribute) to make sure 
its fields will be rendered. 
   
   One thing I really appreciate with approach # 3 is that is works without any 
change on classes. just setting a templated value on any attribute from any 
class (in an operator template_field), and this value will be rendered during 
DAG execution.
   
   If we have to customize a class so that its inner fields are templated, 
would you rather add a `template_field` class attribute, or define a 
`render_template` custom method (approach # 2)
   
   I hope I'm clear enough... Thank you in advance for your answer.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template fields recursively

2019-03-12 Thread GitBox
galak75 commented on issue #4743: [AIRFLOW-3871] render Operators template 
fields recursively
URL: https://github.com/apache/airflow/pull/4743#issuecomment-472148500
 
 
   @bjoernpollex-sc : thanks a lot for your feedback
   I actually didn't see your comment on Jira. I do not understand why it did 
not pop up.
   
   I have some questions about it:
   
   > There might be fields I don't want rendered (maybe I use Jinja templating 
internally)
   
   This is a good point. In such a case, shouldn't Airflow templating process 
be preferred?
   
   
   
   > Due to the dynamic nature of Python, I might want to render fields that 
can't be found via introspection
   
   I'm pretty new with python, how this could happen? Isn't introspection meant 
to provide all existing attributes, methods, and so on, on an object? Do you 
have any example or reading about it?
   
   
   
   > I think a viable alternative would be to use the same approach as for 
operators - declare which fields need templating using a template_fields class 
variable.
   
   It is a good alternative to avoid the first point: just choose the fields we 
want to be templated.
   But then, we need to manipulate a class (add a class attribute) to make sure 
its fields will be rendered. 
   
   One thing I really appreciate with approach # 3 is that is works without any 
change on classes. just setting a templated value on any attribute from any 
class (in an operator template_field), and this value will be rendered during 
DAG execution.
   
   If we have to customize a class so that its inner fields are templated, 
would you rather add a `template_field` class attribute, or define a 
`render_template` custom method (approach # 2)
   
   I hope I'm clear enough... Thank you in advance for your answer.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aoen edited a comment on issue #4904: [AIRFLOW-4070] AirflowException -> log.warning for duplicate task dependencies

2019-03-12 Thread GitBox
aoen edited a comment on issue #4904: [AIRFLOW-4070] AirflowException -> 
log.warning for duplicate task dependencies
URL: https://github.com/apache/airflow/pull/4904#issuecomment-472127972
 
 
   I agree that I don't see any risks to this (no need to throw breaking 
exceptions on no-op).
   
   I think maybe the purpose of this could have been immutability, but I don't 
think there's too much value in the current state, and also without good 
constructs for copying DAGs it leaves gaps. e.g. to make composable "subdags" 
(different concept from the SubdagOperator), we need some way of copying the 
tasks from one DAG to another which is not currently feasible. By "subdags" I 
mean you have some process or flow encapsulated in a DAG that you want to reuse 
across several DAGs, currently there is no first-party way to do this.
   
   What do you think @XD-DENG ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4906: AIRFLOW-3736: SqoopOperator Numeric `extra_import_options` value

2019-03-12 Thread GitBox
codecov-io commented on issue #4906: AIRFLOW-3736: SqoopOperator Numeric 
`extra_import_options` value
URL: https://github.com/apache/airflow/pull/4906#issuecomment-472128138
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=h1) 
Report
   > Merging 
[#4906](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/effa9e8b9ca50bbab1687d11fd4ccc7c32c49c1a?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4906/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4906  +/-   ##
   ==
   - Coverage   75.53%   75.53%   -0.01% 
   ==
 Files 450  450  
 Lines   2903429034  
   ==
   - Hits2193121930   -1 
   - Misses   7103 7104   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/contrib/hooks/sqoop\_hook.py](https://codecov.io/gh/apache/airflow/pull/4906/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL2hvb2tzL3Nxb29wX2hvb2sucHk=)
 | `95.33% <100%> (ø)` | :arrow_up: |
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4906/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.87% <0%> (-0.06%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=footer). 
Last update 
[effa9e8...825f968](https://codecov.io/gh/apache/airflow/pull/4906?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] aoen commented on issue #4904: [AIRFLOW-4070] AirflowException -> log.warning for duplicate task dependencies

2019-03-12 Thread GitBox
aoen commented on issue #4904: [AIRFLOW-4070] AirflowException -> log.warning 
for duplicate task dependencies
URL: https://github.com/apache/airflow/pull/4904#issuecomment-472127972
 
 
   I agree that I don't see any risks to this (no need to throw breaking 
exceptions on no-op).
   
   I think maybe the purpose of this could have been immutability, but I don't 
think there's too much value in the current state, and also without good 
constructs for copying DAGs it leaves gaps. e.g. to make composable "subdags" 
(different concept from the SubdagOperator), we need some way of copying the 
tasks from one DAG to another which is not currently feasible.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] codecov-io commented on issue #4904: [AIRFLOW-4070] AirflowException -> log.warning for duplicate task dependencies

2019-03-12 Thread GitBox
codecov-io commented on issue #4904: [AIRFLOW-4070] AirflowException -> 
log.warning for duplicate task dependencies
URL: https://github.com/apache/airflow/pull/4904#issuecomment-472123249
 
 
   # [Codecov](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=h1) 
Report
   > Merging 
[#4904](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=desc) into 
[master](https://codecov.io/gh/apache/airflow/commit/4fb91b0b674a187447e18e6171df7be8a85ac22c?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/airflow/pull/4904/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#4904  +/-   ##
   ==
   - Coverage   75.53%   75.53%   -0.01% 
   ==
 Files 450  450  
 Lines   2903429034  
   ==
   - Hits2193221930   -2 
   - Misses   7102 7104   +2
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=tree) | 
Coverage Δ | |
   |---|---|---|
   | 
[airflow/models/\_\_init\_\_.py](https://codecov.io/gh/apache/airflow/pull/4904/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMvX19pbml0X18ucHk=)
 | `92.87% <100%> (-0.06%)` | :arrow_down: |
   | 
[airflow/contrib/operators/ssh\_operator.py](https://codecov.io/gh/apache/airflow/pull/4904/diff?src=pr=tree#diff-YWlyZmxvdy9jb250cmliL29wZXJhdG9ycy9zc2hfb3BlcmF0b3IucHk=)
 | `82.27% <0%> (-1.27%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=footer). 
Last update 
[4fb91b0...2afcafb](https://codecov.io/gh/apache/airflow/pull/4904?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] dossett commented on issue #4064: AIRFLOW-3149 Support dataproc cluster deletion on ERROR

2019-03-12 Thread GitBox
dossett commented on issue #4064: AIRFLOW-3149 Support dataproc cluster 
deletion on ERROR
URL: https://github.com/apache/airflow/pull/4064#issuecomment-472111972
 
 
   Yeah @OmerJog I haven't been able to track that down


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (AIRFLOW-3901) Add optional role parameter to snowflake hook

2019-03-12 Thread Daniel Standish (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-3901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Standish resolved AIRFLOW-3901.
--
   Resolution: Fixed
Fix Version/s: (was: 1.10.3)

> Add optional role parameter to snowflake hook
> -
>
> Key: AIRFLOW-3901
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3901
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib
>Reporter: Daniel Standish
>Assignee: Daniel Standish
>Priority: Trivial
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> Role is a parameter missing from snowflake hook.
> I will add it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] mans2singh edited a comment on issue #4887: [AIRFLOW-4055] Add AWS SQS Sensor

2019-03-12 Thread GitBox
mans2singh edited a comment on issue #4887: [AIRFLOW-4055] Add AWS SQS Sensor
URL: https://github.com/apache/airflow/pull/4887#issuecomment-471737248
 
 
   @XD-DENG - 
   
   My use case is that we have files dropped on S3 and need to process them as 
soon as possible. Since I do not know the time when the file will be available 
or it's complete path I am driving the DAG with an SQS event.  The sensor 
retrieves the message containing file path etc, and passes it to the next 
processor to consume the file.  This allows me to use airflow sensors nice 
features like timeout, manage task dependencies, etc.  It can also be used for 
any other type of sqs message.  The sensor is similar to [pub sub 
sensor](https://github.com/apache/airflow/blob/master/airflow/contrib/sensors/pubsub_sensor.py).
 
   Please let me know if there is a better way to implementing this workflow.  
   
   Thanks


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-3736) A numeric import option breaks the Sqoop hook/operator

2019-03-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3736?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16790753#comment-16790753
 ] 

ASF GitHub Bot commented on AIRFLOW-3736:
-

kik-kik commented on pull request #4906: AIRFLOW-3736: SqoopOperator numeric 
import
URL: https://github.com/apache/airflow/pull/4906
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> A numeric import option breaks the Sqoop hook/operator
> --
>
> Key: AIRFLOW-3736
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3736
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: contrib, hooks, operators
>Reporter: Alireza
>Assignee: kik
>Priority: Major
>
> Adding a numeric value to the _extra_import_options_ parameter, breaks the 
> sqoop hook.
>  
> {code:java}
> task1 = SqoopOperator(
> task_id='import_tbl',
> cmd_type='import',
> 
> extra_import_options={'fetch-size':1},
> dag=dag
> ){code}
>  
> The following error is thrown:
> {code:java}
> ERROR - sequence item 18: expected string or Unicode, int found
> Traceback (most recent call last):
> File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1659, in 
> _run_raw_task
> result = task_copy.execute(context=context)
> File 
> "/usr/lib/python2.7/site-packages/airflow/contrib/operators/sqoop_operator.py",
>  line 218, in execute
> extra_import_options=self.extra_import_options)
> File "/usr/lib/python2.7/site-packages/airflow/contrib/hooks/sqoop_hook.py", 
> line 232, in import_table
> self.Popen(cmd)
> File "/usr/lib/python2.7/site-packages/airflow/contrib/hooks/sqoop_hook.py", 
> line 100, in Popen
> masked_cmd = ' '.join(self.cmd_mask_password(cmd))  {code}
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] [airflow] kik-kik opened a new pull request #4906: AIRFLOW-3736: SqoopOperator numeric import

2019-03-12 Thread GitBox
kik-kik opened a new pull request #4906: AIRFLOW-3736: SqoopOperator numeric 
import
URL: https://github.com/apache/airflow/pull/4906
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] marengaz opened a new pull request #4905: [AIRFLOW-4072] enable GKEPodOperator xcom

2019-03-12 Thread GitBox
marengaz opened a new pull request #4905: [AIRFLOW-4072] enable GKEPodOperator 
xcom
URL: https://github.com/apache/airflow/pull/4905
 
 
   this provides consistent functionality with KubernetesPodOperator
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
 - In case you are proposing a fundamental code change, you need to create 
an Airflow Improvement 
Proposal([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
 - All the public functions and the classes in the PR contain docstrings 
that explain what it does
 - If you implement backwards incompatible changes, please leave a note in 
the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so 
we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >