mik-laj commented on a change in pull request #4903: [AIRFLOW-4069] Add 
Opsgenie Alert Hook and Operator
URL: https://github.com/apache/airflow/pull/4903#discussion_r265390331
 
 

 ##########
 File path: airflow/contrib/hooks/opsgenie_alert_hook.py
 ##########
 @@ -0,0 +1,83 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+import json
+
+from airflow.hooks.http_hook import HttpHook
+from airflow.exceptions import AirflowException
+
+
+class OpsgenieAlertHook(HttpHook):
+    """
+    This hook allows you to post alerts to Opsgenie.
+    Accepts a connection that has an Opsgenie API key as the connection's 
password.
+    Each Opsgenie API key can be pre-configured to a team integration.
+    You can override these defaults in this hook.
+
+    :param http_conn_id: Http connection ID with host as 
"https://api.opsgenie.com/";
+                      and Opsgenie API key as the connection's password
+                      (e.g. "eb243592-faa2-4ba2-a551q-1afdf565c889")
+    :type http_conn_id: str
+    :param payload: Opsgenie API Create Alert payload values
+                    See 
https://docs.opsgenie.com/docs/alert-api#section-create-alert
+    :type payload: dict
+    :param proxy: Proxy to use to make the Opsgenie Alert API call
+    :type proxy: str
+    """
+    def __init__(self,
+                 http_conn_id=None,
+                 payload={},
+                 proxy=None,
+                 *args,
+                 **kwargs
+                 ):
+        super(OpsgenieAlertHook, self).__init__(*args, **kwargs)
+        self.http_conn_id = http_conn_id
+        self.api_key = None
+        self.payload = payload
+        self.proxy = proxy
+
+    def _get_api_key(self, http_conn_id):
+        """
+        Given a conn_id, return the api_key to use
+        :param http_conn_id: The conn_id provided
+        :type http_conn_id: str
+        :return: api_key (str) to use
+        """
+        if http_conn_id:
+            conn = self.get_connection(http_conn_id)
+            return conn.password
+        else:
+            raise AirflowException('Cannot get api_key: No valid conn_id '
+                                   'supplied')
+
+    def execute(self):
+        """
+        Remote Popen (actually execute the Opsgenie Alert call)
+        """
+        proxies = {}
+        if self.proxy:
+            # we only need https proxy for Opsgenie, as the endpoint is https
+            proxies = {'https': self.proxy}
+        self.api_key = self.api_key or self._get_api_key(self.http_conn_id)
 
 Review comment:
   Each operator is performed in isolation. The previous state of the operation 
is not stored. It is possible to optimize the number of database queries during 
one execution. Each operator execution is a new Python interpreter. Airflow is 
designed for distributed systems where shared memory does not exist. Airflow is 
intended for distributed systems. It does not have shared memory. 
   
   Reference: 
   > In previous versions of the scheduler, user-supplied DAG definitions were 
parsed and loaded in the same process as the scheduler. Unfortunately, this 
made it possible for bad user code to adversely affect the scheduler process. 
For example, if a user DAG definition includes a `system.exit(-1)`, parsing the 
DAG definition would cause the scheduler process to exit.
   
   https://cwiki.apache.org/confluence/display/AIRFLOW/Scheduler+Basics
   
   
https://github.com/apache/airflow/blob/master/airflow/executors/sequential_executor.py#L48
   
https://github.com/apache/airflow/blob/master/airflow/executors/local_executor.py#L86
   
https://github.com/apache/airflow/blob/master/airflow/executors/celery_executor.py#L66

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to