[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-02-12 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r378272407
 
 

 ##
 File path: docs/operators-and-hooks-ref.rst
 ##
 @@ -98,6 +98,12 @@ Foundation.
:mod:`airflow.sensors.hive_partition_sensor`,
:mod:`airflow.sensors.metastore_partition_sensor`
 
+   * - `Apache Livy `__
+ -
+ - :mod:`airflow.contrib.hooks.livy_hook`
+ - :mod:`airflow.contrib.operators.livy_operator`
+ - :mod:`airflow.contrib.sensors.livy_sensor`
+
 
 Review comment:
   @zhongjiajie already pointed it out.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-02-12 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r378270803
 
 

 ##
 File path: docs/operators-and-hooks-ref.rst
 ##
 @@ -98,6 +98,12 @@ Foundation.
:mod:`airflow.sensors.hive_partition_sensor`,
:mod:`airflow.sensors.metastore_partition_sensor`
 
+   * - `Apache Livy `__
+ -
+ - :mod:`airflow.contrib.hooks.livy_hook`
+ - :mod:`airflow.contrib.operators.livy_operator`
+ - :mod:`airflow.contrib.sensors.livy_sensor`
+
 
 Review comment:
   @lucacavazzana please update the links. That is the reason why the build 
docs test is failing.
   
   ```
   Missing modules:
   
   airflow.providers.apache.livy.hooks.livy_hook
   airflow.providers.apache.livy.operators.livy_operator
   airflow.providers.apache.livy.sensors.livy_sensor
   
   Please add this module to operators-and-hooks-ref.rst file.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2020-02-12 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r378270803
 
 

 ##
 File path: docs/operators-and-hooks-ref.rst
 ##
 @@ -98,6 +98,12 @@ Foundation.
:mod:`airflow.sensors.hive_partition_sensor`,
:mod:`airflow.sensors.metastore_partition_sensor`
 
+   * - `Apache Livy `__
+ -
+ - :mod:`airflow.contrib.hooks.livy_hook`
+ - :mod:`airflow.contrib.operators.livy_operator`
+ - :mod:`airflow.contrib.sensors.livy_sensor`
+
 
 Review comment:
   Please update the links. That is the reason why the build docs test is 
failing.
   
   ```
   Missing modules:
   
   airflow.providers.apache.livy.hooks.livy_hook
   airflow.providers.apache.livy.operators.livy_operator
   airflow.providers.apache.livy.sensors.livy_sensor
   
   Please add this module to operators-and-hooks-ref.rst file.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-10-10 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r333669995
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
+:type driver_memory: str
+:param driver_cores: Number of cores to use for the driver process int.
+:type driver_cores: str
+:param executor_memory: Amount of memory to use per executor process  
string.
+:type executor_memory: str
+:param executor_cores: Number of cores to use for each executor  int.
+:type executor_cores: str
+:param num_executors: Number of executors to launch for this session  int.
+:type num_executors: str
+:param archives: Archives to be used in this session.
+:type archives: list
+:param queue: The name of the YARN queue to which submitted string.
+:type queue: str
+:param name: The name of this session  string.
+:type name: str
+:param conf: Spark configuration properties.
+:type conf: dict
+:param proxy_user: User to impersonate when running the job.
+:type proxy_user: str
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+:param polling_interval: time in seconds between polling for job 
completion. Don't poll for values >=0
+:type polling_interval: int
+:param timeout: for a value greater than zero, number of seconds to poll 
before killing the batch.
+:type timeout: int
+"""
+
+@apply_defaults
+def __init__(
+self,
+file=None,
+args=None,
+conf=None,
+livy_conn_id='livy_default',
+polling_interval=0,
+timeout=24 * 3600,
+*vargs,
+**kwargs
+):
+super(LivyOperator, self).__init__(*vargs, **kwargs)
+
+self._spark_params = {
+'file': file,
+'args': args,
+'conf': conf,
+}
+
+self._spark_params['proxy_user'] = kwargs.get('proxy_user')
+self._spark_params['class_name'] = kwargs.get('class_name')
+self._spark_params['jars'] = kwargs.get('jars')
+self._spark_params['py_files'] = kwargs.get('py_files')
+self._spark_params['files'] = kwargs.get('files')
+self._spark_params['driver_memory'] = kwargs.get('driver_memory')
+self._spark_params['driver_cores'] = kwargs.get('driver_cores')
+self._spark_params['executor_memory'] = kwargs.get('executor_memory')
+self._spark_params['executor_cores'] = kwargs.get('executor_cores')
+self._spark_params['num_executors'] = kwargs.get('num_executors')
+self._spark_params['archives'] = kwargs.get('archives')
+self._spark_params['queue'] = kwargs.get('queue')
+self._spark_params['name'] = kwargs.get('name')
+
+self._livy_conn_id = livy_conn_id
+self._polling_interval = polling_interval
+self._timeout = timeout
+
+self._livy_hook = None
+self._batch_id = None
+self._start_ts = None
+
+def _init_hook(self):
+if self._livy_conn_id:
+if self._livy_hook 

[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326870689
 
 

 ##
 File path: airflow/contrib/hooks/livy_hook.py
 ##
 @@ -0,0 +1,297 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy hook.
+"""
+
+import re
+from enum import Enum
+import json
+import requests
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class BatchState(Enum):
+"""
+Batch session states
+"""
+NOT_STARTED = 'not_started'
+STARTING = 'starting'
+RUNNING = 'running'
+IDLE = 'idle'
+BUSY = 'busy'
+SHUTTING_DOWN = 'shutting_down'
+ERROR = 'error'
+DEAD = 'dead'
+KILLED = 'killed'
+SUCCESS = 'success'
+
+
+TERMINAL_STATES = {
+BatchState.SUCCESS,
+BatchState.DEAD,
+BatchState.KILLED,
+BatchState.ERROR,
+}
+
+
+class LivyHook(BaseHook, LoggingMixin):
+"""
+Hook for Apache Livy through the REST API.
+
+For more information about the API refer to
+https://livy.apache.org/docs/latest/rest-api.html
+
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+"""
+def __init__(self, livy_conn_id='livy_default'):
+super(LivyHook, self).__init__(livy_conn_id)
+self._livy_conn_id = livy_conn_id
+self._build_base_url()
+
+def _build_base_url(self):
+"""
+Build connection URL
+"""
+params = self.get_connection(self._livy_conn_id)
+
+base_url = params.host
+
+if not base_url:
+raise AirflowException("Missing Livy endpoint hostname")
+
+if '://' not in base_url:
+base_url = '{}://{}'.format('http', base_url)
+if not re.search(r':\d+$', base_url):
+base_url = '{}:{}'.format(base_url, str(params.port or 8998))
+
+self._base_url = base_url
+
+def get_conn(self):
+pass
+
+def post_batch(self, *args, **kwargs):
+"""
+Perform request to submit batch
+"""
+
+batch_submit_body = json.dumps(LivyHook.build_post_batch_body(*args, 
**kwargs))
+headers = {'Content-Type': 'application/json'}
+
+self.log.info("Submitting job {} to {}".format(batch_submit_body, 
self._base_url))
+response = requests.post(self._base_url + '/batches', 
data=batch_submit_body, headers=headers)
+self.log.debug("Got response: {}".format(response.text))
+
+if response.status_code != 201:
+raise AirflowException("Could not submit batch. Status code: 
{}".format(response.status_code))
+
+batch_id = LivyHook._parse_post_response(response.json())
+if batch_id is None:
+raise AirflowException("Unable to parse a batch session id")
+self.log.info("Batch submitted with session id: {}".format(batch_id))
+
+return batch_id
+
+def get_batch(self, session_id):
+"""
+Fetch info about the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}'.format(self._base_url, 
session_id))
+
+if response.status_code != 200:
+self.log.warning("Got status code {} for session 
{}".format(response.status_code, session_id))
+raise AirflowException("Unable to fetch batch with id: 
{}".format(session_id))
+
+return response.json()
+
+def get_batch_state(self, session_id):
+"""
+Fetch the state of the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}/state'.format(self._base_url, 
session_id))
+
+ 

[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871022
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
 
 Review comment:
   Could you add a short description here what the operator does?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871046
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
 
 Review comment:
   ```suggestion
   :param args: Command line arguments for the application.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871710
 
 

 ##
 File path: tests/contrib/hooks/test_livy_hook.py
 ##
 @@ -0,0 +1,428 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import unittest
+from unittest.mock import patch, MagicMock
+import json
+from requests.exceptions import RequestException
+
+from airflow import AirflowException
+from airflow.models import Connection
+from airflow.utils import db
+
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState
+
+TEST_ID = 100
+SAMPLE_GET_RESPONSE = {'id': TEST_ID, 'state': BatchState.SUCCESS.value}
+
+
+class TestLivyHook(unittest.TestCase):
+
+def setUp(self):
 
 Review comment:
   I think 
[setUpClass](https://docs.python.org/3/library/unittest.html#unittest.TestCase.setUpClass)
 would also work and it wouldn't call those "db merges" before each test. And 
you can specify 
[tearDownClass](https://docs.python.org/3/library/unittest.html#unittest.TestCase.tearDownClass)
 to drop the connections.
   
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871060
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
 
 Review comment:
   ```suggestion
   :param driver_memory: Amount of memory to use for the driver process.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326870791
 
 

 ##
 File path: airflow/contrib/hooks/livy_hook.py
 ##
 @@ -0,0 +1,297 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy hook.
+"""
+
+import re
+from enum import Enum
+import json
+import requests
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class BatchState(Enum):
+"""
+Batch session states
+"""
+NOT_STARTED = 'not_started'
+STARTING = 'starting'
+RUNNING = 'running'
+IDLE = 'idle'
+BUSY = 'busy'
+SHUTTING_DOWN = 'shutting_down'
+ERROR = 'error'
+DEAD = 'dead'
+KILLED = 'killed'
+SUCCESS = 'success'
+
+
+TERMINAL_STATES = {
+BatchState.SUCCESS,
+BatchState.DEAD,
+BatchState.KILLED,
+BatchState.ERROR,
+}
+
+
+class LivyHook(BaseHook, LoggingMixin):
+"""
+Hook for Apache Livy through the REST API.
+
+For more information about the API refer to
+https://livy.apache.org/docs/latest/rest-api.html
+
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+"""
+def __init__(self, livy_conn_id='livy_default'):
+super(LivyHook, self).__init__(livy_conn_id)
+self._livy_conn_id = livy_conn_id
+self._build_base_url()
+
+def _build_base_url(self):
+"""
+Build connection URL
+"""
+params = self.get_connection(self._livy_conn_id)
+
+base_url = params.host
+
+if not base_url:
+raise AirflowException("Missing Livy endpoint hostname")
+
+if '://' not in base_url:
+base_url = '{}://{}'.format('http', base_url)
+if not re.search(r':\d+$', base_url):
+base_url = '{}:{}'.format(base_url, str(params.port or 8998))
+
+self._base_url = base_url
+
+def get_conn(self):
+pass
+
+def post_batch(self, *args, **kwargs):
+"""
+Perform request to submit batch
+"""
+
+batch_submit_body = json.dumps(LivyHook.build_post_batch_body(*args, 
**kwargs))
+headers = {'Content-Type': 'application/json'}
+
+self.log.info("Submitting job {} to {}".format(batch_submit_body, 
self._base_url))
+response = requests.post(self._base_url + '/batches', 
data=batch_submit_body, headers=headers)
+self.log.debug("Got response: {}".format(response.text))
+
+if response.status_code != 201:
+raise AirflowException("Could not submit batch. Status code: 
{}".format(response.status_code))
+
+batch_id = LivyHook._parse_post_response(response.json())
+if batch_id is None:
+raise AirflowException("Unable to parse a batch session id")
+self.log.info("Batch submitted with session id: {}".format(batch_id))
+
+return batch_id
+
+def get_batch(self, session_id):
+"""
+Fetch info about the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}'.format(self._base_url, 
session_id))
+
+if response.status_code != 200:
+self.log.warning("Got status code {} for session 
{}".format(response.status_code, session_id))
+raise AirflowException("Unable to fetch batch with id: 
{}".format(session_id))
+
+return response.json()
+
+def get_batch_state(self, session_id):
+"""
+Fetch the state of the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}/state'.format(self._base_url, 
session_id))
+
+ 

[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871162
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
 
 Review comment:
   There are some more of those - see below.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326870662
 
 

 ##
 File path: airflow/contrib/hooks/livy_hook.py
 ##
 @@ -0,0 +1,297 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy hook.
+"""
+
+import re
+from enum import Enum
+import json
+import requests
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class BatchState(Enum):
+"""
+Batch session states
+"""
+NOT_STARTED = 'not_started'
+STARTING = 'starting'
+RUNNING = 'running'
+IDLE = 'idle'
+BUSY = 'busy'
+SHUTTING_DOWN = 'shutting_down'
+ERROR = 'error'
+DEAD = 'dead'
+KILLED = 'killed'
+SUCCESS = 'success'
+
+
+TERMINAL_STATES = {
+BatchState.SUCCESS,
+BatchState.DEAD,
+BatchState.KILLED,
+BatchState.ERROR,
+}
+
+
+class LivyHook(BaseHook, LoggingMixin):
+"""
+Hook for Apache Livy through the REST API.
+
+For more information about the API refer to
+https://livy.apache.org/docs/latest/rest-api.html
+
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+"""
+def __init__(self, livy_conn_id='livy_default'):
+super(LivyHook, self).__init__(livy_conn_id)
+self._livy_conn_id = livy_conn_id
+self._build_base_url()
+
+def _build_base_url(self):
+"""
+Build connection URL
+"""
+params = self.get_connection(self._livy_conn_id)
+
+base_url = params.host
+
+if not base_url:
+raise AirflowException("Missing Livy endpoint hostname")
+
+if '://' not in base_url:
+base_url = '{}://{}'.format('http', base_url)
+if not re.search(r':\d+$', base_url):
+base_url = '{}:{}'.format(base_url, str(params.port or 8998))
+
+self._base_url = base_url
+
+def get_conn(self):
+pass
+
+def post_batch(self, *args, **kwargs):
+"""
+Perform request to submit batch
 
 Review comment:
   I think it is missing in every function you documented.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326870579
 
 

 ##
 File path: airflow/contrib/hooks/livy_hook.py
 ##
 @@ -0,0 +1,297 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy hook.
+"""
+
+import re
+from enum import Enum
+import json
+import requests
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class BatchState(Enum):
+"""
+Batch session states
+"""
+NOT_STARTED = 'not_started'
+STARTING = 'starting'
+RUNNING = 'running'
+IDLE = 'idle'
+BUSY = 'busy'
+SHUTTING_DOWN = 'shutting_down'
+ERROR = 'error'
+DEAD = 'dead'
+KILLED = 'killed'
+SUCCESS = 'success'
+
+
+TERMINAL_STATES = {
+BatchState.SUCCESS,
+BatchState.DEAD,
+BatchState.KILLED,
+BatchState.ERROR,
+}
+
+
+class LivyHook(BaseHook, LoggingMixin):
+"""
+Hook for Apache Livy through the REST API.
+
+For more information about the API refer to
+https://livy.apache.org/docs/latest/rest-api.html
+
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+"""
+def __init__(self, livy_conn_id='livy_default'):
+super(LivyHook, self).__init__(livy_conn_id)
+self._livy_conn_id = livy_conn_id
+self._build_base_url()
+
+def _build_base_url(self):
+"""
+Build connection URL
+"""
+params = self.get_connection(self._livy_conn_id)
+
+base_url = params.host
+
+if not base_url:
+raise AirflowException("Missing Livy endpoint hostname")
+
+if '://' not in base_url:
+base_url = '{}://{}'.format('http', base_url)
+if not re.search(r':\d+$', base_url):
+base_url = '{}:{}'.format(base_url, str(params.port or 8998))
+
+self._base_url = base_url
+
+def get_conn(self):
+pass
+
+def post_batch(self, *args, **kwargs):
+"""
+Perform request to submit batch
+"""
+
+batch_submit_body = json.dumps(LivyHook.build_post_batch_body(*args, 
**kwargs))
+headers = {'Content-Type': 'application/json'}
+
+self.log.info("Submitting job {} to {}".format(batch_submit_body, 
self._base_url))
+response = requests.post(self._base_url + '/batches', 
data=batch_submit_body, headers=headers)
+self.log.debug("Got response: {}".format(response.text))
+
+if response.status_code != 201:
+raise AirflowException("Could not submit batch. Status code: 
{}".format(response.status_code))
+
+batch_id = LivyHook._parse_post_response(response.json())
+if batch_id is None:
+raise AirflowException("Unable to parse a batch session id")
+self.log.info("Batch submitted with session id: {}".format(batch_id))
+
+return batch_id
+
+def get_batch(self, session_id):
+"""
+Fetch info about the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}'.format(self._base_url, 
session_id))
+
+if response.status_code != 200:
+self.log.warning("Got status code {} for session 
{}".format(response.status_code, session_id))
+raise AirflowException("Unable to fetch batch with id: 
{}".format(session_id))
+
+return response.json()
+
+def get_batch_state(self, session_id):
+"""
+Fetch the state of the specified batch
+:param session_id: identifier of the batch sessions
+:type session_id: int
+"""
+LivyHook._validate_session_id(session_id)
+
+self.log.debug("Fetching info for batch session {}".format(session_id))
+response = requests.get('{}/batches/{}/state'.format(self._base_url, 
session_id))
+
+ 

[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326870633
 
 

 ##
 File path: airflow/contrib/hooks/livy_hook.py
 ##
 @@ -0,0 +1,297 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy hook.
+"""
+
+import re
+from enum import Enum
+import json
+import requests
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+
+class BatchState(Enum):
+"""
+Batch session states
+"""
+NOT_STARTED = 'not_started'
+STARTING = 'starting'
+RUNNING = 'running'
+IDLE = 'idle'
+BUSY = 'busy'
+SHUTTING_DOWN = 'shutting_down'
+ERROR = 'error'
+DEAD = 'dead'
+KILLED = 'killed'
+SUCCESS = 'success'
+
+
+TERMINAL_STATES = {
+BatchState.SUCCESS,
+BatchState.DEAD,
+BatchState.KILLED,
+BatchState.ERROR,
+}
+
+
+class LivyHook(BaseHook, LoggingMixin):
+"""
+Hook for Apache Livy through the REST API.
+
+For more information about the API refer to
+https://livy.apache.org/docs/latest/rest-api.html
+
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+"""
+def __init__(self, livy_conn_id='livy_default'):
+super(LivyHook, self).__init__(livy_conn_id)
+self._livy_conn_id = livy_conn_id
+self._build_base_url()
+
+def _build_base_url(self):
+"""
+Build connection URL
+"""
+params = self.get_connection(self._livy_conn_id)
+
+base_url = params.host
+
+if not base_url:
+raise AirflowException("Missing Livy endpoint hostname")
+
+if '://' not in base_url:
+base_url = '{}://{}'.format('http', base_url)
+if not re.search(r':\d+$', base_url):
+base_url = '{}:{}'.format(base_url, str(params.port or 8998))
+
+self._base_url = base_url
+
+def get_conn(self):
+pass
+
+def post_batch(self, *args, **kwargs):
+"""
+Perform request to submit batch
 
 Review comment:
   Could you document the `:return:` and the `:rtype:`, please? :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871236
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
+:type driver_memory: str
+:param driver_cores: Number of cores to use for the driver process int.
+:type driver_cores: str
+:param executor_memory: Amount of memory to use per executor process  
string.
+:type executor_memory: str
+:param executor_cores: Number of cores to use for each executor  int.
+:type executor_cores: str
+:param num_executors: Number of executors to launch for this session  int.
+:type num_executors: str
+:param archives: Archives to be used in this session.
+:type archives: list
+:param queue: The name of the YARN queue to which submitted string.
+:type queue: str
+:param name: The name of this session  string.
+:type name: str
+:param conf: Spark configuration properties.
+:type conf: dict
+:param proxy_user: User to impersonate when running the job.
+:type proxy_user: str
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+:param polling_interval: time in seconds between polling for job 
completion. Don't poll for values >=0
+:type polling_interval: int
+:param timeout: for a value greater than zero, number of seconds to poll 
before killing the batch.
+:type timeout: int
+"""
+
+@apply_defaults
+def __init__(
+self,
+file=None,
+args=None,
+conf=None,
+livy_conn_id='livy_default',
+polling_interval=0,
+timeout=24 * 3600,
+*vargs,
+**kwargs
 
 Review comment:
   Would you mind splitting up the kwargs here, too?
   Then you could also remove those lines below.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871346
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
+:type driver_memory: str
+:param driver_cores: Number of cores to use for the driver process int.
+:type driver_cores: str
+:param executor_memory: Amount of memory to use per executor process  
string.
+:type executor_memory: str
+:param executor_cores: Number of cores to use for each executor  int.
+:type executor_cores: str
+:param num_executors: Number of executors to launch for this session  int.
+:type num_executors: str
+:param archives: Archives to be used in this session.
+:type archives: list
+:param queue: The name of the YARN queue to which submitted string.
+:type queue: str
+:param name: The name of this session  string.
+:type name: str
+:param conf: Spark configuration properties.
+:type conf: dict
+:param proxy_user: User to impersonate when running the job.
+:type proxy_user: str
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+:param polling_interval: time in seconds between polling for job 
completion. Don't poll for values >=0
+:type polling_interval: int
+:param timeout: for a value greater than zero, number of seconds to poll 
before killing the batch.
+:type timeout: int
+"""
+
+@apply_defaults
+def __init__(
+self,
+file=None,
+args=None,
+conf=None,
+livy_conn_id='livy_default',
+polling_interval=0,
+timeout=24 * 3600,
+*vargs,
+**kwargs
+):
+super(LivyOperator, self).__init__(*vargs, **kwargs)
+
+self._spark_params = {
+'file': file,
+'args': args,
+'conf': conf,
+}
+
+self._spark_params['proxy_user'] = kwargs.get('proxy_user')
+self._spark_params['class_name'] = kwargs.get('class_name')
+self._spark_params['jars'] = kwargs.get('jars')
+self._spark_params['py_files'] = kwargs.get('py_files')
+self._spark_params['files'] = kwargs.get('files')
+self._spark_params['driver_memory'] = kwargs.get('driver_memory')
+self._spark_params['driver_cores'] = kwargs.get('driver_cores')
+self._spark_params['executor_memory'] = kwargs.get('executor_memory')
+self._spark_params['executor_cores'] = kwargs.get('executor_cores')
+self._spark_params['num_executors'] = kwargs.get('num_executors')
+self._spark_params['archives'] = kwargs.get('archives')
+self._spark_params['queue'] = kwargs.get('queue')
+self._spark_params['name'] = kwargs.get('name')
+
+self._livy_conn_id = livy_conn_id
+self._polling_interval = polling_interval
+self._timeout = timeout
+
+self._livy_hook = None
+self._batch_id = None
+self._start_ts = None
+
+def _init_hook(self):
+if self._livy_conn_id:
+if self._livy_hook 

[GitHub] [airflow] feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache Livy REST operator

2019-09-21 Thread GitBox
feluelle commented on a change in pull request #6090: [AIRFLOW-5470] Add Apache 
Livy REST operator
URL: https://github.com/apache/airflow/pull/6090#discussion_r326871517
 
 

 ##
 File path: airflow/contrib/operators/livy_operator.py
 ##
 @@ -0,0 +1,175 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+This module contains the Apache Livy operator.
+"""
+
+from time import sleep, gmtime, mktime
+
+from airflow.models import BaseOperator
+from airflow.utils.decorators import apply_defaults
+from airflow.exceptions import AirflowException
+from airflow.contrib.hooks.livy_hook import LivyHook, BatchState, 
TERMINAL_STATES
+
+
+class LivyOperator(BaseOperator):
+"""
+:param file: Path of the  file containing the application to execute 
(required).
+:type file: str
+:param class_name: Application Java/Spark main class string.
+:type class_name: str
+:param args: Command line arguments for the application s.
+:type args: list
+:param jars: jars to be used in this sessions.
+:type jars: list
+:param py_files: Python files to be used in this session.
+:type py_files: list
+:param files: files to be used in this session.
+:type files: list
+:param driver_memory: Amount of memory to use for the driver process  
string.
+:type driver_memory: str
+:param driver_cores: Number of cores to use for the driver process int.
+:type driver_cores: str
+:param executor_memory: Amount of memory to use per executor process  
string.
+:type executor_memory: str
+:param executor_cores: Number of cores to use for each executor  int.
+:type executor_cores: str
+:param num_executors: Number of executors to launch for this session  int.
+:type num_executors: str
+:param archives: Archives to be used in this session.
+:type archives: list
+:param queue: The name of the YARN queue to which submitted string.
+:type queue: str
+:param name: The name of this session  string.
+:type name: str
+:param conf: Spark configuration properties.
+:type conf: dict
+:param proxy_user: User to impersonate when running the job.
+:type proxy_user: str
+:param livy_conn_id: reference to a pre-defined Livy Connection.
+:type livy_conn_id: str
+:param polling_interval: time in seconds between polling for job 
completion. Don't poll for values >=0
+:type polling_interval: int
+:param timeout: for a value greater than zero, number of seconds to poll 
before killing the batch.
+:type timeout: int
+"""
+
+@apply_defaults
+def __init__(
+self,
+file=None,
+args=None,
+conf=None,
+livy_conn_id='livy_default',
+polling_interval=0,
+timeout=24 * 3600,
+*vargs,
 
 Review comment:
   Why did you call them `*vargs` instead of `*args` ? Your documentation has 
only `*args` and I think `*args` is also more commonly used.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services