dstandish commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r718004578



##########
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##########
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
       I think RedshiftSqlHook is an acceptable name, though I reached out on 
slack to see if anyone has some advice for how we could end up with 
`RedshiftHook` as the sql hook.

##########
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##########
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
       I think RedshiftSqlHook is an acceptable name, though I reached out on 
slack to see if anyone has some advice for how we could end up with 
`RedshiftHook` as the sql hook (which i think would be preferable if it can be 
done)

##########
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##########
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
       @mik-laj agreed with @josh-fell on this one, and i think i now agree 
with this too.
   
   namely we should do the following:
   * add RedshiftSqlHook to the existing `redshift` module (i.e. in the same 
module as the existing RedshiftHook)
   * deprecate RedshiftHook (renaming it to RedshiftClusterHook) -- this 
doesn't necessarily need to happen in this PR
   
   then in the next major release _for this provider_ (which now that i think 
of it does not have to be in 3.0, since providers have separate release 
schedule), we remove `RedshiftHook` and we are left with `RedshiftSqlHook` and 
`RedshiftClusterHook`
   

##########
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##########
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
       @mik-laj agreed with @josh-fell on this one, and i think i now agree 
with this too.
   
   namely we should do the following:
   * add RedshiftSqlHook to the existing `redshift` module (i.e. in the same 
module as the existing RedshiftHook)
   * deprecate RedshiftHook (renaming it to RedshiftClusterHook) -- this 
doesn't necessarily need to happen in this PR
   
   And those names we intend to stick with (i.e. ultimately we keep two hooks 
`RedshiftSqlHook` and `RedshiftClusterHook`)
   
   In the next major release _for this provider_ (which now that i think of it 
does not have to be in 3.0, since providers have separate release schedule), we 
remove `RedshiftHook` and we are left with `RedshiftSqlHook` and 
`RedshiftClusterHook`
   

##########
File path: airflow/providers/amazon/aws/hooks/redshift_statement.py
##########
@@ -0,0 +1,144 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Interact with AWS Redshift, using the boto3 library."""
+
+from typing import Callable, Dict, Optional, Tuple, Union
+
+import redshift_connector
+from redshift_connector import Connection as RedshiftConnection
+
+from airflow.hooks.dbapi import DbApiHook
+
+
+class RedshiftStatementHook(DbApiHook):

Review comment:
       @mik-laj agreed with @josh-fell on this one, and i think i now agree 
with this too.
   
   namely we should do the following:
   * add RedshiftSqlHook to the existing `redshift` module (i.e. in the same 
module as the existing RedshiftHook)
   * deprecate RedshiftHook (renaming it to RedshiftClusterHook) -- this 
doesn't necessarily need to happen in this PR
   
   And those names we intend to stick with (i.e. ultimately we keep two hooks 
`RedshiftSqlHook` and `RedshiftClusterHook`)
   
   In the next major release _for this provider_ (which now that i think of it 
does not have to be in airflow 3.0, since providers have separate release 
schedule), we remove `RedshiftHook` and we are left with `RedshiftSqlHook` and 
`RedshiftClusterHook`
   

##########
File path: docs/apache-airflow-providers-amazon/connections/redshift.rst
##########
@@ -0,0 +1,68 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+.. _howto/connection:redshift:
+
+Amazon Redshift Connection
+==========================
+
+The Redshift connection type enables integrations with Redshift.
+
+Authenticating to Amazon Redshift
+---------------------------------
+
+Authentication may be performed using any of the authentication methods 
supported by `redshift_connector 
<https://github.com/aws/amazon-redshift-python-driver>`_ such as via direct 
credentials, IAM authentication, or using an Identity Provider (IdP) plugin.
+
+Default Connection IDs
+-----------------------
+
+The default connection ID is ``redshift_default``.
+
+Configuring the Connection
+--------------------------
+
+
+User

Review comment:
       @mik-laj, in this hook, `login` and `schema` are 
[renamed](https://github.com/apache/airflow/pull/18447/files#diff-a6af6323bf6848d19c7d8066497ebebe20d221398642f5e1340b336c26eb2d87R167)
 to `User` and `Database` in the airflow UI.  Just wondering if this is 
something we encourage or not since I haven't seen this before.  To me, as 
someone who prefers using secrets backend to the airflow metastore, it would 
seem better to stick with `login` and `schema`, and use those uniformly, since 
those are the Connection attribute names, and those are what you'd have to use 
in secrets backend storage (e.g. if using a json backend) or when using 
Connection.get_uri to generate the airflow URI.
   

##########
File path: airflow/providers/amazon/aws/operators/redshift.py
##########
@@ -0,0 +1,73 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from typing import Any, Optional
+
+from airflow.models import BaseOperator
+from airflow.providers.amazon.aws.hooks.redshift import RedshiftSQLHook
+
+
+class RedshiftSQLOperator(BaseOperator):
+    """
+    Executes SQL Statements against an Amazon Redshift cluster
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the 
guide:
+        :ref:`howto/operator:RedshiftSQLOperator`
+
+    :param sql: the sql code to be executed
+    :type sql: Can receive a str representing a sql statement,
+        a list of str (sql statements)
+    :param redshift_conn_id: reference to
+        :ref:`Amazon Redshift connection id<howto/connection:redshift>`
+    :type redshift_conn_id: str
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: dict or iterable
+    :param autocommit: if True, each command is automatically committed.
+        (default value: False)
+    :type autocommit: bool
+    """
+
+    template_fields = ('sql',)
+    template_ext = ('.sql',)
+
+    def __init__(
+        self,
+        *,
+        sql: Any,

Review comment:
       ```suggestion
           sql: Union[str, List[str]],
   ```

##########
File path: docs/apache-airflow-providers-amazon/operators/redshift.rst
##########
@@ -0,0 +1,96 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+.. _howto/operator:RedshiftSqlOperator:
+
+RedshiftSqlOperator
+===================
+
+.. contents::
+  :depth: 1
+  :local:
+
+Overview
+--------
+
+Use the :class:`RedshiftSqlOperator 
<airflow.providers.amazon.aws.operators.redshift>` to execute
+statements against an Amazon Redshift cluster.
+
+:class:`RedshiftSqlOperator 
<airflow.providers.amazon.aws.operators.redshift.RedshiftSqlOperator>` works 
together with
+:class:`RedshiftSqlHook 
<airflow.providers.amazon.aws.hooks.redshift.RedshiftSqlHook>` to establish
+connections with Amazon Redshift.
+
+
+example_redshift.py
+-------------------
+
+Purpose
+"""""""
+
+This is a basic example dag for using :class:`RedshiftSqlOperator 
<airflow.providers.amazon.aws.operators.redshift>`
+to execute statements against an Amazon Redshift cluster.
+
+Create a table
+""""""""""""""
+
+In the following code we are creating a table called "fruit".
+
+.. exampleinclude:: 
/../../airflow/providers/amazon/aws/example_dags/example_redshift.py
+    :language: python
+    :start-after: [START howto_operator_redshift_create_table]
+    :end-before: [END howto_operator_redshift_create_table]
+
+Insert data into a table
+""""""""""""""""""""""""
+
+In the following code we insert a few sample rows into the "fruit" table.
+
+.. exampleinclude:: 
/../../airflow/providers/amazon/aws/example_dags/example_redshift.py
+    :language: python
+    :start-after: [START howto_operator_redshift_populate_table]
+    :end-before: [END howto_operator_redshift_populate_table]
+
+Fetching records from a table
+"""""""""""""""""""""""""""""
+
+Retrieving all records from the "fruit" table.

Review comment:
       alright so this may be nitpicky but ... it strikes me that the select * 
examples (in your example dag) might be confusing to a newcomer.   in reality 
this is not the way you would want to use this operator unless you did it with 
`handler` passed to `hook.run` (because the output of the select will not be 
captured anywhere).  you've written "retrieving" but i don't think that you 
actually retrieve them anywhere (e.g. neither printing to logs nor routing to 
csv -- i don't think any rows are actually fetched).
   
   just to be more realistic i would suggest simply converting them to `create 
table my_table AS select *`.  otherwise a newcomer might look at this and 
assume that the records selected here are actually sent somewhere.
   

##########
File path: tests/providers/amazon/aws/hooks/test_redshift.py
##########
@@ -103,3 +106,51 @@ def test_cluster_status_returns_available_cluster(self):
         hook = RedshiftHook(aws_conn_id='aws_default')
         status = hook.cluster_status('test_cluster')
         assert status == 'available'
+
+
+class TestRedshiftSQLHookConn(unittest.TestCase):
+    def setUp(self):
+        super().setUp()
+
+        self.connection = Connection(login='login', password='password', 
host='host', port=5439, schema="dev")
+
+        class UnitTestRedshiftSQLHook(RedshiftSQLHook):
+            conn_name_attr = "redshift_conn_id"
+            conn_type = 'redshift+redshift_connector'

Review comment:
       I'm curious why you are making this subclass here when it does not 
appear to differ from RedshiftSQLHook




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to