Brooke-white commented on a change in pull request #18447:
URL: https://github.com/apache/airflow/pull/18447#discussion_r721602520



##########
File path: airflow/providers/amazon/aws/hooks/redshift.py
##########
@@ -126,3 +133,91 @@ def create_cluster_snapshot(self, snapshot_identifier: 
str, cluster_identifier:
             ClusterIdentifier=cluster_identifier,
         )
         return response['Snapshot'] if response['Snapshot'] else None
+
+
+class RedshiftSQLHook(DbApiHook):
+    """
+    Execute statements against Amazon Redshift, using redshift_connector
+
+    This hook requires the redshift_conn_id connection. This connection must
+    be initialized with the schema. Additional connection
+    options can be passed to extra as a JSON string.

Review comment:
       Thank you for taking the time to write this all up, I really appreciate 
all the time and effort you've put into this PR :)
   
   I've removed the parameter validation from the Hook, so we defer to the 
driver for that. Added examples to the connection docs for auth via IAM and 
Identity Provider. Test cases for the connection parsing behavior were added as 
well. 8f843d6
   
   Thank you for the feedback re: the driver's error messages when parameter 
combinations aren't valid. I'll pass this along to the Redshift driver team and 
see what we can do here. At the very least, improving the error messages & 
validation should be doable :). We *really* should be identifying these bad 
cases up front so users can receive a less cryptic error message rather than 
continuing through the code path with an incorrect configuration. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to