mariotaddeucci commented on a change in pull request #18027:
URL: https://github.com/apache/airflow/pull/18027#discussion_r702325038



##########
File path: airflow/providers/amazon/aws/transfers/s3_to_redshift.py
##########
@@ -96,33 +114,73 @@ def __init__(
         self.column_list = column_list
         self.copy_options = copy_options or []
         self.autocommit = autocommit
-        self.truncate_table = truncate_table
+        self.method = method
+        self.upsert_keys = upsert_keys
+
+        if self.method not in AVAILABLE_METHODS:
+            raise AirflowException(f'Method not found! Available methods: 
{AVAILABLE_METHODS}')
 
-    def _build_copy_query(self, credentials_block: str, copy_options: str) -> 
str:
+    def _build_copy_query(self, copy_destination: str, credentials_block: str, 
copy_options: str) -> str:
         column_names = "(" + ", ".join(self.column_list) + ")" if 
self.column_list else ''
         return f"""
-                    COPY {self.schema}.{self.table} {column_names}
+                    COPY {copy_destination} {column_names}
                     FROM 's3://{self.s3_bucket}/{self.s3_key}'
                     with credentials
                     '{credentials_block}'
                     {copy_options};
         """
 
+    def _get_table_primary_key(self, postgres_hook):

Review comment:
       This is nice, thinking now, this method works on postgres too 🤔
   Could be an non implemented method on `DbApiHook` and for each hook that 
extends DbApiHook, implement the custom query to get the primary key.
   @JavierLopezT what do you think about it?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to