eladkal commented on a change in pull request #12564:
URL: https://github.com/apache/airflow/pull/12564#discussion_r528819179



##########
File path: airflow/providers/snowflake/transfers/s3_to_snowflake.py
##########
@@ -58,22 +77,38 @@ def __init__(
         file_format: str,
         schema: str,  # TODO: shouldn't be required, rely on session/user 
defaults
         columns_array: Optional[list] = None,
+        warehouse: Optional[str] = None,
         autocommit: bool = True,
         snowflake_conn_id: str = 'snowflake_default',
+        role: Optional[str] = None,
+        authenticator: Optional[str] = None,
+        session_parameters: Optional[dict] = None,
         **kwargs,
     ) -> None:
         super().__init__(**kwargs)
         self.s3_keys = s3_keys
         self.table = table
+        self.warehouse = warehouse
         self.stage = stage
         self.file_format = file_format
         self.schema = schema
         self.columns_array = columns_array
         self.autocommit = autocommit
         self.snowflake_conn_id = snowflake_conn_id
+        self.role = role
+        self.authenticator = authenticator
+        self.session_parameters = session_parameters
 
     def execute(self, context: Any) -> None:
-        snowflake_hook = 
SnowflakeHook(snowflake_conn_id=self.snowflake_conn_id)
+        snowflake_hook = SnowflakeHook(
+            snowflake_conn_id=self.snowflake_conn_id,
+            warehouse=self.warehouse,
+            database=self.database,

Review comment:
       Why? like schema you can also select different DBs.
   Do you see value in limiting it only to the DB defined in the connection?
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to