potix2 commented on a change in pull request #11434:
URL: https://github.com/apache/airflow/pull/11434#discussion_r502952746



##########
File path: airflow/providers/amazon/aws/transfers/s3_to_redshift.py
##########
@@ -87,13 +87,11 @@ def __init__(
         self.verify = verify
         self.copy_options = copy_options or []
         self.autocommit = autocommit
-        self._s3_hook = None
-        self._postgres_hook = None
 
-    def execute(self, context):
-        self._postgres_hook = 
PostgresHook(postgres_conn_id=self.redshift_conn_id)
-        self._s3_hook = S3Hook(aws_conn_id=self.aws_conn_id, 
verify=self.verify)
-        credentials = self._s3_hook.get_credentials()
+    def execute(self, context: Dict[str, Any]) -> None:
+        postgres_hook = PostgresHook(postgres_conn_id=self.redshift_conn_id)

Review comment:
       These variables are referenced from only `execute()`, so I changed scope 
of them. If we keep them as  the attributes of the instance, their type are 
Optional[T]. I think it's a little redundant, because they have always actual 
values in execute().




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to