guotongfei commented on a change in pull request #18755:
URL: https://github.com/apache/airflow/pull/18755#discussion_r725652119



##########
File path: airflow/providers/amazon/aws/transfers/mysql_to_s3.py
##########
@@ -92,15 +117,40 @@ def __init__(
         self.aws_conn_id = aws_conn_id
         self.verify = verify
 
-        self.pd_csv_kwargs = pd_csv_kwargs or {}
-        if "path_or_buf" in self.pd_csv_kwargs:
-            raise AirflowException('The argument path_or_buf is not allowed, 
please remove it')
-        if "index" not in self.pd_csv_kwargs:
-            self.pd_csv_kwargs["index"] = index
-        if "header" not in self.pd_csv_kwargs:
-            self.pd_csv_kwargs["header"] = header
+        if file_format == "csv":
+            self.file_format = FILE_FORMAT.CSV
+        else:
+            self.file_format = FILE_FORMAT.PARQUET
+
+        if pd_csv_kwargs:
+            warnings.warn(
+                "pd_csv_kwargs is deprecated. Please use pd_kwargs.",
+                DeprecationWarning,
+                stacklevel=2,
+            )
+        if index or header:
+            warnings.warn(
+                "index and header are deprecated. Please pass them via 
pd_kwargs.",
+                DeprecationWarning,
+                stacklevel=2,
+            )

Review comment:
       Yeah, besides the file name, all parameters are optional for pd.to_csv 
and pd.to_parquet, so I think we should leave them optional for the 
MySQLToS3Operator and let users use pd_kwargs to set them.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to