eladkal commented on a change in pull request #18755:
URL: https://github.com/apache/airflow/pull/18755#discussion_r722651040
##########
File path: airflow/providers/amazon/aws/transfers/mysql_to_s3.py
##########
@@ -60,6 +60,10 @@ class MySQLToS3Operator(BaseOperator):
:type index: str
:param header: whether to include header or not into the S3 file
:type header: bool
+ :param file_format: the destination file format, only csv and parquet are
supported.
+ :type file_format: str
+ :param pd_parquet_kwargs: arguments to include in pd.to_parquet
+ :type pd_parquet_kwargs: dict
Review comment:
We don't really need a dedicated kwargs per `file_format`.
We can deprecate `pd_csv_kwargs` have a generic `pandas_kwargs` similar to
what we have in
[hive](https://github.com/apache/airflow/blob/da99c3fa6c366d762bba9fbf3118cc3b3d55f6b4/airflow/providers/apache/hive/hooks/hive.py#L372)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]