RosterIn commented on a change in pull request #6578: [DEPENDS ON
#6575][DEPENDS ON #6577][AIRFLOW-5907] Add S3 to MySql Operator
URL: https://github.com/apache/airflow/pull/6578#discussion_r346815026
##########
File path: airflow/operators/s3_to_mysql.py
##########
@@ -0,0 +1,65 @@
+from airflow.hooks.mysql_hook import MySqlHook
+from airflow.models import BaseOperator
+from airflow.providers.aws.hooks.s3 import S3Hook
+from airflow.utils.decorators import apply_defaults
+
+
+class S3ToMySqlTransfer(BaseOperator):
+ """
+ Loads a file from S3 into a MySQL table.
+
+ :param s3_source_key: The path to the file (S3 key) that will be loaded
into MySQL.
+ :type s3_source_key: str
+ :param mysql_table: The MySQL table into where the data will be sent.
+ :type mysql_table: str
+ :param delimiter: The delimiter for the file.
+ :type delimiter: str
+ :param header_rows: This parameter defines the number of header rows in
the input file.
+ :type header_rows: int
+ :param aws_conn_id: The S3 connection that contains the credentials to the
S3 Bucket.
+ :type aws_conn_id: str
+ :param mysql_conn_id: The MySQL connection that contains the credentials
to the MySQL data base.
+ :type mysql_conn_id: str
+ """
+
+ template_fields = ('s3_source_key',)
Review comment:
can you also add mysql_table ?
It's possible that some workflows will need to template the target table
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services