pyerbiz commented on issue #16730:
URL: https://github.com/apache/airflow/issues/16730#issuecomment-877830718


   @potiuk Is there an example dag for sftp_to_s3 transfer. Below is what I 
have so far
   
   ```
   import os
   
   from airflow import models
   from airflow.providers.amazon.aws.transfers.sftp_to_s3 import 
SFTPToS3Operator
   from airflow.utils.dates import days_ago
   
   S3_BUCKET = os.environ.get("S3_BUCKET", "test-bucker")
   S3_KEY = os.environ.get("S3_KEY", "key")
   
   with models.DAG(
       "example_sftp_to_s3",
       schedule_interval=None,
       start_date=days_ago(1),  # Override to match your needs
   ) as dag:
   
       # [START howto_sftp_transfer_data_to_s3]
       create_sftp_to_s3_job = SFTPToS3Operator(
           task_id="create_sftp_to_s3_job",
           sftp_conn_id="sftp_conn_id",
           sftp_path="sftp_path",
           s3_conn_id="s3_conn_id",
           s3_bucket=S3_BUCKET,
           s3_key=S3_KEY
       )
       # [END howto_sftp_transfer_data_to_s3]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to