hyangminj opened a new issue, #28830:
URL: https://github.com/apache/airflow/issues/28830

   ### Description
   
   Airflow provides the Amazon DynamoDB to Amazon S3 below.  
   
https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/operators/transfer/dynamodb_to_s3.html
   
   Most of Data Engineer build their "export DDB data to s3" pipeline using 
"within the point in time recovery window".   
   
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Client.export_table_to_point_in_time
   
   I appreciate if airflow has this function as a native function.  
   
   ### Use case/motivation
   
   My daily batch job exports its data with pitr option. All of tasks is 
written by apache-airflow-providers-amazon except 
"export_table_to_point_in_time" task. 
   "export_table_to_point_in_time" task only used the python operator. I expect 
I can unify the task as apache-airflow-providers-amazon library. 
   
   ### Related issues
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to