Taragolis commented on issue #28830:
URL: https://github.com/apache/airflow/issues/28830#issuecomment-1377388529

   BTW, Hooks which provided by amazon-provider is basically just a wrapper 
around of `boto3` clients / resources.
   So you always could access to all `boto3` clients methods within the hook
   
   ```python
   
   from airflow.providers.amazon.aws.hooks.dynamodb import DynamoDBHook
   
   hook = DynamoDBHook(aws_conn_id="awesome-connection-id", 
region_name="us-east-1")
   # DynamoDBHook create `resource` as high level client
   # but you have an access to regular client by call `meta.client`.
   # This potentially could be uniform in the future, see: 
https://github.com/apache/airflow/discussions/28560
   client = hook.conn.meta.client
   client.export_table_to_point_in_time(
       TableArn='string',
       ExportTime=datetime(2015, 1, 1),
       ClientToken='string',
       S3Bucket='string',
       S3BucketOwner='string',
       S3Prefix='string',
       S3SseAlgorithm='AES256'|'KMS',
       S3SseKmsKeyId='string',
       ExportFormat='DYNAMODB_JSON'|'ION'
   )
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to