ferruzzi commented on PR #30501:
URL: https://github.com/apache/airflow/pull/30501#issuecomment-1538976807

   @utkarsharma2 Did you actually try this live?   After the merge, the system 
test fails because [this 
addition](https://github.com/apache/airflow/pull/30501/files#diff-849549a5587d7b4e2e386cb23d5b8da9cddfd5b88b80fb931c36cb931e943e6fR131-R139)
 is missing the file_size paramter, and after adding that in it fails with the 
message
   
   ```
   ERROR    airflow.task:taskinstance.py:1900 Task failed with exception
   Traceback (most recent call last):
     File 
"/opt/airflow/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py", line 
142, in execute
       self._export_table_to_point_in_time()
     File 
"/opt/airflow/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py", line 
157, in _export_table_to_point_in_time
       ExportFormat=self.export_format,
     File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
530, in _api_call
       return self._make_api_call(operation_name, kwargs)
     File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 
960, in _make_api_call
       raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (ValidationException) 
when calling the ExportTableToPointInTime operation: One or more parameter 
values were invalid: tableArn is not a valid ARN
   ```Here
   
   
[Here](https://github.com/apache/airflow/pull/30501/files#diff-5a4b3715577fad0a7bbf023f861a5f4f1e0fb1abb4b2a71ecb804a068d9de4edR153)
 you are passing`dynamodb_table_name` instead of the ARN.   I'm not sure how 
this would have worked, and I probably should have noticed that in the review.  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to