Paula, PutS3Object is capable of using S3's multi-part upload feature to upload very large files, and it checks for the status of multi-part uploads periodically in the course of normal operation. Ideally, your AWS credentials would include the s3:ListBucketMultipartUploads permission to allow this. However, I believe PutS3Object should work as-is. This message is typically logged as a warning, and it should not stop your flowfiles from being put to S3.
Are your files not getting through to S3? Thanks, James On Fri, Feb 23, 2018 at 4:58 AM, Paula Jäppinen <[email protected]> wrote: > Hi all! > > > > Could you please help me with a PutS3Object processor error? I'm trying to > load a file to S3, but got this error message: > > > > "PutS3Object[id=9842b19e-ced7-135a-0000-000000000000] AccessDenied > checking S3 Multipart Upload list for <bucket_name>: Access Denied > (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request > ID: 766BF2370000000) ** The configured user does not have the > s3:ListBucketMultipartUploads permission for this bucket, S3 ageoff cannot > occur without this permission. Next ageoff check time is being advanced by > interval to prevent checking on every upload **" > > I have access to the subfolders, but don't have permissions to the S3 > bucket. Why it is possible to copy data using AWS CLI, but not using NiFi's > PutS3Object processor? The version of NiFi is 1.2.0. > > > > Thanks a lot if someone knows how to resolve this issue! > > > > Br, > > > > Paula >
