[
https://issues.apache.org/jira/browse/FLINK-31095?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17699451#comment-17699451
]
Sylvia Lin commented on FLINK-31095:
------------------------------------
Hey [~martijnvisser] , Thanks for reply!
I tried above configs against Flink 1.16.1, but now it's giving error:
{code:java}
Caused by: java.nio.file.AccessDeniedException: :
org.apache.hadoop.fs.s3a.auth.NoAwsCredentialsException:
SimpleAWSCredentialsProvider: No AWS credentials in the Hadoop configuration
at
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:212) ~[?:?]
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProvider(S3AUtils.java:784)
~[?:?]
at
org.apache.hadoop.fs.s3a.S3AUtils.buildAWSProviderList(S3AUtils.java:698) ~[?:?]
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:631)
~[?:?]
at
org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:877)
~[?:?]
at
org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:534) ~[?:?]
at
org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:127)
~[?:?]
at
org.apache.flink.core.fs.PluginFileSystemFactory.create(PluginFileSystemFactory.java:62)
~[flink-dist-1.16.1.jar:1.16.1]
at
org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:508)
~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:409)
~[flink-dist-1.16.1.jar:1.16.1]
at
org.apache.flink.connector.file.sink.FileSink$RowFormatBuilder.createBucketWriter(FileSink.java:475)
~[flink-connector-files-1.16.1.jar:1.16.1]
at
org.apache.flink.connector.file.sink.FileSink$RowFormatBuilder.getCommittableSerializer(FileSink.java:466)
~[flink-connector-files-1.16.1.jar:1.16.1]
at
org.apache.flink.connector.file.sink.FileSink.getCommittableSerializer(FileSink.java:175)
~[flink-connector-files-1.16.1.jar:1.16.1] {code}
Below is my Flink config:
{code:java}
flinkConfiguration:
taskmanager.numberOfTaskSlots: "2"
...
s3.aws.credentials.provider:
org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider
s3.assumed.role.arn: arn:aws:iam::[aws-accnt-num]:role/my-iam-role {code}
And hadoop plugin had been added.
can you point me to `s3` related config keys?
> FileSink doesn't work with s3a on EKS
> -------------------------------------
>
> Key: FLINK-31095
> URL: https://issues.apache.org/jira/browse/FLINK-31095
> Project: Flink
> Issue Type: Bug
> Components: Connectors / FileSystem
> Affects Versions: 1.16.1
> Reporter: Sylvia Lin
> Priority: Major
>
> FileSink gives below exception on AWS EKS cluster:
> {code:java}
> Caused by: java.lang.UnsupportedOperationException: This s3 file system
> implementation does not support recoverable writers.
> at
> org.apache.flink.fs.s3.common.FlinkS3FileSystem.createRecoverableWriter(FlinkS3FileSystem.java:136)
> ~[?:?]
> at
> org.apache.flink.core.fs.PluginFileSystemFactory$ClassLoaderFixingFileSystem.createRecoverableWriter(PluginFileSystemFactory.java:134)
> ~[flink-dist-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.sink.FileSink$RowFormatBuilder.createBucketWriter(FileSink.java:475)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.sink.FileSink$RowFormatBuilder.getCommittableSerializer(FileSink.java:466)
> ~[flink-connector-files-1.16.1.jar:1.16.1]
> at
> org.apache.flink.connector.file.sink.FileSink.getCommittableSerializer(FileSink.java:175)
> ~[flink-connector-files-1.16.1.jar:1.16.1]{code}
> [https://github.com/apache/flink/blob/278dc7b793303d228f7816585054629708983af6/flink-filesystems/flink-s3-fs-base/src/main/java/org/apache/flink/fs/s3/common/FlinkS3FileSystem.java#LL136C16-L136C16]
> And this may be related to
> https://issues.apache.org/jira/browse/FLINK-23487?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)