Can you double-check if you have a AWSCredentialsProvider in your user jar
or in your flink/lib/ ? Same for S3AUtils?

On Fri, Jul 30, 2021 at 9:50 AM Ingo Bürk <i...@ververica.com> wrote:

> Hi Andreas,
>
> Such an exception can occur if the class in question (your provider) and
> the one being checked (AWSCredentialsProvider) were loaded from
> different class loaders.
>
> Any chance you can try once with 1.10+ to see if it would work? It does
> look like a Flink issue to me, but I'm not sure this can be worked
> around in 1.9.
>
> [Initially sent to Andreas directly by accident]
>
>
> Best
> Ingo
>
> On 29.07.21 17:37, Hailu, Andreas [Engineering] wrote:
> > Hi team, I’m trying to read and write from and to S3 using a custom AWS
> > Credential Provider using Flink v1.9.2 on YARN.
> >
> >
> >
> > I followed the instructions to create a plugins directory in our Flink
> > distribution location and copy the FS implementation (I’m using
> > s3-fs-hadoop) package into it. I have also placed the package that
> > contains our custom CredentialsProvider implementation in that same
> > directory as well.
> >
> >
> >
> > $ ls /flink-1.9.2/plugins/s3-fs-hadoop/
> >
> > total 20664
> >
> > 14469 Jun 17 10:57 aws-hadoop-utils-0.0.9.jar ßcontains our custom
> > CredentialsProvider class
> >
> > 21141329 Jul 28 15:43 flink-s3-fs-hadoop-1.9.2.jar
> >
> >
> >
> > I’ve placed this directory in the java classpath when running the Flink
> > application. I have added the ‘fs.s3a.assumed.role.credentials.provider’
> > and ‘fs.s3a.assumed.role.arn’ to our flink-conf.yaml as well. When
> > trying to run a basic app that reads a file, I get the following
> exception:
> >
> >
> >
> > Caused by: java.io.IOException: Class class
> > com.gs.ep.da.lake.aws.CustomAwsCredentialProvider does not implement
> > AWSCredentialsProvider
> >
> >         at
> >
> org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProvider(S3AUtils.java:400)
> >
> >         at
> >
> org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:367)
> >
> >         at
> >
> org.apache.hadoop.fs.s3a.S3ClientFactory$DefaultS3ClientFactory.createS3Client(S3ClientFactory.java:73)
> >
> >
> >
> > Have I missed a step here? Do I need to make the packages also available
> > in my YARN classpath as well? I saw some discussion that suggest that
> > there were some related problems around this that were resolved in v1.10
> > [1][2][3].
> >
> >
> >
> > [1] https://issues.apache.org/jira/browse/FLINK-14574
> > <https://issues.apache.org/jira/browse/FLINK-14574>
> >
> > [2] https://issues.apache.org/jira/browse/FLINK-13044
> > <https://issues.apache.org/jira/browse/FLINK-13044>
> >
> > [3] https://issues.apache.org/jira/browse/FLINK-11956
> > <https://issues.apache.org/jira/browse/FLINK-11956>
> >
> >
> >
> > Best,
> >
> > Andreas
> >
> >
> >
> >
> > ------------------------------------------------------------------------
> >
> > Your Personal Data: We may collect and process information about you
> > that may be subject to data protection laws. For more information about
> > how we use and disclose your personal data, how we protect your
> > information, our legal basis to use your information, your rights and
> > who you can contact, please refer to: www.gs.com/privacy-notices
> > <http://www.gs.com/privacy-notices>
>

Reply via email to