[ 
https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17788742#comment-17788742
 ] 

Antonio Murgia commented on HADOOP-17372:
-----------------------------------------

I add a workaround for anyone having the same issue described by [~brandonvin].

In your own project (where you defined a custom credential provider, that needs 
a compile time dependency on hadoop-aws) create a new java class like the 
following:
 
{{package org.apache.hadoop.fs.s3a;}}
{{{}public class PatchedS3AFileSystem extends S3AFileSystem {{}}}}
 
Then proceed configuring your Spark app with the following property:

{{spark.hadoop.fs.s3a.impl = org.apache.hadoop.fs.s3a.PatchedS3AFileSystem}}

as well as you provider configuration:

{{fs.s3a.aws.credentials.provider = 
com.enel.platform.batch.commons.aws.auth.v2.FileSystemReaderSessionCredentialsProvider}}

Doing so, the filesystem implementation AND the credential provider will be 
loaded by Spark {{MutableClassloader}} (which is a child of {{Launcher}} 
classloader, so can still access also all the classes loaded by it). In this 
way the {{conf.setClassloader}} call will be performed by 
{{PatchedS3AFileSystem}} will not have any effect because the classloader set 
is the same that loaded the {{Configuration}} object itself.

[[email protected]] I'm still open to provide a PR to make this workaround 
un-necessary.

> S3A AWS Credential provider loading gets confused with isolated classloaders
> ----------------------------------------------------------------------------
>
>                 Key: HADOOP-17372
>                 URL: https://issues.apache.org/jira/browse/HADOOP-17372
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.4.0
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Major
>             Fix For: 3.3.1
>
>
> Problem: exception in loading S3A credentials for an FS, "Class class 
> com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement 
> AWSCredentialsProvider"
> Location: S3A + Spark dataframes test
> Hypothesised cause:
> Configuration.getClasses() uses the context classloader, and with the spark 
> isolated CL that's different from the one the s3a FS uses, so it can't load 
> AWS credential providers.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to