[ 
https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17493612#comment-17493612
 ] 

Ivan Sadikov commented on HADOOP-17372:
---------------------------------------

Hi [[email protected]]. Thanks for fixing the issue, I confirmed it worked in 
my case.

Looking at the code, I was wondering if ideally {{conf.setClassLoader()}} 
should be setting the class loader for each getClass/getClassByName call from 
the interface that it was trying to cast to instead of S3AFileSystem class 
loader? I understand that the same class loader would likely load both 
S3AFileSystem and AWS SDK but in case different class loaders are used, would 
the solution be different?

> S3A AWS Credential provider loading gets confused with isolated classloaders
> ----------------------------------------------------------------------------
>
>                 Key: HADOOP-17372
>                 URL: https://issues.apache.org/jira/browse/HADOOP-17372
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.4.0
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Major
>             Fix For: 3.3.1
>
>
> Problem: exception in loading S3A credentials for an FS, "Class class 
> com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement 
> AWSCredentialsProvider"
> Location: S3A + Spark dataframes test
> Hypothesised cause:
> Configuration.getClasses() uses the context classloader, and with the spark 
> isolated CL that's different from the one the s3a FS uses, so it can't load 
> AWS credential providers.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to