[ 
https://issues.apache.org/jira/browse/HADOOP-19733?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18063135#comment-18063135
 ] 

ASF GitHub Bot commented on HADOOP-19733:
-----------------------------------------

hadoop-yetus commented on PR #8048:
URL: https://github.com/apache/hadoop/pull/8048#issuecomment-4003179985

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m  0s |  |  Docker mode activated.  |
   | -1 :x: |  patch  |   0m 18s |  |  
https://github.com/apache/hadoop/pull/8048 does not apply to trunk. Rebase 
required? Wrong Branch? See 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help.  
|
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8048/10/console |
   | versions | git=2.34.1 |
   | Powered by | Apache Yetus 0.14.1 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> S3A: Credentials provider classes not found despite setting 
> `fs.s3a.classloader.isolation` to `false`
> -----------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-19733
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19733
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs/s3
>    Affects Versions: 3.4.2
>            Reporter: Brandon
>            Assignee: Brandon
>            Priority: Minor
>              Labels: pull-request-available
>
> HADOOP-18993 added the option `fs.s3a.classloader.isolation` to support, for 
> example, a Spark job using an AWS credentials provider class that is bundled 
> into the Spark job JAR. In testing this, the AWS credentials provider classes 
> are still not found.
> I think the cause is:
>  * `fs.s3a.classloader.isolation` is implemented by setting (or not setting) 
> a classloader on the `Configuration`
>  * However, code paths to load AWS credential provider call 
> `S3AUtils.getInstanceFromReflection`, which uses the classloader that loaded 
> the S3AUtils class. That's likely to be the built-in application classloader, 
> which won't be able to load classes in a Spark job JAR.
> And the fix seems small:
>  * Change `S3AUtils.getInstanceFromReflection` to load classes using the 
> `Configuration`'s classloader. Luckily we already have the Configuration in 
> this method.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to