[ 
https://issues.apache.org/jira/browse/HADOOP-19733?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18032581#comment-18032581
 ] 

ASF GitHub Bot commented on HADOOP-19733:
-----------------------------------------

brandonvin commented on code in PR #8048:
URL: https://github.com/apache/hadoop/pull/8048#discussion_r2456994430


##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3AFileSystemIsolatedClassloader.java:
##########
@@ -77,19 +109,9 @@ private void assertInNewFilesystem(Map<String, String> 
confToSet, Consumer<FileS
     }
   }
 
-  private Map<String, String> mapOf() {
-    return new HashMap<>();
-  }
-
-  private Map<String, String> mapOf(String key, String value) {
-    HashMap<String, String> m = new HashMap<>();
-    m.put(key, value);
-    return m;
-  }

Review Comment:
   Thanks, makes sense!





> S3A: Credentials provider classes not found despite setting 
> `fs.s3a.classloader.isolation` to `false`
> -----------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-19733
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19733
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs/s3
>    Affects Versions: 3.4.2
>            Reporter: Brandon
>            Assignee: Brandon
>            Priority: Minor
>              Labels: pull-request-available
>
> HADOOP-18993 added the option `fs.s3a.classloader.isolation` to support, for 
> example, a Spark job using an AWS credentials provider class that is bundled 
> into the Spark job JAR. In testing this, the AWS credentials provider classes 
> are still not found.
> I think the cause is:
>  * `fs.s3a.classloader.isolation` is implemented by setting (or not setting) 
> a classloader on the `Configuration`
>  * However, code paths to load AWS credential provider call 
> `S3AUtils.getInstanceFromReflection`, which uses the classloader that loaded 
> the S3AUtils class. That's likely to be the built-in application classloader, 
> which won't be able to load classes in a Spark job JAR.
> And the fix seems small:
>  * Change `S3AUtils.getInstanceFromReflection` to load classes using the 
> `Configuration`'s classloader. Luckily we already have the Configuration in 
> this method.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to