[
https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17229527#comment-17229527
]
Steve Loughran commented on HADOOP-17372:
-----------------------------------------
setting com.aws in "spark.sql.hive.metastore.sharedPrefixes" should be enough.
> S3A AWS Credential provider loading gets confused with isolated classloaders
> ----------------------------------------------------------------------------
>
> Key: HADOOP-17372
> URL: https://issues.apache.org/jira/browse/HADOOP-17372
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs/s3
> Affects Versions: 3.4.0
> Reporter: Steve Loughran
> Priority: Major
>
> Problem: exception in loading S3A credentials for an FS, "Class class
> com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
> AWSCredentialsProvider"
> Location: S3A + Spark dataframes test
> Hypothesised cause:
> Configuration.getClasses() uses the context classloader, and with the spark
> isolated CL that's different from the one the s3a FS uses, so it can't load
> AWS credential providers.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]