[
https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17229269#comment-17229269
]
Steve Loughran commented on HADOOP-17372:
-----------------------------------------
{code}
2020-11-10 05:27:33,517 [ScalaTest-main-running-S3DataFrameExampleSuite] WARN
fs.FileSystem (FileSystem.java:createFileSystem(3466)) - Failed to initialize
fileystem s3a://stevel-ireland: java.io.IOException: Class class
com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
AWSCredentialsProvider
- DataFrames *** FAILED ***
org.apache.spark.sql.AnalysisException: java.lang.RuntimeException:
java.io.IOException: Class class
com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
AWSCredentialsProvider;
at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:218)
at
org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:151)
at
org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:139)
at
org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:178)
at
org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:173)
at
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
at
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:91)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:91)
...
Cause: java.lang.RuntimeException: java.io.IOException: Class class
com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
AWSCredentialsProvider
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:686)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:621)
at
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:219)
at
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:126)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:306)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:433)
...
Cause: java.io.IOException: Class class
com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
AWSCredentialsProvider
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProvider(S3AUtils.java:722)
at org.apache.hadoop.fs.s3a.S3AUtils.buildAWSProviderList(S3AUtils.java:687)
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:620)
at
org.apache.hadoop.fs.s3a.S3AFileSystem.bindAWSClient(S3AFileSystem.java:673)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:414)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3462)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:171)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3522)
at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:3496)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:591)
...
{code}
> S3A AWS Credential provider loading gets confused with isolated classloaders
> ----------------------------------------------------------------------------
>
> Key: HADOOP-17372
> URL: https://issues.apache.org/jira/browse/HADOOP-17372
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs/s3
> Affects Versions: 3.4.0
> Reporter: Steve Loughran
> Priority: Major
>
> Problem: exception in loading S3A credentials for an FS, "Class class
> com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement
> AWSCredentialsProvider"
> Location: S3A + Spark dataframes test
> Hypothesised cause:
> Configuration.getClasses() uses the context classloader, and with the spark
> isolated CL that's different from the one the s3a FS uses, so it can't load
> AWS credential providers.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]