liangyu-1 commented on PR #42295:
URL: https://github.com/apache/spark/pull/42295#issuecomment-1667554705
The main cause is that ShutdownHook thread is created before we create the
ugi in ApplicationMaster.
When we set the config key _"hadoop.security.credential.provider.path"_, the
ApplicationMaster will try to get a filesystem when generating SSLOptions, and
when initialize the filesystem during which it will generate a new thread whose
ugi is inherited from the current process (yarn).
After this, it will generate a new ugi (SPARK_USER) in ApplicationMaster and
execute the doAs() function.
Here is the chain of the call:
ApplicationMaster.<init>(ApplicationMaster.scala:83) ->
org.apache.spark.SecurityManager.<init>(SecurityManager.scala:98) ->
org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188) ->
org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2353) ->
org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2434)
->
org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:82)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]