Github user shrutig commented on the issue:
https://github.com/apache/spark/pull/21756
`SparkHadoopUtil` is accessed in multiple points in Apache Spark , and is
not limited to usage by yarn. External cluster managers should also be able to
change any functions in `SparkHadoopUtil`, currently there is no way to do so.
Internally, we have leveraged the external cluster manager and we would like to
control the `runAsSparkUser` method for our authentication method. Just as
external cluster manager is pluggable, `SparkHadoopUtil` could also be
pluggable.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]