Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/17723
To ask a more direct question:
The only public interface being added in this change is
`ServiceCredentialProvider`. It's an interface that service-specific libraries
(e.g. a Solr connector, or a Kudu connector) would extend, and user
applications would never touch. It's also an existing interface, which was in
the YARN module but, really, has functionality that is independent of YARN.
So the question is: why is it a bad thing to expose this interface more
widely? It serves a specific purpose (provide a way for services that use
Hadoop-style security to integrate with Spark applications). It doesn't dictate
that all Spark code must use Hadoop delegation tokens, it just allows Spark
applications to better integrate with secure Hadoop services.
And, on top of that, I don't see a point in abstracting it further. It
would just be abstraction for the sake of abstraction. If a different class of
secure services requires something like that at some point, we can evaluate
then what to do. But right now, that class of services *does not exist*.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]