Github user mridulm commented on the issue:
https://github.com/apache/spark/pull/17723
@vanzip wrote:
> So, this is purely about handling Hadoop authentication for Hadoop
services.
This was my point - we should not introduce system specific api's into
spark core infrastructure api's/spi's : unless
a) we have explicitly based our support on it, or
b) generalized it sufficiently that we can support others, or
c) keep it an impl detail in core (but exposed in yarn for backward
compatibility).
IMO (a) or (b) require a dev@ discussion.
Until now, this (hadoop security) support was restricted to yarn in spark
(with a couple of minor other uses iirc).
@mgummelt:
> The only thing the new ServiceCredentialProvider interface enforces is
that the credentials must be added to a Credentials object, which is a hadoop
class. <snip>
The spi makes assumptions about the environment within which the credential
provider is invoked, how the tokens are updated at driver/executor as well in
addition to use of Credentials - and these are driven by hadoop security DT
design.
> Do we need to generalize ServiceCredentialProvider to support non-hadoop
delegation tokens?
IMO that depends on what the answer to the design choice above is.
If (a) or (c) - then no.
If (b), then yes.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]