[ 
https://issues.apache.org/jira/browse/HADOOP-12563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15220563#comment-15220563
 ] 

Matthew Paduano commented on HADOOP-12563:
------------------------------------------

Thanks for the quick reply.  I would like to briefly recall the history of this 
JIRA from my pov.  Altiscale needed a utility that could (1) overwrite alias 
field in a credentials file to enable passing token files outside of firewalls 
(2) print a base64 encoded string to paste into URL's as a DELEGATION 
parameter.   aw suggested I throw in protobufs, invent DtFetcher and model the 
command syntax after CredentialsShell.   I tried to oblige.   But mostly all we 
wanted was to let users have tokens outside firewalls and use them via HTTP.  :)

1.  Not sure specifically what you mean.   If the DtFetcher code throws exc, 
then it propagates through Get.  When auth fails, for example, one might see 
"java.io.IOException: Failed on local exception: java.io.IOException: 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: 
No valid credentials provided (Mechanism level: Failed to find any Kerberos 
tgt)];".    In response to your comment I did two things:  (a) I made 
HdfsDtFetcher throw an IOE if fs.getDelegationToken(renewer) returns null 
without throwing an Exception. (b) In DtUtilShell.Get.execute() I trapped a NPE 
if the fetcher returns null indicating the token should not be aliased.  Maybe 
addDelegationTokens() needs to accept the alias parameter in case returning a 
single token for aliasing is not sufficient?  Is this the "return null" part 
you are referring to?

2.  I added the method isTokenRequired() to the interface.  I am not totally 
clear on how this is to be used.  In HdfsDtFetcher and RmDtFetcher (to be added 
via YARN4435 once this interface is committed here), I implemented this as 
"return UserGroupInformation.isSecurityEnabled();".

3.  I have added Hdfs,WebHdfs,SWebHdfs and Rm DtFetcher examples (RmDtFetcher 
is YARN4435).  I am not sure I really know how to do the RM case correctly and 
hope the review of that JIRA will straighten me out!   wrt ATS and KMS, perhaps 
someone can show me some pointers to docs/examples where I can figure out how 
to fetch those tokens correctly/completely?  I can add new JIRA's with my best 
guess at how to do it.

4.  I agree it would be cool to have some mechanism to let hadoop know how to 
kinit for an OS user who is already authenticated and has OS perms to access a 
keytab, e.g. "kinit -kt ${KEYTAB_HOME} ${PRINCIPAL}".   But this seems like a 
hadoop-wide question and not specific to this utility.   Perhaps a follow-up 
JIRA?

5. Part of this came from just following CredentialsShell.  But I can touch 
that up and change the test code.

6.  get is somewhat abstracted by DtFetcher.addDelegationTokens().   Renew is 
abstracted by token.renew().  Is all you are seeking a method that iterates 
tokens in a Credentials object and calls renew()?  matching alias and/or 
service?   Or do you additionally want the same file management operations with 
file names as argument etc. in an API?   Can this be follow-up JIRA?

minor:  1.  will remove String.format() if preferred.
            2.  An awful lot of Date is deprecated.  But Date.toString() is not 
deprecated.  Nor is the constructor Date(long).   
                 http://docs.oracle.com/javase/6/docs/api/java/util/Date.html
                 https://docs.oracle.com/javase/7/docs/api/java/sql/Date.html
                 https://docs.oracle.com/javase/8/docs/api/java/sql/Date.html
                 ???


> Updated utility to create/modify token files
> --------------------------------------------
>
>                 Key: HADOOP-12563
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12563
>             Project: Hadoop Common
>          Issue Type: New Feature
>    Affects Versions: 3.0.0
>            Reporter: Allen Wittenauer
>            Assignee: Matthew Paduano
>         Attachments: HADOOP-12563.01.patch, HADOOP-12563.02.patch, 
> HADOOP-12563.03.patch, HADOOP-12563.04.patch, HADOOP-12563.05.patch, 
> HADOOP-12563.06.patch, HADOOP-12563.07.patch, HADOOP-12563.07.patch, 
> dtutil-test-out, example_dtutil_commands_and_output.txt, 
> generalized_token_case.pdf
>
>
> hdfs fetchdt is missing some critical features and is geared almost 
> exclusively towards HDFS operations.  Additionally, the token files that are 
> created use Java serializations which are hard/impossible to deal with in 
> other languages. It should be replaced with a better utility in common that 
> can read/write protobuf-based token files, has enough flexibility to be used 
> with other services, and offers key functionality such as append and rename. 
> The old version file format should still be supported for backward 
> compatibility, but will be effectively deprecated.
> A follow-on JIRA will deprecrate fetchdt.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to