[ 
https://issues.apache.org/jira/browse/HADOOP-14821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16152767#comment-16152767
 ] 

Larry McCay commented on HADOOP-14821:
--------------------------------------

This is something that I was aware of but hadn't come across actual 
requirements to have a provider path with varying file permissions. Especially 
for credentials that need to be accessed by an end user. It seemed most 
usecases would have the credential store/s within their own home directory.

That said, I always thought that it might be better to allow the traversal of 
the full path before raising an error.

Questions:

1. Is a configured provider.path that you aren't allowed to access a 
configuration error? I think it should be.
2. If a credential isn't found at the end of the path traversal is it an error 
or are those semantics context based? I think it is context based.
3. If a credential isn't found at the end of the path traversal and there was a 
permissions problem with one of the providers is it a configuration error, 
context based handling of missing credential or both?

My thinking on this has always led me to the conclusion that any inability to 
read one of the configured providers is a configuration error and should be 
treated as a failure. I am certainly open to better suggestions though.



> Executing the command 'hdfs 
> -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails if 
> permission is denied to some files
> -------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14821
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14821
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3, hdfs-client, security
>    Affects Versions: 2.8.0
>            Reporter: Ernani Pereira de Mattos Junior
>            Priority: Critical
>              Labels: features
>
> ======= 
> Request Use Case: 
> UC1: 
> The customer has the path to a directory and subdirectories full of keys. The 
> customer knows that he does not have the access to all the keys, but ignoring 
> this problem, the customer makes a list of the keys. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is granted locally then he can try the login on the s3a. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is not granted locally then he will skip the login on the 
> s3a and try the next key on the list. 
> ===========
> For now, the UC1.2 fails with below exception and does not try the next key:
> {code}
> $ hdfs  --loglevel DEBUG dfs 
> -Dhadoop.security.credential.provider.path=jceks://hdfs/tmp/aws.jceks,jceks://hdfs/tmp/awst.jceks
>  -ls s3a://av-dl-hwx-nprod-anhffpoc-enriched/hive/e_ceod/
> Not retrying because try once and fail.
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=502549376, access=READ, 
> inode="/tmp/aws.jceks":admin:hdfs:-rwx------
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to