[ 
https://issues.apache.org/jira/browse/HADOOP-14821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16369315#comment-16369315
 ] 

Steve Loughran commented on HADOOP-14821:
-----------------------------------------

I believe HADOOP-14507 may help here. It doesn't address the direct problem 
"multiple files", but as it supports side-by-side login secrets in a single 
file, merging the contents of the different files should be possible, providing 
they are for accessing different buckets

> Executing the command 'hdfs 
> -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails if 
> permission is denied to some files
> -------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14821
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14821
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3, hdfs-client, security
>    Affects Versions: 2.8.0
>            Reporter: Ernani Pereira de Mattos Junior
>            Priority: Critical
>              Labels: features
>
> ======= 
> Request Use Case: 
> UC1: 
> The customer has the path to a directory and subdirectories full of keys. The 
> customer knows that he does not have the access to all the keys, but ignoring 
> this problem, the customer makes a list of the keys. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is granted locally then he can try the login on the s3a. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is not granted locally then he will skip the login on the 
> s3a and try the next key on the list. 
> ===========
> For now, the UC1.2 fails with below exception and does not try the next key:
> {code}
> $ hdfs  --loglevel DEBUG dfs 
> -Dhadoop.security.credential.provider.path=jceks://hdfs/tmp/aws.jceks,jceks://hdfs/tmp/awst.jceks
>  -ls s3a://av-dl-hwx-nprod-anhffpoc-enriched/hive/e_ceod/
> Not retrying because try once and fail.
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=502549376, access=READ, 
> inode="/tmp/aws.jceks":admin:hdfs:-rwx------
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to