[ https://issues.apache.org/jira/browse/HDFS-11848?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16313613#comment-16313613 ]
Manoj Govindassamy commented on HDFS-11848: ------------------------------------------- Thanks for the patch revision. Looks good overall. +1, with few more nit questions below. 1. {{TestDFSAdmin:778}} since the path is "", shouldn't it list all open files and hence the validation should be against {{openFilesMap}} instead of {{openFiles1}} 1. The input paths are treated like Strings right? That is, even if the input path is not a valid path they can still filter the results. Say "/dir1/dir2/d" can filter files for both "/dir1/dir2/dir3/", "/dir1/dir2/dir4/" etc., 2. If (1) is true, is it any useful to have the input path as a regex pattern? Totally ok with me not doing this or taking it in a different jira. > Enhance dfsadmin listOpenFiles command to list files under a given path > ----------------------------------------------------------------------- > > Key: HDFS-11848 > URL: https://issues.apache.org/jira/browse/HDFS-11848 > Project: Hadoop HDFS > Issue Type: Improvement > Affects Versions: 3.0.0-alpha1 > Reporter: Manoj Govindassamy > Assignee: Yiqun Lin > Attachments: HDFS-11848.001.patch, HDFS-11848.002.patch, > HDFS-11848.003.patch > > > HDFS-10480 adds {{listOpenFiles}} option is to {{dfsadmin}} command to list > all the open files in the system. > One more thing that would be nice here is to filter the output on a passed > path or DataNode. Usecases: An admin might already know a stale file by path > (perhaps from fsck's -openforwrite), and wants to figure out who the lease > holder is. Proposal here is add suboptions to {{listOpenFiles}} to list files > filtered by path. > {{LeaseManager#getINodeWithLeases(INodeDirectory)}} can be used to get the > open file list for any given ancestor directory. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org