[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980437#comment-14980437 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Hdfs-trunk-Java8 #550 (See [https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/550/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14979985#comment-14979985 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Mapreduce-trunk-Java8 #599 (See [https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/599/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980009#comment-14980009 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Mapreduce-trunk #2542 (See [https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2542/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14979883#comment-14979883 ] Vinayakumar B commented on HDFS-8545: - Thanks [~andreina] for the checkstyle fix +1 for latest revision. Will commit shortly. > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14979960#comment-14979960 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-trunk-Commit #8723 (See [https://builds.apache.org/job/Hadoop-trunk-Commit/8723/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980100#comment-14980100 ] Hadoop QA commented on HDFS-8545: - | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m 8s {color} | {color:blue} docker + precommit patch detected. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s {color} | {color:green} The patch does not contain any @author tags. {color} | | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s {color} | {color:green} The patch appears to include 1 new or modified test files. {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 3m 21s {color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 52s {color} | {color:green} trunk passed with JDK v1.8.0_66 {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 33s {color} | {color:green} trunk passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 0s {color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 30s {color} | {color:green} trunk passed {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 1m 57s {color} | {color:red} hadoop-hdfs-project/hadoop-hdfs in trunk cannot run convertXmlToText from findbugs {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 8s {color} | {color:green} trunk passed with JDK v1.8.0_66 {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 3m 1s {color} | {color:green} trunk passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 20s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 47s {color} | {color:green} the patch passed with JDK v1.8.0_66 {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 4m 47s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 33s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 4m 33s {color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 1m 3s {color} | {color:red} Patch generated 1 new checkstyle issues in root (total was 200, now 200). {color} | | {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 29s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s {color} | {color:green} Patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 4m 9s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 9s {color} | {color:green} the patch passed with JDK v1.8.0_66 {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 58s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 7m 52s {color} | {color:red} hadoop-common in the patch failed with JDK v1.8.0_66. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 70m 48s {color} | {color:red} hadoop-hdfs in the patch failed with JDK v1.8.0_66. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 8m 26s {color} | {color:red} hadoop-common in the patch failed with JDK v1.7.0_79. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 68m 35s {color} | {color:red} hadoop-hdfs in the patch failed with JDK v1.7.0_79. {color} | | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 21s {color} | {color:red} Patch generated 56 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 203m 7s {color} | {color:black} {color} | \\ \\ || Reason || Tests || | JDK v1.7.0_79 Failed junit tests | hadoop.security.ssl.TestReloadingX509TrustManager | | | hadoop.hdfs.tools.TestDFSZKFailoverController | | | hadoop.hdfs.TestDFSUpgradeFromImage | | | hadoop.hdfs.server.datanode.TestBlockScanner | | | hadoop.hdfs.server.namenode.ha.TestDNFencing | | | hadoop.hdfs.server.datanode.TestDataNodeVolumeFailure | | | hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes | | | hadoop.hdfs.TestEncryptionZones | | | hadoop.hdfs.server.blockmanagement.TestNodeCount | | |
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980151#comment-14980151 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Yarn-trunk #1335 (See [https://builds.apache.org/job/Hadoop-Yarn-trunk/1335/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980208#comment-14980208 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Hdfs-trunk #2488 (See [https://builds.apache.org/job/Hadoop-Hdfs-trunk/2488/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14980159#comment-14980159 ] Hudson commented on HDFS-8545: -- FAILURE: Integrated in Hadoop-Yarn-trunk-Java8 #612 (See [https://builds.apache.org/job/Hadoop-Yarn-trunk-Java8/612/]) HDFS-8545. Refactor FS#getUsed() to use ContentSummary and add an API to (vinayakumarb: rev 7d2d16f4ee87ae56dc20016a91c109dd5130f7d4) * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java * hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FilterFileSystem.java * hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDistributedFileSystem.java * hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/HarFileSystem.java > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Fix For: 2.8.0 > > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch, HDFS-8545.7.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14978102#comment-14978102 ] Vinayakumar B commented on HDFS-8545: - +1 lgtm, Pending jenkins > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch, HDFS-8545.6.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14979775#comment-14979775 ] Hadoop QA commented on HDFS-8545: - | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 0m 7s {color} | {color:blue} docker + precommit patch detected. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s {color} | {color:green} The patch does not contain any @author tags. {color} | | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s {color} | {color:green} The patch appears to include 1 new or modified test files. {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 3m 35s {color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 17s {color} | {color:green} trunk passed with JDK v1.8.0_60 {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 48s {color} | {color:green} trunk passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 7s {color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 30s {color} | {color:green} trunk passed {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 2m 6s {color} | {color:red} hadoop-hdfs-project/hadoop-hdfs in trunk cannot run convertXmlToText from findbugs {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 31s {color} | {color:green} trunk passed with JDK v1.8.0_60 {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 3m 23s {color} | {color:green} trunk passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 16s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 7s {color} | {color:green} the patch passed with JDK v1.8.0_60 {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 5m 7s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 4m 42s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 4m 42s {color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 1m 3s {color} | {color:red} Patch generated 2 new checkstyle issues in root (total was 200, now 201). {color} | | {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 29s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s {color} | {color:green} Patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 4m 31s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 2m 26s {color} | {color:green} the patch passed with JDK v1.8.0_60 {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 3m 26s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 7m 35s {color} | {color:red} hadoop-common in the patch failed with JDK v1.8.0_60. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 57m 46s {color} | {color:red} hadoop-hdfs in the patch failed with JDK v1.8.0_60. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 7m 21s {color} | {color:red} hadoop-common in the patch failed with JDK v1.7.0_79. {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 61m 50s {color} | {color:red} hadoop-hdfs in the patch failed with JDK v1.7.0_79. {color} | | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 31s {color} | {color:red} Patch generated 58 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 185m 54s {color} | {color:black} {color} | \\ \\ || Reason || Tests || | JDK v1.7.0_79 Failed junit tests | hadoop.crypto.key.TestValueQueue | | | hadoop.hdfs.server.namenode.TestDecommissioningStatus | | | hadoop.hdfs.TestDFSStripedOutputStreamWithFailure010 | | | hadoop.hdfs.TestInjectionForSimulatedStorage | | | hadoop.hdfs.server.blockmanagement.TestPendingInvalidateBlock | | | hadoop.hdfs.TestDFSUpgradeFromImage | | | hadoop.hdfs.server.namenode.ha.TestDNFencing | | | hadoop.metrics2.impl.TestMetricsSystemImpl | | | hadoop.hdfs.server.namenode.ha.TestHASafeMode
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14958763#comment-14958763 ] Hadoop QA commented on HDFS-8545: - \\ \\ | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:red}-1{color} | pre-patch | 22m 52s | Findbugs (version ) appears to be broken on trunk. | | {color:green}+1{color} | @author | 0m 0s | The patch does not contain any @author tags. | | {color:green}+1{color} | tests included | 0m 0s | The patch appears to include 1 new or modified test files. | | {color:green}+1{color} | javac | 11m 13s | There were no new javac warning messages. | | {color:green}+1{color} | javadoc | 12m 38s | There were no new javadoc warning messages. | | {color:red}-1{color} | release audit | 0m 20s | The applied patch generated 1 release audit warnings. | | {color:red}-1{color} | checkstyle | 1m 44s | The applied patch generated 2 new checkstyle issues (total was 200, now 201). | | {color:green}+1{color} | whitespace | 0m 0s | The patch has no lines that end in whitespace. | | {color:green}+1{color} | install | 1m 47s | mvn install still works. | | {color:green}+1{color} | eclipse:eclipse | 0m 35s | The patch built with eclipse:eclipse. | | {color:red}-1{color} | findbugs | 5m 0s | The patch appears to introduce 1 new Findbugs (version 3.0.0) warnings. | | {color:red}-1{color} | common tests | 8m 8s | Tests failed in hadoop-common. | | {color:red}-1{color} | hdfs tests | 65m 30s | Tests failed in hadoop-hdfs. | | | | 130m 13s | | \\ \\ || Reason || Tests || | FindBugs | module:hadoop-hdfs | | Failed unit tests | hadoop.metrics2.impl.TestMetricsSystemImpl | | | hadoop.hdfs.server.namenode.ha.TestFailureToReadEdits | | | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks | | | hadoop.fs.TestGlobPaths | | Timed out tests | org.apache.hadoop.hdfs.TestReplication | | | org.apache.hadoop.hdfs.TestRollingUpgrade | | | org.apache.hadoop.hdfs.crypto.TestHdfsCryptoStreams | | | org.apache.hadoop.hdfs.TestParallelUnixDomainRead | \\ \\ || Subsystem || Report/Notes || | Patch URL | http://issues.apache.org/jira/secure/attachment/12766753/HDFS-8545.5.patch | | Optional Tests | javadoc javac unit findbugs checkstyle | | git revision | trunk / 63020c5 | | Release Audit | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/artifact/patchprocess/patchReleaseAuditProblems.txt | | checkstyle | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/artifact/patchprocess/diffcheckstylehadoop-common.txt | | Findbugs warnings | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/artifact/patchprocess/newPatchFindbugsWarningshadoop-hdfs.html | | hadoop-common test log | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/artifact/patchprocess/testrun_hadoop-common.txt | | hadoop-hdfs test log | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/artifact/patchprocess/testrun_hadoop-hdfs.txt | | Test Results | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/testReport/ | | Java | 1.7.0_55 | | uname | Linux asf909.gq1.ygridcore.net 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux | | Console output | https://builds.apache.org/job/PreCommit-HDFS-Build/13001/console | This message was automatically generated. > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HDFS-8545) Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total file length from a specific path
[ https://issues.apache.org/jira/browse/HDFS-8545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14958793#comment-14958793 ] Vinayakumar B commented on HDFS-8545: - Thanks [~andreina] for the update. Refactor looks great. Just one minor comment. Can you please move new test {{TestDFSShell#testTotalDfsUsed()}} to {{TestDistributedFileSystem}}, as this is not shell test, this is testing of new API you have added. > Refactor FS#getUsed() to use ContentSummary and add an API to fetch the total > file length from a specific path > -- > > Key: HDFS-8545 > URL: https://issues.apache.org/jira/browse/HDFS-8545 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: J.Andreina >Assignee: J.Andreina >Priority: Minor > Attachments: HDFS-8545.1.patch, HDFS-8545.2.patch, HDFS-8545.3.patch, > HDFS-8545.4.patch, HDFS-8545.5.patch > > > Currently by default in FileSystem#getUsed() returns the total file size from > root. > It is good to have an api to return the total file size from specified path > ,same as we specify the path in "./hdfs dfs -du -s /path" . -- This message was sent by Atlassian JIRA (v6.3.4#6332)