[
https://issues.apache.org/jira/browse/HADOOP-12326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14712814#comment-14712814
]
Hadoop QA commented on HADOOP-12326:
------------------------------------
\\
\\
| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:red}-1{color} | pre-patch | 22m 57s | Pre-patch trunk has 4 extant
Findbugs (version 3.0.0) warnings. |
| {color:green}+1{color} | @author | 0m 0s | The patch does not contain any
@author tags. |
| {color:green}+1{color} | tests included | 0m 0s | The patch appears to
include 3 new or modified test files. |
| {color:green}+1{color} | javac | 7m 41s | There were no new javac warning
messages. |
| {color:green}+1{color} | javadoc | 9m 47s | There were no new javadoc
warning messages. |
| {color:green}+1{color} | release audit | 0m 24s | The applied patch does
not increase the total number of release audit warnings. |
| {color:green}+1{color} | site | 2m 57s | Site still builds. |
| {color:red}-1{color} | checkstyle | 2m 2s | The applied patch generated
32 new checkstyle issues (total was 424, now 442). |
| {color:red}-1{color} | whitespace | 0m 3s | The patch has 2 line(s) that
end in whitespace. Use git apply --whitespace=fix. |
| {color:green}+1{color} | install | 1m 32s | mvn install still works. |
| {color:green}+1{color} | eclipse:eclipse | 0m 31s | The patch built with
eclipse:eclipse. |
| {color:green}+1{color} | findbugs | 5m 5s | The patch does not introduce
any new Findbugs (version 3.0.0) warnings. |
| {color:red}-1{color} | common tests | 22m 17s | Tests failed in
hadoop-common. |
| {color:green}+1{color} | tools/hadoop tests | 6m 13s | Tests passed in
hadoop-streaming. |
| {color:red}-1{color} | hdfs tests | 161m 44s | Tests failed in hadoop-hdfs. |
| | | 243m 47s | |
\\
\\
|| Reason || Tests ||
| Failed unit tests | hadoop.cli.TestCLI |
| | hadoop.fs.permission.TestStickyBit |
\\
\\
|| Subsystem || Report/Notes ||
| Patch URL |
http://issues.apache.org/jira/secure/attachment/12752394/HADOOP-12326.007.patch
|
| Optional Tests | javadoc javac unit findbugs checkstyle site |
| git revision | trunk / a4d9acc |
| Pre-patch Findbugs warnings |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/trunkFindbugsWarningshadoop-hdfs.html
|
| checkstyle |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/diffcheckstylehadoop-common.txt
|
| whitespace |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/whitespace.txt
|
| hadoop-common test log |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/testrun_hadoop-common.txt
|
| hadoop-streaming test log |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/testrun_hadoop-streaming.txt
|
| hadoop-hdfs test log |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/artifact/patchprocess/testrun_hadoop-hdfs.txt
|
| Test Results |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/testReport/ |
| Java | 1.7.0_55 |
| uname | Linux asf906.gq1.ygridcore.net 3.13.0-36-lowlatency #63-Ubuntu SMP
PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux |
| Console output |
https://builds.apache.org/job/PreCommit-HADOOP-Build/7529/console |
This message was automatically generated.
> Implement ChecksumFileSystem#getFileChecksum equivalent to HDFS for easy check
> ------------------------------------------------------------------------------
>
> Key: HADOOP-12326
> URL: https://issues.apache.org/jira/browse/HADOOP-12326
> Project: Hadoop Common
> Issue Type: Improvement
> Components: fs
> Affects Versions: 2.7.1
> Reporter: Gera Shegalov
> Assignee: Gera Shegalov
> Attachments: HADOOP-12326.001.patch, HADOOP-12326.002.patch,
> HADOOP-12326.003.patch, HADOOP-12326.004.patch, HADOOP-12326.005.patch,
> HADOOP-12326.007.patch
>
>
> If we have same-content files, one local and one remotely on HDFS (after
> downloading or uploading), getFileChecksum can provide a quick check whether
> they are consistent. To this end, we can switch to CRC32C on local
> filesystem. The difference in block sizes does not matter, because for the
> local filesystem it's just a logical parameter.
> {code}
> $ hadoop fs -Dfs.local.block.size=134217728 -checksum
> file:${PWD}/part-m-00000 part-m-00000
> 15/08/15 13:30:02 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> file:///Users/gshegalov/workspace/hadoop-common/part-m-00000
> MD5-of-262144MD5-of-512CRC32C
> 000002000000000000040000e84fb07f8c9d4ef3acb5d1983a7e2a68
> part-m-00000 MD5-of-262144MD5-of-512CRC32C
> 000002000000000000040000e84fb07f8c9d4ef3acb5d1983a7e2a68
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)