[jira] [Created] (HADOOP-16836) Bug in widely-used helper function caused valid configuration value to fail on multiple tests, causing build failure

2020-02-01 Thread Ctest (Jira)
Ctest created HADOOP-16836:
--

 Summary: Bug in widely-used helper function caused valid 
configuration value to fail on multiple tests, causing build failure
 Key: HADOOP-16836
 URL: https://issues.apache.org/jira/browse/HADOOP-16836
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.2.1
Reporter: Ctest


*## Title*

Bug in widely-used test helper function caused the right configuration value to 
fail on multiple tests.


*## Description*

Test helper function 
`org.apache.hadoop.io.file.tfile.TestTFileByteArrays#readRecords(org.apache.hadoop.fs.FileSystem,
 org.apache.hadoop.fs.Path, int, org.apache.hadoop.conf.Configuration)` 
(abbreviate as `readRecords()` below) are called in 4 actively-used tests below:

 
{code:java}
org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryMixedLengths1
org.apache.hadoop.io.file.tfile.TestTFileStreams#testOneEntryUnknownLength
org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryMixedLengths1
org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams#testOneEntryUnknownLength{code}
 

 

These tests first call 
`org.apache.hadoop.io.file.tfile.TestTFileStreams#writeRecords(int count, 
boolean knownKeyLength, boolean knownValueLength, boolean close)` to write 
`key-value` pair records in a `TFile` object, then call the helper function 
`readRecords()` to assert the `key` part and the `value` part of `key-value` 
pair records stored matched with what they wrote perviously. The `value` parts 
of `key-value` pairs from these tests are hardcode strings with a length of 6.


Assertions in `readRecords()` are directly related to the value of the 
configuration parameter `tfile.io.chunk.size`. The formal definition of 
`tfile.io.chunk.size` is "Value chunk size in bytes. Default to 1MB. Values of 
the length less than the chunk size is guaranteed to have known value length in 
read time (See also TFile.Reader.Scanner.Entry.isValueLengthKnown())".

When `tfile.io.chunk.size` is configured to a value less than the length of the 
`value` part of the `key-value` pairs from these 4 tests, these tests will 
fail, even though the configured value for `tfile.io.chunk.size` is correct in 
semantic.


*## Consequence*

At least 4 actively-used tests failed on correctly configured parameters. Tests 
used `readRecords()` could fail if the length of the hardcoded `value` part 
they tested is larger than the configured value of `tfile.io.chunk.size`. This 
caused build failure of Hadoop-Common if these tests are not skipped.

 

*## Root Cause*

`readRecords()` used 
`org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValueLength()` 
(abbreviate as `getValueLength()` below) to get the full length of the `value` 
part in the `key-value` pair. But `getValueLength()` can only get the full 
length of the `value` part when the full length is less than 
`tfile.io.chunk.size`, otherwise, `getValueLength()` throws an exception, 
causing `readRecords()` to fail, and thus resulting in failures in the 
aforementioned 4 tests. This is because `getValueLength()` do not know the full 
length of the `value` part when `value` part's size is larger than 
`tfile.io.chunk.size`.

*## Fixes*

`readRecords()` should instead call 
`org.apache.hadoop.io.file.tfile.TFile.Reader.Scanner.Entry#getValue(byte[])` 
(abbreviate as `getValue()` below), which returns the correct full length of 
the `value` part despite whether the `value` length is larger than 
`tfile.io.chunk.size`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2020-02-01 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1399/

No changes

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Apache Hadoop qbt Report: branch2.10+JDK7 on Linux/x86

2020-02-01 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/

[Jan 31, 2020 7:37:07 PM] (inigoiri) HDFS-13179.
[Feb 1, 2020 12:16:31 AM] (weichiu) HDFS-15046. Backport HDFS-7060 to 
branch-2.10. Contributed by Lisheng




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/empty-configuration.xml
 
   hadoop-tools/hadoop-azure/src/config/checkstyle-suppressions.xml 
   hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/public/crossdomain.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/src/main/webapp/public/crossdomain.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client
 
   Boxed value is unboxed and then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:[line 335] 

Failed junit tests :

   hadoop.ipc.TestRPC 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.hdfs.server.namenode.ha.TestDelegationTokensWithHA 
   hadoop.hdfs.TestRollingUpgrade 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.registry.secure.TestSecureLogins 
   hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-compile-cc-root-jdk1.7.0_95.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-compile-javac-root-jdk1.7.0_95.txt
  [328K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-compile-cc-root-jdk1.8.0_232.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-compile-javac-root-jdk1.8.0_232.txt
  [308K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-checkstyle-root.txt
  [16M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-patch-pylint.txt
  [24K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-patch-shellcheck.txt
  [56K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/diff-patch-shelldocs.txt
  [8.0K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/whitespace-eol.txt
  [12M]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/whitespace-tabs.txt
  [1.3M]

   xml:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/xml.txt
  [12K]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-timelineservice-hbase_hadoop-yarn-server-timelineservice-hbase-client-warnings.html
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/branch-findbugs-hadoop-tools_hadoop-datajoin.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/branch-findbugs-hadoop-tools_hadoop-ant.txt
  [0]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/branch-findbugs-hadoop-tools_hadoop-extras.txt
  [0]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/584/artifact/out/branch-findbugs-hadoop-tools_hadoop-openstack.txt
  [0]