[
https://issues.apache.org/jira/browse/HADOOP-12106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14593448#comment-14593448
]
Tony Reix commented on HADOOP-12106:
------------------------------------
What does the test code and how the issue appears:
The code creates a OutputStream (a file) where it writes bytes.
Then, the test reads the file and must read an expected number of bytes.
File is:
hadoop-common-project/hadoop-common/target/test/data/work-dir/localfs/test-file
The issue appears as:
@ readAll: 1 n= 8192 total= 3145728
@ readAll: N off+total= 3153920 len-total= 23586
@ readAll: 1 n= 8192 total= 3153920
@ readAll: N off+total= 3162112 len-total= 15394
@ readAll: 1 n= 8192 total= 3162112
@ readAll: N off+total= 3170304 len-total= 7202
@ readAll: 1 n= 7200 total= 3170304
@ readAll: N off+total= 3177504 len-total= 2
@ readAll: 1 n= -1 total= 3177504
@ readAll: 2 n= -1 total= 3177504
@ readAll: 0 b[len-100] 57
@ readAll: 0 b[len-20] -64
@ readAll: 0 b[len-10] 63
@ readAll: 0 b[len-3] 35
@ readAll: 0 b[len-2] 0
@ readAll: 0 b[len-1] 0
@ readCheck: n= 3177504 dataLen= 3177506
In that case, TWO bytes are missing in the file. Sometimes, it is 10, or 11, or
anything else.
Traces of CryptoOutputStream show:
@ setUp: 2 dataLen = 3177506
@ testCryptoIV: iv1.length= 16 iv.length= 16
@ testCryptoIV: iv1= [B@3ec722c2 iv= [B@c12308c4 Long.MAX_VALUE=
9223372036854775807
@ writeData: 0 out= org.apache.hadoop.crypto.CryptoOutputStream@fc8d9acd
data.len= 5341184 dataLen= 3177506
@ writeData: 0 data[len-100]= 57
@ writeData: 0 data[len-20]= -64
@ writeData: 0 data[len-10]= 63
@ writeData: 0 data[len-5]= -69
@ writeData: 0 data[len-4]= -118
@ writeData: 0 data[len-3]= 35
@ writeData: 0 data[len-2]= 123
@ writeData: 0 data[len-1]= -46
@ CryptoOutputStream.write: 0 off= 0 len= 3177506
@ CryptoOutputStream.write: 1 len= 3177506 remaining= 8192
@ CryptoOutputStream.write: N off= 8192 len= 3169314
@ CryptoOutputStream.encrypt: 0 padding= 0 inBuffer.position()= 8192
@ CryptoOutputStream.encrypt: 1 len=outBuffer.remaining()= 8192
@ CryptoOutputStream.encrypt: 2 len= 8192 streamOffset= 8192
...
@ CryptoOutputStream.write: 1 len= 23586 remaining= 8192
@ CryptoOutputStream.write: N off= 3162112 len= 15394
@ CryptoOutputStream.encrypt: 0 padding= 0 inBuffer.position()= 8192
@ CryptoOutputStream.encrypt: 1 len=outBuffer.remaining()= 8192
@ CryptoOutputStream.encrypt: 2 len= 8192 streamOffset= 3162112
@ CryptoOutputStream.write: 1 len= 15394 remaining= 8192
@ CryptoOutputStream.write: N off= 3170304 len= 7202
@ CryptoOutputStream.encrypt: 0 padding= 0 inBuffer.position()= 8192
@ CryptoOutputStream.encrypt: 1 len=outBuffer.remaining()= 8192
@ CryptoOutputStream.encrypt: 2 len= 8192 streamOffset= 3170304
@ CryptoOutputStream.write: 1 len= 7202 remaining= 8192
@ CryptoOutputStream.write: END len= 0 remaining= 8192
@ CryptoOutputStream.encrypt: 3 before updateEncryptor() padding= 0
@ CryptoOutputStream.encrypt: 3 after updateEncryptor() padding= 0
@ getInputStream: 0 bufferSize= 8192 codec=
org.apache.hadoop.crypto.JceAesCtrCryptoCodec@ed2e6f3a bufferSize= 8192
Tracing the size of the file from the Java code shows:
-rw-r--r-- 1 reixt system 3166208 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
-rw-r--r-- 1 reixt system 3166208 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
-rw-r--r-- 1 reixt system 3166208 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
-rw-r--r-- 1 reixt system 3166208 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
-rw-r--r-- 1 reixt system 3174400 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
-rw-r--r-- 1 reixt system 3177504 Jun 19 13:05
/home/reixt/HADOOP-2.6.0/hadoop-IBMSOE-branch-2.6.0-power/hadoop-common-project/hadoop-common/target/test/data/
work-dir/localfs/test-file
Final value is not the expected one: 3177506. Rather 3177504 with 2 missing
bytes.
> org.apache.hadoop.crypto.TestCryptoStreamsForLocalFS fails on AIX
> -----------------------------------------------------------------
>
> Key: HADOOP-12106
> URL: https://issues.apache.org/jira/browse/HADOOP-12106
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 2.6.0, 2.7.0
> Environment: Hadoop 2.60 and 2.7+
> - AIX/PowerPC/IBMJVM
> - Ubuntu/i386/IBMJVM
> Reporter: Tony Reix
> Attachments: mvn.Test.TestCryptoStreamsForLocalFS.res20.AIX.Errors,
> mvn.Test.TestCryptoStreamsForLocalFS.res20.Ubuntu-i386.IBMJVM.Errors,
> mvn.Test.TestCryptoStreamsForLocalFS.res22.OpenJDK.Errors
>
>
> On AIX (IBM JVM available only), many sub-tests of :
> org.apache.hadoop.crypto.TestCryptoStreamsForLocalFS
> fail:
> Tests run: 13, Failures: 5, Errors: 1, Skipped:
> - testCryptoIV
> - testSeek
> - testSkip
> - testAvailable
> - testPositionedRead
> When testing SAME exact code on Ubuntu/i386 :
> - with OpenJDK, all tests are OK
> - with IBM JVM, tests randomly fail.
> The issue may be in the IBM JVM, or in some Hadoop code that not perfectly
> handles differences due to different IBM JVM.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)