[jira] [Commented] (HADOOP-11790) leveldb usage should be disabled by default or smarter about platforms

2017-07-17 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11790?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16089410#comment-16089410
 ] 

Ayappan commented on HADOOP-11790:
--

Switching to RocksDB is really a good thing to do. It's a very active one 
unlike leveldb. The latest released ones 5.4.7 & 5.5.1 has PowerPC support (fat 
jar in maven central repo has Power native library). 

> leveldb usage should be disabled by default or smarter about platforms
> --
>
> Key: HADOOP-11790
> URL: https://issues.apache.org/jira/browse/HADOOP-11790
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 2.6.0, 3.0.0-alpha3
> Environment: * any non-x86
> * any OS that isn't Linux, OSX, Windows
>Reporter: Ayappan
>Priority: Blocker
>
> The leveldbjni artifact in maven repository has been built for only x86 
> architecture. Due to which some of the testcases are failing in PowerPC. The 
> leveldbjni community has no plans to support other platforms [ 
> https://github.com/fusesource/leveldbjni/issues/54 ]. Right now , the 
> approach is we need to locally built leveldbjni prior to running hadoop 
> testcases. Pushing a PowerPC-specific leveldbjni artifact in central maven 
> repository and making pom.xml to pickup it up while running in PowerPC is 
> another option but i don't know whether this is a suitable one . Any other 
> alternative/solution is there ?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-14616) Erasurecode XOR testcase failures

2017-06-30 Thread Ayappan (JIRA)
Ayappan created HADOOP-14616:


 Summary: Erasurecode XOR testcase failures 
 Key: HADOOP-14616
 URL: https://issues.apache.org/jira/browse/HADOOP-14616
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.0.0-alpha3
 Environment: x86_64 Ubuntu 16.04.02 LTS
Reporter: Ayappan


TestXORCoder, TestXORRawCoderInteroperable2, TestNativeXORRawCoder are the 
testcases failing. All with the same error. This happens after the commit of 
HADOOP-14479


java.lang.InternalError: Invalid inputs
at 
org.apache.hadoop.io.erasurecode.rawcoder.NativeXORRawDecoder.decodeImpl(Native 
Method)
at 
org.apache.hadoop.io.erasurecode.rawcoder.NativeXORRawDecoder.performDecodeImpl(NativeXORRawDecoder.java:44)
at 
org.apache.hadoop.io.erasurecode.rawcoder.AbstractNativeRawDecoder.doDecode(AbstractNativeRawDecoder.java:58)
at 
org.apache.hadoop.io.erasurecode.rawcoder.AbstractNativeRawDecoder.doDecode(AbstractNativeRawDecoder.java:74)
at 
org.apache.hadoop.io.erasurecode.rawcoder.RawErasureDecoder.decode(RawErasureDecoder.java:105)
at 
org.apache.hadoop.io.erasurecode.rawcoder.RawErasureDecoder.decode(RawErasureDecoder.java:163)
at 
org.apache.hadoop.io.erasurecode.coder.ErasureDecodingStep.performCoding(ErasureDecodingStep.java:54)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.performCodingStep(TestErasureCoderBase.java:126)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.performTestCoding(TestErasureCoderBase.java:95)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.testCoding(TestErasureCoderBase.java:69)
at 
org.apache.hadoop.io.erasurecode.coder.TestXORCoder.testCodingNoDirectBuffer_erasing_p0(TestXORCoder.java:51)




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-11505) Various native parts use bswap incorrectly and unportably

2017-06-15 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16050078#comment-16050078
 ] 

Ayappan commented on HADOOP-11505:
--

Any update on this ?

> Various native parts use bswap incorrectly and unportably
> -
>
> Key: HADOOP-11505
> URL: https://issues.apache.org/jira/browse/HADOOP-11505
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.0.0-alpha1
>Reporter: Colin P. McCabe
>Assignee: Alan Burlison
> Attachments: HADOOP-11505.001.patch, HADOOP-11505.003.patch, 
> HADOOP-11505.004.patch, HADOOP-11505.005.patch, HADOOP-11505.006.patch, 
> HADOOP-11505.007.patch, HADOOP-11505.008.patch
>
>
> hadoop-mapreduce-client-nativetask fails to use x86 optimizations in some 
> cases.  Also, on some alternate, non-x86, non-ARM architectures the generated 
> code is incorrect.  Thanks to Steve Loughran and Edward Nevill for finding 
> this.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-14479) Erasurecode testcase failures with ISA-L

2017-06-02 Thread Ayappan (JIRA)
Ayappan created HADOOP-14479:


 Summary: Erasurecode testcase failures with ISA-L 
 Key: HADOOP-14479
 URL: https://issues.apache.org/jira/browse/HADOOP-14479
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.0.0-alpha3
 Environment: x86_64 Ubuntu 16.04.02 LTS
Reporter: Ayappan


I built hadoop with ISA-L support. I took the ISA-L code from 
https://github.com/01org/isa-l  (tag v2.18.0) and built it. While running the 
UTs , following three testcases are failing

1)TestHHXORErasureCoder

Tests run: 7, Failures: 3, Errors: 0, Skipped: 0, Time elapsed: 1.106 sec <<< 
FAILURE! - in org.apache.hadoop.io.erasurecode.coder.TestHHXORErasureCoder
testCodingDirectBuffer_10x4_erasing_p1(org.apache.hadoop.io.erasurecode.coder.TestHHXORErasureCoder)
  Time elapsed: 0.029 sec  <<< FAILURE!
java.lang.AssertionError: Decoding and comparing failed.
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at 
org.apache.hadoop.io.erasurecode.TestCoderBase.compareAndVerify(TestCoderBase.java:170)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.compareAndVerify(TestErasureCoderBase.java:141)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.performTestCoding(TestErasureCoderBase.java:98)
at 
org.apache.hadoop.io.erasurecode.coder.TestErasureCoderBase.testCoding(TestErasureCoderBase.java:69)
at 
org.apache.hadoop.io.erasurecode.coder.TestHHXORErasureCoder.testCodingDirectBuffer_10x4_erasing_p1(TestHHXORErasureCoder.java:64)


2)TestRSErasureCoder

Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.591 sec - in 
org.apache.hadoop.io.erasurecode.coder.TestXORCoder
Running org.apache.hadoop.io.erasurecode.coder.TestRSErasureCoder
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x7f486a28a6e4, pid=8970, tid=0x7f4850927700
#
# JRE version: OpenJDK Runtime Environment (8.0_121-b13) (build 
1.8.0_121-8u121-b13-0ubuntu1.16.04.2-b13)
# Java VM: OpenJDK 64-Bit Server VM (25.121-b13 mixed mode linux-amd64 
compressed oops)
# Problematic frame:
# C  [libc.so.6+0x8e6e4]
#
# Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/ayappan/hadoop/hadoop-common-project/hadoop-common/hs_err_pid8970.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

3)TestCodecRawCoderMapping

Running org.apache.hadoop.io.erasurecode.TestCodecRawCoderMapping
Tests run: 5, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.559 sec <<< 
FAILURE! - in org.apache.hadoop.io.erasurecode.TestCodecRawCoderMapping
testRSDefaultRawCoder(org.apache.hadoop.io.erasurecode.TestCodecRawCoderMapping)
  Time elapsed: 0.015 sec  <<< FAILURE!
java.lang.AssertionError: null
at org.junit.Assert.fail(Assert.java:86)
at org.junit.Assert.assertTrue(Assert.java:41)
at org.junit.Assert.assertTrue(Assert.java:52)
at 
org.apache.hadoop.io.erasurecode.TestCodecRawCoderMapping.testRSDefaultRawCoder(TestCodecRawCoderMapping.java:58)




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-11790) leveldb usage should be disabled by default or smarter about platforms

2017-04-18 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11790?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15972244#comment-15972244
 ] 

Ayappan commented on HADOOP-11790:
--

Trying to get leveldbjni community to make a new release but no luck so far.

https://github.com/fusesource/leveldbjni/issues/85

> leveldb usage should be disabled by default or smarter about platforms
> --
>
> Key: HADOOP-11790
> URL: https://issues.apache.org/jira/browse/HADOOP-11790
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 2.6.0
> Environment: * any non-x86
> * any OS that isn't Linux, OSX, Windows
>Reporter: Ayappan
>Priority: Critical
>
> The leveldbjni artifact in maven repository has been built for only x86 
> architecture. Due to which some of the testcases are failing in PowerPC. The 
> leveldbjni community has no plans to support other platforms [ 
> https://github.com/fusesource/leveldbjni/issues/54 ]. Right now , the 
> approach is we need to locally built leveldbjni prior to running hadoop 
> testcases. Pushing a PowerPC-specific leveldbjni artifact in central maven 
> repository and making pom.xml to pickup it up while running in PowerPC is 
> another option but i don't know whether this is a suitable one . Any other 
> alternative/solution is there ?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-12633) Extend Erasure Code to support POWER Chip acceleration

2017-04-10 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-12633?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15962628#comment-15962628
 ] 

Ayappan commented on HADOOP-12633:
--

What's the status of this ?

> Extend Erasure Code to support POWER Chip acceleration
> --
>
> Key: HADOOP-12633
> URL: https://issues.apache.org/jira/browse/HADOOP-12633
> Project: Hadoop Common
>  Issue Type: New Feature
>Reporter: wqijun
>Assignee: wqijun
> Attachments: hadoopec-ACC.patch
>
>
> Erasure Code is a very important feature in new HDFS version. This JIRA will 
> focus on how to extend EC to support multiple types of EC acceleration by C 
> library and other hardware method, like GPU or FPGA. Compared with 
> Hadoop-11887, this JIRA will more focus on how to leverage POWER Chip 
> capability to accelerate the EC calculating. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-11505) hadoop-mapreduce-client-nativetask fails to use x86 optimizations in some cases

2015-04-27 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14513857#comment-14513857
 ] 

Ayappan commented on HADOOP-11505:
--

There is a JVM crash due to libnativetask.so in PowerPC64LE that may be related 
to this jira.

# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x3fff721d8e50, pid=10952, tid=70366359056848
#
# JRE version: OpenJDK Runtime Environment (7.0_79-b14) (build 
1.7.0_79-mockbuild_2015_04_10_10_48-b00)
# Java VM: OpenJDK 64-Bit Server VM (24.79-b02 mixed mode linux-ppc64 
compressed oops)
# Derivative: IcedTea 2.5.5
# Distribution: Built on Red Hat Enterprise Linux Server release 7.1 (Maipo) 
(Fri Apr 10 10:48:01 EDT 2015)
# Problematic frame:
# C  [libnativetask.so.1.0.0+0x58e50]  
NativeTask::WritableUtils::ReadVLongInner(char const*, unsigned int)+0x40
#
# Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try ulimit -c unlimited before starting Java again
#
# An error report file with more information is saved as:
# /tmp/jvm-10952/hs_error.log
#
# If you would like to submit a bug report, please include
# instructions on how to reproduce the bug and visit:
#   http://icedtea.classpath.org/bugzilla
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.


 hadoop-mapreduce-client-nativetask fails to use x86 optimizations in some 
 cases
 ---

 Key: HADOOP-11505
 URL: https://issues.apache.org/jira/browse/HADOOP-11505
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.0.0
Reporter: Colin Patrick McCabe
Assignee: Colin Patrick McCabe
 Attachments: HADOOP-11505.001.patch


 hadoop-mapreduce-client-nativetask fails to use x86 optimizations in some 
 cases.  Also, on some alternate, non-x86, non-ARM architectures the generated 
 code is incorrect.  Thanks to Steve Loughran and Edward Nevill for finding 
 this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-04-06 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14481093#comment-14481093
 ] 

Ayappan commented on HADOOP-11665:
--

Any update here? This issue has been lingering around for a long time.

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: native, util
Affects Versions: 2.4.1, 2.6.0
 Environment: PowerPC Big Endian  other Big Endian platforms
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11790) Testcase failures in PowerPC due to leveldbjni artifact

2015-04-02 Thread Ayappan (JIRA)
Ayappan created HADOOP-11790:


 Summary: Testcase failures in PowerPC due to leveldbjni artifact
 Key: HADOOP-11790
 URL: https://issues.apache.org/jira/browse/HADOOP-11790
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.6.0
 Environment: PowerPC64LE
Reporter: Ayappan


The leveldbjni artifact in maven repository has been built for only x86 
architecture. Due to which some of the testcases are failing in PowerPC. The 
leveldbjni community has no plans to support other platforms [ 
https://github.com/fusesource/leveldbjni/issues/54 ]. Right now , the approach 
is we need to locally built leveldbjni prior to running hadoop testcases. 
Pushing a PowerPC-specific leveldbjni artifact in central maven repository and 
making pom.xml to pickup it up while running in PowerPC is another option but i 
don't know whether this is a suitable one . Any other alternative/solution is 
there ?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11755) Update avro version to have PowerPC supported Snappy-java

2015-03-26 Thread Ayappan (JIRA)
Ayappan created HADOOP-11755:


 Summary: Update avro version to have PowerPC supported Snappy-java 
 Key: HADOOP-11755
 URL: https://issues.apache.org/jira/browse/HADOOP-11755
 Project: Hadoop Common
  Issue Type: Task
  Components: build
Affects Versions: 2.6.0
 Environment: PowerPC64, PowerPC64LE
Reporter: Ayappan


Hadoop downloads Snappy-java 1.0.4.1 version (which don't have PowerPC native 
libraries) through avro dependency. 
Current Avro development version ( 1.8.0-SNAPSHOT) has updated the snappy-java 
version to 1.1.1.3 which has ppc64  ppc64le native libraries.So Hadoop needs 
to update the avro version to the upcoming release ( probably 1.7.8) to have 
PowerPC supported snappy-java in its lib.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11755) Update avro version to have PowerPC supported Snappy-java

2015-03-26 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11755?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11755:
-
Description: 
Hadoop downloads Snappy-java 1.0.4.1 version (which don't have PowerPC native 
libraries) through avro 1.7.4 dependency. 
Current Avro development version ( 1.8.0-SNAPSHOT) has updated the snappy-java 
version to 1.1.1.3 which has ppc64  ppc64le native libraries.So Hadoop needs 
to update the avro version to the upcoming release ( probably 1.7.8) to have 
PowerPC supported snappy-java in its lib.

  was:
Hadoop downloads Snappy-java 1.0.4.1 version (which don't have PowerPC native 
libraries) through avro dependency. 
Current Avro development version ( 1.8.0-SNAPSHOT) has updated the snappy-java 
version to 1.1.1.3 which has ppc64  ppc64le native libraries.So Hadoop needs 
to update the avro version to the upcoming release ( probably 1.7.8) to have 
PowerPC supported snappy-java in its lib.


 Update avro version to have PowerPC supported Snappy-java 
 --

 Key: HADOOP-11755
 URL: https://issues.apache.org/jira/browse/HADOOP-11755
 Project: Hadoop Common
  Issue Type: Task
  Components: build
Affects Versions: 2.6.0
 Environment: PowerPC64, PowerPC64LE
Reporter: Ayappan

 Hadoop downloads Snappy-java 1.0.4.1 version (which don't have PowerPC native 
 libraries) through avro 1.7.4 dependency. 
 Current Avro development version ( 1.8.0-SNAPSHOT) has updated the 
 snappy-java version to 1.1.1.3 which has ppc64  ppc64le native libraries.So 
 Hadoop needs to update the avro version to the upcoming release ( probably 
 1.7.8) to have PowerPC supported snappy-java in its lib.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-20 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371462#comment-14371462
 ] 

Ayappan commented on HADOOP-11665:
--

Any update on this?

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
 Environment: PowerPC Big Endian  other Big Endian platforms
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-16 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11665:
-
Priority: Major  (was: Minor)

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
 Environment: PowerPC Big Endian  other Big Endian platforms
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11665:
-
Priority: Blocker  (was: Major)

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
 Environment: PowerPC Big Endian  other Big Endian platforms
Reporter: Binglin Chang
Assignee: Binglin Chang
Priority: Blocker
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-03-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Priority: Blocker  (was: Major)

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Ayappan
Priority: Blocker
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-04 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11665:
-
 Environment: PowerPC Big Endian  other Big Endian platforms
Target Version/s: 2.7.0

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
 Environment: PowerPC Big Endian  other Big Endian platforms
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-03-04 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14348236#comment-14348236
 ] 

Ayappan commented on HADOOP-10846:
--

A new jira ( HADOOP-11665 ) has been opened to fix this issue in a more 
standard way.

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Ayappan
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-04 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11665:
-
Affects Version/s: 2.4.1
   2.6.0

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-04 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11665:
-
Component/s: util

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.6.0
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11665) Provide and unify cross platform byteorder support in native code

2015-03-04 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11665?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14346898#comment-14346898
 ] 

Ayappan commented on HADOOP-11665:
--

I verified this patch in both ppc64 LE and ppc64 BE environment. It works fine.

 Provide and unify cross platform byteorder support in native code
 -

 Key: HADOOP-11665
 URL: https://issues.apache.org/jira/browse/HADOOP-11665
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Binglin Chang
Assignee: Binglin Chang
 Attachments: HADOOP-11665.001.patch






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-03-01 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan reassigned HADOOP-10846:


Assignee: Ayappan  (was: Jinghui Wang)

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Ayappan
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-02-17 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14323956#comment-14323956
 ] 

Ayappan commented on HADOOP-10846:
--

Any update on this ?

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-02-04 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
  Resolution: Fixed
   Fix Version/s: 2.6.0
Target Version/s:   (was: 2.7.0)
  Status: Resolved  (was: Patch Available)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Fix For: 2.6.0

 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744-v3.patch, HADOOP-10744-v4.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-02-04 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14304873#comment-14304873
 ] 

Ayappan commented on HADOOP-10744:
--

It seems like the issue is resolved by HADOOP-11184 ( updating lz4 to r123 ). 
So i am closing this defect.

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744-v3.patch, HADOOP-10744-v4.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-02-03 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Attachment: HADOOP-10744-v3.patch

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744-v3.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-02-03 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Attachment: HADOOP-10744-v4.patch

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744-v3.patch, HADOOP-10744-v4.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11462) TestSocketIOWithTimeout needs change for PowerPC platform

2015-02-02 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11462?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14302802#comment-14302802
 ] 

Ayappan commented on HADOOP-11462:
--

Above build failure is not due to this patch since the failure happens in 
mapreduce related testcase whereas this patch is for TestSocketIOWithTimeout ( 
hadoop-common module ). Check out 
https://www.mail-archive.com/mapreduce-dev@hadoop.apache.org/msg12794.html

 TestSocketIOWithTimeout needs change for PowerPC platform
 -

 Key: HADOOP-11462
 URL: https://issues.apache.org/jira/browse/HADOOP-11462
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2
 Environment: PowerPC
Reporter: Ayappan
Assignee: Ayappan
 Fix For: 2.7.0

 Attachments: HADOOP-9627-v1.patch, HADOOP-9627-v2.patch


 TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
 write. This seems to be a valid in x86 architecture where the default minimum 
 blocksize is 4096. 
 This testcase fails in PowerPC where the default minimum block size is 65536 
 bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
 6(65536 + 19) holds good for this scenario.
 I attached a patch here where i made it very general by introducing 
 NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
 page size.
 I tested my patch in both ppc64 and x86 linux machines.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-27 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Attachment: HADOOP-10846-v4.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-27 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14293554#comment-14293554
 ] 

Ayappan commented on HADOOP-10846:
--

Hi Steve,
   I attached a new patch which deals with the windows env ( which is 
always Little Endian or atleast now )

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846-v4.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-27 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14293394#comment-14293394
 ] 

Ayappan commented on HADOOP-10744:
--

The patch will succeed with -p1 option. Can any maintainer look into this issue 
?

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-23 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14289094#comment-14289094
 ] 

Ayappan commented on HADOOP-10846:
--

This patch resolves checksum errors in the existing tests. So no new tests are 
needed for this patch.

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-22 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Attachment: HADOOP-10744-v2.patch

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744-v2.patch, 
 HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-22 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Attachment: HADOOP-10846-v3.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846-v3.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-14 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Target Version/s: 2.7.0  (was: 2.6.0)

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-13 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Environment: PowerPC platform

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC platform
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-07 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Assignee: Bert Sanders  (was: Ayappan)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Bert Sanders
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11462) TestSocketIOTimeout needs change for PowerPC platform

2015-01-06 Thread Ayappan (JIRA)
Ayappan created HADOOP-11462:


 Summary: TestSocketIOTimeout needs change for PowerPC platform
 Key: HADOOP-11462
 URL: https://issues.apache.org/jira/browse/HADOOP-11462
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2
 Environment: PowerPC
Reporter: Ayappan


TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
write. This seems to be a valid in x86 architecture where the default minimum 
blocksize is 4096. 
This testcase fails in PowerPC where the default minimum block size is 65536 
bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
6(65536 + 19) holds good for this scenario.
I attached a patch here where i made it very general by introducing 
NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
page size.
I tested my patch in both ppc64 and x86 linux machines.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11462) TestSocketIOTimeout needs change for PowerPC platform

2015-01-06 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11462:
-
Attachment: HADOOP-9627-v1.patch

 TestSocketIOTimeout needs change for PowerPC platform
 -

 Key: HADOOP-11462
 URL: https://issues.apache.org/jira/browse/HADOOP-11462
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2
 Environment: PowerPC
Reporter: Ayappan
 Attachments: HADOOP-9627-v1.patch


 TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
 write. This seems to be a valid in x86 architecture where the default minimum 
 blocksize is 4096. 
 This testcase fails in PowerPC where the default minimum block size is 65536 
 bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
 6(65536 + 19) holds good for this scenario.
 I attached a patch here where i made it very general by introducing 
 NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
 page size.
 I tested my patch in both ppc64 and x86 linux machines.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11462) TestSocketIOTimeout needs change for PowerPC platform

2015-01-06 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11462:
-
Assignee: Chris Nauroth
Target Version/s: 2.5.2, 2.6.0  (was: 2.6.0, 2.5.2)
  Status: Patch Available  (was: Open)

 TestSocketIOTimeout needs change for PowerPC platform
 -

 Key: HADOOP-11462
 URL: https://issues.apache.org/jira/browse/HADOOP-11462
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2
 Environment: PowerPC
Reporter: Ayappan
Assignee: Chris Nauroth
 Attachments: HADOOP-9627-v1.patch


 TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
 write. This seems to be a valid in x86 architecture where the default minimum 
 blocksize is 4096. 
 This testcase fails in PowerPC where the default minimum block size is 65536 
 bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
 6(65536 + 19) holds good for this scenario.
 I attached a patch here where i made it very general by introducing 
 NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
 page size.
 I tested my patch in both ppc64 and x86 linux machines.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-06 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Target Version/s: 2.5.2, 2.6.0  (was: 2.6.0)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-06 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14267353#comment-14267353
 ] 

Ayappan commented on HADOOP-10744:
--

Any update on this ?

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11462) TestSocketIOWithTimeout needs change for PowerPC platform

2015-01-06 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-11462:
-
Attachment: HADOOP-9627-v2.patch

 TestSocketIOWithTimeout needs change for PowerPC platform
 -

 Key: HADOOP-11462
 URL: https://issues.apache.org/jira/browse/HADOOP-11462
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2
 Environment: PowerPC
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-9627-v1.patch, HADOOP-9627-v2.patch


 TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
 write. This seems to be a valid in x86 architecture where the default minimum 
 blocksize is 4096. 
 This testcase fails in PowerPC where the default minimum block size is 65536 
 bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
 6(65536 + 19) holds good for this scenario.
 I attached a patch here where i made it very general by introducing 
 NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
 page size.
 I tested my patch in both ppc64 and x86 linux machines.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Issue Type: Bug  (was: Test)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Bug
  Components: io, native
Affects Versions: 2.4.1, 2.5.2
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Fix Version/s: (was: 2.4.1)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Fix Version/s: (was: 2.5.2)
   (was: 2.6.0)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.5.2
Reporter: Arpit Agarwal
Assignee: Ayappan
 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Affects Version/s: 2.5.2

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1, 2.5.2
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Affects Version/s: (was: 2.4.0)
   (was: 2.3.0)
   (was: 2.2.0)

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.4.1, 2.5.2
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Affects Version/s: (was: 3.0.0)
   2.4.1

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.4.1, 2.5.2
Reporter: Arpit Agarwal
Assignee: Ayappan
 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Target Version/s: 2.6.0  (was: 3.0.0, 2.6.0)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.4.1, 2.5.2
Reporter: Arpit Agarwal
Assignee: Ayappan
 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2015-01-05 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Target Version/s: 2.5.2, 2.6.0  (was: 2.6.0)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.4.1, 2.5.2
Reporter: Arpit Agarwal
Assignee: Ayappan
 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-12-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Target Version/s: 2.6.0, 3.0.0  (was: 3.0.0, 2.6.0)
  Status: Open  (was: Patch Available)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.3.0, 3.0.0
Reporter: Arpit Agarwal
Assignee: Ayappan
 Fix For: 2.5.2, 2.6.0

 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-12-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

   Fix Version/s: 2.6.0
  2.5.2
Assignee: Ayappan
Target Version/s: 2.6.0, 3.0.0  (was: 2.6.0)
  Status: Patch Available  (was: Open)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.3.0, 3.0.0
Reporter: Arpit Agarwal
Assignee: Ayappan
 Fix For: 2.5.2, 2.6.0

 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-12-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

Attachment: HADOOP-9627-v1.patch

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.3.0
Reporter: Arpit Agarwal
Assignee: Ayappan
 Fix For: 2.6.0, 2.5.2

 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-12-12 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:

 Target Version/s: 2.6.0, 3.0.0  (was: 3.0.0, 2.6.0)
Affects Version/s: (was: 2.3.0)
   2.5.2
   Status: Patch Available  (was: Open)

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 2.5.2, 3.0.0
Reporter: Arpit Agarwal
Assignee: Ayappan
 Fix For: 2.5.2, 2.6.0

 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-12-12 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14243951#comment-14243951
 ] 

Ayappan commented on HADOOP-9627:
-

Findbugs warnings are unrelated to this patch.

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.5.2
Reporter: Arpit Agarwal
Assignee: Ayappan
 Fix For: 2.6.0, 2.5.2

 Attachments: HADOOP-9627-v1.patch, HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-11-06 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Target Version/s: 2.6.0

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-10-27 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Fix Version/s: 2.4.1
   Status: Patch Available  (was: Open)

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.4.0, 2.3.0, 2.2.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Fix For: 2.4.1

 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-27 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14184886#comment-14184886
 ] 

Ayappan commented on HADOOP-10846:
--

Any update on when this will get pulled into the trunk? 

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-10-27 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-
Attachment: HADOOP-10744-v1.patch

The fixed patch

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Fix For: 2.4.1

 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-10-27 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14185030#comment-14185030
 ] 

Ayappan commented on HADOOP-10744:
--

This patch resolves existing compression related tests failure in Little-Endian 
architecture.
So new tests are required for this patch.

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Fix For: 2.4.1

 Attachments: HADOOP-10744-v1.patch, HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-10-16 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan reassigned HADOOP-10744:


Assignee: Ayappan

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
Assignee: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-10-16 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14173442#comment-14173442
 ] 

Ayappan commented on HADOOP-9627:
-

Any update on this defect ?

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.3.0
Reporter: Arpit Agarwal
 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14170724#comment-14170724
 ] 

Ayappan commented on HADOOP-10846:
--

Lot of changes went into hadoop since the patch was attached. So the old patch 
no longer seem to be correct. 
I reworked the patch and attached a new patch named HADOOP-10846-v1.patch.

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Attachment: HADOOP-10846-v1.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10846:
-
Attachment: HADOOP-10846-v2.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14170838#comment-14170838
 ] 

Ayappan commented on HADOOP-10846:
--

The last patch contains some extra spaces at the line ends due to which it 
fails.
I updated a new fixed patch HADOOP-10846-v3.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14170842#comment-14170842
 ] 

Ayappan commented on HADOOP-10846:
--

sorry. The new attached patch is HADOOP-10846-v2.patch

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-14 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14170905#comment-14170905
 ] 

Ayappan commented on HADOOP-10846:
--

This patch resolves checksum errors in the existing tests. So no new tests are 
needed for this patch.
Findbugs warnings are not related to this patch.



 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846-v1.patch, HADOOP-10846-v2.patch, 
 HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-10-06 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14160107#comment-14160107
 ] 

Ayappan commented on HADOOP-10744:
--

Any update on when the new LZ4 development version is picked up?

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10846) DataChecksum#calculateChunkedSums not working for PPC when buffers not backed by array

2014-10-06 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14160114#comment-14160114
 ] 

Ayappan commented on HADOOP-10846:
--

This patch resolves so many testcase errors and failures in PPC. This patch 
needs to be included in the upcoming release.

 DataChecksum#calculateChunkedSums not working for PPC when buffers not backed 
 by array
 --

 Key: HADOOP-10846
 URL: https://issues.apache.org/jira/browse/HADOOP-10846
 Project: Hadoop Common
  Issue Type: Bug
  Components: util
Affects Versions: 2.2.0, 2.3.0, 2.4.0, 2.4.1
Reporter: Jinghui Wang
Assignee: Jinghui Wang
 Attachments: HADOOP-10846.patch


 Got the following exception when running Hadoop on Power PC. The 
 implementation for computing checksum when the data buffer and checksum 
 buffer are not backed by arrays.
 13/09/16 04:06:57 ERROR security.UserGroupInformation: 
 PriviledgedActionException as:biadmin (auth:SIMPLE) 
 cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): 
 org.apache.hadoop.fs.ChecksumException: Checksum error



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-07-10 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:


Target Version/s: 2.6.0

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.3.0
Reporter: Arpit Agarwal
 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-07-02 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-9627:


Attachment: HADOOP-9627.patch

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.3.0
Reporter: Arpit Agarwal
 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (HADOOP-9627) TestSocketIOTimeout should be rewritten without platform-specific assumptions

2014-07-02 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9627?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14049867#comment-14049867
 ] 

Ayappan commented on HADOOP-9627:
-

TestSocketIOWithTimeout uses a block size of 4192 bytes to simulate a partial 
write. This seems to be a valid in x86 architecture where the default minimum 
blocksize is 4096. 
This testcase fails in PowerPC where the default minimum block size is 65536 
bytes (64KB). So for PowerPC, using a blocksize little more than 64K , say 
65632(65536 + 96) holds good for this scenario.
I attached a patch here where i made it very general by introducing 
NativeIO.POSIX.getCacheManipulator().getOperatingSystemPageSize() to get the 
page size.
I tested my patch in both ppc64 and x86 linux machines.

 TestSocketIOTimeout should be rewritten without platform-specific assumptions
 -

 Key: HADOOP-9627
 URL: https://issues.apache.org/jira/browse/HADOOP-9627
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
Affects Versions: 3.0.0, 2.3.0
Reporter: Arpit Agarwal
 Attachments: HADOOP-9627.patch


 TestSocketIOTimeout makes some assumptions about the behavior of file 
 channels wrt partial writes that do not appear to hold true on Windows 
 [details in HADOOP-8982].
 Currently part of the test is skipped on Windows.
 This bug is to track fixing the test.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-07-02 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14051088#comment-14051088
 ] 

Ayappan commented on HADOOP-10744:
--

This Lz4 compression is one of the core files in hadoop-common. This issue is 
not related to testcase modification.Its related to a failure in a particular 
architecture.So i think latest Lz4 development version should made available in 
hadoop 2.5.0 version or atleast the above patch to 2.5.0 version

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-06-30 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14047555#comment-14047555
 ] 

Ayappan commented on HADOOP-10744:
--

Okay.So any idea on when the latest LZ4 development version is made available 
to Hadoop ?

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Created] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-06-24 Thread Ayappan (JIRA)
Ayappan created HADOOP-10744:


 Summary: LZ4 Compression fails to recognize PowerPC Little Endian 
Architecture
 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.4.0, 2.3.0, 2.2.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan


Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
recognizes it as Big Endian and several testcases( TestCompressorDecompressor, 
TestCodec, TestLz4CompressorDecompressor)  fails due to this.

Running org.apache.hadoop.io.compress.TestCompressorDecompressor
Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
  Time elapsed: 0.308 sec   FAILURE!
org.junit.internal.ArrayComparisonFailure: 
org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
  byte arrays not equals error !!!: arrays first differed at element [1428]; 
expected:4 but was:10
at 
org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
at org.junit.Assert.internalArrayEquals(Assert.java:473)
at org.junit.Assert.assertArrayEquals(Assert.java:294)
at 
org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
at 
org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
at 
org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
...
...
.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Updated] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-06-24 Thread Ayappan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ayappan updated HADOOP-10744:
-

Attachment: HADOOP-10744.patch

 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (HADOOP-10744) LZ4 Compression fails to recognize PowerPC Little Endian Architecture

2014-06-24 Thread Ayappan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14041968#comment-14041968
 ] 

Ayappan commented on HADOOP-10744:
--

The problem is due to the below lines of code in the lz4.c file 
(src/main/native/src/org/apache/hadoop/io/compress/lz4/lz4.c)

// Little Endian or Big Endian ?
// Overwrite the #define below if you know your architecture endianess
#if defined (__GLIBC__)
#  include endian.h
#  if (__BYTE_ORDER == __BIG_ENDIAN)
# define LZ4_BIG_ENDIAN 1
#  endif
#elif (defined(__BIG_ENDIAN__) || defined(__BIG_ENDIAN) || 
defined(_BIG_ENDIAN))  !(defined(__LITTLE_ENDIAN__) || 
defined(__LITTLE_ENDIAN) || defined(_LITTLE_ENDIAN))
#  define LZ4_BIG_ENDIAN 1
#elif defined(__sparc) || defined(__sparc__) \
   || defined(__powerpc__) || defined(__ppc__) || defined(__PPC__) \
   || defined(__hpux)  || defined(__hppa) \
   || defined(_MIPSEB) || defined(__s390__)
#  define LZ4_BIG_ENDIAN 1
#else
// Little Endian assumed. PDP Endian and other very rare endian format are 
unsupported.
#endif

The flow goes like this in Power little Endian system. __GLIB__ is not defined 
, it falls on next elif. The condition fails due to __LITTLE_ENDIAN__... flags, 
falls on next elif and here it get satisfied! and sets LZ4_BIG_ENDIAN. The 
thing is these flags __powerpc__, __ppc__, __PPC__ are also defined in PowerPC 
Little Endian Architecture.

There is a similar file called lz4hc.c which also has the same lines code and 
still it correctly recognizes Little Endian Architecture. The reason behind 
that is..there is a include stdlib.h which actually turns on the __GLIBC__ 
flag.
And since now the decision is based on BYTE_ORDER, it identifies whether it is 
Big Endian or Little Endian.

So lz4.c file should also have the include stdlib.h to make it work 
correctly in Power Little Endian Architecture


 LZ4 Compression fails to recognize PowerPC Little Endian Architecture
 -

 Key: HADOOP-10744
 URL: https://issues.apache.org/jira/browse/HADOOP-10744
 Project: Hadoop Common
  Issue Type: Test
  Components: io, native
Affects Versions: 2.2.0, 2.3.0, 2.4.0
 Environment: PowerPC Little Endian (ppc64le)
Reporter: Ayappan
 Attachments: HADOOP-10744.patch


 Lz4 Compression fails to identify the PowerPC Little Endian Architecture. It 
 recognizes it as Big Endian and several testcases( 
 TestCompressorDecompressor, TestCodec, TestLz4CompressorDecompressor)  fails 
 due to this.
 Running org.apache.hadoop.io.compress.TestCompressorDecompressor
 Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 0.435 sec  
 FAILURE! - in org.apache.hadoop.io.compress.TestCompressorDecompressor
 testCompressorDecompressor(org.apache.hadoop.io.compress.TestCompressorDecompressor)
   Time elapsed: 0.308 sec   FAILURE!
 org.junit.internal.ArrayComparisonFailure: 
 org.apache.hadoop.io.compress.lz4.Lz4Compressor_org.apache.hadoop.io.compress.lz4.Lz4Decompressor-
   byte arrays not equals error !!!: arrays first differed at element [1428]; 
 expected:4 but was:10
 at 
 org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:50)
 at org.junit.Assert.internalArrayEquals(Assert.java:473)
 at org.junit.Assert.assertArrayEquals(Assert.java:294)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester$CompressionTestStrategy$2.assertCompression(CompressDecompressTester.java:325)
 at 
 org.apache.hadoop.io.compress.CompressDecompressTester.test(CompressDecompressTester.java:135)
 at 
 org.apache.hadoop.io.compress.TestCompressorDecompressor.testCompressorDecompressor(TestCompressorDecompressor.java:58)
 ...
 ...
 .



--
This message was sent by Atlassian JIRA
(v6.2#6252)