---------- Forwarded message ----------
From: Evgenii Zhuravlev <[email protected]>
Date: 2017-10-20 12:31 GMT+03:00
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec
compression
To: C Reid <[email protected]>


I've run a few days ago hive, hadoop and ignite with snappy compression
without any problem. It was hadoop 2.7.1, but your version should work too,
I think. Apache Ignite codebase contains tests for snappy codec. Here is
one of them in attachments with small changes - please run it in your
environment and show us results.

Thanks,
Evgenii

2017-10-20 11:30 GMT+03:00 C Reid <[email protected]>:

> Yah, i tried all those methods found on the Google, and results were the
> same.
>
> Also because it's just an "export LD_LIBRARY_PATH=..." expression in
> 'ignite.sh', i'm not sure it takes efforts or not on a grid start up.
>
> We are planing to run more than 1000+ grids in a cluster, but production
> env has plenty of .snappy file, i'm struggling now...
> Btw, my hadoop version is 2.6.0, does it matter?
>
> Thanks for your patience.
> ------------------------------
> *From:* Evgenii Zhuravlev <[email protected]>
> *Sent:* 19 October 2017 17:07
>
> *To:* [email protected]
> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
> compression
>
> Could you also try to set LD_LIBRARY_PATH variable with path to the
> folder with native libraries?
>
> 2017-10-17 17:56 GMT+03:00 C Reid <[email protected]>:
>
>> I just tried, got the same:
>> "Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable"
>> "java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeC
>> odeLoader.buildSupportsSnappy()Z"
>>
>> I also tried adding all related native library under one of folders under
>> jdk where all *.so are located. But ignite just couldn't load them, it's
>> strange.
>> ------------------------------
>> *From:* Evgenii Zhuravlev <[email protected]>
>> *Sent:* 17 October 2017 21:25
>>
>> *To:* [email protected]
>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>> compression
>>
>> Have you tried to remove from path libraries from ${HADOOP_HOME}/lib/native
>> and add only /usr/lib64/ folder?
>>
>> 2017-10-17 12:18 GMT+03:00 C Reid <[email protected]>:
>>
>>> Tried, and did not work.
>>>
>>> ------------------------------
>>> *From:* Evgenii Zhuravlev <[email protected]>
>>> *Sent:* 17 October 2017 16:41
>>> *To:* C Reid
>>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>>> compression
>>>
>>> I'd recommend adding /usr/lib64/ to JAVA_LIBRARY_PATH
>>>
>>> Evgenii
>>>
>>> 2017-10-17 11:29 GMT+03:00 C Reid <[email protected]>:
>>>
>>>> Yes, IgniteNode runs on the DataNode machine.
>>>>
>>>> [[email protected] ignite]$ echo $HADOOP_HOME
>>>> /opt/hadoop-2.8.1-all
>>>> [[email protected] ignite]$ echo $IGNITE_HOME
>>>> /opt/apache-ignite-hadoop-2.2.0-bin
>>>>
>>>> and in ignite.sh
>>>> JVM_OPTS="${JVM_OPTS} -Djava.library.path=${HADOOP_H
>>>> OME}/lib/native:/usr/lib64/libsnappy.so.1:${HADOOP_HOME}/lib
>>>> /native/libhadoop.so"
>>>>
>>>> But exception is thrown as mentioned.
>>>> ------------------------------
>>>> *From:* Evgenii Zhuravlev <[email protected]>
>>>> *Sent:* 17 October 2017 15:44
>>>>
>>>> *To:* [email protected]
>>>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>>>> compression
>>>>
>>>> Do you run Ignite on the same machine as hadoop?
>>>>
>>>> I'd recommend you to check these env variables:
>>>> IGNITE_HOME, HADOOP_HOME and JAVA_LIBRARY_PATH. JAVA_LIBRARY_PATH
>>>> should contain a path to the folder of libsnappy files.
>>>>
>>>> Evgenii
>>>>
>>>> 2017-10-17 8:45 GMT+03:00 C Reid <[email protected]>:
>>>>
>>>>> Hi Evgenii,
>>>>>
>>>>> Checked, as shown:
>>>>>
>>>>> 17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Trying to load the
>>>>> custom-built native-hadoop library...
>>>>> 17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Loaded the
>>>>> native-hadoop library
>>>>> 17/10/17 13:43:12 WARN bzip2.Bzip2Factory: Failed to load/initialize
>>>>> native-bzip2 library system-native, will use pure-Java version
>>>>> 17/10/17 13:43:12 INFO zlib.ZlibFactory: Successfully loaded &
>>>>> initialized native-zlib library
>>>>> Native library checking:
>>>>> hadoop:  true /opt/hadoop-2.8.1-all/lib/native/libhadoop.so
>>>>> zlib:    true /lib64/libz.so.1
>>>>> snappy:  true /usr/lib64/libsnappy.so.1
>>>>> lz4:     true revision:10301
>>>>> bzip2:   false
>>>>> openssl: true /usr/lib64/libcrypto.so
>>>>>
>>>>> ------------------------------
>>>>> *From:* Evgenii Zhuravlev <[email protected]>
>>>>> *Sent:* 17 October 2017 13:34
>>>>> *To:* [email protected]
>>>>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>>>>> compression
>>>>>
>>>>> Hi,
>>>>>
>>>>> Have you checked "hadoop checknative -a" ? What it shows for snappy?
>>>>>
>>>>> Evgenii
>>>>>
>>>>> 2017-10-17 7:12 GMT+03:00 C Reid <[email protected]>:
>>>>>
>>>>>> Hi all igniters,
>>>>>>
>>>>>> I have tried many ways to include native jar and snappy jar, but
>>>>>> exceptions below kept thrown. (I'm sure the hdfs and yarn support snappy 
>>>>>> by
>>>>>> running job in yarn framework with SnappyCodec.) Hopes to get some helps
>>>>>> and suggestions from community.
>>>>>>
>>>>>> [NativeCodeLoader] Unable to load native-hadoop library for your
>>>>>> platform... using builtin-java classes where applicable
>>>>>>
>>>>>> and
>>>>>>
>>>>>> java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeC
>>>>>> odeLoader.buildSupportsSnappy()Z
>>>>>>         at org.apache.hadoop.util.NativeC
>>>>>> odeLoader.buildSupportsSnappy(Native Method)
>>>>>>         at org.apache.hadoop.io.compress.
>>>>>> SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>>>         at org.apache.hadoop.io.compress.
>>>>>> SnappyCodec.getCompressorType(SnappyCodec.java:136)
>>>>>>         at org.apache.hadoop.io.compress.
>>>>>> CodecPool.getCompressor(CodecPool.java:150)
>>>>>>         at org.apache.hadoop.io.compress.
>>>>>> CompressionCodec$Util.createOutputStreamWithCodecPool(Compre
>>>>>> ssionCodec.java:131)
>>>>>>         at org.apache.hadoop.io.compress.
>>>>>> SnappyCodec.createOutputStream(SnappyCodec.java:101)
>>>>>>         at org.apache.hadoop.mapreduce.li
>>>>>> b.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:126)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.impl.v2.HadoopV2Task.prepareWriter(HadoopV2Ta
>>>>>> sk.java:104)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.impl.v2.HadoopV2ReduceTask.run0(HadoopV2Reduc
>>>>>> eTask.java:64)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.impl.v2.HadoopV2Task.run(HadoopV2Task.java:55)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.impl.v2.HadoopV2TaskContext.run(HadoopV2TaskC
>>>>>> ontext.java:266)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.runTask(Hadoo
>>>>>> pRunnableTask.java:209)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call0(HadoopR
>>>>>> unnableTask.java:144)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(Hadoop
>>>>>> RunnableTask.java:116)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(Hadoop
>>>>>> RunnableTask.java:114)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.impl.v2.HadoopV2TaskContext.runAsJobOwner(Had
>>>>>> oopV2TaskContext.java:573)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRu
>>>>>> nnableTask.java:114)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRu
>>>>>> nnableTask.java:46)
>>>>>>         at org.apache.ignite.internal.pro
>>>>>> cessors.hadoop.taskexecutor.HadoopExecutorService$2.body(Had
>>>>>> oopExecutorService.java:186)
>>>>>>         at org.apache.ignite.internal.uti
>>>>>> l.worker.GridWorker.run(GridWorker.java:110)
>>>>>>
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> RC.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Attachment: HadoopSnappyTest.java
Description: Binary data

Reply via email to