As you told Server is Hadoop 2.5.2.
But client is 2.6.2, as seen in the exception.
Whether client-side libraries built with Hadoop 2.6.2?

From: karthi keyan [mailto:[email protected]]
Sent: 29 March 2016 15:16
To: Brahma Reddy Battula
Cc: [email protected]
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop 
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula 
<[email protected]<mailto:[email protected]>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan 
[mailto:[email protected]<mailto:[email protected]>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: [email protected]<mailto:[email protected]>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate 
with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula 
<[email protected]<mailto:[email protected]>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan 
[mailto:[email protected]<mailto:[email protected]>]
Sent: 29 March 2016 14:29
To: [email protected]<mailto:[email protected]>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i 
have replaced (rebuid) the jars. Does any one suggest me the right way to 
resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at 
org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) 
~[hadoop-common-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
 ~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
 ~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) 
~[hadoop-hdfs-2.6.2.jar:na]
            at 
org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) 
~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) 
~[na:1.7.0]

Regards,
Karthikeyan S


Reply via email to