If you define HADOOP_HOME, pig will find all hadoop jars from it. You
don't need to add jar to CLASSPATH manually.

Thanks,
Daniel

On Wed, Jun 4, 2014 at 9:22 AM, Sandeep Jangra <sandeepjan...@gmail.com> wrote:
> Thanks much Daniel!
>
> When I tried the approach that you mentioned, I was getting the following
> error on starting pig:
>
> *ERROR 2998: Unhandled internal error.
> org/apache/hadoop/hdfs/DistributedFileSystem*
>
> *java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/DistributedFileSystem*
>
>
> To resolve this I put the /usr/local/pig/build/ivy/lib/Pig/* path in
> PIG_CLASSPATH. This is where all the hadoop 2.2.0 jars are present.
>
> Then I get the following exception:
>
> *2014-06-04 09:23:01,693 [main] WARN
> org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable*
>
> *Exception in thread "main" java.lang.IncompatibleClassChangeError:
> Implementing class*
>
> * at java.lang.ClassLoader.defineClass1(Native Method)*
>
> * at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)*
>
> * at java.lang.ClassLoader.defineClass(ClassLoader.java:615)*
>
> * at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)*
>
> * at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)*
>
> * at java.net.URLClassLoader.access$000(URLClassLoader.java:58)*
>
> * at java.net.URLClassLoader$1.run(URLClassLoader.java:197)*
>
> * at java.security.AccessController.doPrivileged(Native Method)*
>
> * at java.net.URLClassLoader.findClass(URLClassLoader.java:190)*
>
> * at java.lang.ClassLoader.loadClass(ClassLoader.java:306)*
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
> at org.apache.pig.tools.pigstats.PigStatsUtil.<clinit>(PigStatsUtil.java:54)
>
> at org.apache.pig.Main.run(Main.java:642)
>
> at org.apache.pig.Main.main(Main.java:156)
>
>
> Any pointers please.
>
>
> Thanks,
>
> Sandeep
>
>
>
>
> On Tue, Jun 3, 2014 at 8:02 PM, Daniel Dai <da...@hortonworks.com> wrote:
>
>> Change ivy/libraries.properties, hadoop-common.version,
>> hadoop-hdfs.version, hadoop-mapreduce.version, then compile Pig with
>> "ant -Dhadoopversion=23".
>>
>> Thanks,
>> Daniel
>>
>> On Tue, Jun 3, 2014 at 3:35 PM, Sandeep Jangra <sandeepjan...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> >   I have a remote hadoop cluster version 2.2.0 and am running pig version
>> > 0.12.1 (on separate VM).
>> >
>> >   Seems like the hadoop client packaged in the pig version is not
>> > compatible with the hadoop version.
>> >
>> >   So I need to build the pig with hadoop 2.2.0. Any pointers on what
>> > variables do I need to change in the Pig src to get the hadoop 2.2.0
>> > dependency libraries?
>> >
>> >   Also, any other pointers to configuration changes that I need to do to
>> > get it working
>> >
>> >
>> >
>> > Thanks
>>
>> --
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity to
>> which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to