If I rebuild the pig using ant jar-withouthadoop, the output of pig -secretDebugCmd is following:
dry run: /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m -Dpig.log.dir=/home/huyong/pig-0.8.1/bin/../logs -Dpig.log.file=pig.log -Dpig.home.dir=/home/huyong/pig-0.8.1/bin/.. -Dpig.root.logger=INFO,console,DRFA -classpath /home/huyong/pig-0.8.1/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/huyong/pig-0.8.1/bin/../build/classes:/home/huyong/pig-0.8.1/bin/../build/test/classes:/home/huyong/pig-0.8.1/bin/../pig-0.8.1-core.jar:/home/huyong/pig-0.8.1/bin/../build/pig-*-SNAPSHOT.jar:/home/huyong/pig-0.8.1/bin/../lib/automaton.jar::/conf:/bin org.apache.pig.Main as you see, there is no hadoop information. On Wed, Jan 18, 2012 at 1:05 PM, Dmitriy Ryaboy <[email protected]> wrote: > You need to make sure the hadoop jar being used at runtime is the exact > same version as the version of hadoop you are using. > > What is the output of "pig -secretDebugCmd"? Does the hadoop jar in that > classpath match that you use to start hadoop? > > -D > > On Wed, Jan 18, 2012 at 2:03 AM, yonghu <[email protected]> wrote: > >> Hello, >> >> My pig version is 0.8.1. I have got some information of the mailing list. >> I rebuilt the pig using: >> ant jar-withouthadoop >> and replace the hadoop jar file in /pig_home/build/ivy/lib/pig with >> the hadoop-append jar. At last, I export HADOOP_HOME and PIG_HOME. >> >> But when I started the pig, I always got the information that the hdfs >> version of pig and hadoop are not compatible. >> >> Can anyone tell me how to solve this problem? >> >> Thanks >> >> Yong >>
