In this case, both hadoop and pig need to be at the same cdh3 version. -Adam
> > > I am in process of installing and learning pig. I have a hadoop cluster > and= > when I try to run pig it errors out: > Hadoop version is Hadoop 0.20.2-cdh3u2, > and pig version is pig-0.8.1-cdh3u3 > > I download the pig from Cloudera using sudo yum install hadoop-pig. > When I try to run it I get the following error: > [root@evia ~]# pig > 2012-02-15 09:34:27,183 [main] INFO org.apache.pig.Main - Logging error me= > ssages to: /root/pig_1329316467182.log > 2012-02-15 09:34:27,404 [main] INFO org.apache.pig.backend.hadoop.executio= > nengine.HExecutionEngine - Connecting to hadoop file system at: hdfs:// > evi= > a:8020 > 2012-02-15 09:34:27,535 [main] ERROR org.apache.pig.Main - ERROR 2998: > Unha= > ndled internal error. > org/apache/hadoop/thirdparty/guava/common/collect/Lin= > kedListMultimap > Details at logfile: /root/pig_1329316467182.log > [root@evia ~]# cat /root/pig_1329316467182.log > ERROR 2998: Unhandled internal error. > org/apache/hadoop/thirdparty/guava/co= > mmon/collect/LinkedListMultimap > at org.apache.hadoop.hdfs.SocketCache.<init>(SocketCache.java:48) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:240) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208) > at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(Distribu= > tedFileSystem.java:89) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java= > :1563) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) > at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.jav= > a:1597) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1579) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111) > at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDat= > aStorage.java:72) > at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HD= > ataStorage.java:58) > at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.i= > nit(HExecutionEngine.java:214) > at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.i= > nit(HExecutionEngine.java:134) > at org.apache.pig.impl.PigContext.connect(PigContext.java:183) > at org.apache.pig.PigServer.<init>(PigServer.java:226) > at org.apache.pig.PigServer.<init>(PigServer.java:215) > at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55) > at org.apache.pig.Main.run(Main.java:452) > at org.apache.pig.Main.main(Main.java:107) > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.thirdparty.g= > uava.common.collect.LinkedListMultimap > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:247) > Do you know what may be my problem?? > > > Thanks! > Eviatar Tenne
