One way to make it work is to replace
build/ivy/lib/Pig/hadoop-core-0.20.2.jar with
build/hadoop-core-0.20.3-SNAPSHOT.jar from hadoop append.

Daniel

On Mon, Jul 4, 2011 at 12:38 AM, praveenesh kumar <[email protected]>wrote:

> Hello people,
> I am new to pig.
> Currently I am using hadoop and hbase together.
> Since hadoop-0.20-append supports Hbase in production, so currently I am
> using hadoop 0.20-append jar files.
>
> Now I am interested to use pig which supports 0.20-append version.
>
> I am trying to use pig 0.8, but it seems not to be working.
> Whenever I am trying to run pig in map-reduce mode, it is giving me
> Error:2999.
> Here is output of my log file.
>
> hadoop@ub13:/usr/local/pig/bin$ pig
>
> 2011-07-01 17:41:52,150 [main] INFO  org.apache.pig.Main - Logging error
> messages to: /usr/local/pig/bin/pig_1309522312144.log
>
> 2011-07-01 17:41:52,454 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting
> to hadoop file system at: hdfs://ub13:54310
>
> 2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
> Unexpected internal error. Failed to create DataStorage
>
> LOG MESSAGE -----
>
> Error before Pig is launched---------------------------
>
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
>
> java.lang.RuntimeException: Failed to create DataStorage
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
>
> at
>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
>
> at
>
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
>
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
>
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
>
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
>
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
>
> at org.apache.pig.Main.run(Main.java:452)
>
> at org.apache.pig.Main.main(Main.java:107)
>
> Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
> org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client =
> 41, server = 43)
>
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
>
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>
> at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>
> at
>
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
>
> ... 9 more
>
> ================================================================================
>
> I guess the problem is the version mismatch between the hadoop-append-core
> jar files that my hadoop/hbase clusters is using currently and the
> hadoop-core jar files that pig is using.Anyone faced any similar kind of
> issue ?
> On the documentation website... its written requirement  for pig 0.8 as
> hadoop-0.20.2, but the problem is I want to use my hadoop and hbase along
> with pig also.
>
> Any suggestions.. how to resolve this issue..!!
>

Reply via email to