You're encountering problems connecting to HBase (presumably your Pig script uses HBaseStorage). How does your hbase/conf/hbase-site.xml look?
Norbert On Thu, Mar 22, 2012 at 9:16 PM, Ryan Cole <[email protected]> wrote: > Hello, > > I'm new to these lists. I'm trying to get Pig working, for my first time. I > have setup Hadoop and HBase (on HDFS) using the psuedo-distributed setup, > all on one machine. I am able to run MapReduce jobs, using the example.jar > file included with the Hadoop release. > > Whenever I try to run even the simplest query examples, using Pig, I get > the following error: > > `ERROR 2017: Internal error creating job configuration.` > > and, the log file has the following more specific error: > > `Caused by: java.lang.IllegalArgumentException: Not a host:port pair: > �^@^@^@^P8948@ryan-serverlocalhost,44544,1332443083936` > > It looks like Pig goes through the entire compile process, of turning the > Pig Latin into the MapReduce code, but fails to send it off to Hadoop's > MapReduce. That's just my uneducated analysis of what I see happening, > though. I have pasted the Grunt console output, and the log file contents > here: https://gist.github.com/2163762. > > This is Pig v.0.9.2, HBase v. 0.92.1, Hadoop v1.0.1. > > Does anyone have any idea why this may be happening? It looks obvious that > I have something configured improperly, but I looked for any host:post > settings that would stand out and didn't find anything obvious. > > Thanks, > Ryan >
