That was it! I don't think that I even had my HBase path on the
PIG_CLASSPATH, at all. I simply put HBase on the path and now it works.

Thank you,
Ryan

On Fri, Mar 23, 2012 at 10:02 AM, Ryan Cole <[email protected]> wrote:

> The classpath for Pig, correct?
>
> Ryan
>
>
> On Fri, Mar 23, 2012 at 4:00 AM, Sam William <[email protected]>wrote:
>
>> Ryan,
>>  This message is specific  to Hbase 0.92.1 .  Make sure HBase 0.90.1 jar
>>  is not in the classpath  before the 0.92.1 jar files
>>
>> Sam
>> On Mar 22, 2012, at 8:20 PM, Ryan Cole wrote:
>>
>> > Hmm. The data in my tables is not important. So, I dropped the table
>> and recreated it. This doesn't seem to have resolved the issue, though.
>> >
>> > Is there perhaps a Pig query I can run that would use a built in HBase
>> table, like that .META. table, and see if it works? I don't know if that'd
>> help or not, though.
>> >
>> > Ryan
>> >
>> > On Mar 22, 2012, at 10:02 PM, Norbert Burger wrote:
>> >
>> >> Actually on second glance, this seems like an issue not with the HBase
>> >> config, but with the server:port info inside your .META. table.  Have
>> you
>> >> tried LOADing from a different table besides "events"?  From the HBase
>> >> shell, you can use the following command to extract server hostnames
>> for
>> >> each of your regions:
>> >>
>> >> scan '.META.',{COLUMNS => 'info:server'}
>> >>
>> >> Dropping and recreating the table should resolve the issue; if you have
>> >> real data already, then there are tools in hbase/bin to repair regions.
>> >>
>> >> Norbert
>> >>
>> >> On Thu, Mar 22, 2012 at 10:44 PM, Ryan Cole <[email protected]> wrote:
>>
>>
>> >>> I was thinking that maybe it was because I did not have HBase config
>> path
>> >>> on PIG_CLASSPATH, so I added it. This did not help, though.
>> >>>
>> >>> Ryan
>> >>>
>> >>> On Thu, Mar 22, 2012 at 9:07 PM, Ryan Cole <[email protected]> wrote:
>> >>>
>> >>>> Norbert,
>> >>>>
>> >>>> I have confirmed that this is indeed an issue connecting to HBase. I
>> >>> tried
>> >>>> just running a Pig script that did not use HBaseStorage, and it
>> works.
>> >>> Here
>> >>>> is my hbase-site.xml config file, as well as my query that I'm
>> running:
>> >>>>
>> >>>> https://gist.github.com/2166187
>> >>>>
>> >>>> Also, for ease of reference, here is the query:
>> >>>>
>> >>>> raw = LOAD 'hbase://events' USING
>> >>>> org.apache.pig.backend.hadoop.hbase.HBaseStorage('event:*', '-loadKey
>> >>>> true') AS (id:bytearray, events_map:map[]);
>> >>>>
>> >>>> Maybe I need to change the rootdir in the config to be my hostname,
>> and
>> >>>> not localhost?
>> >>>>
>> >>>> Ryan
>> >>>>
>> >>>>
>> >>>> On Thu, Mar 22, 2012 at 8:58 PM, Norbert Burger <
>> >>> [email protected]>wrote:
>> >>>>
>> >>>>> You're encountering problems connecting to HBase (presumably your
>> Pig
>> >>>>> script uses HBaseStorage).  How does your hbase/conf/hbase-site.xml
>> >>> look?
>> >>>>>
>> >>>>> Norbert
>> >>>>>
>> >>>>> On Thu, Mar 22, 2012 at 9:16 PM, Ryan Cole <[email protected]> wrote:
>> >>>>>
>> >>>>>> Hello,
>> >>>>>>
>> >>>>>> I'm new to these lists. I'm trying to get Pig working, for my first
>> >>>>> time. I
>> >>>>>> have setup Hadoop and HBase (on HDFS) using the psuedo-distributed
>> >>>>> setup,
>> >>>>>> all on one machine. I am able to run MapReduce jobs, using the
>> >>>>> example.jar
>> >>>>>> file included with the Hadoop release.
>> >>>>>>
>> >>>>>> Whenever I try to run even the simplest query examples, using Pig,
>> I
>> >>> get
>> >>>>>> the following error:
>> >>>>>>
>> >>>>>> `ERROR 2017: Internal error creating job configuration.`
>> >>>>>>
>> >>>>>> and, the log file has the following more specific error:
>> >>>>>>
>> >>>>>> `Caused by: java.lang.IllegalArgumentException: Not a host:port
>> pair:
>> >>>>>> �^@^@^@^P8948@ryan-serverlocalhost,44544,1332443083936`
>> >>>>>>
>> >>>>>> It looks like Pig goes through the entire compile process, of
>> turning
>> >>>>> the
>> >>>>>> Pig Latin into the MapReduce code, but fails to send it off to
>> >>> Hadoop's
>> >>>>>> MapReduce. That's just my uneducated analysis of what I see
>> happening,
>> >>>>>> though. I have pasted the Grunt console output, and the log file
>> >>>>> contents
>> >>>>>> here: https://gist.github.com/2163762.
>> >>>>>>
>> >>>>>> This is Pig v.0.9.2, HBase v. 0.92.1, Hadoop v1.0.1.
>> >>>>>>
>> >>>>>> Does anyone have any idea why this may be happening? It looks
>> obvious
>> >>>>> that
>> >>>>>> I have something configured improperly, but I looked for any
>> host:post
>> >>>>>> settings that would stand out and didn't find anything obvious.
>> >>>>>>
>> >>>>>> Thanks,
>> >>>>>> Ryan
>> >>>>>>
>> >>>>>
>> >>>>
>> >>>>
>> >>>
>> >
>>
>> Sam William
>> [email protected]
>>
>>
>>
>>
>

Reply via email to