ls / worked. Thanks! I don't have a pig.jar, I only have pig-0.5.0-core.jar but
it seemed to have worked.
I didn't run ant. Do I need to? Ant fails when I run it.
BUILD FAILED
/usr/local/apps/pig-0.5.0/build.xml:237: Problem: failed to create task or type
jjtree
Cause: the class org.apache.tools.ant.taskdefs.optional.javacc.JJTree was not
found.
This looks like one of Ant's optional components.
Action: Check that the appropriate optional JAR exists in
-/usr/share/ant/lib
-/root/.ant/lib
-a directory added on the command line with the -lib argument
-----Original Message-----
From: Dmitriy Ryaboy [mailto:[email protected]]
Sent: Wednesday, December 23, 2009 12:41 PM
To: [email protected]
Subject: Re: Pig Setup
Hm, interesting. So it looks like you are now able to connect to HDFS
fine, but "ls" on an empty string dies. Try "ls /" (ls on an empty
string works for me, but I'm on trunk, which is halfway between 0.6
and 0.7).
If you continue to have trouble:
By pig.jar I mean the pig.jar that gets dropped into your pig
directory (such as /usr/local/apps/pig-0.5.0) when you run ant.
Do you know what's up with the "attempt to override final parameter"
messages? This is a Hadoop thing, not a Pig thing. Are you able to use
Hadoop? Seems like something may be misconfigured.
Can you send (or paste to pastebin) the logfile in
/usr/local/apps/pig-0.5.0/pig_1261587982689.log ?
BTW, undocumented (so far) tip: you can use pig.logfile in
pig.properties to specify an alternative directory for your logfiles,
if you don't like polluting the working directory with them.
As far as supporting HBase, you will want to apply the patch from
https://issues.apache.org/jira/browse/PIG-970 and rebuild.
-D
On Wed, Dec 23, 2009 at 9:12 AM, Aryeh Berkowitz <[email protected]> wrote:
> Dmitriy,
> Thanks for your reply. I added the pig_classpath and I got somewhere but I'm
> still getting errors which I put here: http://pastebin.com/d213a21b9.
>
> Also, when you say pig.jar, I'm assuming that I'm using pig-0.5.0-core.jar in
> the latest version?
>
> My ultimate goal is to be to load an HBase table. Any help with that would be
> appreciated.
>
> Aryeh
>
> -----Original Message-----
> From: Dmitriy Ryaboy [mailto:[email protected]]
> Sent: Wednesday, December 23, 2009 11:51 AM
> To: [email protected]
> Subject: Re: Pig Setup
>
> Hi Aryeh,
> The most common cause of this is not having the hadoop conf directory
> in your classpath.
>
> If you are using the bin/pig script (as opposed to using Pig through
> Java) you can put both the conf directory and pig.jar in
> PIG_CLASSPATH, for example I have:
>
> export
> PIG_CLASSPATH=/home/dvryaboy/src/pig/pig.jar:/etc/hadoop-0.20/conf.pseudo/
>
> If this doesn't help, please send the exact error you are getting,
> your pig version, and the relevant environment information (all
> classpaths, etc).
>
> Cheers
> -D
>
> On Wed, Dec 23, 2009 at 8:40 AM, Aryeh Berkowitz <[email protected]> wrote:
>> I followed the instructions on the Pig Setup page but I can't seem to be
>> able to attach to my HDFS cluster. Is there a configuration file I'm missing
>> or a environment variable that I'm missing?
>>
>