Thanks Renato.

>From talking with QwertyM on IRC, I finally got it working with:

export PATH=/usr/local/pig-0.7.0/bin:$PATH
export JAVA_HOME=/usr/lib/jvm/java-6-openjdk/
export HADOOPDIR=/usr/local/hadoop/conf

export PIG_PATH=/usr/local/pig-0.7.0/
export PIG_CLASSPATH=$HADOOPDIR
export PIG_HADOOP_VERSION=0.20.2

Apparently, the PIG_CLASSPATH *must* point to the directory of hadoop config
files in order to connect properly.

Thanks!
Dave Viner


On Wed, Jun 30, 2010 at 11:29 PM, Renato Marroquín Mogrovejo <
[email protected]> wrote:

> Hi Dave,
>
> The same happened to me because even though we are not supposed to set the
> env variables for PIG, it needs them. So go to your sh file and edite it
> with whatever your values are.
>
> #!/bin/sh
> PIG_PATH=$HOME/bin/pig-0.7.0
> PIG_CLASSPATH=$PIG_PATH/pig-0.3.0-core.jar:$HOME/bin/hadoop-0.20.2/conf \
> PIG_HADOOP_VERSION=0.20.2 \
>
> I found this recommendations on the web but referencing old versions, I
> tried anyways and worked (:
>
> 2010/7/1 Jeff Zhang <[email protected]>
>
> > Try to put the core-site.xml , hdfs-site.xml, mapred-site.xml under conf
> > folder
> >
> >
> >
> > On Thu, Jul 1, 2010 at 1:58 PM, Dave Viner <[email protected]> wrote:
> >
> > > Whenever I start up pig from the commandline, I see the same message
> from
> > > both -x local and -x mapreduce:
> > >
> > > [main] INFO
> >  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine
> > > - Connecting to hadoop file system at: file:///
> > >
> > > Somehow, that feels like it's not connecting to my running Hadoop
> > cluster.
> > >  Is there some way to verify that Pig is talking to my Hadoop cluster?
> >  Is
> > > there some additional setting that I need to use in order to "point"
> Pig
> > to
> > > my Hadoop cluster?
> > >
> > > Thanks
> > > Dave Viner
> > >
> >
> >
> >
> > --
> > Best Regards
> >
> > Jeff Zhang
> >
>

Reply via email to