I don't recall what exactly triggered it, but at some point we saw out of
memory exceptions on the pig client, and bumped up the heap size since we
have ram to burn on the pig client machines. It might have been for running
in local mode.

-D

On Thu, Jul 1, 2010 at 4:17 PM, Renato Marroquín Mogrovejo <
[email protected]> wrote:

> Hi Dmitriy, is there any special reason for setting PIG_HEAPSIZE with a
> high
> value?
> Thanks.
>
> Renato M.
>
> 2010/7/1 Dmitriy Ryaboy <[email protected]>
>
> > We do something like this in our pig-env.sh:
> >
> > HBASE_CLASSPATH=/usr/local/hbase/hbase.jar:/usr/local/hbase/conf
> > export
> > PIG_CLASSPATH=$PIG_CLASSPATH:/usr/lib/hadoop-0.20/conf:$HBASE_CLASSPATH
> >
> > Another handy variable is PIG_HEAPSIZE. We set that prety high, 2 gigs I
> > think.
> >
> > We put all env vars like that in a file called pig-env.sh and set them
> like
> > so from bin/pig (I think this code is in the apache version too..):
> >
> > #check to see if the conf dir is given as an optional argument
> > if [ $# -gt 1 ]
> > then
> >    if [ "--config" = "$1" ]
> >    then
> >        shift
> >        confdir=$1
> >        shift
> >        PIG_CONF_DIR=$confdir
> >    fi
> > fi
> >
> > # Allow alternate conf dir location.
> > PIG_CONF_DIR="${PIG_CONF_DIR:-$PIG_HOME/conf}"
> >
> > if [ -f "${PIG_CONF_DIR}/pig-env.sh" ]; then
> >    . "${PIG_CONF_DIR}/pig-env.sh"
> > fi
> >
> >
> > On Wed, Jun 30, 2010 at 11:42 PM, Dave Viner <[email protected]>
> wrote:
> >
> > > Thanks Renato.
> > >
> > > From talking with QwertyM on IRC, I finally got it working with:
> > >
> > > export PATH=/usr/local/pig-0.7.0/bin:$PATH
> > > export JAVA_HOME=/usr/lib/jvm/java-6-openjdk/
> > > export HADOOPDIR=/usr/local/hadoop/conf
> > >
> > > export PIG_PATH=/usr/local/pig-0.7.0/
> > > export PIG_CLASSPATH=$HADOOPDIR
> > > export PIG_HADOOP_VERSION=0.20.2
> > >
> > > Apparently, the PIG_CLASSPATH *must* point to the directory of hadoop
> > > config
> > > files in order to connect properly.
> > >
> > > Thanks!
> > > Dave Viner
> > >
> > >
> > > On Wed, Jun 30, 2010 at 11:29 PM, Renato Marroquín Mogrovejo <
> > > [email protected]> wrote:
> > >
> > > > Hi Dave,
> > > >
> > > > The same happened to me because even though we are not supposed to
> set
> > > the
> > > > env variables for PIG, it needs them. So go to your sh file and edite
> > it
> > > > with whatever your values are.
> > > >
> > > > #!/bin/sh
> > > > PIG_PATH=$HOME/bin/pig-0.7.0
> > > >
> PIG_CLASSPATH=$PIG_PATH/pig-0.3.0-core.jar:$HOME/bin/hadoop-0.20.2/conf
> > \
> > > > PIG_HADOOP_VERSION=0.20.2 \
> > > >
> > > > I found this recommendations on the web but referencing old versions,
> I
> > > > tried anyways and worked (:
> > > >
> > > > 2010/7/1 Jeff Zhang <[email protected]>
> > > >
> > > > > Try to put the core-site.xml , hdfs-site.xml, mapred-site.xml under
> > > conf
> > > > > folder
> > > > >
> > > > >
> > > > >
> > > > > On Thu, Jul 1, 2010 at 1:58 PM, Dave Viner <[email protected]>
> > > wrote:
> > > > >
> > > > > > Whenever I start up pig from the commandline, I see the same
> > message
> > > > from
> > > > > > both -x local and -x mapreduce:
> > > > > >
> > > > > > [main] INFO
> > > > >  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine
> > > > > > - Connecting to hadoop file system at: file:///
> > > > > >
> > > > > > Somehow, that feels like it's not connecting to my running Hadoop
> > > > > cluster.
> > > > > >  Is there some way to verify that Pig is talking to my Hadoop
> > > cluster?
> > > > >  Is
> > > > > > there some additional setting that I need to use in order to
> > "point"
> > > > Pig
> > > > > to
> > > > > > my Hadoop cluster?
> > > > > >
> > > > > > Thanks
> > > > > > Dave Viner
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Best Regards
> > > > >
> > > > > Jeff Zhang
> > > > >
> > > >
> > >
> >
>

Reply via email to