Hi Jeff,

Which conf directory?

I have the following set in my ENV:

# ls `echo $HADOOPDIR;`
capacity-scheduler.xml core-site.xml  hadoop-metrics.properties
 hdfs-site.xml     mapred-site.xml  slaves     ssl-server.xml.example
configuration.xsl hadoop-env.sh  hadoop-policy.xml  log4j.properties
 masters     ssl-client.xml.example


Is there a different directory to which this ENV var should be pointing?

Thanks
Dave Viner


On Wed, Jun 30, 2010 at 11:21 PM, Jeff Zhang <[email protected]> wrote:

> Try to put the core-site.xml , hdfs-site.xml, mapred-site.xml under conf
> folder
>
>
>
> On Thu, Jul 1, 2010 at 1:58 PM, Dave Viner <[email protected]> wrote:
>
> > Whenever I start up pig from the commandline, I see the same message from
> > both -x local and -x mapreduce:
> >
> > [main] INFO
>  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine
> > - Connecting to hadoop file system at: file:///
> >
> > Somehow, that feels like it's not connecting to my running Hadoop
> cluster.
> >  Is there some way to verify that Pig is talking to my Hadoop cluster?
>  Is
> > there some additional setting that I need to use in order to "point" Pig
> to
> > my Hadoop cluster?
> >
> > Thanks
> > Dave Viner
> >
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to