Hi Samuel

Thankyou for your valuable input.
Was integrating pig with hadoop.  The pig.jar was giving errors while
identifying hadoop file systemd, fs.default.name as "hdfs://<host>:<port>".
This was due to the incompatible versions of pig and hadoop.

Now pig 0.1.0 works with hadoop 0.17 . And the fs.default.name with value
"hdfs://<host>:<port>" works fine for me.

Thankyou for your help
Srilatha

On Mon, Sep 22, 2008 at 7:07 PM, Samuel Guo <[EMAIL PROTECTED]> wrote:

> you can check ${HADOOP_HOME}/conf/hadoo-default.xml to see infomation about
> "fs.default.name".
>
>  <property>
>   <name>fs.default.name</name>
>   <value>file:///</value>
>   <description>The name of the default file system.  A URI whose
>   scheme and authority determine the FileSystem implementation.  The
>   uri's scheme determines the config property (fs.SCHEME.impl) naming
>   the FileSystem implementation class.  The uri's authority is used to
>   determine the host, port, etc. for a filesystem.</description>
>  </property>
>
> On Mon, Sep 22, 2008 at 7:38 PM, Latha <[EMAIL PROTECTED]> wrote:
>
> > Hi ,
> >
> > Please let me know if the value of fs.default.name value in the
> > hadoop-site.xml should be in the format <file:///> ?
> >
> > (or) can it also be in the format of "hdfs://<hostname>:<port>"?
> >
> > Would request you to pls let me know which one is correct.
> >
> > Thankyou
> > Srilatha
> >
>

Reply via email to