I got it correctly.
when we are programming in client node:
(1) place a configuration xml hadoop-site.xml in the project path, Take
Eclispe for example , play hadoop-site.xml to Project's src diractory
(2) It seems DFS has directory control access.  This  needs others  test.
(3) FS Shell 's path is a little strange, we can not start with '/', like
/test/test.out is not illegal, test/test.out is OK

Thanks  for all!
It's a pleasure to learn hadoop!

Ryan

On Dec 1, 2007 12:55 PM, dhruba Borthakur <[EMAIL PROTECTED]> wrote:

> There is a dfs shell utility to do this too. Please see if this utility
> works on ur cluster. It can be used as follows:
>
> Bin/hadoop dfs -copyFromLocal <localfile> <dfs path>
>
> Please see if it works on you cluster. If it does, then please compare
> the method org.apache.hadoop.fs.FsShell.copyFromLocal() with your code.
>
> Thanks,
> dhruba
>
> -----Original Message-----
> From: Raghu Angadi [mailto:[EMAIL PROTECTED]
> Sent: Friday, November 30, 2007 8:49 PM
> To: [email protected]
> Subject: Re: Any one can tell me about how to write to HDFS?
>
>
> try 'Path outFile = new Path("/ryan/test");'
> also check if there is any usefule message on Namenode log.
>
> Raghu.
>
> Ryan Wang wrote:
> > Hope this version can attract other's attention
> >
> > Hadoop Version:  0.15.0
> > JDK version: Sun JDK 6.0.3
> > Platform: Ubuntu 7.10
> > IDE:   Eclipse 3.2
> > Code :
> > public class HadoopWrite {
> >
> >     /**
> >      * @param args
> >      */
> >     public static void main(String[] args) throws IOException{
> >         Configuration dfsconf = new Configuration();
> >         FileSystem dfs;
> >         dfs = FileSystem.get(dfsconf);
> >         Path inFile = new Path("/nutch/out");
> >         Path outFile = new Path("ryan/test");
> >         dfs.copyFromLocalFile(inFile, outFile);
> >
> >     }
> >
> > }
>

Reply via email to