It looks like you're trying to write the output of a job to /root.  I'm not
familiar with Nutch, but I just looked at the source and I think you might
be launching the crawl from /root, and Nutch is trying to create a temporary
linkdb there.  Try launching the crawl from a directory you have permissions
to.

On 8/24/07, Otto, Frank <[EMAIL PROTECTED]> wrote:
>
> Can nobody help me?
>
> Why does hadoop write in /root/? I do think with the folders in hadoop
> configuration files, I can set all needed folders.
>
>
> kind regards
>
> Frank
>
> > -----Ursprüngliche Nachricht-----
> > Von: Otto, Frank [mailto:[EMAIL PROTECTED]
> > Gesendet: Mittwoch, 22. August 2007 13:40
> > An: '[email protected]'
> > Betreff: permission problems in hadoop MapFile (0.12)
> >
> >
> > hi,
> >
> > I'm running nutch at a webapp on a webserver, which is
> > hostetd by a provider. If I want to crawl, I will get
> > following exception:
> >
> > java.io.IOException: Mkdirs failed to create directory
> > /root/linkdb-657635469/part-00000
> > at
> > org.apache.hadoop.io.MapFile$Writer.<init>(MapFile.java:121)
> > at org.apache.hadoop.io.MapFile$Writer.<init>(MapFile.java:84)
> > at
> > org.apache.hadoop.mapred.MapFileOutputFormat.getRecordWriter(M
> > apFileOutputFormat.java:44)
> > at
> > org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:292)
> > at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner
> > .java:155)
> >
> >
> > I have no write acces to /root/. How can I change this
> > folder?  I have set in hadoop-site.xml the hadoop.tmp.dir.
> >
> >
> > kind regards
> >
> > Frank
> >
>

Reply via email to