Re: Hadoop Disk Error

2010-04-27 Thread Andrzej Bialecki
precedence, I no longer get the hadoop disk > error. While I though this means the problem comes from the nutch script, > not the code itself, manually trying to set system calls > to /opt/freeware/bin didn't fix it. I assume until detailed debugging is > done, further releases w

Re: Hadoop Disk Error

2010-04-26 Thread Joshua J Pavel
Sending this out to close the thread if anyone else experiences this problem: nutch 1.0 is not AIX-friendly (0.9 is). I'm not 100% sure which command it may be, but by modifying my path so that /opt/freeware/bin has precedence, I no longer get the hadoop disk error. While I though this

Re: Hadoop Disk Error

2010-04-21 Thread Joshua J Pavel
-| |> | Subject: | |> >--------

Re: Hadoop Disk Error

2010-04-21 Thread Julien Nioche
n 2 GB are far from impressing. Why don't you switch > hadoop.tmp.dir to a place with, say, 50 > > > From: > > To: > > Date: > 04/20/2010 06:30 PM > Subject: > RE: Hadoop Disk Error > -- > > > > 1 or even 2 GB are far

RE: Hadoop Disk Error

2010-04-21 Thread Joshua J Pavel
---| |> | Subject: | |> >------

RE: Hadoop Disk Error

2010-04-20 Thread Arkadi.Kosmynin
AM To: nutch-user@lucene.apache.org Subject: Re: Hadoop Disk Error Yes - how much free space does it need? We ran 0.9 using /tmp, and that has ~ 1 GB. After I first saw this error, I moved it to another filesystem where I have 2 GB free (maybe not "gigs and gigs", but more than I think I n

Re: Hadoop Disk Error

2010-04-20 Thread Joshua J Pavel
--| |04/20/2010 01:41 PM | >--| |> | Subject: | |> >------

Re: Hadoop Disk Error

2010-04-20 Thread Joshua J Pavel
| |> | Subject: | |> >-----

Re: Hadoop Disk Error

2010-04-20 Thread Julien Nioche
porary directory used b]---04/19/2010 > 05:53:53 PM---Are you sure that you have enough space in the temporary > directory used by Hadoop? From: Joshua J Pa > > > From: > > To: > > Date: > 04/19/2010 05:53 PM > Subject: > RE: Hadoop Disk Error >

RE: Hadoop Disk Error

2010-04-20 Thread Joshua J Pavel
---| |> | Subject: | |> >------

RE: Hadoop Disk Error

2010-04-20 Thread Joshua J Pavel
---| |> | Subject: | |> >------

RE: Hadoop Disk Error

2010-04-19 Thread Arkadi.Kosmynin
Are you sure that you have enough space in the temporary directory used by Hadoop? From: Joshua J Pavel [mailto:jpa...@us.ibm.com] Sent: Tuesday, 20 April 2010 6:42 AM To: nutch-user@lucene.apache.org Subject: Re: Hadoop Disk Error Some more information, if anyone can help: If I turn

Re: Hadoop Disk Error

2010-04-19 Thread Joshua J Pavel
--| |> | Subject: | |> >-------

Re: Hadoop Disk Error

2010-04-16 Thread Joshua J Pavel
--| |> | Subject: | |> >-------

Hadoop Disk Error

2010-04-16 Thread Joshua J Pavel
We're just now moving from a nutch .9 installation to 1.0, so I'm not entirely new to this. However, I can't even get past the first fetch now, due to a hadoop error. Looking in the mailing list archives, normally this error is caused from either permissions or a full disk. I overrode the use