On 2010-04-26 22:31, Joshua J Pavel wrote:
>
> Sending this out to close the thread if anyone else experiences this
> problem: nutch 1.0 is not AIX-friendly (0.9 is).
>
> I'm not 100% sure which command it may be, but by modifying my path so
> that /opt/freeware/bin has precedence, I no longer ge
-|
|>
| Subject: |
|>
>------------
-|
|>
| Subject: |
|>
>------------
n 2 GB are far from impressing. Why don't you switch
> hadoop.tmp.dir to a place with, say, 50
>
>
> From:
>
> To:
>
> Date:
> 04/20/2010 06:30 PM
> Subject:
> RE: Hadoop Disk Error
> --
>
>
>
> 1 or even 2 GB are far
---|
|>
| Subject: |
|>
>----------
AM
To: nutch-user@lucene.apache.org
Subject: Re: Hadoop Disk Error
Yes - how much free space does it need? We ran 0.9 using /tmp, and that has ~ 1
GB. After I first saw this error, I moved it to another filesystem where I have
2 GB free (maybe not "gigs and gigs", but more than I think I n
--|
|04/20/2010 01:41 PM
|
>--|
|>
| Subject: |
|>
>----------
|
|>
| Subject: |
|>
>---------
porary directory used b]---04/19/2010
> 05:53:53 PM---Are you sure that you have enough space in the temporary
> directory used by Hadoop? From: Joshua J Pa
>
>
> From:
>
> To:
>
> Date:
> 04/19/2010 05:53 PM
> Subject:
> RE: Hadoop Disk Error
>
---|
|>
| Subject: |
|>
>----------
---|
|>
| Subject: |
|>
>----------
Are you sure that you have enough space in the temporary directory used by
Hadoop?
From: Joshua J Pavel [mailto:jpa...@us.ibm.com]
Sent: Tuesday, 20 April 2010 6:42 AM
To: nutch-user@lucene.apache.org
Subject: Re: Hadoop Disk Error
Some more information, if anyone can help:
If I turn
--|
|>
| Subject: |
|>
>-----------
fwiw, the error does seem to be valid: from the taskTracker/jobcache
directory, I only have something for job 1-4.
ls -la
total 0
drwxr-xr-x6 root system 256 Apr 16 19:01 .
drwxr-xr-x3 root system 256 Apr 16 19:01 ..
drwxr-xr-x4 root system 256 A
14 matches
Mail list logo