13940 clone(child_stack=0,
flags=CLONE_CHILD_CLEARTID|CLONE_CHILD_SETTID|SIGCHLD,
child_tidptr=0x7f51260459e0) = -1 ENOMEM (Cannot allocate memory)

(a few times)

Solutions: (pick one)
1) allocate more swap on your node
2) futz with /proc/sys/vm/overcommit_ratio (make it bigger) or
/proc/sys/vm/overcommit_memory (set to 1)

For option #2 you risk letting the linux OOM killer loose on your system,
which is usually no fun. So #1 is what I'd recommend.

-Todd

On Mon, Dec 7, 2009 at 1:00 AM, pavel kolodin
<[email protected]>wrote:

>
> Hi Todd,
>
> $ strace -o output.trace -f hadoop-0.20.1/bin/hadoop fs -put
> /tmp/idata/logogenPY_1259491560 /
> $ grep clone output.trace > 123.txt:
>
> 123.txt:
> http://pastebin.com/m675546ed
>
> -Pavel.
>
>
>  Hi Pavel,
>>
>> Try this:
>>
>> strace -o output.trace -f hadoop fs -put ... etc ...
>>
>> then grep clone output.trace
>>
>> -Todd
>>
>> On Mon, Dec 7, 2009 at 12:43 AM, pavel kolodin <
>> [email protected]
>>
>>> wrote:
>>>
>>
>>  Hi Todd,
>>> java is running as:
>>>
>>> /usr/lib/jvm/icedtea6-bin/bin/java <...> -Xmx1000m <...> -Xmx512 <...>
>>> -Xmx512 <...>
>>>
>>> (don't know why several Xmx params are used, i thought that last one is
>>> only meaningful).
>>>
>>> hadoop-env.sh is:
>>> http://pastebin.com/m544f5b20
>>>
>>> ~2500MB is available on each VPS node (16G on host machine).
>>>
>>> Thank you for answer.
>>> -Pavel Kolodin.
>>>
>>>
>>>  Hi Pavel,
>>>
>>>>
>>>> Any chance you've changed the memory settings in hadoop-env.sh to give
>>>> absurdly large heap sizes?
>>>>
>>>> How much RAM is available on your machine?
>>>>
>>>> -Todd
>>>>
>>>> On Mon, Dec 7, 2009 at 12:12 AM, pavel kolodin <
>>>> [email protected]
>>>>
>>>>  wrote:
>>>>>
>>>>>
>>>>
>>>>  Hello.
>>>>>
>>>>> I am using hadoop-0.20.1 on two VPS nodes with gentoo linux (hardware =
>>>>> 16
>>>>> xeon cpu, 64bit). Previously i was using the same version on 2
>>>>> separated
>>>>> 32-bit machines and all was fine. Seems to be hadoop can not execute
>>>>> 'whoami'. Maybe reason is another, but hadoop tells me that my name is
>>>>> "DrWho". For example:
>>>>>
>>>>> had...@hadoopmaster ~ $ hadoop-0.20.1/bin/hadoop fs -put
>>>>> /tmp/idata/bigbigbigfile /
>>>>> put: org.apache.hadoop.security.AccessControlException: Permission
>>>>> denied:
>>>>> user=DrWho, access=WRITE, inode="":hadoop:hadoop:rwxr-xr-x
>>>>> had...@hadoopmaster ~ $
>>>>>
>>>>> 'whoami' itself is available as user 'hadoop' and returns 'hadoop'.
>>>>>
>>>>> Thank you for any suggestions!
>>>>> Pavel Kolodin.
>>>>>
>>>>>
>>>>>
>
>
> --
> Using Opera's revolutionary e-mail client: http://www.opera.com/mail/
>

Reply via email to