In 32 bit machine, the biggest memory the jvm can provide is in the range of
1.5g to 2.0g.  So if you want a bigger memory, say 3000M, you should have a
64bit machine.

On Tue, Oct 12, 2010 at 4:50 AM, Shi Yu <[email protected]> wrote:

> Hi,
>
> I want to load a serialized HashMap object in hadoop. The file of stored
> object is 200M. I could read that object efficiently in JAVA by setting -Xmx
> as 1000M.  However, in hadoop I could never load it into memory. The code is
> very simple (just read the ObjectInputStream) and there is yet no map/reduce
> implemented.  I set the  mapred.child.java.opts=-Xmx3000M, still get the
> "java.lang.OutOfMemoryError: Java heap space"  Could anyone explain a little
> bit how memory is allocate to JVM in hadoop. Why hadoop takes up so much
> memory?  If a program requires 1G memory on a single node, how much memory
> it requires (generally) in Hadoop?
>
> Thanks.
>
> Shi
>
> --
>
>


-- 
Yours sincerely,
Charles Lee

Reply via email to