Just tried with 512 MB and 1 GB....and guess what .... it started (finally!!) working at a GB.
Is there a way to lower this requirement?.....I'll also just try to hone in into the min. amount of RAM needed. I really can't afford using a GB of RAM for my VM....I will soon run out of juice that way :( Regards, On Tue, May 6, 2008 at 12:49 AM, Ahmad Humayun <[EMAIL PROTECTED]> wrote: > Well my VM is allocated 256 MB.....I'll just increase it and report back > > Plus I have just tried HelloWorld programs....and since they hardly have > any memory usage, they work. > > > Regards, > > > On Tue, May 6, 2008 at 12:41 AM, Christophe Taton <[EMAIL PROTECTED]> > wrote: > > > Hi Ahmad, > > > > As the error message suggests it, your issue is likely to be related to > > the > > amount of memory available: > > Error occurred during initialization of VM > > Could not reserve enough space for object heap > > > > How much memory did you allocate to your VM? Can you run any other Java > > applications in your JVM? > > > > Christophe > > > > On Mon, May 5, 2008 at 9:33 PM, Ahmad Humayun <[EMAIL PROTECTED]> > > wrote: > > > > > Hi there, > > > > > > Has anybody tried running Hadoop on VMware (6.0). I have installed > > open > > > SUSE > > > 10.2 as a guest OS....and I have been trying to get Hadoop started, > > but > > > whatever I do with bin/hadoop, I keep getting this error: > > > *Error occurred during initialization of VM > > > Could not reserve enough space for object heap > > > Could not create the Java virtual machine.* > > > > > > Any ideas? Is it a problem with VMware? Or maybe my java environment > > > setup? > > > Or I'm simply doing something wrong in setting up Hadoop? > > > > > > Thanks again! > > > > > > Regards, > > > -- > > > Ahmad Humayun > > > Research Assistant > > > Computer Science Dpt., LUMS > > > +92 321 4457315 > > > > > > > > > -- > Ahmad Humayun > Research Assistant > Computer Science Dpt., LUMS > +92 321 4457315 > -- Ahmad Humayun Research Assistant Computer Science Dpt., LUMS +92 321 4457315
