It was the presense of the /etc/hadoop directory that messed me up, and then 
HADOOP_CLIENT_OPTS was set and sticky which kept it messed up thereafter.

After nuking /etc/hadoop, wiping out the hadoop-1.0.0 directory, closing my ssh 
session, and trying again, it works.

    - Tim.

________________________________________
From: Tim Broberg [[email protected]]
Sent: Tuesday, February 07, 2012 4:33 PM
To: [email protected]
Subject: RE: out of memory running examples

Ok, so I edited the bin/hadoop script to dump out what it is doing when it 
executes java:

/usr/java/default/bin/java -Dproc_jar -Xmx8192m -Djava.net.preferIPv4Stack=true 
-Xmx128m -Xmx128m -Xmx128m -Dhadoop.log.dir=/var/log/hadoop/tbroberg 
-Dhadoop.log.file=hadoop.log -
... blah blah blah

You can see when I set HADOOP_HEAPSIZE=8192, it adds "-Xmx8192m" to the command 
line, but it also has 3 other -Xmx128m's from elsewhere that I can't easily 
override.

I will try to track these down, identify what the issues are behind them, and 
log a bug.

Adam and Harsh, thanks for helping me understand how it is *supposed* to work.

    - Tim.
________________________________________
From: Adam Brown [[email protected]]
Sent: Tuesday, February 07, 2012 11:05 AM
To: [email protected]
Subject: Re: out of memory running examples

Hey Tim,

When a job runs, it will spawn a jvm, and set runtime parameters based on
the properties file

you can override those settings from the shell:

e.g - before you issue your "hadoop jar ...." command

     export HADOOP_HEAPSIZE=512


the hadoop shell script checks if this is set, and if so, should use its
value.

after this, if you have compiled your job, you have another opportunity to
override values, via your JobConf

@see
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/mapred/JobConf.html
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/conf/Configuration.html

these settings will take final priority in you job

cheers,

Adam

On Tue, Feb 7, 2012 at 10:00 AM, Tim Broberg <[email protected]> wrote:

> Thanks. I'm a little frustrated and/or paranoid right now. At the moment,
> I can't get the songs "Pink Elephants on Parade" and "Hefelumps and
> Woozles" out of my head, and I think Hadoop is out to get me personally.
>
> If you don't mind, can you tell me specifically where does one set this
> value and to what for a local-only setup?
>
> Thanks,
>    - Tim.
>
> "Beware, beware, bewaaaaaaaaaaaaarrrree!"
> ________________________________________
> From: Adam Brown [[email protected]]
> Sent: Tuesday, February 07, 2012 9:47 AM
> To: [email protected]
> Subject: Re: out of memory running examples
>
> Tim,
>
> That will allow you to increase the spawned vm's max heap size, which based
> on your trace looks like a good thing to try.
>
> Adam
>
> On Tue, Feb 7, 2012 at 9:42 AM, Tim Broberg <[email protected]> wrote:
>
> > mapred.child.java.opts is not set. Should it be?
> >
> > ________________________________________
> > From: Uma Maheswara Rao G [[email protected]]
> > Sent: Tuesday, February 07, 2012 2:27 AM
> > To: [email protected]
> > Subject: RE: out of memory running examples
> >
> > What is the Java heap space you configured?
> > for the property mapred.child.java.opts
> > ________________________________________
> > From: Tim Broberg [[email protected]]
> > Sent: Tuesday, February 07, 2012 3:20 PM
> > To: [email protected]
> > Subject: out of memory running examples
> >
> > I'm trying to run the basic example from
> > hadoop/hadoop-1.0.0/docs/single_node_setup.html.
> >
> > I'm getting java.lang.OutOfMemoryError's when I run the grep example from
> > that page.
> >
> > Stackoverflow suggests various tweaks to the command line,
> > mapred-site.xml, or hadoop-env.sh, none of which seem to be helping in my
> > case.
> >
> > When I tweak hadoop-env.sh to echo text to a file, that file doesn't show
> > up, which suggests that hadoop-env.sh isn't even getting executed.
> >
> > Any hints on debugging this?
> >
> >    - Tim.
> >
> > [tbroberg@san-mothra hadoop-1.0.0]$ bin/hadoop jar
> > hadoop-examples-1.0.0.jar grep input output 'dfs[a-z.]+'
> > Warning: $HADOOP_HOME is deprecated.
> > 12/02/07 01:39:35 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > library
> > 12/02/07 01:39:35 INFO mapred.FileInputFormat: Total input paths to
> > process : 7
> > 12/02/07 01:39:35 INFO mapred.JobClient: Running job: job_local_0001
> > 12/02/07 01:39:35 INFO util.ProcessTree: setsid exited with exit code 0
> > 12/02/07 01:39:35 INFO mapred.Task:  Using ResourceCalculatorPlugin :
> > org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4c349471<mailto:
> > org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4c349471>
> > 12/02/07 01:39:35 INFO mapred.MapTask: numReduceTasks: 1
> > 12/02/07 01:39:35 INFO mapred.MapTask: io.sort.mb = 100
> > 12/02/07 01:39:35 WARN mapred.LocalJobRunner: job_local_0001
> > java.lang.OutOfMemoryError: Java heap space
> >        at
> > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:949)
> >        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:428)
> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> >        at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
> > 12/02/07 01:39:36 INFO mapred.JobClient:  map 0% reduce 0%
> > 12/02/07 01:39:36 INFO mapred.JobClient: Job complete: job_local_0001
> > 12/02/07 01:39:36 INFO mapred.JobClient: Counters: 0
> > 12/02/07 01:39:36 INFO mapred.JobClient: Job Failed: NA
> > java.io.IOException: Job failed!
> >        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
> >        at org.apache.hadoop.examples.Grep.run(Grep.java:69)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >        at org.apache.hadoop.examples.Grep.main(Grep.java:93)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
> >        at
> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >        at
> > org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > [tbroberg@san-mothra hadoop-1.0.0]$
> >
> > ________________________________
> > The information and any attached documents contained in this message
> > may be confidential and/or legally privileged. The message is
> > intended solely for the addressee(s). If you are not the intended
> > recipient, you are hereby notified that any use, dissemination, or
> > reproduction is strictly prohibited and may be unlawful. If you are
> > not the intended recipient, please contact the sender immediately by
> > return e-mail and destroy all copies of the original message.
> >
> > The information and any attached documents contained in this message
> > may be confidential and/or legally privileged.  The message is
> > intended solely for the addressee(s).  If you are not the intended
> > recipient, you are hereby notified that any use, dissemination, or
> > reproduction is strictly prohibited and may be unlawful.  If you are
> > not the intended recipient, please contact the sender immediately by
> > return e-mail and destroy all copies of the original message.
> >
>
>
>
> --
> Adam Brown
> Enablement Engineer
> Hortonworks
> <http://www.hadoopsummit.org/>
>
> The information and any attached documents contained in this message
> may be confidential and/or legally privileged.  The message is
> intended solely for the addressee(s).  If you are not the intended
> recipient, you are hereby notified that any use, dissemination, or
> reproduction is strictly prohibited and may be unlawful.  If you are
> not the intended recipient, please contact the sender immediately by
> return e-mail and destroy all copies of the original message.
>



--
Adam Brown
Enablement Engineer
Hortonworks
<http://www.hadoopsummit.org/>

The information and any attached documents contained in this message
may be confidential and/or legally privileged.  The message is
intended solely for the addressee(s).  If you are not the intended
recipient, you are hereby notified that any use, dissemination, or
reproduction is strictly prohibited and may be unlawful.  If you are
not the intended recipient, please contact the sender immediately by
return e-mail and destroy all copies of the original message.

The information and any attached documents contained in this message
may be confidential and/or legally privileged.  The message is
intended solely for the addressee(s).  If you are not the intended
recipient, you are hereby notified that any use, dissemination, or
reproduction is strictly prohibited and may be unlawful.  If you are
not the intended recipient, please contact the sender immediately by
return e-mail and destroy all copies of the original message.

Reply via email to