I am running a Hadoop Java program in local single-JVM mode via an IDE
(IntelliJ).  I want to do performance profiling of it.  Following the
instructions in chapter 5 of *Hadoop: the Definitive Guide*, I added the
following properties to my job configuration file.


  <property>
    <name>mapred.task.profile</name>
    <value>true</value>
  </property>

  <property>
    <name>mapred.task.profile.params</name>

<value>-agentlib:hprof=cpu=samples,heap=sites,depth=6,force=n,thread=y,verbose=n,file=%s</value>
  </property>

  <property>
    <name>mapred.task.profile.maps</name>
    <value>0-</value>
  </property>

  <property>
    <name>mapred.task.profile.reduces</name>
    <value>0-</value>
  </property>


With these properties, the job runs as before, but I don't see any profiler
output.

I also tried simply setting


  <property>
    <name>mapred.child.java.opts</name>

<value>-agentlib:hprof=cpu=samples,heap=sites,depth=6,force=n,thread=y,verbose=n,file=%s</value>
  </property>


Again, no profiler output.

I know I have HPROF installed because running "java -agentlib:hprof=help" at
the command prompt produces a result.

Is is possible to run HPROF on a local Hadoop job?  Am I doing something
wrong?

Reply via email to