Hi all,

Trying to upgrade Hadoop to 1.0.x, we found that the shuffleInput metric (for 
each reduce task) is not under the “mapred” context, according to the following 
implementation in ReduceTask.java. Currently, the ShuffleInput metric is under 
“default” context, since it is not set explicitly.

Consequently, we cannot have “Hadoop_mapred_shuffleInput” directory under 
repos/*/ folder.

{
final MetricsRegistry registry = new MetricsRegistry("shuffleInput");
      final MetricMutableCounterLong inputBytes =
          registry.newCounter("shuffle_input_bytes", "", 0L);
      final MetricMutableCounterInt failedFetches =
          registry.newCounter("shuffle_failed_fetches", "", 0);
      final MetricMutableCounterInt successFetches =
          registry.newCounter("shuffle_success_fetches", "", 0);
      private volatile int threadsBusy = 0;

      @SuppressWarnings("deprecation")
      ShuffleClientInstrumentation(JobConf conf) {
        registry.tag("user", "User name", conf.getUser())
                .tag("jobName", "Job name", conf.getJobName())
                .tag("jobId", "Job ID", ReduceTask.this.getJobID().toString())
                .tag("taskId", "Task ID", getTaskID().toString())
                .tag("sessionId", "Session ID", conf.getSessionId());
      }
}



Thank you && Best Regards,
Grace (Huang Jie)
---------------------------------------------------------------------
SSG PRC Cloud Computing
Intel Asia-Pacific Research & Development Ltd.
No. 880 Zi Xing Road
Shanghai, PRC, 200241
Phone: (86-21) 61166031

Reply via email to