OK, that wasn't the real error, it looks like this was: When working with cygwin. I am guessing that the task failed. Is mapred/task runner launch a new jvm process? That seems to be failing
MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002" TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1" START_TIME="1313214584866" TRACKER_NAME="tracker_USER-2\.xxxx\.com:localhost/127\.0\.0\.1:2937" HTTP_PORT="50060" . MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002" TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1" TASK_STATUS="FAILED" FINISH_TIME="1313214588194" HOSTNAME="USER-2\.xxxx.com" ERROR="java\.lang\.Throwable: Child Error at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:271) Caused by: java\.io\.IOException: Task process exit with nonzero status of 127\. at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:258) ________________________________ From: Brown, Berlin [GCG-PFS] Sent: Friday, August 12, 2011 3:49 PM To: '[email protected]' Cc: [email protected] Subject: basic usage map/reduce error I am getting this error with a mostly out of the box configuration from version 0.20.203.0 When I try to run the wordcount examples. $ hadoop jar hadoop-examples-0.20.203.0.jar wordcount /user/hduser/gutenberg /user/hduser/gutenberg-output6 2011-08-12 15:45:38,299 WARN org.apache.hadoop.mapred.TaskRunner: attempt_201108121544_0001_m_000008_2 : Child Error java.io.IOException: Task process exit with nonzero status of 127. at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258) 2011-08-12 15:45:38,878 WARN org.apache.hadoop.mapred.TaskLog: Failed to retrieve stdout log for task: attempt_201108121544_0001_m_000008_1 java.io.FileNotFoundException: E:\projects\workspace_mar11\ParseLogCriticalErrors\lib\h\logs\userlogs\j ob_201108121544_0001\attempt_201108121544_0001_m_000008_1\log.index (The system cannot find the file specified) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:106) at org.apache.hadoop.io.SecureIOUtils.openForRead(SecureIOUtils.java:102) at org.apache.hadoop.mapred.TaskLog.getAllLogsFileDetails(TaskLog.java:112) ... The userlogs/job* directory is empty. Maybe there is some permission issue with those directories. I am running on windows with cygwin so I don't really know permissions to set.
