[jira] Created: (MAPREDUCE-1646) Task Killing tests
Task Killing tests -- Key: MAPREDUCE-1646 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1646 Project: Hadoop Map/Reduce Issue Type: Task Components: test Reporter: Vinay Kumar Thota The following tasks covered in the test. 1. In a running job, kill a task and verify the job succeeds. 2. Setup a job with long running tasks that write some output to HDFS. When one of the tasks is running, ensure that the output/_temporary/_attempt-id directory is created. Kill the task. After the task is killed, make sure that the output/_temporary/_attempt-id directory is cleaned up. 3. Setup a job with long running tasks that write some output to HDFS. When one of the tasks is running, ensure that the output/_temporary/_attempt-id directory is created. Fail the task by simulating the map. After the task is failed, make sure that the output/_temporary/_attempt-id directory is cleaned up. The important difference we are trying to check is btw kill and fail, there would a subtle difference. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.
[jira] Created: (MAPREDUCE-1647) JvmEnv miss logSize argument
JvmEnv miss logSize argument Key: MAPREDUCE-1647 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1647 Project: Hadoop Map/Reduce Issue Type: Bug Components: tasktracker Reporter: Guilin Sun class JvmEnv missed logSize argument in its constructor, thus task-jvms seems will never limit their stdout/stderr outputs into specified size. {code:title=JvmManager.java|borderStyle=solid} public JvmEnv(ListString setup, VectorString vargs, File stdout, File stderr, long logSize, File workDir, MapString,String env, JobConf conf) { this.setup = setup; this.vargs = vargs; this.stdout = stdout; this.stderr = stderr; this.workDir = workDir; this.env = env; this.conf = conf; } {code} -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.
[jira] Created: (MAPREDUCE-1650) Exclude Private elements from generated MapReduce Javadoc
Exclude Private elements from generated MapReduce Javadoc - Key: MAPREDUCE-1650 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1650 Project: Hadoop Map/Reduce Issue Type: Improvement Components: documentation Reporter: Tom White Assignee: Tom White Exclude elements annotated with InterfaceAudience.Private or InterfaceAudience.LimitedPrivate from Javadoc and JDiff. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.
[jira] Resolved: (MAPREDUCE-1647) JvmEnv miss logSize argument
[ https://issues.apache.org/jira/browse/MAPREDUCE-1647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Guilin Sun resolved MAPREDUCE-1647. --- Resolution: Duplicate Duplicated with MAPREDUCE-1057. JvmEnv miss logSize argument Key: MAPREDUCE-1647 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1647 Project: Hadoop Map/Reduce Issue Type: Bug Components: tasktracker Reporter: Guilin Sun Attachments: patch.txt class JvmEnv missed logSize argument in its constructor, thus task-jvms seems will never limit their stdout/stderr outputs into specified size. {code:title=JvmManager.java|borderStyle=solid} public JvmEnv(ListString setup, VectorString vargs, File stdout, File stderr, long logSize, File workDir, MapString,String env, JobConf conf) { this.setup = setup; this.vargs = vargs; this.stdout = stdout; this.stderr = stderr; this.workDir = workDir; this.env = env; this.conf = conf; } {code} -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.
[jira] Created: (MAPREDUCE-1652) Tasks need to emit a better error message when job-acls.xml file cannot be created
Tasks need to emit a better error message when job-acls.xml file cannot be created -- Key: MAPREDUCE-1652 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1652 Project: Hadoop Map/Reduce Issue Type: Bug Affects Versions: 0.22.0 Reporter: Ravi Gummadi Fix For: 0.22.0 If task cannot create job-acls.xml in userlogs/$jobid/task-attempt-dir(because of disc being full, OR disc has gone bad, etc), then task should emit a better error message instead of failing with FileNotFoundException in writeJobACLs(). The stack trace shown currently is: java.lang.Throwable: Child Error at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:242) Caused by: java.io.FileNotFoundException: $mapred-local-dir/userlogs/job_201003240402_0402/attempt_201003240402_0402_m_002091_0/job-acl.xml (No such file or directory) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.init(FileOutputStream.java:179) at java.io.FileOutputStream.init(FileOutputStream.java:131) at org.apache.hadoop.mapred.TaskRunner.writeJobACLs(TaskRunner.java:303) at org.apache.hadoop.mapred.TaskRunner.prepareLogFiles(TaskRunner.java:286) at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:205) -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.