Hello Ahmad,

I took a look at your config files, they seem correct. The limitation of 14 
mappers and 2 reducers is weird, suggests that you can't utilize more than 32GB 
for whatever reason. I've been able to run with more processes on a weaker 
machine. Have you ever been able to utilize more than 32GBs of DRAM on that 
machine?

I don't have a solution for you right now, but will get back soon to you 
regarding this problem. Meanwhile, please take a look at the datanode logs 
regarding the first error.

Regards,
Djordje
________________________________________
From: Ahmad Yasin [[email protected]]
Sent: Sunday, December 29, 2013 11:08 PM
To: [email protected]
Cc: _Gmail
Subject: Hadoop fails with 16+ mappers

Dear CloudSuite Team,

I'm trying to run the Data Analytics Hadoop benchmark with 16+ mappers.
I've precisely followed the setup instructions for single node(download the 
whole benchmark from the CloudSuite website). I'm able to run without failures 
up to 14 mappers.
My system has a clean CentOS 6.5 install, with OpenJDK 1.6.0_28 running on 24 
cores, 128GB of memory and 1TB disk.

I've searched the mailing list archive, as well as tried few random things, 
like increasing heap size of daemon processes, reducing heapsize of child 
processes, increasing system swap, increasing io.sort.mb,  etc.. I also did a 
whole system re-build from scratch. No luck. "ulimit -u" return 1024.
Attached a zip of hadoop/conf/ folder.

I get two types of (java memory) errors detailed below: OutOfMemory to create 
new threads, and IOException when dealing with HDFS blocks.

Appreciate prompt help.

Regards,
Ahmad Yasin.

Error1:  java.io.IOException as 1st reducer job starts.
This is with 16 mappers, 2 or 8 reducer and 2GB heap size for childs.

%... /usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java -Xmx2048m ... 
org.apache.hado
op.util.RunJar /home/hadoop/mahout/examples/target/mahout-examples-0.6-job.jar 
org.ap
ache.mahout.driver.MahoutDriver testclassifier -m wikipediamodel -d 
wikipediainput --
method mapreduce
13/12/29 19:02:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing 
the arg
uments. Applications should implement Tool for the same.
13/12/29 19:02:28 INFO mapred.FileInputFormat: Total input paths to process : 2
13/12/29 19:02:29 INFO mapred.JobClient: Running job: job_201312291900_0001
13/12/29 19:02:30 INFO mapred.JobClient:  map 0% reduce 0%
13/12/29 19:03:14 INFO mapred.JobClient:  map 1% reduce 0%
13/12/29 19:03:23 INFO mapred.JobClient:  map 2% reduce 0%
...
13/12/29 19:10:07 INFO mapred.JobClient:  map 49% reduce 0%
13/12/29 19:10:19 INFO mapred.JobClient:  map 49% reduce 3%
13/12/29 19:10:21 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_r_000001_0, Status : FAILED
java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:418)

13/12/29 19:10:21 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312291900_0001_r_000001_0&filter=stdout<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312291900_0001_r_000001_0&filter=stdout>
13/12/29 19:10:21 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312291900_0001_r_000001_0&filter=stderr<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312291900_0001_r_000001_0&filter=stderr>
13/12/29 19:10:27 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_m_000000_0, Status : FAILED
java.io.IOException: Could not obtain block: blk_-5819730428801045811_1781 
file=/user/hadoop/wikipediainput/part-r-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
        at 
org.apache.hadoop.mapred.LineRecordReader.next(LineRecordReader.java:136)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:79)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:33)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:192)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:176)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)

13/12/29 19:10:27 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_m_000002_0, Status : FAILED
java.io.IOException: Could not obtain block: blk_-6875356997986463345_1781 
file=/user/hadoop/wikipediainput/part-r-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
        at 
org.apache.hadoop.mapred.LineRecordReader.next(LineRecordReader.java:136)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:79)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:33)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:192)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:176)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)

13/12/29 19:10:27 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_m_000009_0, Status : FAILED
java.io.IOException: Could not obtain block: blk_-1019912339544070742_1781 
file=/user/hadoop/wikipediainput/part-r-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
        at 
org.apache.hadoop.mapred.LineRecordReader.next(LineRecordReader.java:136)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:79)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:33)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:192)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:176)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)

13/12/29 19:10:27 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_m_000010_0, Status : FAILED
java.io.IOException: Could not obtain block: blk_-5842524233961123605_1781 
file=/user/hadoop/wikipediainput/part-r-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
        at 
org.apache.hadoop.mapred.LineRecordReader.next(LineRecordReader.java:136)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:79)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:33)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:192)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:176)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)

13/12/29 19:10:27 INFO mapred.JobClient: Task Id : 
attempt_201312291900_0001_m_000011_0, Status : FAILED
java.io.IOException: Could not obtain block: blk_-6712267498930710914_1781 
file=/user/hadoop/wikipediainput/part-r-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
        at 
org.apache.hadoop.mapred.LineRecordReader.next(LineRecordReader.java:136)
        at 
org.apache.hadoop.mapred.KeyValueLineRecordReader.next(KeyValueLineRecordReader.java:79)
...


Error2:  "java.lang.OutOfMemoryError: unable to create new native thread" as 
the task is created.
This is with 24 mappers, 2 reducer and 2GB heap size for childs.

/usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java -Xmx2048m ... 
org.apache.hadoop.util.RunJar 
/home/hadoop/mahout/examples/target/mahout-examples-0.6-job.jar 
org.apache.mahout.driver.MahoutDriver testclassifier -m wikipediamodel -d 
wikipediainput --method mapreduce
13/12/29 20:43:58 WARN mapred.JobClient: Use GenericOptionsParser for parsing 
the arguments. Applications should implement Tool for the same.
13/12/29 20:43:59 INFO mapred.FileInputFormat: Total input paths to process : 2
13/12/29 20:44:00 INFO mapred.JobClient: Running job: job_201312292041_0001
13/12/29 20:44:01 INFO mapred.JobClient:  map 0% reduce 0%
13/12/29 20:44:11 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000020_0, Status : FAILED
Error initializing attempt_201312292041_0001_m_000020_0:
java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:679)
        at java.lang.UNIXProcess$1.run(UNIXProcess.java:157)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:119)
        at java.lang.ProcessImpl.start(ProcessImpl.java:81)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:470)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
        at org.apache.hadoop.util.Shell.run(Shell.java:134)
        at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)
        at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)
        at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:364)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.localizeTask(TaskTracker.java:1885)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.launchTask(TaskTracker.java:1933)
        at 
org.apache.hadoop.mapred.TaskTracker.launchTaskForJob(TaskTracker.java:830)
        at 
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:824)
        at 
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1664)
        at org.apache.hadoop.mapred.TaskTracker.access$1200(TaskTracker.java:97)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:1629)

13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000020_0&filter=stdout<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000020_0&filter=stdout>
13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000020_0&filter=stderr<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000020_0&filter=stderr>
13/12/29 20:44:11 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000021_0, Status : FAILED
Error initializing attempt_201312292041_0001_m_000021_0:
java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:679)
        at java.lang.UNIXProcess$1.run(UNIXProcess.java:157)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:119)
        at java.lang.ProcessImpl.start(ProcessImpl.java:81)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:470)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
        at org.apache.hadoop.util.Shell.run(Shell.java:134)
        at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)
        at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)
        at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:364)
        at 
org.apache.hadoop.mapred.MapTask.localizeConfiguration(MapTask.java:111)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.localizeTask(TaskTracker.java:1850)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.launchTask(TaskTracker.java:1933)
        at 
org.apache.hadoop.mapred.TaskTracker.launchTaskForJob(TaskTracker.java:830)
        at 
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:824)
        at 
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1664)
        at org.apache.hadoop.mapred.TaskTracker.access$1200(TaskTracker.java:97)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:1629)
13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stdout<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stdout>
13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stderr<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stderr>
13/12/29 20:44:11 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000022_0, Status : FAILED
Error initializing attempt_201312292041_0001_m_000022_0:
java.io.IOException: Cannot run program "chmod": java.io.IOException: error=11, 
Resource temporarily unavailable
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:488)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
        at org.apache.hadoop.util.Shell.run(Shell.java:134)
        at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)
        at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)
        at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:364)
        at 
org.apache.hadoop.mapred.MapTask.localizeConfiguration(MapTask.java:111)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.localizeTask(TaskTracker.java:1850)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.launchTask(TaskTracker.java:1933)
        at 
org.apache.hadoop.mapred.TaskTracker.launchTaskForJob(TaskTracker.java:830)
        at 
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:824)
        at 
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1664)
        at org.apache.hadoop.mapred.TaskTracker.access$1200(TaskTracker.java:97)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:1629)
Caused by: java.io.IOException: java.io.IOException: error=11, Resource 
temporarily unavailable
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
        at java.lang.ProcessImpl.start(ProcessImpl.java:81)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:470)
        ... 21 more
3/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stderr<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000021_0&filter=stderr>
13/12/29 20:44:11 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000022_0, Status : FAILED
Error initializing attempt_201312292041_0001_m_000022_0:
java.io.IOException: Cannot run program "chmod": java.io.IOException: error=11, 
Resource temporarily unavailable
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:488)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
        at org.apache.hadoop.util.Shell.run(Shell.java:134)
        at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:354)
        at org.apache.hadoop.util.Shell.execCommand(Shell.java:337)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:481)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:473)
        at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:280)
        at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:364)
        at 
org.apache.hadoop.mapred.MapTask.localizeConfiguration(MapTask.java:111)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.localizeTask(TaskTracker.java:1850)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskInProgress.launchTask(TaskTracker.java:1933)
        at 
org.apache.hadoop.mapred.TaskTracker.launchTaskForJob(TaskTracker.java:830)
        at 
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:824)
        at 
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1664)
        at org.apache.hadoop.mapred.TaskTracker.access$1200(TaskTracker.java:97)
        at 
org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:1629)
Caused by: java.io.IOException: java.io.IOException: error=11, Resource 
temporarily unavailable
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
        at java.lang.ProcessImpl.start(ProcessImpl.java:81)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:470)
        ... 21 more
13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000022_0&filter=stdout<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000022_0&filter=stdout>
13/12/29 20:44:11 WARN mapred.JobClient: Error reading task 
outputhttp://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000022_0&filter=stderr<http://arch-ivt01.iil.intel.com:50060/tasklog?plaintext=true&taskid=attempt_201312292041_0001_m_000022_0&filter=stderr>
13/12/29 20:44:14 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000000_0, Status : FAILED
Error: unable to create new native thread
13/12/29 20:44:17 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000014_0, Status : FAILED
Error: unable to create new native thread
13/12/29 20:44:23 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000001_0, Status : FAILED
java.lang.RuntimeException: Error in configuring object
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
        ... 5 more
Caused by: java.lang.RuntimeException: Error in configuring object
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
        ... 10 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
        ... 13 more
Caused by: java.lang.IllegalStateException: 
hdfs://localhost:54310/user/hadoop/wikipediamodel/trainer-weights/Sigma_j/part-00001
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:117)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:106)
        at com.google.common.collect.Iterators$8.next(Iterators.java:765)
        at com.google.common.collect.Iterators$5.hasNext(Iterators.java:526)
        at 
com.google.common.collect.ForwardingIterator.hasNext(ForwardingIterator.java:43)
        at 
org.apache.mahout.classifier.bayes.SequenceFileModelReader.loadFeatureWeights(SequenceFileModelReader.java:72)
        at 
org.apache.mahout.classifier.bayes.SequenceFileModelReader.loadModel(SequenceFileModelReader.java:46)
        at 
org.apache.mahout.classifier.bayes.InMemoryBayesDatastore.initialize(InMemoryBayesDatastore.java:72)
        at 
org.apache.mahout.classifier.bayes.ClassifierContext.initialize(ClassifierContext.java:44)
        at 
org.apache.mahout.classifier.bayes.mapreduce.bayes.BayesClassifierMapper.configure(BayesClassifierMapper.java:120)
        ... 18 more
Caused by: java.io.IOException: Could not obtain block: 
blk_1492865381060422407_1818 
file=/user/hadoop/wikipediamodel/trainer-weights/Sigma_j/part-00001
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.readFully(DataInputStream.java:195)
        at java.io.DataInputStream.readFully(DataInputStream.java:169)
        at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1450)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1428)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1417)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1412)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileIterator.<init>(SequenceFileIterator.java:58)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:110)
        ... 27 more

13/12/29 20:44:23 INFO mapred.JobClient: Task Id : 
attempt_201312292041_0001_m_000003_0, Status : FAILED
java.lang.RuntimeException: Error in configuring object
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
        at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
        ... 5 more
Caused by: java.lang.RuntimeException: Error in configuring object
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
        ... 10 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
        ... 5 more
Caused by: java.lang.RuntimeException: Error in configuring object
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
        ... 10 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at 
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
        ... 13 more
Caused by: java.lang.IllegalStateException: 
hdfs://localhost:54310/user/hadoop/wikipediamodel/trainer-weights/Sigma_j/part-00000
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:117)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:106)
        at com.google.common.collect.Iterators$8.next(Iterators.java:765)
        at com.google.common.collect.Iterators$5.hasNext(Iterators.java:526)
        at 
com.google.common.collect.ForwardingIterator.hasNext(ForwardingIterator.java:43)
        at 
org.apache.mahout.classifier.bayes.SequenceFileModelReader.loadFeatureWeights(SequenceFileModelReader.java:72)
        at 
org.apache.mahout.classifier.bayes.SequenceFileModelReader.loadModel(SequenceFileModelReader.java:46)
        at 
org.apache.mahout.classifier.bayes.InMemoryBayesDatastore.initialize(InMemoryBayesDatastore.java:72)
        at 
org.apache.mahout.classifier.bayes.ClassifierContext.initialize(ClassifierContext.java:44)
        at 
org.apache.mahout.classifier.bayes.mapreduce.bayes.BayesClassifierMapper.configure(BayesClassifierMapper.java:120)
        ... 18 more
Caused by: java.io.IOException: Could not obtain block: 
blk_8057302158765564450_1815 
file=/user/hadoop/wikipediamodel/trainer-weights/Sigma_j/part-00000
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.readFully(DataInputStream.java:195)
        at java.io.DataInputStream.readFully(DataInputStream.java:169)
        at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1450)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1428)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1417)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1412)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileIterator.<init>(SequenceFileIterator.java:58)
        at 
org.apache.mahout.common.iterator.sequencefile.SequenceFileDirIterator$1.apply(SequenceFileDirIterator.java:110)
        ... 27 more

Reply via email to