See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/280/changes>
Changes:
[ssc] MAHOUT-1173 Reactivate checkstyle
------------------------------------------
[...truncated 9516 lines...]
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer
sortAndSpill
INFO: Finished spill 0
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0003_m_000000_0 is done. And is in the process of
commiting
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0003_m_000000_0' done.
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1e2afb2
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
Mar 26, 2013 2:48:34 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: map 100% reduce 0%
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer
sortAndSpill
INFO: Finished spill 0
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0003_m_000001_0 is done. And is in the process of
commiting
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0003_m_000001_0' done.
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1250ff2
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
Mar 26, 2013 2:48:35 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer
sortAndSpill
INFO: Finished spill 0
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0003_m_000002_0 is done. And is in the process of
commiting
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0003_m_000002_0' done.
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1bc1fb9
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
Mar 26, 2013 2:48:36 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer
sortAndSpill
INFO: Finished spill 0
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0003_m_000003_0 is done. And is in the process of
commiting
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0003_m_000003_0' done.
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@d0357a
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Merging 4 sorted segments
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Down to the last merge-pass, with 4 segments left of total size: 14870368
bytes
Mar 26, 2013 2:48:51 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:54 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0003_r_000000_0 is done. And is in the process of
commiting
Mar 26, 2013 2:48:54 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:48:54 PM org.apache.hadoop.mapred.Task commit
INFO: Task attempt_local_0003_r_000000_0 is allowed to commit now
Mar 26, 2013 2:48:54 PM
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter commitTask
INFO: Saved output of task 'attempt_local_0003_r_000000_0' to
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-fkmeans/partial-vectors-0
Mar 26, 2013 2:48:54 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO: reduce > reduce
Mar 26, 2013 2:48:54 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0003_r_000000_0' done.
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: map 100% reduce 100%
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Job complete: job_local_0003
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Counters: 20
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: File Output Format Counters
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Bytes Written=17521050
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: FileSystemCounters
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: FILE_BYTES_READ=1218661051
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: FILE_BYTES_WRITTEN=1142515818
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: File Input Format Counters
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Bytes Read=15194475
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Map-Reduce Framework
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Map output materialized bytes=14870384
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Map input records=21578
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Reduce shuffle bytes=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Spilled Records=43156
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Map output bytes=14791487
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Total committed heap usage (bytes)=1980039168
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: CPU time spent (ms)=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: SPLIT_RAW_BYTES=640
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Combine input records=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Reduce input records=21578
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Reduce input groups=21578
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Combine output records=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Physical memory (bytes) snapshot=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Reduce output records=21578
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Virtual memory (bytes) snapshot=0
Mar 26, 2013 2:48:55 PM org.apache.hadoop.mapred.Counters log
INFO: Map output records=21578
Mar 26, 2013 2:48:57 PM org.apache.hadoop.mapreduce.lib.input.FileInputFormat
listStatus
INFO: Total input paths to process : 1
Mar 26, 2013 2:49:18 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Running job: job_local_0004
Mar 26, 2013 2:49:18 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@bfed5a
Mar 26, 2013 2:49:18 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
Mar 26, 2013 2:49:18 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
Mar 26, 2013 2:49:18 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
Mar 26, 2013 2:49:19 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: map 0% reduce 0%
Mar 26, 2013 2:49:20 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
Mar 26, 2013 2:49:24 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:49:25 PM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: map 100% reduce 0%
Mar 26, 2013 2:49:27 PM org.apache.hadoop.mapred.MapTask$MapOutputBuffer
sortAndSpill
INFO: Finished spill 0
Mar 26, 2013 2:49:27 PM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0004_m_000000_0 is done. And is in the process of
commiting
Mar 26, 2013 2:49:27 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:49:27 PM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0004_m_000000_0' done.
Mar 26, 2013 2:56:42 PM org.apache.hadoop.mapred.Task initialize
INFO: Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@13d1402
Mar 26, 2013 2:56:42 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
Mar 26, 2013 2:56:42 PM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Merging 1 sorted segments
Mar 26, 2013 2:56:42 PM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Down to the last merge-pass, with 1 segments left of total size: 17164298
bytes
Mar 26, 2013 2:56:42 PM org.apache.hadoop.mapred.LocalJobRunner$Job statusUpdate
INFO:
FATAL: Unable to delete script file /tmp/hudson2631663119181333879.sh
hudson.util.IOException2: remote file operation failed:
/tmp/hudson2631663119181333879.sh at hudson.remoting.Channel@59142f27:ubuntu2
at hudson.FilePath.act(FilePath.java:861)
at hudson.FilePath.act(FilePath.java:838)
at hudson.FilePath.delete(FilePath.java:1223)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:101)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:60)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:810)
at hudson.model.Build$BuildExecution.build(Build.java:199)
at hudson.model.Build$BuildExecution.doRun(Build.java:160)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:592)
at hudson.model.Run.execute(Run.java:1568)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:236)
Caused by: hudson.remoting.ChannelClosedException: channel is already closed
at hudson.remoting.Channel.send(Channel.java:494)
at hudson.remoting.Request.call(Request.java:129)
at hudson.remoting.Channel.call(Channel.java:672)
at hudson.FilePath.act(FilePath.java:854)
... 13 more
Caused by: hudson.remoting.Channel$OrderlyShutdown
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:850)
at hudson.remoting.Channel$2.handle(Channel.java:435)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:60)
Caused by: Command close created at
at hudson.remoting.Command.<init>(Command.java:56)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:844)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:842)
at hudson.remoting.Channel.close(Channel.java:909)
at hudson.remoting.Channel.close(Channel.java:892)
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:849)
... 2 more
FATAL: hudson.remoting.RequestAbortedException:
hudson.remoting.Channel$OrderlyShutdown
hudson.remoting.RequestAbortedException:
hudson.remoting.RequestAbortedException: hudson.remoting.Channel$OrderlyShutdown
at hudson.remoting.Request.call(Request.java:174)
at hudson.remoting.Channel.call(Channel.java:672)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:158)
at sun.proxy.$Proxy39.join(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:915)
at hudson.Launcher$ProcStarter.join(Launcher.java:360)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:91)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:60)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:810)
at hudson.model.Build$BuildExecution.build(Build.java:199)
at hudson.model.Build$BuildExecution.doRun(Build.java:160)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:592)
at hudson.model.Run.execute(Run.java:1568)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:236)
Caused by: hudson.remoting.RequestAbortedException:
hudson.remoting.Channel$OrderlyShutdown
at hudson.remoting.Request.abort(Request.java:299)
at hudson.remoting.Channel.terminate(Channel.java:732)
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:850)
at hudson.remoting.Channel$2.handle(Channel.java:435)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:60)
Caused by: hudson.remoting.Channel$OrderlyShutdown
... 3 more
Caused by: Command close created at
at hudson.remoting.Command.<init>(Command.java:56)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:844)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:842)
at hudson.remoting.Channel.close(Channel.java:909)
at hudson.remoting.Channel.close(Channel.java:892)
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:849)
... 2 more