See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/21/>

------------------------------------------
[...truncated 6058 lines...]
12/01/22 19:24:03 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=142860384
12/01/22 19:24:03 INFO mapred.JobClient:   File Input Format Counters 
12/01/22 19:24:03 INFO mapred.JobClient:     Bytes Read=101
12/01/22 19:24:03 INFO mapred.JobClient:   Map-Reduce Framework
12/01/22 19:24:03 INFO mapred.JobClient:     Reduce input groups=0
12/01/22 19:24:03 INFO mapred.JobClient:     Map output materialized bytes=6
12/01/22 19:24:03 INFO mapred.JobClient:     Combine output records=0
12/01/22 19:24:03 INFO mapred.JobClient:     Map input records=0
12/01/22 19:24:03 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/01/22 19:24:03 INFO mapred.JobClient:     Reduce output records=0
12/01/22 19:24:03 INFO mapred.JobClient:     Spilled Records=0
12/01/22 19:24:03 INFO mapred.JobClient:     Map output bytes=0
12/01/22 19:24:03 INFO mapred.JobClient:     Combine input records=0
12/01/22 19:24:03 INFO mapred.JobClient:     Map output records=0
12/01/22 19:24:03 INFO mapred.JobClient:     SPLIT_RAW_BYTES=159
12/01/22 19:24:03 INFO mapred.JobClient:     Reduce input records=0
12/01/22 19:24:03 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
12/01/22 19:24:04 INFO input.FileInputFormat: Total input paths to process : 1
12/01/22 19:24:04 INFO mapred.JobClient: Running job: job_local_0004
12/01/22 19:24:04 INFO mapred.MapTask: io.sort.mb = 100
12/01/22 19:24:04 INFO mapred.MapTask: data buffer = 79691776/99614720
12/01/22 19:24:04 INFO mapred.MapTask: record buffer = 262144/327680
12/01/22 19:24:04 INFO mapred.MapTask: Starting flush of map output
12/01/22 19:24:04 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:05 INFO mapred.JobClient:  map 0% reduce 0%
12/01/22 19:24:07 INFO mapred.LocalJobRunner: 
12/01/22 19:24:07 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done.
12/01/22 19:24:07 INFO mapred.LocalJobRunner: 
12/01/22 19:24:07 INFO mapred.Merger: Merging 1 sorted segments
12/01/22 19:24:07 INFO mapred.Merger: Down to the last merge-pass, with 0 
segments left of total size: 0 bytes
12/01/22 19:24:07 INFO mapred.LocalJobRunner: 
12/01/22 19:24:07 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:07 INFO mapred.LocalJobRunner: 
12/01/22 19:24:07 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is 
allowed to commit now
12/01/22 19:24:07 INFO output.FileOutputCommitter: Saved output of task 
'attempt_local_0004_r_000000_0' to 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
12/01/22 19:24:07 INFO mapred.JobClient:  map 100% reduce 0%
12/01/22 19:24:10 INFO mapred.LocalJobRunner: reduce > reduce
12/01/22 19:24:10 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done.
12/01/22 19:24:10 INFO mapred.JobClient:  map 100% reduce 100%
12/01/22 19:24:10 INFO mapred.JobClient: Job complete: job_local_0004
12/01/22 19:24:10 INFO mapred.JobClient: Counters: 16
12/01/22 19:24:10 INFO mapred.JobClient:   File Output Format Counters 
12/01/22 19:24:10 INFO mapred.JobClient:     Bytes Written=102
12/01/22 19:24:10 INFO mapred.JobClient:   FileSystemCounters
12/01/22 19:24:10 INFO mapred.JobClient:     FILE_BYTES_READ=188740568
12/01/22 19:24:10 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=190479994
12/01/22 19:24:10 INFO mapred.JobClient:   File Input Format Counters 
12/01/22 19:24:10 INFO mapred.JobClient:     Bytes Read=102
12/01/22 19:24:10 INFO mapred.JobClient:   Map-Reduce Framework
12/01/22 19:24:10 INFO mapred.JobClient:     Reduce input groups=0
12/01/22 19:24:10 INFO mapred.JobClient:     Map output materialized bytes=6
12/01/22 19:24:10 INFO mapred.JobClient:     Combine output records=0
12/01/22 19:24:10 INFO mapred.JobClient:     Map input records=0
12/01/22 19:24:10 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/01/22 19:24:10 INFO mapred.JobClient:     Reduce output records=0
12/01/22 19:24:10 INFO mapred.JobClient:     Spilled Records=0
12/01/22 19:24:10 INFO mapred.JobClient:     Map output bytes=0
12/01/22 19:24:10 INFO mapred.JobClient:     Combine input records=0
12/01/22 19:24:10 INFO mapred.JobClient:     Map output records=0
12/01/22 19:24:10 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
12/01/22 19:24:10 INFO mapred.JobClient:     Reduce input records=0
12/01/22 19:24:10 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/01/22 19:24:10 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
12/01/22 19:24:10 INFO input.FileInputFormat: Total input paths to process : 1
12/01/22 19:24:10 INFO mapred.JobClient: Running job: job_local_0005
12/01/22 19:24:10 INFO mapred.MapTask: io.sort.mb = 100
12/01/22 19:24:10 INFO mapred.MapTask: data buffer = 79691776/99614720
12/01/22 19:24:10 INFO mapred.MapTask: record buffer = 262144/327680
12/01/22 19:24:10 INFO mapred.MapTask: Starting flush of map output
12/01/22 19:24:10 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:11 INFO mapred.JobClient:  map 0% reduce 0%
12/01/22 19:24:13 INFO mapred.LocalJobRunner: 
12/01/22 19:24:13 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done.
12/01/22 19:24:13 INFO mapred.LocalJobRunner: 
12/01/22 19:24:13 INFO mapred.Merger: Merging 1 sorted segments
12/01/22 19:24:13 INFO mapred.Merger: Down to the last merge-pass, with 0 
segments left of total size: 0 bytes
12/01/22 19:24:13 INFO mapred.LocalJobRunner: 
12/01/22 19:24:13 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:13 INFO mapred.LocalJobRunner: 
12/01/22 19:24:13 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is 
allowed to commit now
12/01/22 19:24:13 INFO output.FileOutputCommitter: Saved output of task 
'attempt_local_0005_r_000000_0' to 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
12/01/22 19:24:13 INFO mapred.JobClient:  map 100% reduce 0%
12/01/22 19:24:16 INFO mapred.LocalJobRunner: reduce > reduce
12/01/22 19:24:16 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' done.
12/01/22 19:24:16 INFO mapred.JobClient:  map 100% reduce 100%
12/01/22 19:24:16 INFO mapred.JobClient: Job complete: job_local_0005
12/01/22 19:24:16 INFO mapred.JobClient: Counters: 16
12/01/22 19:24:16 INFO mapred.JobClient:   File Output Format Counters 
12/01/22 19:24:16 INFO mapred.JobClient:     Bytes Written=105
12/01/22 19:24:16 INFO mapred.JobClient:   FileSystemCounters
12/01/22 19:24:16 INFO mapred.JobClient:     FILE_BYTES_READ=235925578
12/01/22 19:24:16 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=238099085
12/01/22 19:24:16 INFO mapred.JobClient:   File Input Format Counters 
12/01/22 19:24:16 INFO mapred.JobClient:     Bytes Read=102
12/01/22 19:24:16 INFO mapred.JobClient:   Map-Reduce Framework
12/01/22 19:24:16 INFO mapred.JobClient:     Reduce input groups=0
12/01/22 19:24:16 INFO mapred.JobClient:     Map output materialized bytes=6
12/01/22 19:24:16 INFO mapred.JobClient:     Combine output records=0
12/01/22 19:24:16 INFO mapred.JobClient:     Map input records=0
12/01/22 19:24:16 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/01/22 19:24:16 INFO mapred.JobClient:     Reduce output records=0
12/01/22 19:24:16 INFO mapred.JobClient:     Spilled Records=0
12/01/22 19:24:16 INFO mapred.JobClient:     Map output bytes=0
12/01/22 19:24:16 INFO mapred.JobClient:     Combine input records=0
12/01/22 19:24:16 INFO mapred.JobClient:     Map output records=0
12/01/22 19:24:16 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
12/01/22 19:24:16 INFO mapred.JobClient:     Reduce input records=0
12/01/22 19:24:17 INFO input.FileInputFormat: Total input paths to process : 1
12/01/22 19:24:17 INFO filecache.TrackerDistributedCacheManager: Creating 
frequency.file-0 in 
/tmp/hadoop-hudson/mapred/local/archive/5090227056520710800_1334525619_115362154/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans-work--423398436315833019
 with rwxr-xr-x
12/01/22 19:24:17 INFO filecache.TrackerDistributedCacheManager: Cached 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
/tmp/hadoop-hudson/mapred/local/archive/5090227056520710800_1334525619_115362154/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
12/01/22 19:24:17 INFO filecache.TrackerDistributedCacheManager: Cached 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
/tmp/hadoop-hudson/mapred/local/archive/5090227056520710800_1334525619_115362154/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
12/01/22 19:24:17 INFO mapred.JobClient: Running job: job_local_0006
12/01/22 19:24:17 INFO mapred.MapTask: io.sort.mb = 100
12/01/22 19:24:17 INFO mapred.MapTask: data buffer = 79691776/99614720
12/01/22 19:24:17 INFO mapred.MapTask: record buffer = 262144/327680
12/01/22 19:24:17 INFO mapred.MapTask: Starting flush of map output
12/01/22 19:24:17 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:18 INFO mapred.JobClient:  map 0% reduce 0%
12/01/22 19:24:20 INFO mapred.LocalJobRunner: 
12/01/22 19:24:20 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done.
12/01/22 19:24:20 INFO mapred.LocalJobRunner: 
12/01/22 19:24:20 INFO mapred.Merger: Merging 1 sorted segments
12/01/22 19:24:20 INFO mapred.Merger: Down to the last merge-pass, with 0 
segments left of total size: 0 bytes
12/01/22 19:24:20 INFO mapred.LocalJobRunner: 
12/01/22 19:24:20 INFO mapred.JobClient:  map 100% reduce 0%
12/01/22 19:24:20 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:20 INFO mapred.LocalJobRunner: 
12/01/22 19:24:20 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is 
allowed to commit now
12/01/22 19:24:20 INFO output.FileOutputCommitter: Saved output of task 
'attempt_local_0006_r_000000_0' to 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/01/22 19:24:23 INFO mapred.LocalJobRunner: reduce > reduce
12/01/22 19:24:23 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done.
12/01/22 19:24:23 INFO mapred.JobClient:  map 100% reduce 100%
12/01/22 19:24:23 INFO mapred.JobClient: Job complete: job_local_0006
12/01/22 19:24:23 INFO mapred.JobClient: Counters: 16
12/01/22 19:24:23 INFO mapred.JobClient:   File Output Format Counters 
12/01/22 19:24:23 INFO mapred.JobClient:     Bytes Written=102
12/01/22 19:24:23 INFO mapred.JobClient:   FileSystemCounters
12/01/22 19:24:23 INFO mapred.JobClient:     FILE_BYTES_READ=283111113
12/01/22 19:24:23 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=285722018
12/01/22 19:24:23 INFO mapred.JobClient:   File Input Format Counters 
12/01/22 19:24:23 INFO mapred.JobClient:     Bytes Read=102
12/01/22 19:24:23 INFO mapred.JobClient:   Map-Reduce Framework
12/01/22 19:24:23 INFO mapred.JobClient:     Reduce input groups=0
12/01/22 19:24:23 INFO mapred.JobClient:     Map output materialized bytes=6
12/01/22 19:24:23 INFO mapred.JobClient:     Combine output records=0
12/01/22 19:24:23 INFO mapred.JobClient:     Map input records=0
12/01/22 19:24:23 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/01/22 19:24:23 INFO mapred.JobClient:     Reduce output records=0
12/01/22 19:24:23 INFO mapred.JobClient:     Spilled Records=0
12/01/22 19:24:23 INFO mapred.JobClient:     Map output bytes=0
12/01/22 19:24:23 INFO mapred.JobClient:     Combine input records=0
12/01/22 19:24:23 INFO mapred.JobClient:     Map output records=0
12/01/22 19:24:23 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
12/01/22 19:24:23 INFO mapred.JobClient:     Reduce input records=0
12/01/22 19:24:23 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
12/01/22 19:24:23 INFO input.FileInputFormat: Total input paths to process : 1
12/01/22 19:24:23 INFO mapred.JobClient: Running job: job_local_0007
12/01/22 19:24:23 INFO mapred.MapTask: io.sort.mb = 100
12/01/22 19:24:24 INFO mapred.MapTask: data buffer = 79691776/99614720
12/01/22 19:24:24 INFO mapred.MapTask: record buffer = 262144/327680
12/01/22 19:24:24 INFO mapred.MapTask: Starting flush of map output
12/01/22 19:24:24 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:24 INFO mapred.JobClient:  map 0% reduce 0%
12/01/22 19:24:26 INFO mapred.LocalJobRunner: 
12/01/22 19:24:26 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done.
12/01/22 19:24:26 INFO mapred.LocalJobRunner: 
12/01/22 19:24:26 INFO mapred.Merger: Merging 1 sorted segments
12/01/22 19:24:26 INFO mapred.Merger: Down to the last merge-pass, with 0 
segments left of total size: 0 bytes
12/01/22 19:24:26 INFO mapred.LocalJobRunner: 
12/01/22 19:24:26 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is done. 
And is in the process of commiting
12/01/22 19:24:26 INFO mapred.LocalJobRunner: 
12/01/22 19:24:26 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is 
allowed to commit now
12/01/22 19:24:26 INFO output.FileOutputCommitter: Saved output of task 
'attempt_local_0007_r_000000_0' to 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
12/01/22 19:24:26 INFO mapred.JobClient:  map 100% reduce 0%
12/01/22 19:24:29 INFO mapred.LocalJobRunner: reduce > reduce
12/01/22 19:24:29 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done.
12/01/22 19:24:29 INFO mapred.JobClient:  map 100% reduce 100%
12/01/22 19:24:29 INFO mapred.JobClient: Job complete: job_local_0007
12/01/22 19:24:29 INFO mapred.JobClient: Counters: 16
12/01/22 19:24:29 INFO mapred.JobClient:   File Output Format Counters 
12/01/22 19:24:29 INFO mapred.JobClient:     Bytes Written=102
12/01/22 19:24:29 INFO mapred.JobClient:   FileSystemCounters
12/01/22 19:24:29 INFO mapred.JobClient:     FILE_BYTES_READ=330296242
12/01/22 19:24:29 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=333341656
12/01/22 19:24:29 INFO mapred.JobClient:   File Input Format Counters 
12/01/22 19:24:29 INFO mapred.JobClient:     Bytes Read=102
12/01/22 19:24:29 INFO mapred.JobClient:   Map-Reduce Framework
12/01/22 19:24:29 INFO mapred.JobClient:     Reduce input groups=0
12/01/22 19:24:29 INFO mapred.JobClient:     Map output materialized bytes=6
12/01/22 19:24:29 INFO mapred.JobClient:     Combine output records=0
12/01/22 19:24:29 INFO mapred.JobClient:     Map input records=0
12/01/22 19:24:29 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/01/22 19:24:29 INFO mapred.JobClient:     Reduce output records=0
12/01/22 19:24:29 INFO mapred.JobClient:     Spilled Records=0
12/01/22 19:24:29 INFO mapred.JobClient:     Map output bytes=0
12/01/22 19:24:29 INFO mapred.JobClient:     Combine input records=0
12/01/22 19:24:29 INFO mapred.JobClient:     Map output records=0
12/01/22 19:24:29 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
12/01/22 19:24:29 INFO mapred.JobClient:     Reduce input records=0
12/01/22 19:24:29 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/01/22 19:24:29 INFO driver.MahoutDriver: Program took 43546 ms (Minutes: 
0.7257666666666667)
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
no HADOOP_HOME set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/mahout-examples-0.6-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
12/01/22 19:24:30 INFO common.AbstractJob: Command line arguments: 
{--clustering=null, --clusters=/tmp/mahout-work-hudson/reuters-kmeans-clusters, 
--convergenceDelta=0.5, 
--distanceMeasure=org.apache.mahout.common.distance.CosineDistanceMeasure, 
--endPhase=2147483647, 
--input=/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/,
 --maxIter=10, --method=mapreduce, --numClusters=20, 
--output=/tmp/mahout-work-hudson/reuters-kmeans, --overwrite=null, 
--startPhase=0, --tempDir=temp}
12/01/22 19:24:30 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-kmeans
12/01/22 19:24:30 INFO common.HadoopUtil: Deleting 
/tmp/mahout-work-hudson/reuters-kmeans-clusters
12/01/22 19:24:30 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
12/01/22 19:24:30 INFO compress.CodecPool: Got brand-new compressor
12/01/22 19:24:31 INFO kmeans.RandomSeedGenerator: Wrote 20 vectors to 
/tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed
12/01/22 19:24:31 INFO kmeans.KMeansDriver: Input: 
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors Clusters 
In: /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed Out: 
/tmp/mahout-work-hudson/reuters-kmeans Distance: 
org.apache.mahout.common.distance.CosineDistanceMeasure
12/01/22 19:24:31 INFO kmeans.KMeansDriver: convergence: 0.5 max Iterations: 10 
num Reduce Tasks: org.apache.mahout.math.VectorWritable Input Vectors: {}
12/01/22 19:24:31 INFO kmeans.KMeansDriver: K-Means Iteration 1
12/01/22 19:24:31 INFO input.FileInputFormat: Total input paths to process : 1
12/01/22 19:24:31 INFO mapred.JobClient: Running job: job_local_0001
12/01/22 19:24:31 INFO mapred.MapTask: io.sort.mb = 100
12/01/22 19:24:31 INFO mapred.MapTask: data buffer = 79691776/99614720
12/01/22 19:24:31 INFO mapred.MapTask: record buffer = 262144/327680
12/01/22 19:24:32 INFO compress.CodecPool: Got brand-new decompressor
12/01/22 19:24:32 WARN mapred.LocalJobRunner: job_local_0001
java.lang.IllegalStateException: No clusters found. Check your -c path.
        at 
org.apache.mahout.clustering.kmeans.KMeansMapper.setup(KMeansMapper.java:59)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at 
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
12/01/22 19:24:32 INFO mapred.JobClient:  map 0% reduce 0%
12/01/22 19:24:32 INFO mapred.JobClient: Job complete: job_local_0001
12/01/22 19:24:32 INFO mapred.JobClient: Counters: 0
Exception in thread "main" java.lang.InterruptedException: K-Means Iteration 
failed processing 
/tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.runIteration(KMeansDriver.java:371)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.buildClustersMR(KMeansDriver.java:316)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:239)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:154)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:112)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:61)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
Build step 'Execute shell' marked build as failure

Reply via email to