See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/169/changes>
Changes: [srowen] MAHOUT-985 ignore ARFF instance weights, handle ? correctly [srowen] MAHOUT-1033 avoid NPE on null conf [srowen] MAHOUT-954 "Unpredictable" have to be represented by NaN on DF. [ssc] MAHOUT-889 size() returns wrong value (10) on freshly instantiated ObjectArrayList [srowen] MAHOUT-1003 fix bad help display due to 100-arg --filter arg [ssc] MAHOUT-960 Reduce memory usage of ImplicitFeedbackAlternatingLeastSquaresSolver ------------------------------------------ [...truncated 3662 lines...] 12/06/22 13:51:29 INFO mapred.Task: Task 'attempt_local_0003_m_000000_0' done. 12/06/22 13:51:29 INFO mapred.JobClient: map 100% reduce 0% 12/06/22 13:51:29 INFO mapred.LocalJobRunner: 12/06/22 13:51:29 INFO mapred.Merger: Merging 1 sorted segments 12/06/22 13:51:29 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/06/22 13:51:29 INFO mapred.LocalJobRunner: 12/06/22 13:51:29 INFO mapred.Task: Task:attempt_local_0003_r_000000_0 is done. And is in the process of commiting 12/06/22 13:51:29 INFO mapred.LocalJobRunner: 12/06/22 13:51:29 INFO mapred.Task: Task attempt_local_0003_r_000000_0 is allowed to commit now 12/06/22 13:51:29 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0003_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 12/06/22 13:51:32 INFO mapred.LocalJobRunner: reduce > reduce 12/06/22 13:51:32 INFO mapred.Task: Task 'attempt_local_0003_r_000000_0' done. 12/06/22 13:51:32 INFO mapred.JobClient: map 100% reduce 100% 12/06/22 13:51:32 INFO mapred.JobClient: Job complete: job_local_0003 12/06/22 13:51:32 INFO mapred.JobClient: Counters: 16 12/06/22 13:51:32 INFO mapred.JobClient: File Output Format Counters 12/06/22 13:51:32 INFO mapred.JobClient: Bytes Written=102 12/06/22 13:51:32 INFO mapred.JobClient: FileSystemCounters 12/06/22 13:51:32 INFO mapred.JobClient: FILE_BYTES_READ=184726401 12/06/22 13:51:32 INFO mapred.JobClient: FILE_BYTES_WRITTEN=186368620 12/06/22 13:51:32 INFO mapred.JobClient: File Input Format Counters 12/06/22 13:51:32 INFO mapred.JobClient: Bytes Read=101 12/06/22 13:51:32 INFO mapred.JobClient: Map-Reduce Framework 12/06/22 13:51:32 INFO mapred.JobClient: Reduce input groups=0 12/06/22 13:51:32 INFO mapred.JobClient: Map output materialized bytes=6 12/06/22 13:51:32 INFO mapred.JobClient: Combine output records=0 12/06/22 13:51:32 INFO mapred.JobClient: Map input records=0 12/06/22 13:51:32 INFO mapred.JobClient: Reduce shuffle bytes=0 12/06/22 13:51:32 INFO mapred.JobClient: Reduce output records=0 12/06/22 13:51:32 INFO mapred.JobClient: Spilled Records=0 12/06/22 13:51:32 INFO mapred.JobClient: Map output bytes=0 12/06/22 13:51:32 INFO mapred.JobClient: Combine input records=0 12/06/22 13:51:32 INFO mapred.JobClient: Map output records=0 12/06/22 13:51:32 INFO mapred.JobClient: SPLIT_RAW_BYTES=159 12/06/22 13:51:32 INFO mapred.JobClient: Reduce input records=0 12/06/22 13:51:32 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors 12/06/22 13:51:32 INFO input.FileInputFormat: Total input paths to process : 1 12/06/22 13:51:32 INFO mapred.JobClient: Running job: job_local_0004 12/06/22 13:51:32 INFO mapred.MapTask: io.sort.mb = 100 12/06/22 13:51:33 INFO mapred.MapTask: data buffer = 79691776/99614720 12/06/22 13:51:33 INFO mapred.MapTask: record buffer = 262144/327680 12/06/22 13:51:33 INFO mapred.MapTask: Starting flush of map output 12/06/22 13:51:33 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is done. And is in the process of commiting 12/06/22 13:51:33 INFO mapred.JobClient: map 0% reduce 0% 12/06/22 13:51:35 INFO mapred.LocalJobRunner: 12/06/22 13:51:35 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done. 12/06/22 13:51:35 INFO mapred.LocalJobRunner: 12/06/22 13:51:35 INFO mapred.Merger: Merging 1 sorted segments 12/06/22 13:51:35 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/06/22 13:51:35 INFO mapred.LocalJobRunner: 12/06/22 13:51:35 INFO mapred.JobClient: map 100% reduce 0% 12/06/22 13:51:35 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is done. And is in the process of commiting 12/06/22 13:51:35 INFO mapred.LocalJobRunner: 12/06/22 13:51:35 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is allowed to commit now 12/06/22 13:51:35 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0004_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors 12/06/22 13:51:38 INFO mapred.LocalJobRunner: reduce > reduce 12/06/22 13:51:38 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done. 12/06/22 13:51:38 INFO mapred.JobClient: map 100% reduce 100% 12/06/22 13:51:38 INFO mapred.JobClient: Job complete: job_local_0004 12/06/22 13:51:38 INFO mapred.JobClient: Counters: 16 12/06/22 13:51:38 INFO mapred.JobClient: File Output Format Counters 12/06/22 13:51:38 INFO mapred.JobClient: Bytes Written=102 12/06/22 13:51:38 INFO mapred.JobClient: FileSystemCounters 12/06/22 13:51:38 INFO mapred.JobClient: FILE_BYTES_READ=246301840 12/06/22 13:51:38 INFO mapred.JobClient: FILE_BYTES_WRITTEN=248490972 12/06/22 13:51:38 INFO mapred.JobClient: File Input Format Counters 12/06/22 13:51:38 INFO mapred.JobClient: Bytes Read=102 12/06/22 13:51:38 INFO mapred.JobClient: Map-Reduce Framework 12/06/22 13:51:38 INFO mapred.JobClient: Reduce input groups=0 12/06/22 13:51:38 INFO mapred.JobClient: Map output materialized bytes=6 12/06/22 13:51:38 INFO mapred.JobClient: Combine output records=0 12/06/22 13:51:38 INFO mapred.JobClient: Map input records=0 12/06/22 13:51:38 INFO mapred.JobClient: Reduce shuffle bytes=0 12/06/22 13:51:38 INFO mapred.JobClient: Reduce output records=0 12/06/22 13:51:38 INFO mapred.JobClient: Spilled Records=0 12/06/22 13:51:38 INFO mapred.JobClient: Map output bytes=0 12/06/22 13:51:38 INFO mapred.JobClient: Combine input records=0 12/06/22 13:51:38 INFO mapred.JobClient: Map output records=0 12/06/22 13:51:38 INFO mapred.JobClient: SPLIT_RAW_BYTES=157 12/06/22 13:51:38 INFO mapred.JobClient: Reduce input records=0 12/06/22 13:51:38 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 12/06/22 13:51:38 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count 12/06/22 13:51:38 INFO input.FileInputFormat: Total input paths to process : 1 12/06/22 13:51:38 INFO mapred.JobClient: Running job: job_local_0005 12/06/22 13:51:38 INFO mapred.MapTask: io.sort.mb = 100 12/06/22 13:51:39 INFO mapred.MapTask: data buffer = 79691776/99614720 12/06/22 13:51:39 INFO mapred.MapTask: record buffer = 262144/327680 12/06/22 13:51:39 INFO mapred.MapTask: Starting flush of map output 12/06/22 13:51:39 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is done. And is in the process of commiting 12/06/22 13:51:39 INFO mapred.JobClient: map 0% reduce 0% 12/06/22 13:51:41 INFO mapred.LocalJobRunner: 12/06/22 13:51:41 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done. 12/06/22 13:51:41 INFO mapred.LocalJobRunner: 12/06/22 13:51:41 INFO mapred.Merger: Merging 1 sorted segments 12/06/22 13:51:41 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/06/22 13:51:41 INFO mapred.LocalJobRunner: 12/06/22 13:51:41 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is done. And is in the process of commiting 12/06/22 13:51:41 INFO mapred.LocalJobRunner: 12/06/22 13:51:41 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is allowed to commit now 12/06/22 13:51:41 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0005_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count 12/06/22 13:51:41 INFO mapred.JobClient: map 100% reduce 0% 12/06/22 13:51:44 INFO mapred.LocalJobRunner: reduce > reduce 12/06/22 13:51:44 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' done. 12/06/22 13:51:44 INFO mapred.JobClient: map 100% reduce 100% 12/06/22 13:51:44 INFO mapred.JobClient: Job complete: job_local_0005 12/06/22 13:51:44 INFO mapred.JobClient: Counters: 16 12/06/22 13:51:44 INFO mapred.JobClient: File Output Format Counters 12/06/22 13:51:44 INFO mapred.JobClient: Bytes Written=105 12/06/22 13:51:44 INFO mapred.JobClient: FileSystemCounters 12/06/22 13:51:44 INFO mapred.JobClient: FILE_BYTES_READ=307877168 12/06/22 13:51:44 INFO mapred.JobClient: FILE_BYTES_WRITTEN=310612797 12/06/22 13:51:44 INFO mapred.JobClient: File Input Format Counters 12/06/22 13:51:44 INFO mapred.JobClient: Bytes Read=102 12/06/22 13:51:44 INFO mapred.JobClient: Map-Reduce Framework 12/06/22 13:51:44 INFO mapred.JobClient: Reduce input groups=0 12/06/22 13:51:44 INFO mapred.JobClient: Map output materialized bytes=6 12/06/22 13:51:44 INFO mapred.JobClient: Combine output records=0 12/06/22 13:51:44 INFO mapred.JobClient: Map input records=0 12/06/22 13:51:44 INFO mapred.JobClient: Reduce shuffle bytes=0 12/06/22 13:51:44 INFO mapred.JobClient: Reduce output records=0 12/06/22 13:51:44 INFO mapred.JobClient: Spilled Records=0 12/06/22 13:51:44 INFO mapred.JobClient: Map output bytes=0 12/06/22 13:51:44 INFO mapred.JobClient: Combine input records=0 12/06/22 13:51:44 INFO mapred.JobClient: Map output records=0 12/06/22 13:51:44 INFO mapred.JobClient: SPLIT_RAW_BYTES=150 12/06/22 13:51:44 INFO mapred.JobClient: Reduce input records=0 12/06/22 13:51:45 INFO input.FileInputFormat: Total input paths to process : 1 12/06/22 13:51:45 INFO filecache.TrackerDistributedCacheManager: Creating frequency.file-0 in /tmp/hadoop-hudson/mapred/local/archive/3675376996949945347_1334525619_343308272/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans-work--2719025437925940869 with rwxr-xr-x 12/06/22 13:51:45 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as /tmp/hadoop-hudson/mapred/local/archive/3675376996949945347_1334525619_343308272/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 12/06/22 13:51:45 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as /tmp/hadoop-hudson/mapred/local/archive/3675376996949945347_1334525619_343308272/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 12/06/22 13:51:45 INFO mapred.JobClient: Running job: job_local_0006 12/06/22 13:51:45 INFO mapred.MapTask: io.sort.mb = 100 12/06/22 13:51:46 INFO mapred.MapTask: data buffer = 79691776/99614720 12/06/22 13:51:46 INFO mapred.MapTask: record buffer = 262144/327680 12/06/22 13:51:46 INFO mapred.MapTask: Starting flush of map output 12/06/22 13:51:46 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is done. And is in the process of commiting 12/06/22 13:51:46 INFO mapred.JobClient: map 0% reduce 0% 12/06/22 13:51:48 INFO mapred.LocalJobRunner: 12/06/22 13:51:48 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done. 12/06/22 13:51:48 INFO mapred.JobClient: map 100% reduce 0% 12/06/22 13:51:48 INFO mapred.LocalJobRunner: 12/06/22 13:51:48 INFO mapred.Merger: Merging 1 sorted segments 12/06/22 13:51:48 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/06/22 13:51:48 INFO mapred.LocalJobRunner: 12/06/22 13:51:48 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is done. And is in the process of commiting 12/06/22 13:51:48 INFO mapred.LocalJobRunner: 12/06/22 13:51:48 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is allowed to commit now 12/06/22 13:51:48 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0006_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 12/06/22 13:51:51 INFO mapred.LocalJobRunner: reduce > reduce 12/06/22 13:51:51 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done. 12/06/22 13:51:51 INFO mapred.JobClient: map 100% reduce 100% 12/06/22 13:51:51 INFO mapred.JobClient: Job complete: job_local_0006 12/06/22 13:51:51 INFO mapred.JobClient: Counters: 16 12/06/22 13:51:51 INFO mapred.JobClient: File Output Format Counters 12/06/22 13:51:51 INFO mapred.JobClient: Bytes Written=102 12/06/22 13:51:51 INFO mapred.JobClient: FileSystemCounters 12/06/22 13:51:51 INFO mapred.JobClient: FILE_BYTES_READ=369453021 12/06/22 13:51:51 INFO mapred.JobClient: FILE_BYTES_WRITTEN=372738472 12/06/22 13:51:51 INFO mapred.JobClient: File Input Format Counters 12/06/22 13:51:51 INFO mapred.JobClient: Bytes Read=102 12/06/22 13:51:51 INFO mapred.JobClient: Map-Reduce Framework 12/06/22 13:51:51 INFO mapred.JobClient: Reduce input groups=0 12/06/22 13:51:51 INFO mapred.JobClient: Map output materialized bytes=6 12/06/22 13:51:51 INFO mapred.JobClient: Combine output records=0 12/06/22 13:51:51 INFO mapred.JobClient: Map input records=0 12/06/22 13:51:51 INFO mapred.JobClient: Reduce shuffle bytes=0 12/06/22 13:51:51 INFO mapred.JobClient: Reduce output records=0 12/06/22 13:51:51 INFO mapred.JobClient: Spilled Records=0 12/06/22 13:51:51 INFO mapred.JobClient: Map output bytes=0 12/06/22 13:51:51 INFO mapred.JobClient: Combine input records=0 12/06/22 13:51:51 INFO mapred.JobClient: Map output records=0 12/06/22 13:51:51 INFO mapred.JobClient: SPLIT_RAW_BYTES=150 12/06/22 13:51:51 INFO mapred.JobClient: Reduce input records=0 12/06/22 13:51:51 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors 12/06/22 13:51:52 INFO input.FileInputFormat: Total input paths to process : 1 12/06/22 13:51:52 INFO mapred.JobClient: Running job: job_local_0007 12/06/22 13:51:52 INFO mapred.MapTask: io.sort.mb = 100 12/06/22 13:51:53 INFO mapred.MapTask: data buffer = 79691776/99614720 12/06/22 13:51:53 INFO mapred.MapTask: record buffer = 262144/327680 12/06/22 13:51:53 INFO mapred.MapTask: Starting flush of map output 12/06/22 13:51:53 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is done. And is in the process of commiting 12/06/22 13:51:53 INFO mapred.JobClient: map 0% reduce 0% 12/06/22 13:51:55 INFO mapred.LocalJobRunner: 12/06/22 13:51:55 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done. 12/06/22 13:51:55 INFO mapred.LocalJobRunner: 12/06/22 13:51:55 INFO mapred.Merger: Merging 1 sorted segments 12/06/22 13:51:55 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/06/22 13:51:55 INFO mapred.LocalJobRunner: 12/06/22 13:51:55 INFO mapred.JobClient: map 100% reduce 0% 12/06/22 13:51:55 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is done. And is in the process of commiting 12/06/22 13:51:55 INFO mapred.LocalJobRunner: 12/06/22 13:51:55 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is allowed to commit now 12/06/22 13:51:55 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0007_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors 12/06/22 13:51:58 INFO mapred.LocalJobRunner: reduce > reduce 12/06/22 13:51:58 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done. 12/06/22 13:51:58 INFO mapred.JobClient: map 100% reduce 100% 12/06/22 13:51:58 INFO mapred.JobClient: Job complete: job_local_0007 12/06/22 13:51:58 INFO mapred.JobClient: Counters: 16 12/06/22 13:51:58 INFO mapred.JobClient: File Output Format Counters 12/06/22 13:51:58 INFO mapred.JobClient: Bytes Written=102 12/06/22 13:51:58 INFO mapred.JobClient: FileSystemCounters 12/06/22 13:51:58 INFO mapred.JobClient: FILE_BYTES_READ=431028468 12/06/22 13:51:58 INFO mapred.JobClient: FILE_BYTES_WRITTEN=434860844 12/06/22 13:51:58 INFO mapred.JobClient: File Input Format Counters 12/06/22 13:51:58 INFO mapred.JobClient: Bytes Read=102 12/06/22 13:51:58 INFO mapred.JobClient: Map-Reduce Framework 12/06/22 13:51:58 INFO mapred.JobClient: Reduce input groups=0 12/06/22 13:51:58 INFO mapred.JobClient: Map output materialized bytes=6 12/06/22 13:51:58 INFO mapred.JobClient: Combine output records=0 12/06/22 13:51:58 INFO mapred.JobClient: Map input records=0 12/06/22 13:51:58 INFO mapred.JobClient: Reduce shuffle bytes=0 12/06/22 13:51:58 INFO mapred.JobClient: Reduce output records=0 12/06/22 13:51:58 INFO mapred.JobClient: Spilled Records=0 12/06/22 13:51:58 INFO mapred.JobClient: Map output bytes=0 12/06/22 13:51:58 INFO mapred.JobClient: Combine input records=0 12/06/22 13:51:58 INFO mapred.JobClient: Map output records=0 12/06/22 13:51:58 INFO mapred.JobClient: SPLIT_RAW_BYTES=157 12/06/22 13:51:58 INFO mapred.JobClient: Reduce input records=0 12/06/22 13:51:58 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 12/06/22 13:51:58 INFO driver.MahoutDriver: Program took 44685 ms (Minutes: 0.74475) hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-jcl-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 12/06/22 13:52:00 INFO common.AbstractJob: Command line arguments: {--clustering=null, --clusters=[/tmp/mahout-work-hudson/reuters-kmeans-clusters], --convergenceDelta=[0.5], --distanceMeasure=[org.apache.mahout.common.distance.CosineDistanceMeasure], --endPhase=[2147483647], --input=[/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/], --maxIter=[10], --method=[mapreduce], --numClusters=[20], --output=[/tmp/mahout-work-hudson/reuters-kmeans], --overwrite=null, --startPhase=[0], --tempDir=[temp]} 12/06/22 13:52:00 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-kmeans-clusters 12/06/22 13:52:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 12/06/22 13:52:00 INFO compress.CodecPool: Got brand-new compressor 12/06/22 13:52:00 INFO kmeans.RandomSeedGenerator: Wrote 20 Klusters to /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed 12/06/22 13:52:00 INFO kmeans.KMeansDriver: Input: /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors Clusters In: /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed Out: /tmp/mahout-work-hudson/reuters-kmeans Distance: org.apache.mahout.common.distance.CosineDistanceMeasure 12/06/22 13:52:00 INFO kmeans.KMeansDriver: convergence: 0.5 max Iterations: 10 num Reduce Tasks: org.apache.mahout.math.VectorWritable Input Vectors: {} 12/06/22 13:52:00 INFO compress.CodecPool: Got brand-new decompressor Exception in thread "main" java.lang.IllegalStateException: No input clusters found in /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed. Check your -c argument. at org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:217) at org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:148) at org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:107) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:48) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) Build step 'Execute shell' marked build as failure
