Jeff can you take a look at this. ------ Robin Anil
On Fri, Jun 8, 2012 at 4:29 PM, Apache Jenkins Server < [email protected]> wrote: > See < > https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/158/changes> > > Changes: > > [pranjan] [maven-release-plugin] prepare for next development iteration > > [pranjan] [maven-release-plugin] prepare release mahout-0.7 > > [pranjan] [maven-release-plugin] rollback the release of mahout-0.7 > > ------------------------------------------ > [...truncated 4250 lines...] > 12/06/08 21:28:52 INFO mapred.LocalJobRunner: > 12/06/08 21:28:52 INFO mapred.Task: Task 'attempt_local_0003_m_000000_0' > done. > 12/06/08 21:28:52 INFO mapred.LocalJobRunner: > 12/06/08 21:28:52 INFO mapred.Merger: Merging 1 sorted segments > 12/06/08 21:28:52 INFO mapred.Merger: Down to the last merge-pass, with 0 > segments left of total size: 0 bytes > 12/06/08 21:28:52 INFO mapred.LocalJobRunner: > 12/06/08 21:28:52 INFO mapred.JobClient: map 100% reduce 0% > 12/06/08 21:28:52 INFO mapred.Task: Task:attempt_local_0003_r_000000_0 is > done. And is in the process of commiting > 12/06/08 21:28:52 INFO mapred.LocalJobRunner: > 12/06/08 21:28:52 INFO mapred.Task: Task attempt_local_0003_r_000000_0 is > allowed to commit now > 12/06/08 21:28:52 INFO output.FileOutputCommitter: Saved output of task > 'attempt_local_0003_r_000000_0' to > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 > 12/06/08 21:28:55 INFO mapred.LocalJobRunner: reduce > reduce > 12/06/08 21:28:55 INFO mapred.Task: Task 'attempt_local_0003_r_000000_0' > done. > 12/06/08 21:28:55 INFO mapred.JobClient: map 100% reduce 100% > 12/06/08 21:28:55 INFO mapred.JobClient: Job complete: job_local_0003 > 12/06/08 21:28:55 INFO mapred.JobClient: Counters: 16 > 12/06/08 21:28:55 INFO mapred.JobClient: File Output Format Counters > 12/06/08 21:28:55 INFO mapred.JobClient: Bytes Written=102 > 12/06/08 21:28:55 INFO mapred.JobClient: FileSystemCounters > 12/06/08 21:28:55 INFO mapred.JobClient: FILE_BYTES_READ=180228927 > 12/06/08 21:28:55 INFO mapred.JobClient: FILE_BYTES_WRITTEN=181835986 > 12/06/08 21:28:55 INFO mapred.JobClient: File Input Format Counters > 12/06/08 21:28:55 INFO mapred.JobClient: Bytes Read=101 > 12/06/08 21:28:55 INFO mapred.JobClient: Map-Reduce Framework > 12/06/08 21:28:55 INFO mapred.JobClient: Reduce input groups=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Map output materialized > bytes=6 > 12/06/08 21:28:55 INFO mapred.JobClient: Combine output records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Map input records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Reduce shuffle bytes=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Reduce output records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Spilled Records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Map output bytes=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Combine input records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: Map output records=0 > 12/06/08 21:28:55 INFO mapred.JobClient: SPLIT_RAW_BYTES=159 > 12/06/08 21:28:55 INFO mapred.JobClient: Reduce input records=0 > 12/06/08 21:28:55 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors > 12/06/08 21:28:56 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/06/08 21:28:56 INFO mapred.JobClient: Running job: job_local_0004 > 12/06/08 21:28:56 INFO mapred.MapTask: io.sort.mb = 100 > 12/06/08 21:28:56 INFO mapred.MapTask: data buffer = 79691776/99614720 > 12/06/08 21:28:56 INFO mapred.MapTask: record buffer = 262144/327680 > 12/06/08 21:28:56 INFO mapred.MapTask: Starting flush of map output > 12/06/08 21:28:56 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is > done. And is in the process of commiting > 12/06/08 21:28:57 INFO mapred.JobClient: map 0% reduce 0% > 12/06/08 21:28:59 INFO mapred.LocalJobRunner: > 12/06/08 21:28:59 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' > done. > 12/06/08 21:28:59 INFO mapred.LocalJobRunner: > 12/06/08 21:28:59 INFO mapred.Merger: Merging 1 sorted segments > 12/06/08 21:28:59 INFO mapred.Merger: Down to the last merge-pass, with 0 > segments left of total size: 0 bytes > 12/06/08 21:28:59 INFO mapred.LocalJobRunner: > 12/06/08 21:28:59 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is > done. And is in the process of commiting > 12/06/08 21:28:59 INFO mapred.LocalJobRunner: > 12/06/08 21:28:59 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is > allowed to commit now > 12/06/08 21:28:59 INFO output.FileOutputCommitter: Saved output of task > 'attempt_local_0004_r_000000_0' to > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors > 12/06/08 21:28:59 INFO mapred.JobClient: map 100% reduce 0% > 12/06/08 21:29:02 INFO mapred.LocalJobRunner: reduce > reduce > 12/06/08 21:29:02 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' > done. > 12/06/08 21:29:02 INFO mapred.JobClient: map 100% reduce 100% > 12/06/08 21:29:02 INFO mapred.JobClient: Job complete: job_local_0004 > 12/06/08 21:29:02 INFO mapred.JobClient: Counters: 16 > 12/06/08 21:29:02 INFO mapred.JobClient: File Output Format Counters > 12/06/08 21:29:02 INFO mapred.JobClient: Bytes Written=102 > 12/06/08 21:29:02 INFO mapred.JobClient: FileSystemCounters > 12/06/08 21:29:02 INFO mapred.JobClient: FILE_BYTES_READ=240305208 > 12/06/08 21:29:02 INFO mapred.JobClient: FILE_BYTES_WRITTEN=242447468 > 12/06/08 21:29:02 INFO mapred.JobClient: File Input Format Counters > 12/06/08 21:29:02 INFO mapred.JobClient: Bytes Read=102 > 12/06/08 21:29:02 INFO mapred.JobClient: Map-Reduce Framework > 12/06/08 21:29:02 INFO mapred.JobClient: Reduce input groups=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Map output materialized > bytes=6 > 12/06/08 21:29:02 INFO mapred.JobClient: Combine output records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Map input records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Reduce shuffle bytes=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Reduce output records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Spilled Records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Map output bytes=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Combine input records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: Map output records=0 > 12/06/08 21:29:02 INFO mapred.JobClient: SPLIT_RAW_BYTES=157 > 12/06/08 21:29:02 INFO mapred.JobClient: Reduce input records=0 > 12/06/08 21:29:02 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 > 12/06/08 21:29:02 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count > 12/06/08 21:29:02 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/06/08 21:29:02 INFO mapred.JobClient: Running job: job_local_0005 > 12/06/08 21:29:02 INFO mapred.MapTask: io.sort.mb = 100 > 12/06/08 21:29:03 INFO mapred.MapTask: data buffer = 79691776/99614720 > 12/06/08 21:29:03 INFO mapred.MapTask: record buffer = 262144/327680 > 12/06/08 21:29:03 INFO mapred.MapTask: Starting flush of map output > 12/06/08 21:29:03 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:03 INFO mapred.JobClient: map 0% reduce 0% > 12/06/08 21:29:05 INFO mapred.LocalJobRunner: > 12/06/08 21:29:05 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' > done. > 12/06/08 21:29:05 INFO mapred.LocalJobRunner: > 12/06/08 21:29:05 INFO mapred.Merger: Merging 1 sorted segments > 12/06/08 21:29:05 INFO mapred.Merger: Down to the last merge-pass, with 0 > segments left of total size: 0 bytes > 12/06/08 21:29:05 INFO mapred.LocalJobRunner: > 12/06/08 21:29:05 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:05 INFO mapred.LocalJobRunner: > 12/06/08 21:29:05 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is > allowed to commit now > 12/06/08 21:29:05 INFO output.FileOutputCommitter: Saved output of task > 'attempt_local_0005_r_000000_0' to > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count > 12/06/08 21:29:05 INFO mapred.JobClient: map 100% reduce 0% > 12/06/08 21:29:08 INFO mapred.LocalJobRunner: reduce > reduce > 12/06/08 21:29:08 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' > done. > 12/06/08 21:29:08 INFO mapred.JobClient: map 100% reduce 100% > 12/06/08 21:29:08 INFO mapred.JobClient: Job complete: job_local_0005 > 12/06/08 21:29:08 INFO mapred.JobClient: Counters: 16 > 12/06/08 21:29:08 INFO mapred.JobClient: File Output Format Counters > 12/06/08 21:29:08 INFO mapred.JobClient: Bytes Written=105 > 12/06/08 21:29:08 INFO mapred.JobClient: FileSystemCounters > 12/06/08 21:29:08 INFO mapred.JobClient: FILE_BYTES_READ=300381378 > 12/06/08 21:29:08 INFO mapred.JobClient: FILE_BYTES_WRITTEN=303058415 > 12/06/08 21:29:08 INFO mapred.JobClient: File Input Format Counters > 12/06/08 21:29:08 INFO mapred.JobClient: Bytes Read=102 > 12/06/08 21:29:08 INFO mapred.JobClient: Map-Reduce Framework > 12/06/08 21:29:08 INFO mapred.JobClient: Reduce input groups=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Map output materialized > bytes=6 > 12/06/08 21:29:08 INFO mapred.JobClient: Combine output records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Map input records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Reduce shuffle bytes=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Reduce output records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Spilled Records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Map output bytes=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Combine input records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: Map output records=0 > 12/06/08 21:29:08 INFO mapred.JobClient: SPLIT_RAW_BYTES=150 > 12/06/08 21:29:08 INFO mapred.JobClient: Reduce input records=0 > 12/06/08 21:29:09 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/06/08 21:29:09 INFO filecache.TrackerDistributedCacheManager: Creating > frequency.file-0 in > /tmp/hadoop-hudson/mapred/local/archive/1408592979058787675_1334525619_1308635919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans-work--5518354331994365426 > with rwxr-xr-x > 12/06/08 21:29:09 INFO filecache.TrackerDistributedCacheManager: Cached > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 > as > /tmp/hadoop-hudson/mapred/local/archive/1408592979058787675_1334525619_1308635919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 > 12/06/08 21:29:09 INFO filecache.TrackerDistributedCacheManager: Cached > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 > as > /tmp/hadoop-hudson/mapred/local/archive/1408592979058787675_1334525619_1308635919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 > 12/06/08 21:29:09 INFO mapred.JobClient: Running job: job_local_0006 > 12/06/08 21:29:09 INFO mapred.MapTask: io.sort.mb = 100 > 12/06/08 21:29:09 INFO mapred.MapTask: data buffer = 79691776/99614720 > 12/06/08 21:29:09 INFO mapred.MapTask: record buffer = 262144/327680 > 12/06/08 21:29:09 INFO mapred.MapTask: Starting flush of map output > 12/06/08 21:29:09 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:10 INFO mapred.JobClient: map 0% reduce 0% > 12/06/08 21:29:12 INFO mapred.LocalJobRunner: > 12/06/08 21:29:12 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' > done. > 12/06/08 21:29:12 INFO mapred.LocalJobRunner: > 12/06/08 21:29:12 INFO mapred.Merger: Merging 1 sorted segments > 12/06/08 21:29:12 INFO mapred.Merger: Down to the last merge-pass, with 0 > segments left of total size: 0 bytes > 12/06/08 21:29:12 INFO mapred.LocalJobRunner: > 12/06/08 21:29:12 INFO mapred.JobClient: map 100% reduce 0% > 12/06/08 21:29:12 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:12 INFO mapred.LocalJobRunner: > 12/06/08 21:29:12 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is > allowed to commit now > 12/06/08 21:29:12 INFO output.FileOutputCommitter: Saved output of task > 'attempt_local_0006_r_000000_0' to > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 > 12/06/08 21:29:15 INFO mapred.LocalJobRunner: reduce > reduce > 12/06/08 21:29:15 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' > done. > 12/06/08 21:29:15 INFO mapred.JobClient: map 100% reduce 100% > 12/06/08 21:29:15 INFO mapred.JobClient: Job complete: job_local_0006 > 12/06/08 21:29:15 INFO mapred.JobClient: Counters: 16 > 12/06/08 21:29:15 INFO mapred.JobClient: File Output Format Counters > 12/06/08 21:29:15 INFO mapred.JobClient: Bytes Written=102 > 12/06/08 21:29:15 INFO mapred.JobClient: FileSystemCounters > 12/06/08 21:29:15 INFO mapred.JobClient: FILE_BYTES_READ=360458073 > 12/06/08 21:29:15 INFO mapred.JobClient: FILE_BYTES_WRITTEN=363673230 > 12/06/08 21:29:15 INFO mapred.JobClient: File Input Format Counters > 12/06/08 21:29:15 INFO mapred.JobClient: Bytes Read=102 > 12/06/08 21:29:15 INFO mapred.JobClient: Map-Reduce Framework > 12/06/08 21:29:15 INFO mapred.JobClient: Reduce input groups=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Map output materialized > bytes=6 > 12/06/08 21:29:15 INFO mapred.JobClient: Combine output records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Map input records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Reduce shuffle bytes=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Reduce output records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Spilled Records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Map output bytes=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Combine input records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: Map output records=0 > 12/06/08 21:29:15 INFO mapred.JobClient: SPLIT_RAW_BYTES=150 > 12/06/08 21:29:15 INFO mapred.JobClient: Reduce input records=0 > 12/06/08 21:29:15 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors > 12/06/08 21:29:15 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/06/08 21:29:15 INFO mapred.JobClient: Running job: job_local_0007 > 12/06/08 21:29:15 INFO mapred.MapTask: io.sort.mb = 100 > 12/06/08 21:29:16 INFO mapred.MapTask: data buffer = 79691776/99614720 > 12/06/08 21:29:16 INFO mapred.MapTask: record buffer = 262144/327680 > 12/06/08 21:29:16 INFO mapred.MapTask: Starting flush of map output > 12/06/08 21:29:16 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:16 INFO mapred.JobClient: map 0% reduce 0% > 12/06/08 21:29:18 INFO mapred.LocalJobRunner: > 12/06/08 21:29:18 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' > done. > 12/06/08 21:29:18 INFO mapred.LocalJobRunner: > 12/06/08 21:29:18 INFO mapred.Merger: Merging 1 sorted segments > 12/06/08 21:29:18 INFO mapred.Merger: Down to the last merge-pass, with 0 > segments left of total size: 0 bytes > 12/06/08 21:29:18 INFO mapred.LocalJobRunner: > 12/06/08 21:29:18 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is > done. And is in the process of commiting > 12/06/08 21:29:18 INFO mapred.LocalJobRunner: > 12/06/08 21:29:18 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is > allowed to commit now > 12/06/08 21:29:18 INFO output.FileOutputCommitter: Saved output of task > 'attempt_local_0007_r_000000_0' to > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors > 12/06/08 21:29:18 INFO mapred.JobClient: map 100% reduce 0% > 12/06/08 21:29:21 INFO mapred.LocalJobRunner: reduce > reduce > 12/06/08 21:29:21 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' > done. > 12/06/08 21:29:21 INFO mapred.JobClient: map 100% reduce 100% > 12/06/08 21:29:21 INFO mapred.JobClient: Job complete: job_local_0007 > 12/06/08 21:29:21 INFO mapred.JobClient: Counters: 16 > 12/06/08 21:29:21 INFO mapred.JobClient: File Output Format Counters > 12/06/08 21:29:21 INFO mapred.JobClient: Bytes Written=102 > 12/06/08 21:29:21 INFO mapred.JobClient: FileSystemCounters > 12/06/08 21:29:21 INFO mapred.JobClient: FILE_BYTES_READ=420534362 > 12/06/08 21:29:21 INFO mapred.JobClient: FILE_BYTES_WRITTEN=424284724 > 12/06/08 21:29:21 INFO mapred.JobClient: File Input Format Counters > 12/06/08 21:29:21 INFO mapred.JobClient: Bytes Read=102 > 12/06/08 21:29:21 INFO mapred.JobClient: Map-Reduce Framework > 12/06/08 21:29:21 INFO mapred.JobClient: Reduce input groups=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Map output materialized > bytes=6 > 12/06/08 21:29:21 INFO mapred.JobClient: Combine output records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Map input records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Reduce shuffle bytes=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Reduce output records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Spilled Records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Map output bytes=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Combine input records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: Map output records=0 > 12/06/08 21:29:21 INFO mapred.JobClient: SPLIT_RAW_BYTES=157 > 12/06/08 21:29:21 INFO mapred.JobClient: Reduce input records=0 > 12/06/08 21:29:21 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0 > 12/06/08 21:29:21 INFO driver.MahoutDriver: Program took 44338 ms > (Minutes: 0.7389666666666667) > hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running > locally > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > 12/06/08 21:29:22 INFO common.AbstractJob: Command line arguments: > {--clustering=null, > --clusters=[/tmp/mahout-work-hudson/reuters-kmeans-clusters], > --convergenceDelta=[0.5], > --distanceMeasure=[org.apache.mahout.common.distance.CosineDistanceMeasure], > --endPhase=[2147483647], > --input=[/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/], > --maxIter=[10], --method=[mapreduce], --numClusters=[20], > --output=[/tmp/mahout-work-hudson/reuters-kmeans], --overwrite=null, > --startPhase=[0], --tempDir=[temp]} > 12/06/08 21:29:23 INFO common.HadoopUtil: Deleting > /tmp/mahout-work-hudson/reuters-kmeans-clusters > 12/06/08 21:29:23 WARN util.NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 12/06/08 21:29:23 INFO compress.CodecPool: Got brand-new compressor > 12/06/08 21:29:23 INFO kmeans.RandomSeedGenerator: Wrote 20 vectors to > /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed > 12/06/08 21:29:23 INFO kmeans.KMeansDriver: Input: > /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors > Clusters In: > /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed Out: > /tmp/mahout-work-hudson/reuters-kmeans Distance: > org.apache.mahout.common.distance.CosineDistanceMeasure > 12/06/08 21:29:23 INFO kmeans.KMeansDriver: convergence: 0.5 max > Iterations: 10 num Reduce Tasks: org.apache.mahout.math.VectorWritable > Input Vectors: {} > 12/06/08 21:29:23 INFO compress.CodecPool: Got brand-new decompressor > Exception in thread "main" java.lang.IllegalStateException: No input > clusters found. Check your -c argument. > at > org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:218) > at > org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:149) > at > org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:108) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at > org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:49) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) > at > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) > at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) > Build step 'Execute shell' marked build as failure >
