The Mahout-792 change should have had no effect on the examples stuff.

On Tue, Jan 17, 2012 at 12:54 AM, Jeff Eastman
<[email protected]>wrote:

>  Does anybody know what changed to break this example? Jenkins really
> needs to be stable in order to code freeze for 0.6 and, IMHO, we should be
> focusing our efforts to achieve this goal along with completion of the
> following 3 open issues:
>
>        ASF 
> JIRA<https://issues.apache.org/jira/secure/IssueNavigator.jspa?reset=true&jqlQuery=project+%3D+MAHOUT+AND+resolution+%3D+Unresolved+AND+fixVersion+%3D+%220.6%22+ORDER+BY+priority+DESC>
>   Displaying
> 3 issues at 17/Jan/12 00:53.  Issue Type Key Summary Assignee Reporter
> Priority Status Resolution Created Updated Due Date  Bug 
> MAHOUT-890<https://issues.apache.org/jira/browse/MAHOUT-890> Performance
> issue in FPGrowth Robin Anil tom pierce Major Patch Available Unresolved 
> 11/20/11
> 3:39 1/15/12 17:58    Improvement 
> MAHOUT-768<https://issues.apache.org/jira/browse/MAHOUT-768> Duplicated
> DoubleFunction in mahout and mahout-collections (mahout.math package). Ted
> Dunning Dawid Weiss Minor Open Unresolved 7/24/11 7:38 1/8/12 18:41
> Improvement MAHOUT-854 <https://issues.apache.org/jira/browse/MAHOUT-854> Add
> MinHash to build-reuters.sh example Grant Ingersoll Varun Thacker Minor
> Reopened Unresolved 10/30/11 17:57 1/12/12 21:58    Generated at Tue Jan
> 17 00:53:22 UTC 2012 using JIRA 4.4.1#660-r161644.
>
>
> On 1/16/12 12:20 PM, Apache Jenkins Server wrote:
>
> See 
> <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/15/changes> 
> <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/15/changes>
>
> Changes:
>
> [tdunning] MAHOUT-792 - Forced correct block ordering in out-of-core SVD.  
> Hopefully addresses ubuntu test failures.  Also forced file closing.
>
> ------------------------------------------
> [...truncated 6057 lines...]
> 12/01/16 19:19:34 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=142860392
> 12/01/16 19:19:34 INFO mapred.JobClient:   File Input Format Counters
> 12/01/16 19:19:34 INFO mapred.JobClient:     Bytes Read=101
> 12/01/16 19:19:34 INFO mapred.JobClient:   Map-Reduce Framework
> 12/01/16 19:19:34 INFO mapred.JobClient:     Reduce input groups=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Map output materialized bytes=6
> 12/01/16 19:19:34 INFO mapred.JobClient:     Combine output records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Map input records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Reduce shuffle bytes=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Reduce output records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Spilled Records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Map output bytes=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Combine input records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     Map output records=0
> 12/01/16 19:19:34 INFO mapred.JobClient:     SPLIT_RAW_BYTES=159
> 12/01/16 19:19:34 INFO mapred.JobClient:     Reduce input records=0
> 12/01/16 19:19:34 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
> 12/01/16 19:19:35 INFO input.FileInputFormat: Total input paths to process : 1
> 12/01/16 19:19:35 INFO mapred.JobClient: Running job: job_local_0004
> 12/01/16 19:19:35 INFO mapred.MapTask: io.sort.mb = 100
> 12/01/16 19:19:35 INFO mapred.MapTask: data buffer = 79691776/99614720
> 12/01/16 19:19:35 INFO mapred.MapTask: record buffer = 262144/327680
> 12/01/16 19:19:35 INFO mapred.MapTask: Starting flush of map output
> 12/01/16 19:19:35 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:36 INFO mapred.JobClient:  map 0% reduce 0%
> 12/01/16 19:19:38 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:38 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done.
> 12/01/16 19:19:38 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:38 INFO mapred.Merger: Merging 1 sorted segments
> 12/01/16 19:19:38 INFO mapred.Merger: Down to the last merge-pass, with 0 
> segments left of total size: 0 bytes
> 12/01/16 19:19:38 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:38 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:38 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:38 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is 
> allowed to commit now
> 12/01/16 19:19:38 INFO mapred.JobClient:  map 100% reduce 0%
> 12/01/16 19:19:38 INFO output.FileOutputCommitter: Saved output of task 
> 'attempt_local_0004_r_000000_0' to 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
> 12/01/16 19:19:41 INFO mapred.LocalJobRunner: reduce > reduce
> 12/01/16 19:19:41 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done.
> 12/01/16 19:19:41 INFO mapred.JobClient:  map 100% reduce 100%
> 12/01/16 19:19:41 INFO mapred.JobClient: Job complete: job_local_0004
> 12/01/16 19:19:41 INFO mapred.JobClient: Counters: 16
> 12/01/16 19:19:41 INFO mapred.JobClient:   File Output Format Counters
> 12/01/16 19:19:41 INFO mapred.JobClient:     Bytes Written=102
> 12/01/16 19:19:41 INFO mapred.JobClient:   FileSystemCounters
> 12/01/16 19:19:41 INFO mapred.JobClient:     FILE_BYTES_READ=188740576
> 12/01/16 19:19:41 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=190479996
> 12/01/16 19:19:41 INFO mapred.JobClient:   File Input Format Counters
> 12/01/16 19:19:41 INFO mapred.JobClient:     Bytes Read=102
> 12/01/16 19:19:41 INFO mapred.JobClient:   Map-Reduce Framework
> 12/01/16 19:19:41 INFO mapred.JobClient:     Reduce input groups=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Map output materialized bytes=6
> 12/01/16 19:19:41 INFO mapred.JobClient:     Combine output records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Map input records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Reduce shuffle bytes=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Reduce output records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Spilled Records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Map output bytes=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Combine input records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     Map output records=0
> 12/01/16 19:19:41 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
> 12/01/16 19:19:41 INFO mapred.JobClient:     Reduce input records=0
> 12/01/16 19:19:41 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
> 12/01/16 19:19:41 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
> 12/01/16 19:19:41 INFO input.FileInputFormat: Total input paths to process : 1
> 12/01/16 19:19:41 INFO mapred.JobClient: Running job: job_local_0005
> 12/01/16 19:19:41 INFO mapred.MapTask: io.sort.mb = 100
> 12/01/16 19:19:41 INFO mapred.MapTask: data buffer = 79691776/99614720
> 12/01/16 19:19:41 INFO mapred.MapTask: record buffer = 262144/327680
> 12/01/16 19:19:41 INFO mapred.MapTask: Starting flush of map output
> 12/01/16 19:19:41 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:42 INFO mapred.JobClient:  map 0% reduce 0%
> 12/01/16 19:19:44 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:44 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done.
> 12/01/16 19:19:44 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:44 INFO mapred.Merger: Merging 1 sorted segments
> 12/01/16 19:19:44 INFO mapred.Merger: Down to the last merge-pass, with 0 
> segments left of total size: 0 bytes
> 12/01/16 19:19:44 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:44 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:44 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:44 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is 
> allowed to commit now
> 12/01/16 19:19:44 INFO mapred.JobClient:  map 100% reduce 0%
> 12/01/16 19:19:44 INFO output.FileOutputCommitter: Saved output of task 
> 'attempt_local_0005_r_000000_0' to 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
> 12/01/16 19:19:47 INFO mapred.LocalJobRunner: reduce > reduce
> 12/01/16 19:19:47 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' done.
> 12/01/16 19:19:47 INFO mapred.JobClient:  map 100% reduce 100%
> 12/01/16 19:19:47 INFO mapred.JobClient: Job complete: job_local_0005
> 12/01/16 19:19:47 INFO mapred.JobClient: Counters: 16
> 12/01/16 19:19:47 INFO mapred.JobClient:   File Output Format Counters
> 12/01/16 19:19:47 INFO mapred.JobClient:     Bytes Written=105
> 12/01/16 19:19:47 INFO mapred.JobClient:   FileSystemCounters
> 12/01/16 19:19:47 INFO mapred.JobClient:     FILE_BYTES_READ=235925588
> 12/01/16 19:19:47 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=238099073
> 12/01/16 19:19:47 INFO mapred.JobClient:   File Input Format Counters
> 12/01/16 19:19:47 INFO mapred.JobClient:     Bytes Read=102
> 12/01/16 19:19:47 INFO mapred.JobClient:   Map-Reduce Framework
> 12/01/16 19:19:47 INFO mapred.JobClient:     Reduce input groups=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Map output materialized bytes=6
> 12/01/16 19:19:47 INFO mapred.JobClient:     Combine output records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Map input records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Reduce shuffle bytes=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Reduce output records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Spilled Records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Map output bytes=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Combine input records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     Map output records=0
> 12/01/16 19:19:47 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
> 12/01/16 19:19:47 INFO mapred.JobClient:     Reduce input records=0
> 12/01/16 19:19:48 INFO input.FileInputFormat: Total input paths to process : 1
> 12/01/16 19:19:48 INFO filecache.TrackerDistributedCacheManager: Creating 
> frequency.file-0 in 
> /tmp/hadoop-hudson/mapred/local/archive/-1386584936348879013_1334525619_1744176801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans-work-7414301258342387605
>  with rwxr-xr-x
> 12/01/16 19:19:48 INFO filecache.TrackerDistributedCacheManager: Cached 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
> /tmp/hadoop-hudson/mapred/local/archive/-1386584936348879013_1334525619_1744176801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
> 12/01/16 19:19:48 INFO filecache.TrackerDistributedCacheManager: Cached 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
> /tmp/hadoop-hudson/mapred/local/archive/-1386584936348879013_1334525619_1744176801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
> 12/01/16 19:19:48 INFO mapred.JobClient: Running job: job_local_0006
> 12/01/16 19:19:48 INFO mapred.MapTask: io.sort.mb = 100
> 12/01/16 19:19:48 INFO mapred.MapTask: data buffer = 79691776/99614720
> 12/01/16 19:19:48 INFO mapred.MapTask: record buffer = 262144/327680
> 12/01/16 19:19:48 INFO mapred.MapTask: Starting flush of map output
> 12/01/16 19:19:48 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:49 INFO mapred.JobClient:  map 0% reduce 0%
> 12/01/16 19:19:51 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:51 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done.
> 12/01/16 19:19:51 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:51 INFO mapred.Merger: Merging 1 sorted segments
> 12/01/16 19:19:51 INFO mapred.Merger: Down to the last merge-pass, with 0 
> segments left of total size: 0 bytes
> 12/01/16 19:19:51 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:51 INFO mapred.JobClient:  map 100% reduce 0%
> 12/01/16 19:19:51 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:51 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:51 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is 
> allowed to commit now
> 12/01/16 19:19:51 INFO output.FileOutputCommitter: Saved output of task 
> 'attempt_local_0006_r_000000_0' to 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
> 12/01/16 19:19:54 INFO mapred.LocalJobRunner: reduce > reduce
> 12/01/16 19:19:54 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done.
> 12/01/16 19:19:54 INFO mapred.JobClient:  map 100% reduce 100%
> 12/01/16 19:19:54 INFO mapred.JobClient: Job complete: job_local_0006
> 12/01/16 19:19:54 INFO mapred.JobClient: Counters: 16
> 12/01/16 19:19:54 INFO mapred.JobClient:   File Output Format Counters
> 12/01/16 19:19:54 INFO mapred.JobClient:     Bytes Written=102
> 12/01/16 19:19:54 INFO mapred.JobClient:   FileSystemCounters
> 12/01/16 19:19:54 INFO mapred.JobClient:     FILE_BYTES_READ=283111125
> 12/01/16 19:19:54 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=285722004
> 12/01/16 19:19:54 INFO mapred.JobClient:   File Input Format Counters
> 12/01/16 19:19:54 INFO mapred.JobClient:     Bytes Read=102
> 12/01/16 19:19:54 INFO mapred.JobClient:   Map-Reduce Framework
> 12/01/16 19:19:54 INFO mapred.JobClient:     Reduce input groups=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Map output materialized bytes=6
> 12/01/16 19:19:54 INFO mapred.JobClient:     Combine output records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Map input records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Reduce shuffle bytes=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Reduce output records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Spilled Records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Map output bytes=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Combine input records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     Map output records=0
> 12/01/16 19:19:54 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
> 12/01/16 19:19:54 INFO mapred.JobClient:     Reduce input records=0
> 12/01/16 19:19:54 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
> 12/01/16 19:19:54 INFO input.FileInputFormat: Total input paths to process : 1
> 12/01/16 19:19:54 INFO mapred.JobClient: Running job: job_local_0007
> 12/01/16 19:19:54 INFO mapred.MapTask: io.sort.mb = 100
> 12/01/16 19:19:55 INFO mapred.MapTask: data buffer = 79691776/99614720
> 12/01/16 19:19:55 INFO mapred.MapTask: record buffer = 262144/327680
> 12/01/16 19:19:55 INFO mapred.MapTask: Starting flush of map output
> 12/01/16 19:19:55 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:55 INFO mapred.JobClient:  map 0% reduce 0%
> 12/01/16 19:19:57 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:57 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done.
> 12/01/16 19:19:57 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:57 INFO mapred.Merger: Merging 1 sorted segments
> 12/01/16 19:19:57 INFO mapred.Merger: Down to the last merge-pass, with 0 
> segments left of total size: 0 bytes
> 12/01/16 19:19:57 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:57 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is 
> done. And is in the process of commiting
> 12/01/16 19:19:57 INFO mapred.LocalJobRunner:
> 12/01/16 19:19:57 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is 
> allowed to commit now
> 12/01/16 19:19:57 INFO output.FileOutputCommitter: Saved output of task 
> 'attempt_local_0007_r_000000_0' to 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
> 12/01/16 19:19:57 INFO mapred.JobClient:  map 100% reduce 0%
> 12/01/16 19:20:00 INFO mapred.LocalJobRunner: reduce > reduce
> 12/01/16 19:20:00 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done.
> 12/01/16 19:20:00 INFO mapred.JobClient:  map 100% reduce 100%
> 12/01/16 19:20:00 INFO mapred.JobClient: Job complete: job_local_0007
> 12/01/16 19:20:00 INFO mapred.JobClient: Counters: 16
> 12/01/16 19:20:00 INFO mapred.JobClient:   File Output Format Counters
> 12/01/16 19:20:00 INFO mapred.JobClient:     Bytes Written=102
> 12/01/16 19:20:00 INFO mapred.JobClient:   FileSystemCounters
> 12/01/16 19:20:00 INFO mapred.JobClient:     FILE_BYTES_READ=330296256
> 12/01/16 19:20:00 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=333341612
> 12/01/16 19:20:00 INFO mapred.JobClient:   File Input Format Counters
> 12/01/16 19:20:00 INFO mapred.JobClient:     Bytes Read=102
> 12/01/16 19:20:00 INFO mapred.JobClient:   Map-Reduce Framework
> 12/01/16 19:20:00 INFO mapred.JobClient:     Reduce input groups=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Map output materialized bytes=6
> 12/01/16 19:20:00 INFO mapred.JobClient:     Combine output records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Map input records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Reduce shuffle bytes=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Reduce output records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Spilled Records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Map output bytes=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Combine input records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     Map output records=0
> 12/01/16 19:20:00 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
> 12/01/16 19:20:00 INFO mapred.JobClient:     Reduce input records=0
> 12/01/16 19:20:00 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
> 12/01/16 19:20:00 INFO driver.MahoutDriver: Program took 42665 ms (Minutes: 
> 0.7110833333333333)
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> no HADOOP_HOME set, running locally
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/mahout-examples-0.6-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> 12/01/16 19:20:03 INFO common.AbstractJob: Command line arguments: 
> {--clustering=null, 
> --clusters=/tmp/mahout-work-hudson/reuters-kmeans-clusters, 
> --convergenceDelta=0.5, 
> --distanceMeasure=org.apache.mahout.common.distance.CosineDistanceMeasure, 
> --endPhase=2147483647, 
> --input=/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/,
>  --maxIter=10, --method=mapreduce, --numClusters=20, 
> --output=/tmp/mahout-work-hudson/reuters-kmeans, --overwrite=null, 
> --startPhase=0, --tempDir=temp}
> 12/01/16 19:20:03 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-kmeans
> 12/01/16 19:20:03 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-hudson/reuters-kmeans-clusters
> 12/01/16 19:20:04 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 12/01/16 19:20:04 INFO compress.CodecPool: Got brand-new compressor
> 12/01/16 19:20:04 INFO kmeans.RandomSeedGenerator: Wrote 20 vectors to 
> /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed
> 12/01/16 19:20:04 INFO kmeans.KMeansDriver: Input: 
> /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors 
> Clusters In: /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed 
> Out: /tmp/mahout-work-hudson/reuters-kmeans Distance: 
> org.apache.mahout.common.distance.CosineDistanceMeasure
> 12/01/16 19:20:04 INFO kmeans.KMeansDriver: convergence: 0.5 max Iterations: 
> 10 num Reduce Tasks: org.apache.mahout.math.VectorWritable Input Vectors: {}
> 12/01/16 19:20:04 INFO kmeans.KMeansDriver: K-Means Iteration 1
> 12/01/16 19:20:04 INFO input.FileInputFormat: Total input paths to process : 1
> 12/01/16 19:20:04 INFO mapred.JobClient: Running job: job_local_0001
> 12/01/16 19:20:04 INFO mapred.MapTask: io.sort.mb = 100
> 12/01/16 19:20:05 INFO mapred.MapTask: data buffer = 79691776/99614720
> 12/01/16 19:20:05 INFO mapred.MapTask: record buffer = 262144/327680
> 12/01/16 19:20:05 INFO compress.CodecPool: Got brand-new decompressor
> 12/01/16 19:20:05 WARN mapred.LocalJobRunner: job_local_0001
> java.lang.IllegalStateException: No clusters found. Check your -c path.
>       at 
> org.apache.mahout.clustering.kmeans.KMeansMapper.setup(KMeansMapper.java:59)
>       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
>       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>       at 
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
> 12/01/16 19:20:05 INFO mapred.JobClient:  map 0% reduce 0%
> 12/01/16 19:20:05 INFO mapred.JobClient: Job complete: job_local_0001
> 12/01/16 19:20:05 INFO mapred.JobClient: Counters: 0
> Exception in thread "main" java.lang.InterruptedException: K-Means Iteration 
> failed processing 
> /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.runIteration(KMeansDriver.java:371)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.buildClustersMR(KMeansDriver.java:316)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:239)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:154)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:112)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:61)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>       at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>       at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
> Build step 'Execute shell' marked build as failure
>
>
>
>
>
>

Reply via email to