See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters-II/6/changes>
Changes: [srowen] More of my suggested refinements [gsingers] MAHOUT-899: Add some more cluster dumping options like sampling, coloring ------------------------------------------ [...truncated 7306 lines...] 12/01/09 19:11:52 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/01/09 19:11:52 INFO mapred.LocalJobRunner: 12/01/09 19:11:52 INFO mapred.JobClient: map 100% reduce 0% 12/01/09 19:11:52 INFO mapred.Task: Task:attempt_local_0003_r_000000_0 is done. And is in the process of commiting 12/01/09 19:11:52 INFO mapred.LocalJobRunner: 12/01/09 19:11:52 INFO mapred.Task: Task attempt_local_0003_r_000000_0 is allowed to commit now 12/01/09 19:11:52 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0003_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/partial-vectors-0 12/01/09 19:11:55 INFO mapred.LocalJobRunner: reduce > reduce 12/01/09 19:11:55 INFO mapred.Task: Task 'attempt_local_0003_r_000000_0' done. 12/01/09 19:11:55 INFO mapred.JobClient: map 100% reduce 100% 12/01/09 19:11:55 INFO mapred.JobClient: Job complete: job_local_0003 12/01/09 19:11:55 INFO mapred.JobClient: Counters: 16 12/01/09 19:11:55 INFO mapred.JobClient: File Output Format Counters 12/01/09 19:11:55 INFO mapred.JobClient: Bytes Written=102 12/01/09 19:11:55 INFO mapred.JobClient: FileSystemCounters 12/01/09 19:11:55 INFO mapred.JobClient: FILE_BYTES_READ=141391327 12/01/09 19:11:55 INFO mapred.JobClient: FILE_BYTES_WRITTEN=142695080 12/01/09 19:11:55 INFO mapred.JobClient: File Input Format Counters 12/01/09 19:11:55 INFO mapred.JobClient: Bytes Read=101 12/01/09 19:11:55 INFO mapred.JobClient: Map-Reduce Framework 12/01/09 19:11:55 INFO mapred.JobClient: Reduce input groups=0 12/01/09 19:11:55 INFO mapred.JobClient: Map output materialized bytes=6 12/01/09 19:11:55 INFO mapred.JobClient: Combine output records=0 12/01/09 19:11:55 INFO mapred.JobClient: Map input records=0 12/01/09 19:11:55 INFO mapred.JobClient: Reduce shuffle bytes=0 12/01/09 19:11:55 INFO mapred.JobClient: Reduce output records=0 12/01/09 19:11:55 INFO mapred.JobClient: Spilled Records=0 12/01/09 19:11:55 INFO mapred.JobClient: Map output bytes=0 12/01/09 19:11:55 INFO mapred.JobClient: Combine input records=0 12/01/09 19:11:55 INFO mapred.JobClient: Map output records=0 12/01/09 19:11:55 INFO mapred.JobClient: SPLIT_RAW_BYTES=160 12/01/09 19:11:55 INFO mapred.JobClient: Reduce input records=0 12/01/09 19:11:55 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/tf-vectors 12/01/09 19:11:55 INFO input.FileInputFormat: Total input paths to process : 1 12/01/09 19:11:55 INFO mapred.JobClient: Running job: job_local_0004 12/01/09 19:11:55 INFO mapred.MapTask: io.sort.mb = 100 12/01/09 19:11:56 INFO mapred.MapTask: data buffer = 79691776/99614720 12/01/09 19:11:56 INFO mapred.MapTask: record buffer = 262144/327680 12/01/09 19:11:56 INFO mapred.MapTask: Starting flush of map output 12/01/09 19:11:56 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is done. And is in the process of commiting 12/01/09 19:11:56 INFO mapred.JobClient: map 0% reduce 0% 12/01/09 19:11:58 INFO mapred.LocalJobRunner: 12/01/09 19:11:58 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done. 12/01/09 19:11:58 INFO mapred.LocalJobRunner: 12/01/09 19:11:58 INFO mapred.Merger: Merging 1 sorted segments 12/01/09 19:11:58 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/01/09 19:11:58 INFO mapred.LocalJobRunner: 12/01/09 19:11:58 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is done. And is in the process of commiting 12/01/09 19:11:58 INFO mapred.LocalJobRunner: 12/01/09 19:11:58 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is allowed to commit now 12/01/09 19:11:58 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0004_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/tf-vectors 12/01/09 19:11:58 INFO mapred.JobClient: map 100% reduce 0% 12/01/09 19:12:01 INFO mapred.LocalJobRunner: reduce > reduce 12/01/09 19:12:01 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done. 12/01/09 19:12:01 INFO mapred.JobClient: map 100% reduce 100% 12/01/09 19:12:01 INFO mapred.JobClient: Job complete: job_local_0004 12/01/09 19:12:01 INFO mapred.JobClient: Counters: 16 12/01/09 19:12:01 INFO mapred.JobClient: File Output Format Counters 12/01/09 19:12:01 INFO mapred.JobClient: Bytes Written=102 12/01/09 19:12:01 INFO mapred.JobClient: FileSystemCounters 12/01/09 19:12:01 INFO mapred.JobClient: FILE_BYTES_READ=188521742 12/01/09 19:12:01 INFO mapred.JobClient: FILE_BYTES_WRITTEN=190259588 12/01/09 19:12:01 INFO mapred.JobClient: File Input Format Counters 12/01/09 19:12:01 INFO mapred.JobClient: Bytes Read=102 12/01/09 19:12:01 INFO mapred.JobClient: Map-Reduce Framework 12/01/09 19:12:01 INFO mapred.JobClient: Reduce input groups=0 12/01/09 19:12:01 INFO mapred.JobClient: Map output materialized bytes=6 12/01/09 19:12:01 INFO mapred.JobClient: Combine output records=0 12/01/09 19:12:01 INFO mapred.JobClient: Map input records=0 12/01/09 19:12:01 INFO mapred.JobClient: Reduce shuffle bytes=0 12/01/09 19:12:01 INFO mapred.JobClient: Reduce output records=0 12/01/09 19:12:01 INFO mapred.JobClient: Spilled Records=0 12/01/09 19:12:01 INFO mapred.JobClient: Map output bytes=0 12/01/09 19:12:01 INFO mapred.JobClient: Combine input records=0 12/01/09 19:12:01 INFO mapred.JobClient: Map output records=0 12/01/09 19:12:01 INFO mapred.JobClient: SPLIT_RAW_BYTES=158 12/01/09 19:12:01 INFO mapred.JobClient: Reduce input records=0 12/01/09 19:12:01 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/partial-vectors-0 12/01/09 19:12:01 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/df-count 12/01/09 19:12:05 INFO input.FileInputFormat: Total input paths to process : 1 12/01/09 19:12:05 INFO mapred.JobClient: Running job: job_local_0005 12/01/09 19:12:05 INFO mapred.MapTask: io.sort.mb = 100 12/01/09 19:12:05 INFO mapred.MapTask: data buffer = 79691776/99614720 12/01/09 19:12:05 INFO mapred.MapTask: record buffer = 262144/327680 12/01/09 19:12:05 INFO mapred.MapTask: Starting flush of map output 12/01/09 19:12:05 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is done. And is in the process of commiting 12/01/09 19:12:06 INFO mapred.JobClient: map 0% reduce 0% 12/01/09 19:12:08 INFO mapred.LocalJobRunner: 12/01/09 19:12:08 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done. 12/01/09 19:12:08 INFO mapred.LocalJobRunner: 12/01/09 19:12:08 INFO mapred.Merger: Merging 1 sorted segments 12/01/09 19:12:08 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/01/09 19:12:08 INFO mapred.LocalJobRunner: 12/01/09 19:12:08 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is done. And is in the process of commiting 12/01/09 19:12:08 INFO mapred.LocalJobRunner: 12/01/09 19:12:08 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is allowed to commit now 12/01/09 19:12:08 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0005_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/df-count 12/01/09 19:12:08 INFO mapred.JobClient: map 100% reduce 0% 12/01/09 19:12:11 INFO mapred.LocalJobRunner: reduce > reduce 12/01/09 19:12:11 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' done. 12/01/09 19:12:11 INFO mapred.JobClient: map 100% reduce 100% 12/01/09 19:12:11 INFO mapred.JobClient: Job complete: job_local_0005 12/01/09 19:12:11 INFO mapred.JobClient: Counters: 16 12/01/09 19:12:11 INFO mapred.JobClient: File Output Format Counters 12/01/09 19:12:11 INFO mapred.JobClient: Bytes Written=105 12/01/09 19:12:11 INFO mapred.JobClient: FileSystemCounters 12/01/09 19:12:11 INFO mapred.JobClient: FILE_BYTES_READ=235652046 12/01/09 19:12:11 INFO mapred.JobClient: FILE_BYTES_WRITTEN=237823573 12/01/09 19:12:11 INFO mapred.JobClient: File Input Format Counters 12/01/09 19:12:11 INFO mapred.JobClient: Bytes Read=102 12/01/09 19:12:11 INFO mapred.JobClient: Map-Reduce Framework 12/01/09 19:12:11 INFO mapred.JobClient: Reduce input groups=0 12/01/09 19:12:11 INFO mapred.JobClient: Map output materialized bytes=6 12/01/09 19:12:11 INFO mapred.JobClient: Combine output records=0 12/01/09 19:12:11 INFO mapred.JobClient: Map input records=0 12/01/09 19:12:11 INFO mapred.JobClient: Reduce shuffle bytes=0 12/01/09 19:12:11 INFO mapred.JobClient: Reduce output records=0 12/01/09 19:12:11 INFO mapred.JobClient: Spilled Records=0 12/01/09 19:12:11 INFO mapred.JobClient: Map output bytes=0 12/01/09 19:12:11 INFO mapred.JobClient: Combine input records=0 12/01/09 19:12:11 INFO mapred.JobClient: Map output records=0 12/01/09 19:12:11 INFO mapred.JobClient: SPLIT_RAW_BYTES=151 12/01/09 19:12:11 INFO mapred.JobClient: Reduce input records=0 12/01/09 19:12:13 INFO input.FileInputFormat: Total input paths to process : 1 12/01/09 19:12:13 INFO filecache.TrackerDistributedCacheManager: Creating frequency.file-0 in /tmp/hadoop-hudson/mapred/local/archive/-4691208342283759583_-1253471606_1138921801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash-work-3926503770698727468 with rwxr-xr-x 12/01/09 19:12:13 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/frequency.file-0 as /tmp/hadoop-hudson/mapred/local/archive/-4691208342283759583_-1253471606_1138921801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/frequency.file-0 12/01/09 19:12:13 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/frequency.file-0 as /tmp/hadoop-hudson/mapred/local/archive/-4691208342283759583_-1253471606_1138921801/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/frequency.file-0 12/01/09 19:12:13 INFO mapred.JobClient: Running job: job_local_0006 12/01/09 19:12:13 INFO mapred.MapTask: io.sort.mb = 100 12/01/09 19:12:14 INFO mapred.MapTask: data buffer = 79691776/99614720 12/01/09 19:12:14 INFO mapred.MapTask: record buffer = 262144/327680 12/01/09 19:12:14 INFO mapred.MapTask: Starting flush of map output 12/01/09 19:12:14 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is done. And is in the process of commiting 12/01/09 19:12:14 INFO mapred.JobClient: map 0% reduce 0% 12/01/09 19:12:16 INFO mapred.LocalJobRunner: 12/01/09 19:12:16 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done. 12/01/09 19:12:16 INFO mapred.LocalJobRunner: 12/01/09 19:12:16 INFO mapred.Merger: Merging 1 sorted segments 12/01/09 19:12:16 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/01/09 19:12:16 INFO mapred.LocalJobRunner: 12/01/09 19:12:16 INFO mapred.JobClient: map 100% reduce 0% 12/01/09 19:12:16 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is done. And is in the process of commiting 12/01/09 19:12:16 INFO mapred.LocalJobRunner: 12/01/09 19:12:16 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is allowed to commit now 12/01/09 19:12:16 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0006_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/partial-vectors-0 12/01/09 19:12:19 INFO mapred.LocalJobRunner: reduce > reduce 12/01/09 19:12:19 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done. 12/01/09 19:12:19 INFO mapred.JobClient: map 100% reduce 100% 12/01/09 19:12:19 INFO mapred.JobClient: Job complete: job_local_0006 12/01/09 19:12:19 INFO mapred.JobClient: Counters: 16 12/01/09 19:12:19 INFO mapred.JobClient: File Output Format Counters 12/01/09 19:12:19 INFO mapred.JobClient: Bytes Written=102 12/01/09 19:12:19 INFO mapred.JobClient: FileSystemCounters 12/01/09 19:12:19 INFO mapred.JobClient: FILE_BYTES_READ=282782875 12/01/09 19:12:19 INFO mapred.JobClient: FILE_BYTES_WRITTEN=285391416 12/01/09 19:12:19 INFO mapred.JobClient: File Input Format Counters 12/01/09 19:12:19 INFO mapred.JobClient: Bytes Read=102 12/01/09 19:12:19 INFO mapred.JobClient: Map-Reduce Framework 12/01/09 19:12:19 INFO mapred.JobClient: Reduce input groups=0 12/01/09 19:12:19 INFO mapred.JobClient: Map output materialized bytes=6 12/01/09 19:12:19 INFO mapred.JobClient: Combine output records=0 12/01/09 19:12:19 INFO mapred.JobClient: Map input records=0 12/01/09 19:12:19 INFO mapred.JobClient: Reduce shuffle bytes=0 12/01/09 19:12:19 INFO mapred.JobClient: Reduce output records=0 12/01/09 19:12:19 INFO mapred.JobClient: Spilled Records=0 12/01/09 19:12:19 INFO mapred.JobClient: Map output bytes=0 12/01/09 19:12:19 INFO mapred.JobClient: Combine input records=0 12/01/09 19:12:19 INFO mapred.JobClient: Map output records=0 12/01/09 19:12:19 INFO mapred.JobClient: SPLIT_RAW_BYTES=151 12/01/09 19:12:19 INFO mapred.JobClient: Reduce input records=0 12/01/09 19:12:19 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/tfidf-vectors 12/01/09 19:12:19 INFO input.FileInputFormat: Total input paths to process : 1 12/01/09 19:12:19 INFO mapred.JobClient: Running job: job_local_0007 12/01/09 19:12:19 INFO mapred.MapTask: io.sort.mb = 100 12/01/09 19:12:20 INFO mapred.MapTask: data buffer = 79691776/99614720 12/01/09 19:12:20 INFO mapred.MapTask: record buffer = 262144/327680 12/01/09 19:12:20 INFO mapred.MapTask: Starting flush of map output 12/01/09 19:12:20 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is done. And is in the process of commiting 12/01/09 19:12:20 INFO mapred.JobClient: map 0% reduce 0% 12/01/09 19:12:22 INFO mapred.LocalJobRunner: 12/01/09 19:12:22 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done. 12/01/09 19:12:22 INFO mapred.LocalJobRunner: 12/01/09 19:12:22 INFO mapred.Merger: Merging 1 sorted segments 12/01/09 19:12:22 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of total size: 0 bytes 12/01/09 19:12:22 INFO mapred.LocalJobRunner: 12/01/09 19:12:22 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is done. And is in the process of commiting 12/01/09 19:12:22 INFO mapred.LocalJobRunner: 12/01/09 19:12:22 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is allowed to commit now 12/01/09 19:12:22 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0007_r_000000_0' to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/tfidf-vectors 12/01/09 19:12:22 INFO mapred.JobClient: map 100% reduce 0% 12/01/09 19:12:25 INFO mapred.LocalJobRunner: reduce > reduce 12/01/09 19:12:25 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done. 12/01/09 19:12:25 INFO mapred.JobClient: map 100% reduce 100% 12/01/09 19:12:25 INFO mapred.JobClient: Job complete: job_local_0007 12/01/09 19:12:25 INFO mapred.JobClient: Counters: 16 12/01/09 19:12:25 INFO mapred.JobClient: File Output Format Counters 12/01/09 19:12:25 INFO mapred.JobClient: Bytes Written=102 12/01/09 19:12:25 INFO mapred.JobClient: FileSystemCounters 12/01/09 19:12:25 INFO mapred.JobClient: FILE_BYTES_READ=329913298 12/01/09 19:12:25 INFO mapred.JobClient: FILE_BYTES_WRITTEN=332955936 12/01/09 19:12:25 INFO mapred.JobClient: File Input Format Counters 12/01/09 19:12:25 INFO mapred.JobClient: Bytes Read=102 12/01/09 19:12:25 INFO mapred.JobClient: Map-Reduce Framework 12/01/09 19:12:25 INFO mapred.JobClient: Reduce input groups=0 12/01/09 19:12:25 INFO mapred.JobClient: Map output materialized bytes=6 12/01/09 19:12:25 INFO mapred.JobClient: Combine output records=0 12/01/09 19:12:25 INFO mapred.JobClient: Map input records=0 12/01/09 19:12:25 INFO mapred.JobClient: Reduce shuffle bytes=0 12/01/09 19:12:25 INFO mapred.JobClient: Reduce output records=0 12/01/09 19:12:25 INFO mapred.JobClient: Spilled Records=0 12/01/09 19:12:25 INFO mapred.JobClient: Map output bytes=0 12/01/09 19:12:25 INFO mapred.JobClient: Combine input records=0 12/01/09 19:12:25 INFO mapred.JobClient: Map output records=0 12/01/09 19:12:25 INFO mapred.JobClient: SPLIT_RAW_BYTES=158 12/01/09 19:12:25 INFO mapred.JobClient: Reduce input records=0 12/01/09 19:12:25 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/partial-vectors-0 12/01/09 19:12:25 INFO driver.MahoutDriver: Program took 50512 ms (Minutes: 0.8418666666666667) MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. no HADOOP_HOME set, running locally SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters-II/trunk/examples/target/mahout-examples-0.6-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters-II/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters-II/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 12/01/09 19:12:29 WARN driver.MahoutDriver: No org.apache.mahout.clustering.minhash.MinHashDriver.props found on classpath, will use command-line arguments only 12/01/09 19:12:30 INFO common.AbstractJob: Command line arguments: {--endPhase=2147483647, --hashType=murmur, --input=/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-minhash/tfidf-vectors, --keyGroups=2, --minClusterSize=10, --minVectorSize=5, --numHashFunctions=10, --numReducers=2, --output=/tmp/mahout-work-hudson/reuters-minhash, --startPhase=0, --tempDir=temp} 12/01/09 19:12:31 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-hudson/mapred/staging/hudson916254904/.staging/job_local_0001 Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /tmp/mahout-work-hudson/reuters-minhash already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:134) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:846) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:807) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:807) at org.apache.hadoop.mapreduce.Job.submit(Job.java:465) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495) at org.apache.mahout.clustering.minhash.MinHashDriver.runJob(MinHashDriver.java:86) at org.apache.mahout.clustering.minhash.MinHashDriver.run(MinHashDriver.java:115) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.mahout.clustering.minhash.MinHashDriver.main(MinHashDriver.java:41) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188) Build step 'Execute shell' marked build as failure
