See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters-II/833/>
------------------------------------------
[...truncated 3068 lines...]
14/05/15 18:32:48 INFO mapred.JobClient: Map input records=0
14/05/15 18:32:48 INFO mapred.JobClient: Reduce shuffle bytes=0
14/05/15 18:32:48 INFO mapred.JobClient: Spilled Records=0
14/05/15 18:32:48 INFO mapred.JobClient: Map output bytes=0
14/05/15 18:32:48 INFO mapred.JobClient: Total committed heap usage
(bytes)=736624640
14/05/15 18:32:48 INFO mapred.JobClient: Combine input records=0
14/05/15 18:32:48 INFO mapred.JobClient: SPLIT_RAW_BYTES=166
14/05/15 18:32:48 INFO mapred.JobClient: Reduce input records=0
14/05/15 18:32:48 INFO mapred.JobClient: Reduce input groups=0
14/05/15 18:32:48 INFO mapred.JobClient: Combine output records=0
14/05/15 18:32:48 INFO mapred.JobClient: Reduce output records=0
14/05/15 18:32:48 INFO mapred.JobClient: Map output records=0
14/05/15 18:32:48 INFO common.HadoopUtil: Deleting
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/partial-vectors-0
14/05/15 18:32:48 INFO vectorizer.SparseVectorsFromSequenceFiles: Calculating
IDF
14/05/15 18:32:49 INFO input.FileInputFormat: Total input paths to process : 1
14/05/15 18:32:49 INFO mapred.JobClient: Running job: job_local1584740539_0005
14/05/15 18:32:49 INFO mapred.LocalJobRunner: Waiting for map tasks
14/05/15 18:32:49 INFO mapred.LocalJobRunner: Starting task:
attempt_local1584740539_0005_m_000000_0
14/05/15 18:32:49 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:49 INFO mapred.MapTask: Processing split:
file:/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-toprune/part-r-00000:0+90
14/05/15 18:32:49 INFO mapred.MapTask: io.sort.mb = 100
14/05/15 18:32:49 INFO mapred.MapTask: data buffer = 79691776/99614720
14/05/15 18:32:49 INFO mapred.MapTask: record buffer = 262144/327680
14/05/15 18:32:49 INFO mapred.MapTask: Starting flush of map output
14/05/15 18:32:49 INFO mapred.Task:
Task:attempt_local1584740539_0005_m_000000_0 is done. And is in the process of
commiting
14/05/15 18:32:49 INFO mapred.LocalJobRunner:
14/05/15 18:32:49 INFO mapred.Task: Task
'attempt_local1584740539_0005_m_000000_0' done.
14/05/15 18:32:49 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1584740539_0005_m_000000_0
14/05/15 18:32:49 INFO mapred.LocalJobRunner: Map task executor complete.
14/05/15 18:32:49 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:49 INFO mapred.LocalJobRunner:
14/05/15 18:32:49 INFO mapred.Merger: Merging 1 sorted segments
14/05/15 18:32:49 INFO mapred.Merger: Down to the last merge-pass, with 0
segments left of total size: 0 bytes
14/05/15 18:32:49 INFO mapred.LocalJobRunner:
14/05/15 18:32:49 INFO mapred.Task:
Task:attempt_local1584740539_0005_r_000000_0 is done. And is in the process of
commiting
14/05/15 18:32:49 INFO mapred.LocalJobRunner:
14/05/15 18:32:49 INFO mapred.Task: Task
attempt_local1584740539_0005_r_000000_0 is allowed to commit now
14/05/15 18:32:49 INFO output.FileOutputCommitter: Saved output of task
'attempt_local1584740539_0005_r_000000_0' to
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/df-count
14/05/15 18:32:49 INFO mapred.LocalJobRunner: reduce > reduce
14/05/15 18:32:49 INFO mapred.Task: Task
'attempt_local1584740539_0005_r_000000_0' done.
14/05/15 18:32:50 INFO mapred.JobClient: map 0% reduce 100%
14/05/15 18:32:50 INFO mapred.JobClient: Job complete: job_local1584740539_0005
14/05/15 18:32:50 INFO mapred.JobClient: Counters: 17
14/05/15 18:32:50 INFO mapred.JobClient: File Output Format Counters
14/05/15 18:32:50 INFO mapred.JobClient: Bytes Written=105
14/05/15 18:32:50 INFO mapred.JobClient: File Input Format Counters
14/05/15 18:32:50 INFO mapred.JobClient: Bytes Read=102
14/05/15 18:32:50 INFO mapred.JobClient: FileSystemCounters
14/05/15 18:32:50 INFO mapred.JobClient: FILE_BYTES_READ=252065916
14/05/15 18:32:50 INFO mapred.JobClient: FILE_BYTES_WRITTEN=254568597
14/05/15 18:32:50 INFO mapred.JobClient: Map-Reduce Framework
14/05/15 18:32:50 INFO mapred.JobClient: Map output materialized bytes=6
14/05/15 18:32:50 INFO mapred.JobClient: Map input records=0
14/05/15 18:32:50 INFO mapred.JobClient: Reduce shuffle bytes=0
14/05/15 18:32:50 INFO mapred.JobClient: Spilled Records=0
14/05/15 18:32:50 INFO mapred.JobClient: Map output bytes=0
14/05/15 18:32:50 INFO mapred.JobClient: Total committed heap usage
(bytes)=937164800
14/05/15 18:32:50 INFO mapred.JobClient: Combine input records=0
14/05/15 18:32:50 INFO mapred.JobClient: SPLIT_RAW_BYTES=167
14/05/15 18:32:50 INFO mapred.JobClient: Reduce input records=0
14/05/15 18:32:50 INFO mapred.JobClient: Reduce input groups=0
14/05/15 18:32:50 INFO mapred.JobClient: Combine output records=0
14/05/15 18:32:50 INFO mapred.JobClient: Reduce output records=0
14/05/15 18:32:50 INFO mapred.JobClient: Map output records=0
14/05/15 18:32:50 INFO vectorizer.SparseVectorsFromSequenceFiles: Pruning
14/05/15 18:32:50 INFO input.FileInputFormat: Total input paths to process : 1
14/05/15 18:32:50 INFO filecache.TrackerDistributedCacheManager: Creating
frequency.file-0 in
/tmp/hadoop-hudson/mapred/local/archive/1282951247135825305_-64795931_19432156/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans-work-4476748253218846472
with rwxr-xr-x
14/05/15 18:32:50 INFO filecache.TrackerDistributedCacheManager: Cached
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/frequency.file-0
as
/tmp/hadoop-hudson/mapred/local/archive/1282951247135825305_-64795931_19432156/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/frequency.file-0
14/05/15 18:32:50 INFO filecache.TrackerDistributedCacheManager: Cached
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/frequency.file-0
as
/tmp/hadoop-hudson/mapred/local/archive/1282951247135825305_-64795931_19432156/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/frequency.file-0
14/05/15 18:32:50 INFO mapred.JobClient: Running job: job_local232679360_0006
14/05/15 18:32:50 INFO mapred.LocalJobRunner: Waiting for map tasks
14/05/15 18:32:50 INFO mapred.LocalJobRunner: Starting task:
attempt_local232679360_0006_m_000000_0
14/05/15 18:32:50 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:50 INFO mapred.MapTask: Processing split:
file:/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-toprune/part-r-00000:0+90
14/05/15 18:32:50 INFO mapred.MapTask: io.sort.mb = 100
14/05/15 18:32:51 INFO mapred.MapTask: data buffer = 79691776/99614720
14/05/15 18:32:51 INFO mapred.MapTask: record buffer = 262144/327680
14/05/15 18:32:51 INFO mapred.MapTask: Starting flush of map output
14/05/15 18:32:51 INFO compress.CodecPool: Got brand-new compressor
14/05/15 18:32:51 INFO mapred.Task: Task:attempt_local232679360_0006_m_000000_0
is done. And is in the process of commiting
14/05/15 18:32:51 INFO mapred.LocalJobRunner:
14/05/15 18:32:51 INFO mapred.Task: Task
'attempt_local232679360_0006_m_000000_0' done.
14/05/15 18:32:51 INFO mapred.LocalJobRunner: Finishing task:
attempt_local232679360_0006_m_000000_0
14/05/15 18:32:51 INFO mapred.LocalJobRunner: Map task executor complete.
14/05/15 18:32:51 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:51 INFO mapred.LocalJobRunner:
14/05/15 18:32:51 INFO mapred.Merger: Merging 1 sorted segments
14/05/15 18:32:51 INFO compress.CodecPool: Got brand-new decompressor
14/05/15 18:32:51 INFO mapred.Merger: Down to the last merge-pass, with 0
segments left of total size: 0 bytes
14/05/15 18:32:51 INFO mapred.LocalJobRunner:
14/05/15 18:32:51 INFO mapred.Task: Task:attempt_local232679360_0006_r_000000_0
is done. And is in the process of commiting
14/05/15 18:32:51 INFO mapred.LocalJobRunner:
14/05/15 18:32:51 INFO mapred.Task: Task attempt_local232679360_0006_r_000000_0
is allowed to commit now
14/05/15 18:32:51 INFO output.FileOutputCommitter: Saved output of task
'attempt_local232679360_0006_r_000000_0' to
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-partial/partial-0
14/05/15 18:32:51 INFO mapred.LocalJobRunner: reduce > reduce
14/05/15 18:32:51 INFO mapred.Task: Task
'attempt_local232679360_0006_r_000000_0' done.
14/05/15 18:32:51 INFO mapred.JobClient: map 0% reduce 100%
14/05/15 18:32:51 INFO mapred.JobClient: Job complete: job_local232679360_0006
14/05/15 18:32:51 INFO mapred.JobClient: Counters: 17
14/05/15 18:32:51 INFO mapred.JobClient: File Output Format Counters
14/05/15 18:32:51 INFO mapred.JobClient: Bytes Written=102
14/05/15 18:32:51 INFO mapred.JobClient: File Input Format Counters
14/05/15 18:32:51 INFO mapred.JobClient: Bytes Read=102
14/05/15 18:32:51 INFO mapred.JobClient: FileSystemCounters
14/05/15 18:32:51 INFO mapred.JobClient: FILE_BYTES_READ=302479543
14/05/15 18:32:51 INFO mapred.JobClient: FILE_BYTES_WRITTEN=305484756
14/05/15 18:32:51 INFO mapred.JobClient: Map-Reduce Framework
14/05/15 18:32:51 INFO mapred.JobClient: Map output materialized bytes=14
14/05/15 18:32:51 INFO mapred.JobClient: Map input records=0
14/05/15 18:32:51 INFO mapred.JobClient: Reduce shuffle bytes=0
14/05/15 18:32:51 INFO mapred.JobClient: Spilled Records=0
14/05/15 18:32:51 INFO mapred.JobClient: Map output bytes=0
14/05/15 18:32:51 INFO mapred.JobClient: Total committed heap usage
(bytes)=1138229248
14/05/15 18:32:51 INFO mapred.JobClient: Combine input records=0
14/05/15 18:32:51 INFO mapred.JobClient: SPLIT_RAW_BYTES=167
14/05/15 18:32:51 INFO mapred.JobClient: Reduce input records=0
14/05/15 18:32:51 INFO mapred.JobClient: Reduce input groups=0
14/05/15 18:32:51 INFO mapred.JobClient: Combine output records=0
14/05/15 18:32:51 INFO mapred.JobClient: Reduce output records=0
14/05/15 18:32:51 INFO mapred.JobClient: Map output records=0
14/05/15 18:32:52 INFO input.FileInputFormat: Total input paths to process : 1
14/05/15 18:32:52 INFO mapred.JobClient: Running job: job_local94697514_0007
14/05/15 18:32:52 INFO mapred.LocalJobRunner: Waiting for map tasks
14/05/15 18:32:52 INFO mapred.LocalJobRunner: Starting task:
attempt_local94697514_0007_m_000000_0
14/05/15 18:32:52 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:52 INFO mapred.MapTask: Processing split:
file:/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-partial/partial-0/part-r-00000:0+90
14/05/15 18:32:52 INFO mapred.MapTask: io.sort.mb = 100
14/05/15 18:32:52 INFO mapred.MapTask: data buffer = 79691776/99614720
14/05/15 18:32:52 INFO mapred.MapTask: record buffer = 262144/327680
14/05/15 18:32:52 INFO mapred.MapTask: Starting flush of map output
14/05/15 18:32:52 INFO mapred.Task: Task:attempt_local94697514_0007_m_000000_0
is done. And is in the process of commiting
14/05/15 18:32:52 INFO mapred.LocalJobRunner:
14/05/15 18:32:52 INFO mapred.Task: Task
'attempt_local94697514_0007_m_000000_0' done.
14/05/15 18:32:52 INFO mapred.LocalJobRunner: Finishing task:
attempt_local94697514_0007_m_000000_0
14/05/15 18:32:52 INFO mapred.LocalJobRunner: Map task executor complete.
14/05/15 18:32:52 INFO mapred.Task: Using ResourceCalculatorPlugin : null
14/05/15 18:32:52 INFO mapred.LocalJobRunner:
14/05/15 18:32:52 INFO mapred.Merger: Merging 1 sorted segments
14/05/15 18:32:52 INFO mapred.Merger: Down to the last merge-pass, with 0
segments left of total size: 0 bytes
14/05/15 18:32:52 INFO mapred.LocalJobRunner:
14/05/15 18:32:52 INFO mapred.Task: Task:attempt_local94697514_0007_r_000000_0
is done. And is in the process of commiting
14/05/15 18:32:52 INFO mapred.LocalJobRunner:
14/05/15 18:32:52 INFO mapred.Task: Task attempt_local94697514_0007_r_000000_0
is allowed to commit now
14/05/15 18:32:52 INFO output.FileOutputCommitter: Saved output of task
'attempt_local94697514_0007_r_000000_0' to
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors
14/05/15 18:32:52 INFO mapred.LocalJobRunner: reduce > reduce
14/05/15 18:32:52 INFO mapred.Task: Task
'attempt_local94697514_0007_r_000000_0' done.
14/05/15 18:32:53 INFO mapred.JobClient: map 0% reduce 100%
14/05/15 18:32:53 INFO mapred.JobClient: Job complete: job_local94697514_0007
14/05/15 18:32:53 INFO mapred.JobClient: Counters: 17
14/05/15 18:32:53 INFO mapred.JobClient: File Output Format Counters
14/05/15 18:32:53 INFO mapred.JobClient: Bytes Written=102
14/05/15 18:32:53 INFO mapred.JobClient: File Input Format Counters
14/05/15 18:32:53 INFO mapred.JobClient: Bytes Read=102
14/05/15 18:32:53 INFO mapred.JobClient: FileSystemCounters
14/05/15 18:32:53 INFO mapred.JobClient: FILE_BYTES_READ=352892770
14/05/15 18:32:53 INFO mapred.JobClient: FILE_BYTES_WRITTEN=356397280
14/05/15 18:32:53 INFO mapred.JobClient: Map-Reduce Framework
14/05/15 18:32:53 INFO mapred.JobClient: Map output materialized bytes=6
14/05/15 18:32:53 INFO mapred.JobClient: Map input records=0
14/05/15 18:32:53 INFO mapred.JobClient: Reduce shuffle bytes=0
14/05/15 18:32:53 INFO mapred.JobClient: Spilled Records=0
14/05/15 18:32:53 INFO mapred.JobClient: Map output bytes=0
14/05/15 18:32:53 INFO mapred.JobClient: Total committed heap usage
(bytes)=1339555840
14/05/15 18:32:53 INFO mapred.JobClient: Combine input records=0
14/05/15 18:32:53 INFO mapred.JobClient: SPLIT_RAW_BYTES=177
14/05/15 18:32:53 INFO mapred.JobClient: Reduce input records=0
14/05/15 18:32:53 INFO mapred.JobClient: Reduce input groups=0
14/05/15 18:32:53 INFO mapred.JobClient: Combine output records=0
14/05/15 18:32:53 INFO mapred.JobClient: Reduce output records=0
14/05/15 18:32:53 INFO mapred.JobClient: Map output records=0
14/05/15 18:32:53 INFO common.HadoopUtil: Deleting
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-partial
14/05/15 18:32:53 INFO common.HadoopUtil: Deleting
/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-streamingkmeans/tf-vectors-toprune
14/05/15 18:32:55 INFO mapred.JobClient: Cleaning up the staging area
file:/tmp/hadoop-hudson/mapred/staging/hudson1927125140/.staging/job_local1927125140_0008
Exception in thread "main" java.lang.RuntimeException: Error while running
command to get file permissions : java.io.IOException: Cannot run program
"/bin/ls": error=12, Not enough space
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:712)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:448)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:423)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.checkPermissionOfOther(TrackerDistributedCacheManager.java:364)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.isPublic(TrackerDistributedCacheManager.java:328)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.determineCacheVisibilities(TrackerDistributedCacheManager.java:832)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.determineTimestampsAndCacheVisibilities(TrackerDistributedCacheManager.java:756)
at
org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:843)
at
org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:734)
at org.apache.hadoop.mapred.JobClient.access$400(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at
org.apache.mahout.vectorizer.tfidf.TFIDFConverter.makePartialVectors(TFIDFConverter.java:321)
at
org.apache.mahout.vectorizer.tfidf.TFIDFConverter.processTfIdf(TFIDFConverter.java:128)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:357)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
Caused by: java.io.IOException: error=12, Not enough space
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
at java.lang.ProcessImpl.start(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
... 36 more
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:473)
at
org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:423)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.checkPermissionOfOther(TrackerDistributedCacheManager.java:364)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.isPublic(TrackerDistributedCacheManager.java:328)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.determineCacheVisibilities(TrackerDistributedCacheManager.java:832)
at
org.apache.hadoop.filecache.TrackerDistributedCacheManager.determineTimestampsAndCacheVisibilities(TrackerDistributedCacheManager.java:756)
at
org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:843)
at
org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:734)
at org.apache.hadoop.mapred.JobClient.access$400(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at
org.apache.mahout.vectorizer.tfidf.TFIDFConverter.makePartialVectors(TFIDFConverter.java:321)
at
org.apache.mahout.vectorizer.tfidf.TFIDFConverter.processTfIdf(TFIDFConverter.java:128)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:357)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
Build step 'Execute shell' marked build as failure