This appears to be a real, bona-fide bug.  I can repro it on my local machine.

Seems that  /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed/ 
is actually just a file, not a directory, so somewhere writing a policy to that 
area doesn't make sense.  


On Apr 8, 2012, at 3:19 PM, Apache Jenkins Server wrote:

> See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/96/>
> 
> 12/04/08 19:19:16 INFO common.HadoopUtil: Deleting 
> /tmp/mahout-work-jenkins/reuters-kmeans-clusters
> 12/04/08 19:19:16 WARN util.NativeCodeLoader: Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 12/04/08 19:19:16 INFO compress.CodecPool: Got brand-new compressor
> 12/04/08 19:19:18 INFO kmeans.RandomSeedGenerator: Wrote 20 vectors to 
> /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed
> 12/04/08 19:19:18 INFO kmeans.KMeansDriver: Input: 
> /tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/tfidf-vectors 
> Clusters In: /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed 
> Out: /tmp/mahout-work-jenkins/reuters-kmeans Distance: 
> org.apache.mahout.common.distance.CosineDistanceMeasure
> 12/04/08 19:19:18 INFO kmeans.KMeansDriver: convergence: 0.5 max Iterations: 
> 10 num Reduce Tasks: org.apache.mahout.math.VectorWritable Input Vectors: {}
> 12/04/08 19:19:18 INFO compress.CodecPool: Got brand-new decompressor
> Exception in thread "main" java.io.IOException: Mkdirs failed to create 
> /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed/clusters-0
>       at 
> org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:366)
>       at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:528)
>       at 
> org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:843)
>       at 
> org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:831)
>       at 
> org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:823)
>       at 
> org.apache.mahout.clustering.classify.ClusterClassifier.writePolicy(ClusterClassifier.java:232)
>       at 
> org.apache.mahout.clustering.classify.ClusterClassifier.writeToSeqFiles(ClusterClassifier.java:185)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:254)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:154)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:104)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>       at 
> org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:48)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>       at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>       at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
> Build step 'Execute shell' marked build as failure


Reply via email to