You're running on hadoop, but using a relative path name like "./clusteredPoints/", is there a directory *on HDFS* in
/home/adam/Coding/hadoopResearch/mahout/trunk/clusteredPoints/ ? -jake On Fri, Apr 9, 2010 at 12:58 PM, adam35413 <adam.ham...@gmail.com> wrote: > > Yes, I see that now. I'm an idiot. > > Updated that, and now I'm seeing this: > > bin/mahout clusterdump --seqFileDir ./clusteredPoints/ --output > testFile.txt > running on hadoop, using > HADOOP_HOME=/home/adam/Coding/hadoopResearch/hadoop-0.20.2 and > HADOOP_CONF_DIR=/home/adam/Coding/hadoopResearch/hadoop-0.20.2/conf > Input Path: > /home/adam/Coding/hadoopResearch/mahout/trunk/clusteredPoints/part-00000 > 10/04/09 15:57:02 ERROR driver.MahoutDriver: MahoutDriver failed with args: > [--seqFileDir, ./clusteredPoints/, --output, testFile.txt, null] > File does not exist: > /home/adam/Coding/hadoopResearch/mahout/trunk/clusteredPoints/part-00000 > Exception in thread "main" java.io.FileNotFoundException: File does not > exist: > /home/adam/Coding/hadoopResearch/mahout/trunk/clusteredPoints/part-00000 > at > > org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:457) > at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:676) > at > org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1417) > > > The file does exist however, and the permissions have been set to 777 to > eliminate any potential conflict with access. Making progress I guess, but > still no dice... > -- > View this message in context: > http://n3.nabble.com/Kmeans-clustering-tp641973p709132.html > Sent from the Mahout User List mailing list archive at Nabble.com. >