Thanks for checking, Grant. I am able to ran install with all the tests passing on OS X too. I check out the same revision, 786731, on my Linux box (RHEL 4 64-bit) but the same test keep failing. I checked the POM. It looks like both the trunk and the 0.1 release tar ball requires Hadoop 0.19.1. I will update the cluster to 0.19.1 next week and try the example again. Should we update the Wiki to point this out?
Bill On Fri, Jun 19, 2009 at 5:03 PM, Grant Ingersoll <[email protected]>wrote: > I just did a clean and an update and ran "mvn install" and all tests pass > for me. > > I'm running JDK 1.6 on OS X (latest) with Maven 2.0.10. > > You can skip the tests in Maven by passing in -DskipTests=true > > However, it makes me wonder what's going wrong and whether it will work. > > Note, also, we have been reworking some of the clustering stuff internals, > so maybe you happened to get a copy of trunk at the wrong time. > > -Grant > > > On Jun 19, 2009, at 4:23 PM, Grant Ingersoll wrote: > > Hey Bill, >> >> I've been having problems w/ the Bayesian examples too, but not enough >> time to look into them. However, that is about to change and will be >> digging into it a bit more early next week or this weekend. >> >> I will note a couple of things. I've seen some reports of problems w/ >> Maven 2.1.0 not working on Mahout. Also, I believe the trunk version >> requires Hadoop 0.19.1, but check the POM to be sure. >> >> In the meantime, any insight you have is appreciated. >> >> -Grant >> >> On Jun 19, 2009, at 3:03 PM, Bill Au wrote: >> >> I am using the 0.1 release tar ball and the Hadoop job files contained in >>> it. I am trying to run the twenty newsgroup example by following the >>> instructions in the Wiki. I am not getting any errors nor exception but >>> the >>> results are not correct. Here is the output I got: >>> >>> http://people.apache.org/~billa/mahout0.1-output >>> >>> I am getting the same results for both Bayes and CBayes. Any idea what I >>> am >>> doing wrong? >>> >>> I also want to run the example using the trunk so I checked it out and >>> ran >>> "mvn install". I found that the Hadoop job files are not being created >>> due >>> to a test failure: >>> >>> Running org.apache.mahout.clustering.kmeans.TestKmeansClustering >>> Tests run: 5, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 34.979 >>> sec >>> <<< FAILURE! >>> >>> <failure message="clusters[3] expected:<4> but was:<2>" >>> >>> type="junit.framework.AssertionFailedError">junit.framework.AssertionFailedError: >>> clusters[3] expected:<4> but was:<2> >>> at junit.framework.Assert.fail(Assert.java:47) >>> at junit.framework.Assert.failNotEquals(Assert.java:280) >>> at junit.framework.Assert.assertEquals(Assert.java:64) >>> at junit.framework.Assert.assertEquals(Assert.java:198) >>> at >>> >>> org.apache.mahout.clustering.kmeans.TestKmeansClustering.testKMeansMRJob(TestKmeansClustering.java:432) >>> </failure> >>> >>> I am not sure how to fix this. Is there anyway to build the Hadoop job >>> files without running the tests? >>> >>> By the way, I am using maven 2.1.0 and Hadoop 0.18.3. >>> >>> Bill >>> >> >> > -------------------------- > Grant Ingersoll > http://www.lucidimagination.com/ > > Search the Lucene ecosystem (Lucene/Solr/Nutch/Mahout/Tika/Droids) using > Solr/Lucene: > http://www.lucidimagination.com/search > >
