Hm.. no. have rerun just the core tests, and it did not happen.

Anyway, I changed only 1 line in any of existing code to Fix Cholesky, so
this is clearly not it


On Thu, Mar 6, 2014 at 8:51 PM, Suneel Marthi <[email protected]>wrote:

> Have not seen this happen, this is the Frequent PAttern code that was
> resurrected at the last minute for .9 release; but haven't seen this
> failure.
>
> Does it fail consistently?
>
>
>
>
>
> On Thursday, March 6, 2014 11:41 PM, Dmitriy Lyubimov <[email protected]>
> wrote:
>
> I have this test failure during due diligence run -- is it a known problem
> ?
>
> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 2.226 sec
> <<< FAILURE! - in org.apache.mahout.fpm.pfpgrowth.TransactionTreeTest
> testTransactionTree(org.apache.mahout.fpm.pfpgrowth.TransactionTreeTest)
> Time elapsed: 1.902 sec  <<< FAILURE!
> org.junit.ComparisonFailure: expected:<...4)([7, 11, 13, 24, 2[5]],19)([7,
> 11, 13, 15...> but was:<...4)([7, 11, 13, 24, 2[7]],19)([7, 11, 13, 15...>
>         at
> __randomizedtesting.SeedInfo.seed([2CCB549C54CC2941:939C3E240470B52D]:0)
>         at org.junit.Assert.assertEquals(Assert.java:115)
>         at org.junit.Assert.assertEquals(Assert.java:144)
>         at
>
> org.apache.mahout.fpm.pfpgrowth.TransactionTreeTest.testTransactionTree(TransactionTreeTest.java:105)
>
>
>
>
> On Thu, Mar 6, 2014 at 8:22 PM, Dmitriy Lyubimov <[email protected]>
> wrote:
>
> > Just a heads up... if you haven't tracked jira... for this
> >
> > This is a big commit that, aside from its primary content, brings in
> Spark
> > 0.9 (in a separate module only), updates scala to 2.10.3 , scalatest to
> 2.0
> > . Spark module also has CDH-compatible maven profile (which I think still
> > pulls CDH4 hdfs dependencies).
> >
> > Exact hadoop dependency is fairly irrelevant since one can compile and
> > setup spark to work with whatever version, and all mahout-spark binary
> > therefore will be compatible with whatever hdfs without recompilation.
> >
> > -d
> >
>

Reply via email to