rcurtin commented on this pull request.
> + for(size_t i = 0; i < N; i++)
+ probabilities(i) = prob(generator);
+
+ // fit results with probabilities and data
+ GammaDistribution gDist;
+ gDist.Train(rdata, probabilities);
+
+ // fit results with only data
+ GammaDistribution gDist2;
+ gDist2.Train(rdata);
+
+ BOOST_REQUIRE_CLOSE(gDist2.Alpha(0), gDist.Alpha(0), 10);
+ BOOST_REQUIRE_CLOSE(gDist2.Beta(0), gDist.Beta(0), 10);
+
+ BOOST_REQUIRE_CLOSE(alphaReal, gDist.Alpha(0), 10);
+ BOOST_REQUIRE_CLOSE(betaReal, gDist.Beta(0), 10);
Hm, ok, a larger tolerance like 5% (or even 10% if need be) is okay for the
difference between alphaReal and gDist.Alpha(0), but the tolerance between
gDist and gDist2 should be 1e-5. The gDist/gDist2 tolerance is really the one
I'm much more concerned with.
One way you can test a tolerance is by adding
`math::RandomSeed(std::time(NULL))` (although you are using different RNGs so
you'll have to set them accordingly) and then running the test over and over
again to see how often it fails.
--
You are receiving this because you commented.
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/834
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack