Have you benchmarked it using the generator interface? The structure of this as a no monolithic generator makes it a good deal slower than generating in straight C (with everything inline). While I'm not sure a factor of 2 is enough to justify a change (for me 10x, 1.2x is not but I don't know where the cutoff is).
Can you post benchmarks from using it through Generator? Also, those tests would be replaced with new values if the patch was accepted, so don't worry about them. Kevin On Sat, Feb 6, 2021, 09:32 <camel-...@protonmail.com> wrote: > I tried to implement a different implementation of the ziggurat method for > generating standard normal distributions that is about twice as fast and > uses 2/3 of the memory than the old one. > I tested the implementation separately and am very confident it's correct, > but it does fail 28 test in coverage testing. > Checking the testing code I found out that all the failed tests are inside > TestRandomDist which has the goal of "Make[ing] sure the random > distribution returns the correct value for a given seed". Why would this be > needed? > The only explanation I can come up with is that it's standard_normal is, > in regards to seeding, required to be backwards compatible. If that's the > case how would, could one even implement a new algorithm? > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion@python.org > https://mail.python.org/mailman/listinfo/numpy-discussion >
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@python.org https://mail.python.org/mailman/listinfo/numpy-discussion