Hi,

On 13/02/18 14:27, Matti Picus wrote:
I have begun to dive into the performance/perf code. My goal is to get pypy benchmarks running on http://speed.python.org. Since PyPy has a JIT, the benchmark runs must have a warmup stage.

Why?
The other interpreters don't get an arbitrary chunk of time for free, so neither should PyPy. Warmup in an inherent cost of dynamic optimisers. The benefits should outweigh the costs, but the costs shouldn't be ignored.

 There are some
first-cut warmup values hardcoded inside a few of the benchmarks. I think I would prefer to use a different mechanism - a separate calibrated data file alongside the performance benchmarks. We could start off with a rough guess for the benchmarks and get the system up and running, and then calibrate the warmups hopefully finding some statistical basis for the values.

Assuming the idea of an external data file is acceptable, I have begun diving into the code, trying to figure out the interaction between the performance package and the perf package. It seems that running "pyperformance run -b telco --warmups 10" does not work, the pyperformance cli runner accepts only a subset of the perf Runner command line options.Shouldn't the performance.cli parse_args() start with the perf._runner.py parser?
Would a pull request along those lines be a worthwhile goal?

Thanks,
Matti
Cheers,
Mark.
_______________________________________________
Speed mailing list
Speed@python.org
https://mail.python.org/mailman/listinfo/speed

Reply via email to