Hi, On Wed, 29 May 2013 12:00:44 -0600 Eric Snow <ericsnowcurren...@gmail.com> wrote: > The devguide doesn't have anything on performance testing that I could > find.
See http://bugs.python.org/issue17449 > Tools I'm aware of: > * pybench (relatively limited in real-world usefulness) > * timeit module (for quick comparisions) > * benchmarks repo (real-world performance test suite) > * speed.python.org (would omit for now) > > Things to test: > * speed > * memory (tools? tests?) You can use the "-m" option to perf.py. > Critically sensitive performance subjects > * interpreter start-up time There are startup tests in the benchmark suite. > * module import overhead > * attribute lookup overhead (including MRO traversal) > * function call overhead > * instance creation overhead > * dict performance (the underlying namespace type) > * tuple performance (packing/unpacking, integral container type) > * string performance These are all micro-benchmark fodder rather than high-level concerns (e.g. "startup time" is a high-level concern potentially impacted by "module import overhead", but only if the latter is a significant contributor to startup time). > How do we avoid performance regressions? Right now we don't have any automated way to detect them. Regards Antoine. _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com