On 06/26/2014 12:44 PM, CM wrote: > Huh. I learned two new Python facts this week: > > 1. print statements were slowing down my code enough to > really notice a particular transition. It went from about > 2-3 seconds to a bit under 1 second. What at first seemed > unresponsive now seems almost snappy. The only difference > was removing a lot of print statements I had used for > debugging (Python 2.5, on a single core 1.97 Ghz machine).
Yes print statements are very useful, but you have to be careful with them. In Uni I remember working on a project where we coded up an algorithm, and then attempted to work out by timing the O() runtime of the algorithm. Wanting to be fancy and print out a progress report, I added an entire term to the O() runtime! Instead of O(log n), it became closer to O(n). Oops! Seems like over the years good old fashioned debugging skills have been lost. In the earliest days of IDEs (Turbo BASIC and QuickBASIC) I regularly would employ debuggers with break points, watches, and step through my code. Nowadays it seems we loath to fire up the debugger. I imagine the currently available debugger frontends like ddd or kdbg support pdb. Not sure though. > 2. Merely having a cPython decorator for profiling a > function significantly slowed down performance...again, > from a about 2 seconds to just under a second (~1 second > doesn't seem much but these sorts of delays do affect > user experience). There is something ironic or > Heisenbergian about that. Yes, it stands to reason that profiling code is going to introduce a runtime cost. How else would we expect profiling to work? That's why a production "release" is done with debugging and profiling stuff removed. What I do find Heisenbergian are bugs that show up when debugging and profiling stuff are removed, but completely gone when present. IE profiling and debugging slow it down enough that often subtle race conditions are masked. -- https://mail.python.org/mailman/listinfo/python-list