> It all depends on what kind of application do you have. If you code is
> CPU-bound these seemingly insignificant optimizations can have a very
> significant influence on the overall service performance.

Do such beasts really exist?  I mean, I guess they must, but I've never
seen a mod_perl application that was CPU-bound.  They always seem to be
constrained by database speed and memory.

> On the other hand how often do you get a chance to profile your code
and
>   see how to improve its speed in the real world. Managers never plan
> for debugging period, not talking about optimizations periods.

If you plan a good architecture that avoids the truly slow stuff
(disk/network access) as much as possible, your application is usually
fast enough without spending much time on optimization (except maybe
some database tuning).  At my last couple of jobs we actually did have
load testing and optimization as part of the development plan, but
that's because we knew we'd be getting pretty high levels of traffic.
Most people don't need to tune very much if they have a good
architecture, and it's enough for them to fix problems as they become
visible.

Back to your idea: you're obviously interested in the low-level
optimization stuff, so of course you should go ahead with it.  I don't
think it needs to be a separate project, but improvements to the
performance section of the guide are always a good idea.  I know that I
have taken all of the DBI performance tips to heart and found them very
useful.

I'm more interested in writing about higher level performance issues
(efficient shared data, config tuning, caching), so I'll continue to
work on those things.  I'm submitting a proposal for a talk on data
sharing techniques at this year's Perl Conference, so hopefully I can
contribute that to the guide after I finish it.

- Perrin

Reply via email to