On Sat, 26 Jan 2002, Stas Bekman wrote:
[...]
It's much better to build your system, profile it, and fix the bottlenecks.
The most effective changes are almost never simple coding changes like the
one you showed, but rather large things like using qmail-inject instead of
SMTP, caching a
On Sat, 26 Jan 2002, Perrin Harkins wrote:
It all depends on what kind of application do you have. If you code is
CPU-bound these seemingly insignificant optimizations can have a very
significant influence on the overall service performance.
Do such beasts really exist? I mean, I guess
One memory speed saving is using global VARS, I know it is not
recomended practice, but if from the begining of the project u set a
convention what are the names of global vars it is ok..F.e. I'm using in
all DB pages at the begining :
our $dbh = dbConnect() or dbiError();
See I know (i'm
Hi all,
Stas has a point. Perl makes it very easy to do silly things.
This is what I was doing last week:
if( m/\b$Needle\b/ ) {...}
Eight hours. (Silly:)
if( index($Haystack,$Needle) m/\b$Needle\b/ ) {...}
Twelve minutes.
73,
Ged.
It all depends on what kind of application do you have. If you code is
CPU-bound these seemingly insignificant optimizations can have a very
significant influence on the overall service performance.
Do such beasts really exist? I mean, I guess they must, but I've never
seen a mod_perl
On Sat, 26 Jan 2002, Perrin Harkins wrote:
It all depends on what kind of application do you have. If you code
is CPU-bound these seemingly insignificant optimizations can have a
very significant influence on the overall service performance.
Do such beasts really exist? I mean, I guess
On Sat, 26 Jan 2002, Perrin Harkins wrote:
It all depends on what kind of application do you have. If you code is
CPU-bound these seemingly insignificant optimizations can have a very
significant influence on the overall service performance.
Do such beasts really exist? I mean, I guess
On Saturday 26 January 2002 03:40 pm, Sam Tregar wrote:
Think search engines. Once you've figured out how to get your search
database to fit in memory (or devised a cachin strategy to get the
important parts there) you're essentially looking at a CPU-bound problem.
These days the best
Perrin Harkins wrote:
Back to your idea: you're obviously interested in the low-level
optimization stuff, so of course you should go ahead with it. I don't
think it needs to be a separate project, but improvements to the
performance section of the guide are always a good idea.
It has to
Ah yes, but don't forget that to get this speed, you are sacrificing
memory. You now have another locally scoped variable for perl to keep
track of, which increases memory usage and general overhead (allocation
and garbage collection). Now, those, too, are insignificant with one
use, but
Issac Goldstand wrote:
Ah yes, but don't forget that to get this speed, you are sacrificing
memory. You now have another locally scoped variable for perl to keep
track of, which increases memory usage and general overhead (allocation
and garbage collection). Now, those, too, are
This project's idea is to give stright numbers for some definitely bad
coding practices (e.g. map() in the void context), and things which vary
a lot depending on the context, but are interesting to think about (e.g.
the last example of caching the result of ref() or a method call)
I
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
We've talked about this kind of thing before. My opinion is still the same
as it was: low-level speed optimization before you have a working system is
a waste of your time.
It's much
On Fri, 2002-01-25 at 09:08, Perrin Harkins wrote:
snip /
It's much better to build your system, profile it, and fix the bottlenecks.
The most effective changes are almost never simple coding changes like the
one you showed, but rather large things like using qmail-inject instead of
SMTP,
On 25 Jan 2002, David Wheeler wrote:
On Fri, 2002-01-25 at 09:08, Perrin Harkins wrote:
snip /
It's much better to build your system, profile it, and fix the bottlenecks.
The most effective changes are almost never simple coding changes like the
one you showed, but rather large things
On Fri, 25 Jan 2002 21:15:54 + (GMT)
Matt Sergeant [EMAIL PROTECTED] wrote:
With qmail, SMTP generally uses inetd, which is slow, or daemontools,
which is faster, but still slow, and more importantly, it anyway goes:
perl - SMTP - inetd - qmail-smtpd - qmail-inject.
So with going
On Fri, 2002-01-25 at 13:15, Matt Sergeant wrote:
With qmail, SMTP generally uses inetd, which is slow, or daemontools,
which is faster, but still slow, and more importantly, it anyway goes:
perl - SMTP - inetd - qmail-smtpd - qmail-inject.
So with going direct to qmail-inject, your
Stas Bekman [EMAIL PROTECTED] writes:
I even have a name for the project: Speedy Code Habits :)
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
I disagree with the POV you seem to be taking wrt write-time
optimizations. IMO,
Perrin Harkins wrote:
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
We've talked about this kind of thing before. My opinion is still the same
as it was: low-level speed optimization before you have a working system is
a waste of
Rob Nagler wrote:
Perrin Harkins writes:
Here's a fun example of a design flaw. It is a performance test sent
to another list. The author happened to work for one of our
competitors. :-)
That may well be the problem. Building giant strings using .= can be
incredibly slow; Perl
20 matches
Mail list logo