On Dec 10, 2003, at 4:02 PM, R. Joseph Newton wrote: [..]
Most CPUs in use average about 99% idle time, at least on the
computers [some running up to 20 open windows] on which I have
checked these stats.

Not wishing to get us bogged down in a convention of the IEEE Transactions of Distributed Processing are you talking here in terms of the average 'desk top' unit that most persons have access to? Back End headless servers in dedicated database services mode? yada-yada-yada...

{ forgive me, I was traumatized by major data center issues
at a formative age, where optimizing CPU utilization on
crays was an Imperative...
But my therapist says I'm getting better...8-}

To me, the more important issues in normal
practice have to do with comprehensibility, and
thus maintainability, than with technical performance comparisons.

Is that 'normal' as in 'bell curve shaped' or as in 'normalized data', as in DB speak? 8-)

I think a part of the underlying set of questions
here are when is it time to move one's data sets
out of the simpler perl tiehash with a generic DB_FILE
and off into a 'real database' - such as Postgres,
as Randall Schwartz recently noted some of the
newer benchmark numbers. The simplest answer is

when the DB portion of the process is the log_jam

Which then takes us back into what I wish to
underscore in R. Joseph's point,

        IF one has written modular, and
                maintainable, DB interface code
        Then
                one simply opts to change the guts
                        on the inside of a Module.

If one has built it in such a way that one already
is using a 'network query' interface, where one has
already abstracted the database semantics, one can
do this without changing anything on the 'client side'
of that code.

ciao
drieux

---


-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>




Reply via email to