On Tue, Nov 11, 2008 at 9:22 AM EST, I wrote:
> I think that's the wrong question.  In this case, the right question
> would have been "Is it an issue that we have twice as many
> calculations as we need".  And, the answer around here is
> often that a factor of two is sometimes worth bothering with and
> sometimes not.  The "noise level" on a modern computer
> is usually less than a factor of two speed difference on repeated
> runs but can be significantly greater than a factor of two for the
> first run,

I should add that there are bigger reasons why a factor of 2 is
usually not very relevant.

One of the reasons, of course, is that sometimes a factor of 2
can be relevant.  This would be when that factor of 2 is on
your critical path and depleting a bottleneck resource.

However a bigger issue is that often re-architecting your system
can give you a factor of 1000, or better, resource improvements.
(For example, if a calculation is being run repeatedly, maybe
you can lift some of the complexity out of the repeating aspect
of the system.)

Another, bigger, issue is that in the long run simplicity usually
matters more than resource use.  Not always, of course, but
simplicity also helps you when you have identified your bottlenecks --
with a simple system you can often do fundamental rearrangements
that would be much more expensive if you had spent a lot of time
and complexity optimizing non-bottlenecked parts of your system.

Finally, the biggest issue is correctness.  If you optimize before
your have your system doing the right thing, at best you have
a fast way of doing the wrong thing.  (Or, maybe you have a slow
way of doing the wrong thing that has lots of fast and efficient
subsystems.)

-- 
Raul
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to