Not a personal attack also, but when I hear/read something similar to
what you said: "Keep it in mind, now that the machines are so
unbelievably much faster than they were when[...]", my first thought
is always "why then applications are not also orders of magnitude
faster".

I leave the answer to that question as an exercise to the reader ...

Sébastien

On 10/31/06, J. Merrill <[EMAIL PROTECTED]> wrote:
(I re-read this just before sending, and I think it's necessary to say that I don't intend this to be treated 
as a personal attack.  When I say "you" I'm not trying to single you out personally.  Treat of it 
as "those who have these ideas" or some such.  I do not intend to attack anyone personally, but I 
am willing to "take on" what I think are baseless concerns.)

I think you're worried about (what I think are) non-issues without any evidence 
that they're actually issues.  It's not as though memory allocation in .Net is 
a performance pig; it's a few assembler instructions (an add followed by a 
compare to see if a GC is needed).

When your network hardware and software stack is so fast and efficient that it makes your 
"high performance communications" code a bottleneck, maybe you'll need to re-write.  
However, keep in mind that any "high performance computer" today is literally 100 times 
faster than the fastest machine available only a couple of years ago -- and that's before the 
server is a multiple-processor box with 4 or 8 quad-core processors, like it will be in 6-12 
months.  (The laptop I'm writing this on has 4 times the RAM of the desktop I've been writing 
software on for years.)  How much faster must the hardware get before it's time to stop worrying 
about the performance of the software you didn't write and worry instead about finishing the 
software that you're being asked to write?

Who said "first make it work, then make it fast"?  I don't remember, but that was a good 
thought.  Are you in the "performance tuning" part of your application life-cycle?  Are 
there no user-visible features left to build?

The whole idea of the .Net framework (and every other bit of "software 
layering" that we deal with) is to reduce the time, effort, and low-level knowledge 
it takes to produce the desired software.  You do that by giving up some control over the 
nitty-gritty in order to work, and hopefully think, at a higher level.

"Premature optimization is the root of all evil" was said by people much 
smarter than I.  Keep it in mind, now that the machines are so unbelievably much faster 
than they were when TCP/IP networking, garbage collectors, dynamic memory management (and 
on and one) were first implemented.  Back then, perhaps you had to worry about every 
memory allocation.  Today, you're wasting the money of the people who are paying you if 
you don't start working on the problems they think they're paying you to solve and ignore 
the problems that they think someone has already solved and provided as tools for you to 
use.

At 04:53 AM 10/30/2006, Itay Zandbank wrote (in part)
>The answer may very well be "for high performance communications, use C++ and the 
Windows API", I'm just hoping it's not.


J. Merrill / Analytical Software Corp

===================================
This list is hosted by DevelopMentor(r)  http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com


===================================
This list is hosted by DevelopMentor®  http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

Reply via email to