On Thu, 2003-06-26 at 09:57, Alex Shnitman wrote:
> On Thu, 2003-06-26 at 00:31, Gilboa Davara wrote:
> > Instead of using this huge amount of computing power to break the
> > software sand-box and take computing to a new level, we waste it on
> > object constructors, virtual function tables, house keeping backbones,
> > run-time engines and smart libraries the do their best to keep lousy
> > programmers from get what they (really) deserve: a one-way ticket home.
> Let me ask you a question. How much time would it take you to develop an
> application on your XT that would download a file over the Internet,
> filter it according to a regular expression, and display it on the
> screen for the user to scroll through? And how much time would it take
> you to develop this with the tools that you have now?
> That's the way computing progresses -- by creating more and more
> abstractions, so that more complicated things become easier to do.
> That's the reason why you have all this breadth of nifty applications
> for your PC now -- because it's getting easier to write them. There
> isn't so much number-crunching that the user wants to do on his
> computer, and there certainly isn't any value in running a word
> processor 100,000 times faster now than it ran on your XT. So yes, the
> cycles are wasted on object constructors and virtual function tables,
> because that's exactly what they're there for.

You're missing my point completely. (And that's why I try to distance
myself from such discussions to begin with).
Let me give you an example to what I mean:
A couple of years ago I worked for Medical software development company.
I was working on the database development side. (We had our own
proprietary object oriented database) 
Our database was pretty cool; it could handle an hospital level load on
a dual Pentium Pro machine. (Which was a far cry from most big iron
machines that were used back then)
Out medical software side used PowerBuilder (and later VB) to develop
the medical applications.
To put it mildly, the medical application itself, was by far, slower and
heavier then the medical databased that it was built upon.
While 50 clients could run easily on a P90/32MB, the medical application
ran like shit on a Pentium 166/64MB machine...!
And every-time we pointed this anomaly to the med team, they claimed
that "new machines are bound, new CPUs; by the time we are out, CPU
power won't be an issue."
You know what, the med software is runs slower then a dead dog on a
top-level P3/P4/Athlon machine... nothing changed.

I saw the same line of thinking on each and every project I've worked
for. Why spend time on optimizing if you can relay on the Moore law
instead. You see the same line of thinking on close to every major
software project today.

Anyone can answer me, why Open Office runs like **** on a workstation
level machine? Why Mozilla? (Which I personally love) IE? MS Office? MS
Development Studio?

Yes, they are adding new features. But a couple of new features cannot
justify the inhuman load time presented by the OpenOffice. Nor can it
explain why the Sun's JRE eats tens of MB of RAM just to load a simple
web applet.
Ever tried running netbeans? I've got coworkers working with it on
1.4Ghz Athlons and it's still mimic's a vmware session running on a 486!

When was the last time you saw a new version of major application truly
outperform its predecessor? (And I'm not talking about database/file/etc


To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]

Reply via email to