The strategy has always been to have machines make up for the inefficiency of bad code--after all its easier to get machines to do what you want them to do than the people who write bad code
Donna
[EMAIL PROTECTED]


On 10-Jun-06, at 9:22 PM, John Randall wrote:

Randy MacDonald wrote:
Hello J.R.;

I'd love to see examples of where today's computers seem slower than those
of the 1970s. It's just not the impression I get.


Randy:

Of course today's computers are faster, and we expect them to do more. I am arguing that the increase in computational resources (processor speed, memory, disk size) has not been matched by software performance. This is especially true with programs such as word processors, where the I/ O speed
is glacial.

Let's look at a comparison from the late 1980s.  I had an AT&T Unix PC
(PC7300, 3b1) with 1MB memory, a 20MB disk and a 68010 at 10MHz.  My
current computer running Linux has 1000x more memory, 5000x more disk and is 500x as fast. The Unix PC ran a full version of Unix System V with a graphical interface (but not X-Windows). I would not go back, but I do not
think the performance of my current computer is 500x better.

Best wishes,

John

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to