On 6/11/06, Don Guinn <[EMAIL PROTECTED]> wrote:

Acceptable speed for interactive computing is psychological. A person


Only if you  define it that wayj.  My points were on the basis fo actual
functional capacity: this is not psychologically based.
(Though the intet to FIND that is).

TSO (on iBM ainfrme) used a dela y oop to _slow_down_ responses during
lightly loaded times
because user perceied that the system was _slow_ durig normal load times and
would
therein complain about slow perfomnce _all_the_time_ on th basis of
encountering it
_some_of_the_time_.

My commets dealt with _all_ conditions, and perforce, to discuss SW/HW lack
of meeting
_uniform_ packing fractions _all_ of the time _had_ to deal with the peaks
and valleys.

My own _psychological_ condition is that I regard it with distaste when

some computr (lawn, house, TV, product) company sells me schpui for
too high a price and asked me to repay it every (so often), making me
PAY for this "priilege" (of repeatedly repayig them for a SW idea now 3
decades old;
or, like KOss, for the privilege of sseing their next new portable CD has no
power
cable stress elief, or that drains the underpowered two AA batterie faster
for Everready)

just fast enough so I do not have enough funds to do what I would rather do
with them.

While actually psychologically based, probably not what you intended to
point at.

Most companies these days, in the era of mass advertising, with definite
semiotic
cognitive (especially the ligtly trained or ineperienced or arrogant ones
like PhDs,
all of those who think they can NOT be so impacted) controlk aspects (buy
when they
tell you), tend to prefer the Microsoft approach: train your slave to buy on
demand.

Oversimplified and over-angst ridden, but about as best I can do on short
notice.

expects a certain level of performance which matches one's thought
processes. Not too fast, not too slow. What people really want is
consistent performance.


What people are trained to want is what they then expect wil satisfy them:
which is whatever they were
trained to want.

Bak in the 70s, Stanford participated in an AI program after ETAOIN SHRDLU
in the field of medicine;
on the topic of neurophysicians,  The progrma used rudimentary analsis
techniques to come up with
a q&a pitter patter pattern to inimize session length and maximize
prognosticative effectiveness.

It was judged against a panel of ntionally eminent neurophysiologists.

The program won.

But the MDs involved recommended against the program, seekingh, on a
psychological basis,
to be given a more HUBLE program that would, instead of definitively STATING
the most
probably diagnosis, instead supplicatively ASK the imminent (Not a typo)
doctors
what they would RECOMMEND as the best diagnosis.

They did not like to be bypassed as peers.

In short they neede their go stroked to function.

Result: It is now nigh 30 years later, an the organization of advanced
prognosticative data
is not only not done and on the shelf, the more importnt process of (a)
using it to heal patients,
(b) using it to train new neuros (not a typo) to learn to diagnose by
reasoning WITHOUT it,
(c) preparing and adoting a medicie wide systematic approach to actually
capturing NEW
such data and reducig it to universal availability for practie, daya to day,
yer to year ongoing:

  is not only not done, it i now being co-opted for another round of
decades of profit
  from the new thieves of Mankind's knowledge.


I watched a friend of mine who had given me all kinds of grief on the
horrible performance of TSO because it sometimes took 20 seconds to edit
a file, but normally just a few seconds. He brought a 4300 down from our



See the above.


New York office to demonstrate APL. He stood there smiling, happy as
could be, while it took almost a minute to load his workspace. Several
things. Something he could see was happening. The tape was spinning. It
always took that long to load a workspace. He was in control of the
situation.



Psycho control.  Plese note you bypassed all the TSO steps.

Valid observations; just not the topic I was addressing.

Many PC owner PReFEr a PC they can rebot, which in turn, when reporting
_reliability_

   ..they wll conveniently "forget" to accurately report.  Typical, normal:

     expecially when the user wants the $$$$ for the PC over central
control spending it

    (and hacing the centrl resource wrk more cost effectively).



I too got that sinking feeling when occasionally on our TSO system I
pressed the enter key and nothing happened for a few seconds. Did the
system go down and I just lost the last hour of work?



WHich was why IBM aded the delay loop to _uniformly_ delay response.

As well as the special privileges falgs to allow the "privileged" user
accounts to BYPASS it.

  Thus was born  the concept of computer aristocracy ...


CDC, on one of their time sharing systems, had an interesting parameter.
It was a delay in sending back the results of an interactive request. If
the calculations were completed in less time than this delay the
response was held until that time interval was met. Therefore,
performance was consistent even though the load on the processor varied
tremendously.


TSO had it first, sorry.


Wordstar under DOS took about 5 to 10 seconds to load a document. As we
have moved through word processors to the present it has always taken
about this amount of time. Much slower people get impatient. Much faster
and there's not time for a quick sip of coffee and a chance to compose
one's self for the task at hand.


People perceive _accurate_ whebn their expected work proficiency _should_ go
up.

ANd get combative when this is the case.


If I had my way, software developers would have to use the smallest
configuration on which their product was supposed to run to test and
demonstrate their applications. If the latest version or Windows or
whatever is supposed to run on a 256M one GH processor then Bill ought
to use that size PC hooked to his big screen when he demonstrates it to
the world. And, before he shows it off, maybe he ought to put on all the
applications we don't use but must have like virusware, spyware and all
the other cute stuff that seems to come with PCs for "free".



Agreed.  Efficiency of SW wold improve.

Instad, when W95 was introduced, Bill GAtes comment about slow performance
was
for the users to "go buy a 50MHz 486"; then a $5000 step up.

He literlly did not understand why that wss not a possibility for most
people.

And if he did thereafter ern that, he after that point then did not care.

Mr. Ballmer, even less so.


Our PCs are like refrigerators. No matter how big they are they are
always full and there is unidentifiable green stuff at the back we
couldn't find for months.



Ayah.  I see that we actually agree in totality.


Randy MacDonald wrote:

>Hello J.R.;
>
>I'd love to see examples of where today's computers seem slower than
those
>of the 1970s. It's just not the impression I get.
>
>
>

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm




--
--
Roy A. Crabtree
UNC '76 gaa.lifer#11086
Room 207 Studio Plus
123 East McCullough Drive
Charlotte, NC 28262-3306
336-340-1304 (office/home/cell/vmail)
704-510-0108x7404 (voicemail residence)

[EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]

http://www.authorsden.com/royacrabtree
http://skyscraper.fortunecity.com/activex/720/resume/full.doc
--
(c) RAC/IP, ARE,PRO,PAST
(Copyright) Roy Andrew Crabtree/In Perpetuity
   All Rights/Reserved Explicitly
   Public Reuse Only
   Profits Always Safe Traded
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to