[cctalk] Re: First Personal Computer

2024-05-25 Thread Mike Katz via cctalk
You see, we are back to my original comment.  The definition of Personal 
Computer is quite fluid.  Does it have to be called a Personal Computer 
in advertising literature or does any computer that can be used by a 
single person, in any environment, constitute a personal computer.


If i am writing the definition than my original comment that the 
Antikythera Mechanism is the first computing device designed to be used 
by a single person..


To someone else the Altiar is the first personal computer.  And to yet, 
someone else an early PDP or HP computer might be the first personal 
computer.


There are many mechanical and later analog computing devices in use long 
before the modern digital computer.  What about the Hollerith Machine 
used to count the census from the 1890's to the 1950's.  It was a one 
person calculating machine but since it was used for commercial purposes 
does that make it a personal computer.  When IBM initially released the 
first PC it was intended not for home use but for business use (for 
$10,000 1980's dollars).  The Northstar Horizon was also marketed as a 
business computer but used by home S-100 hobbyists.


The point, as I stated earlier, is that for every different definition 
we will find a different result.


I guess this means that the definition of personal computer is personal

This is written with tongue firmly implanted in cheek.

On 5/25/2024 1:27 PM, Jon Elson via cctalk wrote:

On 5/25/24 13:05, CAREY SCHUG via cctalk wrote:
When announced and sold new, were the SIMON, LINC and G-15 sold and 
described as, in the exact words, "personal computer"?  Did the guy 
with multiple supercomputers in his basement buy them NEW, to use 
them for their designed purpose?  If not they are just memorabilia, 
like a victrola.


The Bendix G-15 came out in 1956!  It cost about $60,000 in 1956 
Dollars.  The first LINC machines were built at an MIT summer school 
by grad students who would then take them back to their home 
institutions and use then in biomedical research labs.  The LINCs in 
this case cost about $50K, and were built starting in 1961-62.  The 
term "personal computer" was not coined until a LONG time after.


Jon





[cctalk] Re: terminology [was: First Personal Computer]

2024-05-25 Thread Mike Katz via cctalk


I'm sorry but I beg to differ with you here.  The DEC PDP line of single 
user interactive computers (as opposed to batch processing only systems) 
started in the late 1950's and early 1960's and spawned many generations 
as well as copies and other companies (Data General being the most well 
known of these).Yes multi user time sharing operating systems we added 
later on but initially they were single user interactive, (DEC 10 & 20 
excepted).Does a computer lose its "Personal" identification if it can 
handle multiple users as an option. There were multiple user time 
sharing Operating Systems for many early personal computers (Unix, 
Xenix, MP/M, Uniflex, OS/9, etc.). Even the aforementioned PDP computers 
ran multi-user time sharing systems. Does that, then, invalidate them 
for consideration as a personal computer? Does that make any Linux 
machine not a personal computer, by definition, because it can handle 
more than one user or task? As I have said earlier in this thead and its 
fore bearers, the term Personal Computer is so non-specific that we can 
argue from here to Alpha Centauri and back without coming up with an 
agreed upon definition. So, until a concrete definition can be made, the 
discussion of the answer is completely moot. I stick by my original 
challenge, find a calculating device that predates the Antikythera 
Mechanism (36 BCE). Simple measuring devices like the sun dial and 
sextant don't count as they don't calculate, they measure.

On 5/25/2024 4:26 PM, CAREY SCHUG via cctalk wrote:

(Rick--IIRC, some later Curta (knockoffs?) could do square roots too, is that 
true, do you have one?)

OK, I'll loosen up, or make exceptions.

Maybe some devices before the altair used the exact or inexact words "personal 
computer".

but they did not "create a market" or lead directly to a series of similar, competing 
products, and do not come anywhere close to what we think of as a "personal computer" 
now.  The Altair is very different from a modern personal computer, but still has more similarities 
and a continuous chain of intermediate stages.  Part of what makes a personal computer to us is 
that we can easily switch from one to another.  That would not have been possible between the 
LGP-30, LINC, etc.

I still ask the question, what fraction of the G-15, LINC, IBM 610, Programma 101, etc, 
were purchased as "personal" devices by an individual, for personal use, and 
from household funds rather than via a corporate (including educational) purchase-order?

Other terminology:

IIRC the first computers that were sold as "supercomputers" were scalar, maybe with a few 
more processors than the generation before, but programmed in the same manner.  Then shortly came 
the massively parallel "supercomputers" created from commodity microcomputer chips, and 
the term supercomputer has transitioned to mean them.

--Carey


On 05/25/2024 3:33 PM CDT Rick Bensene via cctalk  wrote:

  
While the LGP-30(vacuum tube/drum), G-15(vacuum tube/drum), and PB-250(transistor/delay lines) predated it, the ground-breaking Olivetti Programma 101(transistor/delay line) programmable desktop calculator was officially called a "personal computer" in some of its advertising and sales literature.  It was introduced in October of 1965.


Late in the game as far as single-user, standard AC-line-powered computing devices 
compared to those machines and probably others, but those machines, AFAIK, were not 
advertised nor specified as "personal computers".
   
That said, I am much more aware of electronic calculator history than computer history, so I could be entirely biased here.  Also, the Programma 101, as I've stated here before, only scratches the definition of a true computer in that it is not capable of handling any data type but floating point binary-coded decimal numbers, has very limited data storage capability, and had no peripheral interfacing capability.


There were quite a number of single-user computing devices made and sold that ran on 
standard AC power, and were vastly more capable than the Programma 101, and predated it, 
but, AFAIK, were not advertised or particularly marketed as "personal 
computers".

One that comes to mind is the Monroe Monrobot III(vacuum tube/drum), introduced 
in February, 1955.

Another is the IBM 610 "Auto Point"(vacuum tube/drum) computer, introduced in 
1957.
It was originally named the "Personal Automatic Computer" (PAC) by its designer.

I'm sure that there are quite a few other machines developed in the mid-to-late 
1950's that would qualify as personal computing devices, but these two are the 
ones that I'm aware of that seem to fit the bill.   Some of these may actually 
have been capable of manipulating data types other than decimal numbers.

In 1962, Casio introduced its AL-1 programmable (up to 360 steps) relay-based 
electric calculator.  It was definitely intended as a personal computing 
device, and calculations could be performed manually from a keyboard much 

[cctalk] Re: First Personal Computer

2024-05-25 Thread Mike Katz via cctalk
Now that is an interesting refinement.  Limiting to 1800 VA, most likely 
eliminates almost anything vacuum tube based.


My 1974 PDP-8/E computer alone (not counting external storage devices) 
was rated at 115V @ 10A.  I don't know what the power factor of it is 
but that is 1150 Watts.  Does that count? Technically I don't need any 
peripherals to program it or get the program results.  I just use the 
front panel.


Does that 1800VA include any necessary peripherals such as terminals, 
CRT's, disk drives, tape drives, etc?


See, even that definition is really non-specific and open to interpretation.

Should we add a limitation on volume occupied as well?  This would 
eliminate many rack or multiple rack computers.


What about memory type?  Before semiconductor RAM and CORE there was 
serial, drum, tape and mechanical memory systems


Does it have to be a digital computer.  There we many table top analog 
computers in the 50's and 60's.  Even Heathkit made some.


Does the output need to be text (What about an analog computer with a 
digital nixie tube display)???


Everyone has their own definition of what a Personal Computer is to 
them.  It's all subjective.


How much computing power and electricity are we using trying to identify 
something whose mere definition is so subject to interpretation?


On 5/25/2024 4:57 PM, Chuck Guzis via cctalk wrote:

On 5/25/24 13:41, Fred Cisin via cctalk wrote:

On Sat, 25 May 2024, Chuck Guzis via cctalk wrote:
. . . or 100V or 220V in locations where those are the standard for
household residential wiring.
Woulld not want to automatically exclude UK machines, such as the
Sinclair doorstop wedge.

Okay, I'll refine it for the international crowd.  Anything that
requires over 1800 VA to run isn't a "personal computer"  That's about 8
amps for the 220 volt world.

--Chuck





[cctalk] Re: Help? Programming SCM90448 EPROMs

2024-05-25 Thread Mike Katz via cctalk
I have an old Logical Devices Inc, Gangpro 8 with DIP sockets on it.  
What package is this part?  Is it compatible with anything more widely 
known?


On 5/25/2024 12:23 PM, Glen Slick via cctalk wrote:

On Sat, May 25, 2024 at 6:27 AM emanuel stiebler via cctalk
 wrote:

Hi all,
anybody in the US could program some SCM90448 EPROMs for me?
None of my programmers I have here, can do it.

Some old, trusty DATA I/O ???

What is an SCM90448? Can you find a datasheet for that part? I cannot.




[cctalk] Re: First Personal Computer

2024-05-25 Thread Chuck Guzis via cctalk
On 5/25/24 13:41, Fred Cisin via cctalk wrote:
> On Sat, 25 May 2024, Chuck Guzis via cctalk wrote:

> . . . or 100V or 220V in locations where those are the standard for
> household residential wiring.
> Woulld not want to automatically exclude UK machines, such as the
> Sinclair doorstop wedge.

Okay, I'll refine it for the international crowd.  Anything that
requires over 1800 VA to run isn't a "personal computer"  That's about 8
amps for the 220 volt world.

--Chuck



[cctalk] Re: First Personal Computer

2024-05-25 Thread Fred Cisin via cctalk

On Sat, 25 May 2024, Chuck Guzis via cctalk wrote:

On 5/25/24 08:14, Jon Elson via cctalk wrote:
Offhand, if I were King of the World, I'd immediately eliminate from
competition those computers that cannot be run from a US 120 volt 15 amp
wall receptacle.   The rationale being that anything that requires
special power wiring cannot be "personal"


. . . or 100V or 220V in locations where those are the standard for 
household residential wiring.
Woulld not want to automatically exclude UK machines, such as the Sinclair 
doorstop wedge.


[cctalk] Re: First Personal Computer

2024-05-25 Thread Rick Bensene via cctalk
While the LGP-30(vacuum tube/drum), G-15(vacuum tube/drum), and 
PB-250(transistor/delay lines) predated it, the ground-breaking Olivetti 
Programma 101(transistor/delay line) programmable desktop calculator was 
officially called a "personal computer" in some of its advertising and sales 
literature.  It was introduced in October of 1965.   

Late in the game as far as single-user, standard AC-line-powered computing 
devices compared to those machines and probably others, but those machines, 
AFAIK, were not advertised nor specified as "personal computers".
  
That said, I am much more aware of electronic calculator history than computer 
history, so I could be entirely biased here.  Also, the Programma 101, as I've 
stated here before, only scratches the definition of a true computer in that it 
is not capable of handling any data type but floating point binary-coded 
decimal numbers, has very limited data storage capability, and had no 
peripheral interfacing capability.

There were quite a number of single-user computing devices made and sold that 
ran on standard AC power, and were vastly more capable than the Programma 101, 
and predated it, but, AFAIK, were not advertised or particularly marketed as 
"personal computers".

One that comes to mind is the Monroe Monrobot III(vacuum tube/drum), introduced 
in February, 1955.

Another is the IBM 610 "Auto Point"(vacuum tube/drum) computer, introduced in 
1957.
It was originally named the "Personal Automatic Computer" (PAC) by its designer.

I'm sure that there are quite a few other machines developed in the mid-to-late 
1950's that would qualify as personal computing devices, but these two are the 
ones that I'm aware of that seem to fit the bill.   Some of these may actually 
have been capable of manipulating data types other than decimal numbers.

In 1962, Casio introduced its AL-1 programmable (up to 360 steps) relay-based 
electric calculator.  It was definitely intended as a personal computing 
device, and calculations could be performed manually from a keyboard much like 
a regular calculator, but also automatically via plastic toothed gears that 
would have teeth broken off of them to encode program steps.  The gears would 
be electrically read by the machine and directed the machine to perform 
computer-like operations.

I'm not arguing that any of these, including the Programma 101, are the first 
"personal computers" by any means.   I'm just adding some thoughts to the 
discussion.

Rick Bensene
The Old Calculator Museum
https://oldcalculatormuseum.com











[cctalk] Re: First Personal Computer

2024-05-25 Thread Jon Elson via cctalk

On 5/25/24 13:05, CAREY SCHUG via cctalk wrote:

When announced and sold new, were the SIMON, LINC and G-15 sold and described as, in the 
exact words, "personal computer"?  Did the guy with multiple supercomputers in 
his basement buy them NEW, to use them for their designed purpose?  If not they are just 
memorabilia, like a victrola.

The Bendix G-15 came out in 1956!  It cost about $60,000 in 
1956 Dollars.  The first LINC machines were built at an MIT 
summer school by grad students who would then take them back 
to their home institutions and use then in biomedical 
research labs.  The LINCs in this case cost about $50K, and 
were built starting in 1961-62.  The term "personal 
computer" was not coined until a LONG time after.


Jon



[cctalk] Re: First Personal Computer

2024-05-25 Thread Sellam Abraham via cctalk
On Sat, May 25, 2024 at 11:05 AM CAREY SCHUG via cctalk <
cctalk@classiccmp.org> wrote:

>
> Because ONE *developer* of the LINC used his position to take one home and
> use it the way we currently use "personal computers" does not mean EVERY
> OTHER LINC was also a personal computer.  Did he pay the full street
> price?  I'm guessing not.  If you want to put a plaque on that single unit,
> fine, but I am sure other one-off home brew machines need to be included
> too.  I think here were are talking about production machines available for
> sale to all.
>

I agree with you on the Altair.  I was only submitting my comment on the
LINC in the scenario where we're throwing context out the window and
relying on revisionism to define the term.  But in actuality, there were
more than one LINC made, and the design was the basis for the DEC MINC-11.
So it has some of the requirements.


> I believe some obsolete warships (certainly ICBM silos) have been sold to
> private individuals. Does that retroactively mean the original warships and
> ICBMs are "personal yachts and weapons"?  I'll bet one rich guy bought a
> Mississippi riverboat for personal use, does that make them *ALL* into
> "personal pleasure craft"?  That is a slippery slope.
>

> --Carey
>

Hey, if the government needs a fully-provisioned aircraft carrier to defend
the nation, then so do I.

Sellam


[cctalk] Re: First Personal Computer

2024-05-25 Thread CAREY SCHUG via cctalk
When announced and sold new, were the SIMON, LINC and G-15 sold and described 
as, in the exact words, "personal computer"?  Did the guy with multiple 
supercomputers in his basement buy them NEW, to use them for their designed 
purpose?  If not they are just memorabilia, like a victrola.

Somebody claimed the altair was the first time the actual words "personal 
computer" were used.

If so, I would say that is the correct definition.  Period.  Final Answer, 
Regis.

I cannot redefine my 8-core gaming machine with 3 NVIDIA cards as a 
"supercomputer" just because I want to and it has more memory and megaflops 
than some other device that historically was defined as a supercomputer.  I 
cannot take 250 Radio Shack Color Computers and network them to cooperatively 
solve a single problem, and then call that a supercomputer, unless somebody did 
that when the computer was still being sold new (yes, I know there were 
"networks" for radio shack computers, but AFAIK they only let some number of 
students share resources from the central teacher's computer and in most or all 
cases could not talk directly to each other).

specifically, like the altair, which seems to fit common usages of the words 
"personal" and "computer": 

 "something sold NEW to private citizens, for personal enjoyment and not 
for gain"

Granted some buyers hoped to develop software or hardware they could sell.  
Some may have had a day job in computers, but almost as likely might have been 
cooks or door-to-door bible salesmen (statistically not absolutely).

Because ONE *developer* of the LINC used his position to take one home and use 
it the way we currently use "personal computers" does not mean EVERY OTHER LINC 
was also a personal computer.  Did he pay the full street price?  I'm guessing 
not.  If you want to put a plaque on that single unit, fine, but I am sure 
other one-off home brew machines need to be included too.  I think here were 
are talking about production machines available for sale to all.

I remember seeing the original development unit for the Amdahl V6, made with 
descreet components and much larger (and slower) than the production models, 
which although it was nicknamed "Gene's machine", but that was NOT a personal 
computer, sorry.  

The ISA cards that anybody could buy to run S/370 operating systems in a PC 
should qualify, IMHO, but not 4331 level units.  That is getting iffy, i'd like 
to see statistics on how many were purchased by schools and business for their 
employees to use AS LONG AS THEY REMAINED EMPLOYED, verses how many were 
purchased by individuals and run in their place of resident.  I'm guessing at 
least 5% were sold to private individuals, and if anybody quibbles that is not 
enough, I am willing to not include them.  I at least thought about buying one, 
perhaps there were ongoing license charges that ended that dream?

I believe some obsolete warships (certainly ICBM silos) have been sold to 
private individuals. Does that retroactively mean the original warships and 
ICBMs are "personal yachts and weapons"?  I'll bet one rich guy bought a 
Mississippi riverboat for personal use, does that make them *ALL* into 
"personal pleasure craft"?  That is a slippery slope.

--Carey

> On 05/25/2024 10:20 AM CDT Sellam Abraham via cctalk  
> wrote:
> 
>  
> On Sat, May 25, 2024, 8:14 AM Jon Elson via cctalk 
> wrote:
> 
> > On 5/24/24 11:49, Mike Katz via cctalk wrote:
> > > The problem with this debate is that the definition of
> > > Personal Computer is totally fluid and can be written so
> > > that the writers opinion is fact.
> >
> > Yes, the Bendix G-15 was said to be the first personal
> > computer. It was as big as a refrigerator, and weighed a LOT
> > more, and drew much more power.  (300 vacuum tubes, 3000
> > Germanium diodes,  drum memory.)  but, one guy could program
> > it and run it.
> >
> > The LINC comes in a close second.
> >
> > Jon
> >
> 
> I know a guy in a basement in Germany that has three supercomputers up and
> running, that he installed and maintains himself.  Except for when he
> invites guests over, they're very personal.
> 
> That being said, I don't know that the Bendix G-15 fits the bill, but the
> LINC very much does, especially considering it was kinda of intended to be
> a single user machine, and at least one of the team that put it together
> brought one home and used it there.
> 
> If I were writing the definitive history of personal computing, I'd maybe
> start with SIMON, then the LINC, then eventually the Altair.
> 
> Sellam
> 
> >


[cctalk] Re: Help? Programming SCM90448 EPROMs

2024-05-25 Thread Glen Slick via cctalk
On Sat, May 25, 2024 at 6:27 AM emanuel stiebler via cctalk
 wrote:
>
> Hi all,
> anybody in the US could program some SCM90448 EPROMs for me?
> None of my programmers I have here, can do it.
>
> Some old, trusty DATA I/O ???

What is an SCM90448? Can you find a datasheet for that part? I cannot.


[cctalk] Re: First Personal Computer

2024-05-25 Thread Chuck Guzis via cctalk
On 5/25/24 08:14, Jon Elson via cctalk wrote:

> Yes, the Bendix G-15 was said to be the first personal computer. It was
> as big as a refrigerator, and weighed a LOT more, and drew much more
> power.  (300 vacuum tubes, 3000 Germanium diodes,  drum memory.)  but,
> one guy could program it and run it.
> 
> The LINC comes in a close second.

Offhand, if I were King of the World, I'd immediately eliminate from
competition those computers that cannot be run from a US 120 volt 15 amp
wall receptacle.   The rationale being that anything that requires
special power wiring cannot be "personal"

So, for example, the PB-250 qualifies; the IBM 1130 does not.  The
Honeywell H316 "Kitchen computer" probably does, in the sense of intent,
but it was never produced for mass consumption.

I recall a short-lived 60's attempt at a personal data retrieval device
(cassette tape storage)--I don't think it had any computational
capabilities, so probably not a computer per se.   Anyone remember the name?

--Chuck




[cctalk] Re: First Personal Computer

2024-05-25 Thread Sellam Abraham via cctalk
On Sat, May 25, 2024, 8:14 AM Jon Elson via cctalk 
wrote:

> On 5/24/24 11:49, Mike Katz via cctalk wrote:
> > The problem with this debate is that the definition of
> > Personal Computer is totally fluid and can be written so
> > that the writers opinion is fact.
>
> Yes, the Bendix G-15 was said to be the first personal
> computer. It was as big as a refrigerator, and weighed a LOT
> more, and drew much more power.  (300 vacuum tubes, 3000
> Germanium diodes,  drum memory.)  but, one guy could program
> it and run it.
>
> The LINC comes in a close second.
>
> Jon
>

I know a guy in a basement in Germany that has three supercomputers up and
running, that he installed and maintains himself.  Except for when he
invites guests over, they're very personal.

That being said, I don't know that the Bendix G-15 fits the bill, but the
LINC very much does, especially considering it was kinda of intended to be
a single user machine, and at least one of the team that put it together
brought one home and used it there.

If I were writing the definitive history of personal computing, I'd maybe
start with SIMON, then the LINC, then eventually the Altair.

Sellam

>


[cctalk] Re: First Personal Computer

2024-05-25 Thread Jon Elson via cctalk

On 5/24/24 11:49, Mike Katz via cctalk wrote:
The problem with this debate is that the definition of 
Personal Computer is totally fluid and can be written so 
that the writers opinion is fact.


Yes, the Bendix G-15 was said to be the first personal 
computer. It was as big as a refrigerator, and weighed a LOT 
more, and drew much more power.  (300 vacuum tubes, 3000 
Germanium diodes,  drum memory.)  but, one guy could program 
it and run it.


The LINC comes in a close second.

Jon



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-25 Thread Sellam Abraham via cctalk
On Fri, May 24, 2024, 5:48 PM Rich Alderson via cctalk <
cctalk@classiccmp.org> wrote:

>
> And Sellam is simply wrong.
>
> Rich
>

You got your opinions, I got mine. And old Billy Boy has some skeletons in
his closet.  Perhaps literally.

Sellam

>


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Dave Dunfield via cctalk
Chuck Guzis wrote:
> I don't think the "first" applies in this case.  The MCM/70 used an 8008

On the subject of early 8008 designs - there was a Canadian one (1974 I think) 
the
MIL (Microsystems International Limited) MOD-8 - later also released as the 
GNC-8
(Great Northern Computers)

I also created an emulator for it as well - so you can experience using another 
very
early system if you like...

Sometime later, Scelbi 8008 BASIC was ported to it (also in my archive) - this 
has to
be one of the very earliest (notice I didn't say F-r-t :-) BASICs.

Dave


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Bill Degnan via cctalk
On Fri, May 24, 2024 at 11:30 PM Dave Dunfield via cctalk <
cctalk@classiccmp.org> wrote:

> Weill .. I certainly expected lots of "discussion" on these statements
> about my Altair:
>
> I have never claimed to be an "unknown drip"(*) on details of computer
> history, but here is my reasoning:
>
> > First Personal Computer (long before IBM PC)
>
> I am well aware of small systems that predated the Altair, but they
> are/were not neary as well known (mainly due to Jan/Feb 1975 Popular
> Electronics), and I don't recall that nearly as many of them were as
> commonly owned and operated by "people of modest means" and/or not
> "in the industry".
>
> And unlike most predecessors it was expandable by a means that grew
> onto a whole industry.
>
>
>
With respect, I have studied the 1956 Royal McBee LGP-23 (and later -30) at
length and found one could easily use this computer as a "personal
computer". The machine docs indicate that it was sold for general computing
use, operated in real time by one person.  From the training materials I
have on hand, it appears as if this machine was intended as an open system
and people were trained to have at it.  The Friden Flexowriter was the I/O
device, a bootstrap was loaded into the drum memory and off you went.

 THe LGP-30 inspired Kertz and Kimmeny to write BASIC.One might find it
pretty easy to program "Hunt the Wumpus" using this machine, but it was not
powerful enough to run BASIC as it was written originally.

Pretty cool if you ask me and I don't know of any other stand-alone
computer intended to be used specifically as a one person general
electronic computing device before the LGP-23/30.  A first?  Not saying
that, but my definition of personal computer is met by the Royal McBee
LGP.  Conclude what you want.

If anyone has a spare LGP-23 or 30 please send to me, thanks in advance.  I
will come pick it up.

Bill Degnan


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Dave Dunfield via cctalk
Weill .. I certainly expected lots of "discussion" on these statements
about my Altair:

I have never claimed to be an "unknown drip"(*) on details of computer
history, but here is my reasoning:

> First Personal Computer (long before IBM PC)

I am well aware of small systems that predated the Altair, but they
are/were not neary as well known (mainly due to Jan/Feb 1975 Popular
Electronics), and I don't recall that nearly as many of them were as
commonly owned and operated by "people of modest means" and/or not
"in the industry".

And unlike most predecessors it was expandable by a means that grew
onto a whole industry.

I too generally avoid using "first" in history discussions... but

At one time I discussed this with Ed Roberts, the creator of the
Altair, and he said:
 "We coined the phrase Personal Computer and it was first applied
 to the Altair, i.e., by definition the first personal computer."
 ...
 "The beginning of the personal computer industry started without
 question at MITS with the Altair."


> First S100 buss system

Originally called "Roberts Buss" the Atair expansion buss was used by
many systems that followed, and not wanting to use their competitors
name, the buss became known as "S100" (presumably System buss with
100 pins)

Again, Ed Roberts confirmed this to me.


> First system Bill Gates wrote code for (long before Microsoft)

I should have qualified this with "well known published" code.

As far as I know, Bill's career really went off with his
implementation of BASIC - which became: Mits Altair Basic

And perhaps Microsoft started "only a few years" after (which WAS a
LONG time in those days of the industry) - but it wasn't anywhere
what it would become some years after that! - and I don't think it was
at all well known till MS-DOS (post IBM-PC).

But again, I don't claim to be:

(*)
X - marks the unknown
Spurt - a drip under pressure

.. and I don't claim to be an "unknown drip under pressure"
(I'll happily leave that honor to others in the group :-)

Dave


[cctalk] Re: ANITA ((was: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Rick Bensene via cctalk




Christian Corti wrote:

> The Anita electronic desktop calculators are a perfect example for the usage 
> of 
> selenium rectifiers in logic gates.

..and anyone who has restored one knows that the vast majority of the 
back-to-back selenium  diode packages have to be replaced with something else 
as they no longer function properly.  Ambient moisture kills Selenium as a 
semiconductor, and even though these devices were packaged to avoid that to 
some degree, after 60 years, stuff happens.

Many restorers resort to de-soldering the dual-diode packages from the circuit 
boards, hollowing out the package (removing the Selenium rectifiers and the 
potting material used) and installing back-to-back conventional Silicon diodes 
that are rated for the appropriate voltages involved in these machines, potting 
the diodes in place with some kind of material (epoxy?), and re-soldering the 
package to the circuit board.  These calculators used gas-discharge active 
logic elements (e.g., thyratrons and dekatrons) and used (relatively speaking) 
high voltages for their logic levels.  Fortunately, these gas-discharge devices 
seem to fare quite well with time, and though some do fail due to atomic-level 
outgassing or simple breakage, the majority of them work just as well the day 
the machine came off the assembly line.

Such practice with the Selenium rectifier modules makes the calculator look 
original if done carefully, and allows it to function when operation was 
impossible with the original devices.   It is an extremely tedious and 
time-consuming process, as there are a great many of these devices used in the 
first-generation Sumlock/ANITA calculators.  

I applaud anyone with the courage and patience to perform such surgery on these 
unusual artifacts. Fortunately, the circuit boards are quite robustly made, and 
the traces are large and well adhered to the base material of the circuit board 
(unlike many later calculators), making such an operation feasible. 

I am not brave enough to try this with the museum's ANITA Mk8.  After 25+ years 
of owning this artifact, I have not even tried to apply power to it in any 
fashion, and probably never will.  It is one of the very few calculators in the 
museum that is probably not in operational condition, as I strive for all of 
the exhibited machines to be operable and available for visitors to the 
physical museum to play with if they desire.  I'm content to leave it as it is 
for a display machine, as it is in very nice original condition.

Interesting to note that many ANITA Mk8 machines have a single transistor in 
them.  It's in the power supply.   The designers were comfortable enough using 
these relatively fussy gas-discharge logic devices as digital devices(they had 
developed machines like Colossus using this technology considerably before 
transistors were a thing, so there was certainly historical precedent), but the 
transistor was just fine for an analog purpose in the power supply.   

Boy, did they ever get it backwards (in terms of the longevity of gas-discharge 
logic elements in electronic calculators and what became the ubiquitous use to 
transistors)!  

Not intended at all to slight the accomplishment of Sumlock Comptometer in the 
development of these calculators.   They set the stage for the explosion of 
what was to become a many hundreds of million dollar market by the end of the 
decade, not to mention setting the electronic calculator up to be the driving 
force behind integrated circuit development for a consumer-oriented device.   

ICs before their development for use in calculators were only for big mainframe 
computers, military weapons systems, the spooks at places like the NSA, and the 
space program.  For that matter, the ANITA Mk7/8  could be said to be the 
progenitors for the development of the CPU on a chip, and by extension, the 
personal computer.   

Notice I didn't specify any machine, or say "first".  Slippery slope there.

Rick Bensene
The Old Calculator Museum
https://oldcalculatormuseum.com





[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread CAREY SCHUG via cctalk
Gak, 4k ram but 100k via virtual memory TO CASSETTE?  I want one just for that. 
 LOL  Was the cassette multi-track with one track containing timing marks, so 
records would not overlay each other?

I guess I would argue the definition of a PERSONAL computer is if many or 
(preferably) nearly all of them were purchased from personal accounts (credit 
card, check, or cash via some kind of money order) as opposed to corporate or 
business accounts likely subject to double entry bookkeeping and depreciation.  
Maybe being depreciated is the definition of NOT personal?

For instance, I doubt more than one or two of those LGP-30s were purchased from 
a personal account, and if so, probably by a start-up that was not yet into 
having a corporate account.

This web page https://www.xnumber.com/xnumber/MCM_70_microcomputer.htm 
indicates they were sold to corporations and universities, so the in the same 
category as the LGP-30, which predated it by many years.



--Carey

> On 05/24/2024 10:34 AM CDT Chuck Guzis via cctalk  
> wrote:
> 
>  
> On 5/24/24 07:57, CAREY SCHUG via cctalk wrote:
> 
> > (I could be mistaken about the mentioned 8008 device, but I think that was 
> > a training device, no?)
> 
> Do your homewoork--the MCM-70 ran APL, had cassette storage and a
> display and keyboard.  The MITS 8800 had nothing other than RAM and a
> CPU.  APL would have been a distant dream.
> 
> Of course, the MCM0/70 was Canadian, and not USAn...
> 
> --Chuck


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Rich Alderson via cctalk
First, Dave wrote:

> Date: Thu, 23 May 2024 15:53:53 -0400
> From: Dave Dunfield

> I've just passed on my "Mits Altair 8800" - this is a very historic system
> from the 70s - it is:

>   First system Bill Gates wrote code for (long before Microsoft)

Which is on the face of it incorrect.

Then Christian Corti responded (in replying to someone else's objection to
Dave's claim of firstness:

> Date: Fri, 24 May 2024 11:44:46 +0200 (CEST)
> From: Christian Corti via cctalk 

> >>   First system Bill Gates wrote code for (long before Microsoft)

> Didn't he write code for DEC machines at his school before that?

Which is nearer the mark, but not fully correct.

Then Sellam Abraham stuck his oar in:

> Date: Fri, 24 May 2024 07:40:31 -0700
> From: Sellam Abraham via cctalk 

> > Didn't he write code for DEC machines at his school before that?

> Yes, poorly.

Oh, FFS, Sellam.

OK.  Once again, the history goes like this.  I have heard it from the horses'
mouths (yes, plural).

Bill Gates and Paul Allen, along with 4 other students (out of a class of about
20), really cottoned onto programming in BASIC when a class was offered at
their school, the Lakeside School in Seattle.  That class used a remote
timesharing service called GEIS (General Electric Information System), which
ran on GE 635 computers.

The six boys (it was a boys' school until the next year when it went co-ed)
were allowed to visit a new computer service bureau called CCC, because one of
their mothers was acquainted with one of the primaries.  This company was using
a DEC PDP-10 timesharing system; the boys were given guest accounts under the
proviso that when the system crashed they would document what they were doing
at the time of the crash.

They were so eager to learn that the systems programmers (two MIT alums and a
Stanford alum) allowed them access to the hardware and system call reference
manuals, so that they learned assembler programming as well as BASIC, to an
expert level.

The summer between Paul's graduation and starting college, he along with Bill
and three others of the group got ACTUAL PAYING JOBS PROGRAMMING PDP-10 SYSTEMS
FOR THE BONNEVILLE POWER ADMINISTRATION, on a project called RODS (Real-time
Operational Data System) which used the systems for control purposes.  (The
sixth member of their coterie got a job as a junior ranger at Mount Rainier
National Park, so wasn't interested in being indoors all day all summer.)

Paul dropped out of college after his sophomore year and moved to the Boston
area, where he worked for Honeywell's software division and hung out with Bill
and Bill's college friends, meanwhile looking for a way to have a small
computer of their own.  They read the industry magazines to news of small
systems.

In the mean time, they tried to create a company to sell a traffic counting
device based on the Intel 8008 microprocessor.  The prototype hardware failed
in their first demonstration to the City of Seattle traffic department, and
they shelved the idea.

When the Altair issue of Popular Electronics came out in mid-December 1974
(cover data January 1975), they were prepared for the challenge.  After
ascertaining that Ed Roberts and MITS would entertain the idea of looking at a
BASIC interpreter for the new system, they sat down and created one from whole
cloth, with the division of labor as follows:

Bill Gates:  the interpreter itself
Paul Allen:  a simulator running on the PDP-10 for the Intel 8080 processor
Monte Davidoff:  a math whiz freshman who wrote the transcendental math 
routines

(My sources are Paul Allen and Bob Barnett.  Bob was Paul and Bill's manager at
 RODS, and the original business manager for Living Computer Museum.  I have no
 reason to believe that either had any reason to lie to me.)

Micro-soft incorporated in June/July 1975, so six months after they wrote their
first 8080 machine code, so Dave is wrong about "long before Microsoft".

And Sellam is simply wrong.

Rich


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Christian Liendo via cctalk
There was a 4004 based computer developed in 1972 that was released before
the Micral called the Comstar 4. It's not very well known but it was
written about in the ACM and the Computer History Museum has a copy of
their sales manual

ACM article

https://dl.acm.org/doi/pdf/10.1145/1499949.1499959

Manual at Computer History Museum
https://www.computerhistory.org/collections/catalog/102686568



On Fri, May 24, 2024, 5:45 AM Christian Corti via cctalk <
cctalk@classiccmp.org> wrote:

>
> And looking beyond the Great American barrier ;-) there was the MICRAL N,
> much earlier than the MITS, and considered as the first complete
> commercial microprocessor based computer, i.e. not a kit and available to
> normal customers.
>
> Christian
>
>


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Fred Cisin via cctalk

Besides nobody fully comprehending what "FIRST" really means, . . .
"The Altair was just an obscure predecessor; the personal computer was invented by 
Steve Jobs!"  :-)
"How can you call it a 'Personal Computer' with no mouse or Windoze?"  :-)


On Fri, 24 May 2024, Don R wrote:

Well the Xerox Alto had a three button mouse, making it “extra” personal.  ;)


You can put significant effort into creating an unambiguous definition.
But, SOMEBODY can find an example that doesn't apply that still meets the 
definition.



Using the argument that Roberts was the first to CALL it a "personal 
computer", means that the "MINI-Computer" was invented by a DEC marketing 
person.



Relatively early (NOT "FIRST") PC mice, such as Logitech's had three 
buttons.


I have heard conflicting stories about why Apple put only one button on 
their mouse:
1) It would be too confusing for the user, including the need to look away 
from the screen to see which mouse button is being pushed


2) Difficulty of explaining which button is which, and getting user 
comprehension of such, in writing documentation


3) Jef Raskin's concept that the system should KNOW what is wanted, so 
there is no need for more than one.


. . .


--
Grumpy Ol' Fred ci...@xenosoft.com


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Don R via cctalk
Well the Xerox Alto had a three button mouse, making it “extra” personal.  ;)

Don Resor

Sent from someone's iPhone

> On May 24, 2024, at 11:53 AM, Fred Cisin via cctalk  
> wrote:
> 
> On Fri, 24 May 2024, Sellam Abraham via cctalk wrote:
>> 
> Besides nobody fully comprehending what "FIRST" really means, . . .
> "The Altair was just an obscure predecessor; the personal computer was 
> invented by Steve Jobs!"  :-)
> 
> "How can you call it a 'Personal Computer' with no mouse or Windoze?"  :-)
> 



[cctalk] Re: First Personal Computer

2024-05-24 Thread jim stephens via cctalk




On 5/24/24 11:49, Mike Katz via cctalk wrote:
The problem with this debate is that the definition of Personal 
Computer is totally fluid 
A friend worked with an IBM 4361 at UMSL in St. Louis.  It was very 
little used as the print and other unit record had a separate unit to 
handle that traffic to the University of Missouri, Columbia's 370-145 
(later upgraded a lot).


But the 4361 was his "PC" and was about ideal.  He had the system, tape 
drive, a few disks, and a 2741 and a couple of terminals to log on 
with.  Also a printer.


Ran VM/SP 5 as the OS, so you could do about anything you  liked without 
any impact on the system as far as creating a problem.


lots of toys if you knew where to get them.  I don't think they had 
anything but VM, or if they did wasn't complicated.


I think the 4361 was the best of all of those systems, because of the 
integrated storage director.


It had plenty of channels if you needed to add anything, and usually 
you'd have at least a tape drive on those.


All of the air cooled systems, 31, 41, 61 and 81 had integrated com 
connections, so you could hook up a console, as well as a few other 
"regular" consoles w/o adding a controller of any sort.


Thanks
Jim



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Fred Cisin via cctalk

On Fri, 24 May 2024, Sellam Abraham via cctalk wrote:

This is on the Canonical List of ClassicCmp Debate Topics and is a dead
horse so beaten that there's nothing left but teeth and fur at this point.


Besides nobody fully comprehending what "FIRST" really means, . . .
"The Altair was just an obscure predecessor; the personal computer was 
invented by Steve Jobs!"  :-)


"How can you call it a 'Personal Computer' with no mouse or Windoze?"  :-)


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Christian Corti via cctalk

On Fri, 24 May 2024, Paul Koning wrote:
selenium, which is a very marginal semiconductor.  Speaking of which: 
some early computers tried to use selenium diodes as circuit elements 
(for gates), with rather limited success.  The MC ARRA is an example.


The Anita electronic desktop calculators are a perfect example for the 
usage of selenium rectifiers in logic gates.


Christian


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Christian Corti via cctalk

On Fri, 24 May 2024, CAREY SCHUG wrote:
the LGP-30 was used by one person AT A TIME, but on different days used 
by different people, who might or might not know each other, by some 
arbitrary scheduling algorithm.  The one I was familiar with was run by 
a tech or grad student, doing work not for self, but for another, 
definitely outside the realm of "personal"


Ehm... no!

Christian


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Christian Corti via cctalk

On Fri, 24 May 2024, Sellam Abraham wrote:

On Fri, May 24, 2024, 2:45?AM Christian Corti via cctalk <

This would go back to the 50s or earlier. The LGP-30 and comparable
machines are considered as personal computers, too.

But was it called a "personal computer"? And was it designed to be
"personal"?


The term "personal computer" is a modern invention. But it was definitely 
designed to be used by individuals; it was personal in all ways. Although 
not affordable for home use. But neither was the original IBM PC.



And looking beyond the Great American barrier ;-) there was the MICRAL N,

But it doesn't meet the other criteria Dave laid out. Most people these
days have never heard of the Micral, but even normies might've heard of the
Altair 8800 because of the very notoriety it has today because of it's
significance back then.


The Altair was absolutely insignificant in Europe. Ok, the MICRAL, too. 
I'd say, all microprocessor systems before the PET or some SBCs around the 
mid 70s were totally insignificant.


Christian


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Paul Koning via cctalk



> On May 24, 2024, at 1:26 PM, Chuck Guzis  wrote:
> 
> On 5/24/24 09:52, Paul Koning wrote:
> 
>> 
>> I once ran into a pre-WW2 data sheet (or ad?) for a transistor, indeed an 
>> FET that used selenium as the semiconducting material.  Most likely that was 
>> the Lilienfeld device.
> 
> Could also have been a device from Oskar Heil in the 1930s.

No idea.  I vaguely remember that it was French.  It was in a pile of papers in 
my father's office -- long since lost, unfortunately.

> What really made the difference in the case of transistors of any
> stripe, was the adoption of zone refining: (1951) William Gardner Pfann.
> Pfann knew Shockley and devised one of the early point-contact
> transistors, from a 1N26 diode. Zone-refining removed one of the
> bugaboos that plagued early semiconductor research--that of getting
> extremely pure material.
> 
> Pfann was a quiet, shy individual which perhaps explains why he doesn't
> get the historical applause.
> 
> Something akin to the Tesla-Steinmetz treatment.

I also remember the name Czochralski -- creator of the process that produces 
single crystals from which the wafers are sliced.

paul



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Sellam Abraham via cctalk
On Fri, May 24, 2024 at 9:45 AM Chuck Guzis via cctalk <
cctalk@classiccmp.org> wrote:

> Just pointing out that "firsts" are very difficult.

...

> "First" is a tricky term, like "best".
>
> --Chuck


Yep, which is part of the canonical debate ;)  This is why I and many
others in the hobby removed the term "first" from our vocabularies when
speaking of vintage computers.

It's kind of pointless anyway as people tend to want a starting point in
time at which they can point and say, "That is where it began", when in
reality all new invention is just a continuum of improvement over time and
occasionally a particular improvement makes more impact than others and
gets elevated to "first" status.  It's nice for newspaper headlines and
such but for historians it's a waste of time (as we've borne witness to
countless times over the years as this discussion re-rages periodically).

Sellam


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Chuck Guzis via cctalk
On 5/24/24 09:52, Paul Koning wrote:

> 
> I once ran into a pre-WW2 data sheet (or ad?) for a transistor, indeed an FET 
> that used selenium as the semiconducting material.  Most likely that was the 
> Lilienfeld device.

Could also have been a device from Oskar Heil in the 1930s.

What really made the difference in the case of transistors of any
stripe, was the adoption of zone refining: (1951) William Gardner Pfann.
Pfann knew Shockley and devised one of the early point-contact
transistors, from a 1N26 diode. Zone-refining removed one of the
bugaboos that plagued early semiconductor research--that of getting
extremely pure material.

Pfann was a quiet, shy individual which perhaps explains why he doesn't
get the historical applause.

Something akin to the Tesla-Steinmetz treatment.

--Chuck




[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Paul Koning via cctalk



> On May 24, 2024, at 12:45 PM, Chuck Guzis via cctalk  
> wrote:
> 
> ...
> Just pointing out that "firsts" are very difficult.  Even though, for
> years, Shockley et al were trumpeted as the "inventors of the
> transistor", it's noteworthy that their patent application was carefully
> worded to avoid claims from work decades earlier by Julius Lilienfeld.
> In an interesting twist of history, it's the Lilienfeld model of a MOS
> transistor that prevails in our current technology, not the Shockley
> junction device.

I once ran into a pre-WW2 data sheet (or ad?) for a transistor, indeed an FET 
that used selenium as the semiconducting material.  Most likely that was the 
Lilienfeld device.

Apparently they didn't work well, not surprising given the use of selenium, 
which is a very marginal semiconductor.  Speaking of which: some early 
computers tried to use selenium diodes as circuit elements (for gates), with 
rather limited success.  The MC ARRA is an example.

paul



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Chuck Guzis via cctalk
On 5/24/24 09:14, Sellam Abraham via cctalk wrote:

> This is on the Canonical List of ClassicCmp Debate Topics and is a dead
> horse so beaten that there's nothing left but teeth and fur at this point.
> 

Whatever--the MITS 8800 only I/O was a bunch of switches and LEDs. While
an I/O card could be added, that's as far as MITS went for several
years.  Real I/O was left to the user (i.e. buy a terminal of some sort).

By way of comparison, the HP-41 was far more complete as a personal
computer--it had I/O, expandable storage, input and display.  It was
Turing-complete.  And personal?  I suspect more HP41s were sold than the
entirety of MITS 8800s.

Just pointing out that "firsts" are very difficult.  Even though, for
years, Shockley et al were trumpeted as the "inventors of the
transistor", it's noteworthy that their patent application was carefully
worded to avoid claims from work decades earlier by Julius Lilienfeld.
In an interesting twist of history, it's the Lilienfeld model of a MOS
transistor that prevails in our current technology, not the Shockley
junction device.

I would not be at all surprised if some obscure work turned up that
predates Lilienfeld.  Certainly, "oscillating diodes" were known by his
time, but not commercialized.

"First" is a tricky term, like "best".

--Chuck




[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread John Foust via cctalk
At 07:50 AM 5/24/2024, Henry Bent via cctalk wrote:
>Surely the code written for Traf-O-Data, before Altair BASIC, counts as a
>commercial product; I'm not sure what definition of "published" you're
>using here.

They didn't sell Traf-o-data, did they?  I thought it was a tool they
used to analyze data for municipalities, and got paid for the service.

- John



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Sellam Abraham via cctalk
On Fri, May 24, 2024 at 8:34 AM Chuck Guzis via cctalk <
cctalk@classiccmp.org> wrote:

> On 5/24/24 07:57, CAREY SCHUG via cctalk wrote:
>
> > (I could be mistaken about the mentioned 8008 device, but I think that
> was a training device, no?)
>
> Do your homewoork--the MCM-70 ran APL, had cassette storage and a
> display and keyboard.  The MITS 8800 had nothing other than RAM and a
> CPU.  APL would have been a distant dream.
>
> Of course, the MCM0/70 was Canadian, and not USAn...
>
> --Chuck
>

This is on the Canonical List of ClassicCmp Debate Topics and is a dead
horse so beaten that there's nothing left but teeth and fur at this point.

Sellam


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Chuck Guzis via cctalk
On 5/24/24 07:57, CAREY SCHUG via cctalk wrote:

> (I could be mistaken about the mentioned 8008 device, but I think that was a 
> training device, no?)

Do your homewoork--the MCM-70 ran APL, had cassette storage and a
display and keyboard.  The MITS 8800 had nothing other than RAM and a
CPU.  APL would have been a distant dream.

Of course, the MCM0/70 was Canadian, and not USAn...

--Chuck




[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread CAREY SCHUG via cctalk
c'mon guys, the altair was the first device with a CPU chip and memory

--marketed to INDIVIDUALS, with the expectation that only one person or one 
related family will use it
--intended to be for GENERAL PURPOSE

Two, IMHO, requirements for a PERSONAL COMPUTER.  Note that a "personal 
computer" can be used by business or colleges also without being disqualified 
to being a personal computer

earlier devices were targeted as TRAINERS, controllers, or for embeded use, 

or sold to organizations (business or colleges)

the LGP-30 was  used by one person AT A TIME, but on different days used by 
different people, who might or might not know each other, by some arbitrary 
scheduling algorithm.  The one I was familiar with was run by a tech or grad 
student, doing work not for self, but for another, definitely outside the realm 
of "personal"

(I could be mistaken about the mentioned 8008 device, but I think that was a 
training device, no?)

--Carey


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Paul Koning via cctalk



> On May 24, 2024, at 10:40 AM, Sellam Abraham via cctalk 
>  wrote:
> 
> ...
> But it doesn't meet the other criteria Dave laid out. Most people these
> days have never heard of the Micral, but even normies might've heard of the
> Altair 8800 because of the very notoriety it has today because of it's
> significance back then.

This is a familiar pattern in discovery and invention.  In many cases, X was 
first invented by A and then some time later by B.  Or "discovered" instead of 
"invented".  And often the reason A is not generally identified as the first to 
do X is that the way A did it didn't lead to something that was widely used.

For example:
Vikings were the first Europeans to discover America, but their voyages didn't 
start a major movement so Columbus usually gets the credit.

FM radio was invented by Hanso Idzerda, but his approach was a bit odd and the 
economic reasons for it disappeared some years later, so Edwin Armstrong gets 
the credit and Idzerda is pretty much forgotten.  In this case, the bias is so 
strong that attempts to revise Wikipedia to correct the history get rejected.  
:-(

paul



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Sellam Abraham via cctalk
On Fri, May 24, 2024, 2:45 AM Christian Corti via cctalk <
cctalk@classiccmp.org> wrote:

> On Thu, 23 May 2024, Chuck Guzis wrote:
> > On 5/23/24 12:53, Dave Dunfield via cctalk wrote:
> >>   First Personal Computer (long before IBM PC)
>
> This would go back to the 50s or earlier. The LGP-30 and comparable
> machines are considered as personal computers, too.
>

But was it called a "personal computer"? And was it designed to be
"personal"?

>>   First system Bill Gates wrote code for (long before Microsoft)
>

> Didn't he write code for DEC machines at his school before that?
>

Yes, poorly.

> I don't think the "first" applies in this case.  The MCM/70 used an 8008
> > and was complete computer with storage and display--something the MITS
> > 8800 was not.
>
> And looking beyond the Great American barrier ;-) there was the MICRAL N,
> much earlier than the MITS, and considered as the first complete
> commercial microprocessor based computer, i.e. not a kit and available to
> normal customers.
>

But it doesn't meet the other criteria Dave laid out. Most people these
days have never heard of the Micral, but even normies might've heard of the
Altair 8800 because of the very notoriety it has today because of it's
significance back then.

Sellam


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Henry Bent via cctalk
On Fri, May 24, 2024, 07:47 Dave Dunfield via cctalk 
wrote:

>
> -- Christian Corti -- on "Bill Gates first code"
> >Didn't he write code for DEC machines at his school before that?
>
> I'm sure he wrote code before Mits BASIC - everyone writes lots of stuff as
> they learn - but as far as I have been able to determine - Mits BASIC was
> his
> first published commercial product.
>

Surely the code written for Traf-O-Data, before Altair BASIC, counts as a
commercial product; I'm not sure what definition of "published" you're
using here.

-Henry

>


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-24 Thread Christian Corti via cctalk

On Thu, 23 May 2024, Chuck Guzis wrote:

On 5/23/24 12:53, Dave Dunfield via cctalk wrote:

  First Personal Computer (long before IBM PC)


This would go back to the 50s or earlier. The LGP-30 and comparable 
machines are considered as personal computers, too.



  First system Bill Gates wrote code for (long before Microsoft)


Didn't he write code for DEC machines at his school before that?


I don't think the "first" applies in this case.  The MCM/70 used an 8008
and was complete computer with storage and display--something the MITS
8800 was not.


And looking beyond the Great American barrier ;-) there was the MICRAL N, 
much earlier than the MITS, and considered as the first complete 
commercial microprocessor based computer, i.e. not a kit and available to 
normal customers.


Christian



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-23 Thread Murray McCullough via cctalk
The MCM/70 was a Canadian invention though not certain it was a 'first' in
the microcomputer world. Some say the Kenbak 1 was. The Altair 8800, as I
argue, the first to reach a large audience. It demonstrated what was
possible to non-computer people.


Happy computing,

Murray :)


On Thu, May 23, 2024 at 9:36 PM Mike Katz via cctalk 
wrote:

> When my wife (now my ex-wife) told me during a move that my 2 PDP-8/E
> racks were not going to the new apartment because there wasn't room for
> her roll top desk and my computer.  And told me "they go or you go with
> them but they are not moving with us", I should have seen the signs and
> gone with them.
>
> That would have saved me a bunch of money in the divorce AND I would
> still have those beautiful PDP-8's.
>
> I'm still trying to recover from that one.
>
> On 5/23/2024 7:04 PM, Fred Cisin via cctalk wrote:
> > On Thu, 23 May 2024, Chuck Guzis via cctalk wrote:
> >> I couldn't wait to show it to a female working in my section.  She
> >> dropped by my apartment, took one look at the thing sitting on my
> >> kitchen table and burst out laughing.  "That's not a computer; it's a
> >> toy!" was her withering reaction.
> >> I don't know if my male ego ever recovered from that.  And I *hated* the
> >> DRAM boards.
> >
> > Be very thankful that it was before you had more invested in the
> > relationship.
> >
> > I almost failed to heed the warning (although FAR less personally
> > humiliating), when a new interest thought that "Hitchhiker's guide To
> > The Galaxy" was "stupid".
> >
> >
> > --
> > Grumpy Ol' Fred ci...@xenosoft.com
>
>


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-23 Thread Mike Katz via cctalk
When my wife (now my ex-wife) told me during a move that my 2 PDP-8/E 
racks were not going to the new apartment because there wasn't room for 
her roll top desk and my computer.  And told me "they go or you go with 
them but they are not moving with us", I should have seen the signs and 
gone with them.


That would have saved me a bunch of money in the divorce AND I would 
still have those beautiful PDP-8's.


I'm still trying to recover from that one.

On 5/23/2024 7:04 PM, Fred Cisin via cctalk wrote:

On Thu, 23 May 2024, Chuck Guzis via cctalk wrote:

I couldn't wait to show it to a female working in my section.  She
dropped by my apartment, took one look at the thing sitting on my
kitchen table and burst out laughing.  "That's not a computer; it's a
toy!" was her withering reaction.
I don't know if my male ego ever recovered from that.  And I *hated* the
DRAM boards.


Be very thankful that it was before you had more invested in the 
relationship.


I almost failed to heed the warning (although FAR less personally 
humiliating), when a new interest thought that "Hitchhiker's guide To 
The Galaxy" was "stupid".



--
Grumpy Ol' Fred ci...@xenosoft.com




[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-23 Thread Fred Cisin via cctalk

On Thu, 23 May 2024, Chuck Guzis via cctalk wrote:

I couldn't wait to show it to a female working in my section.  She
dropped by my apartment, took one look at the thing sitting on my
kitchen table and burst out laughing.  "That's not a computer; it's a
toy!" was her withering reaction.
I don't know if my male ego ever recovered from that.  And I *hated* the
DRAM boards.


Be very thankful that it was before you had more invested in the 
relationship.


I almost failed to heed the warning (although FAR less personally 
humiliating), when a new interest thought that "Hitchhiker's guide To The 
Galaxy" was "stupid".



--
Grumpy Ol' Fred ci...@xenosoft.com


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-23 Thread Paul Koning via cctalk
I have a vague memory of visiting the Computer Museum when it was still at DEC, 
in the Marlboro building (MRO-n).  About the only item I recall is a Goodyear 
STARAN computer (or piece of one).  I found it rather surprising to have see a 
computer made by a tire company.  I learned years later that the STARAN is a 
very unusual architecture, sometimes called a one-bit machine.  More precisely, 
I think it's a derivative of William Shooman's "Orthogonal Computer" vector 
computer architecture, which was for a while sold by Sanders Associates where 
he worked.  

paul

> On May 23, 2024, at 5:00 PM, Kevin Anderson via cctalk 
>  wrote:
> 
> I had the good fortune of visiting The Computer Museum in Boston in the 
> summer of 1984.  Reading the museum's Wikipedia article, it appears I was 
> there while they were still freshly setting up their Museum Wharf location, 
> yet hadn't officially opened yet.  Unfortunately I only had an hour (or 
> little more) to visit before I had to return to where my wife was at a 
> different location (which I vaguely recall was at an aquarium somewhere 
> nearby?).  The clerk at the front entrance was really surprised that I was 
> leaving so soon...which in hindsight I wish now had not been so short.
> 
> Kevin Anderson
> Dubuque, Iowa



[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-23 Thread Tarek Hoteit via cctalk
I think if you can find that colleague of yours again and then say “who is 
laughing now?” 

Regards,
Tarek Hoteit
AI Consultant, PhD
+1 360-838-3675


> On May 23, 2024, at 16:05, Chuck Guzis via cctalk  
> wrote:
> 
> On 5/23/24 12:53, Dave Dunfield via cctalk wrote:
> 
>> I've just passed on my "Mits Altair 8800" - this is a very historic system
>> from the 70s - it is:
>>  First Personal Computer (long before IBM PC)
>>  First S100 buss system
>>  First system Bill Gates wrote code for (long before Microsoft)
> 
> I don't think the "first" applies in this case.  The MCM/70 used an 8008
> and was complete computer with storage and display--something the MITS
> 8800 was not.
> 
> I spent the weekend soldering together my 8800 (CPU, SIO and 2x 4K DRAM)
> system, cursing the cheap white wire in the process.  Finally got it
> running with a TVT.
> 
> I couldn't wait to show it to a female working in my section.  She
> dropped by my apartment, took one look at the thing sitting on my
> kitchen table and burst out laughing.  "That's not a computer; it's a
> toy!" was her withering reaction.
> 
> I don't know if my male ego ever recovered from that.  And I *hated* the
> DRAM boards.
> 
> I do, however, still have the MITS box.  Haven't run it in nearly 40 years.
> 
> --Chuck
> 


[cctalk] Re: Experience using an Altair 8800 ("Personal computer" from 70s)

2024-05-23 Thread Chuck Guzis via cctalk
On 5/23/24 12:53, Dave Dunfield via cctalk wrote:

> I've just passed on my "Mits Altair 8800" - this is a very historic system
> from the 70s - it is:
>   First Personal Computer (long before IBM PC)
>   First S100 buss system
>   First system Bill Gates wrote code for (long before Microsoft)

I don't think the "first" applies in this case.  The MCM/70 used an 8008
and was complete computer with storage and display--something the MITS
8800 was not.

I spent the weekend soldering together my 8800 (CPU, SIO and 2x 4K DRAM)
system, cursing the cheap white wire in the process.  Finally got it
running with a TVT.

I couldn't wait to show it to a female working in my section.  She
dropped by my apartment, took one look at the thing sitting on my
kitchen table and burst out laughing.  "That's not a computer; it's a
toy!" was her withering reaction.

I don't know if my male ego ever recovered from that.  And I *hated* the
DRAM boards.

I do, however, still have the MITS box.  Haven't run it in nearly 40 years.

--Chuck



[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-23 Thread Kevin Anderson via cctalk
I had the good fortune of visiting The Computer Museum in Boston in the summer 
of 1984.  Reading the museum's Wikipedia article, it appears I was there while 
they were still freshly setting up their Museum Wharf location, yet hadn't 
officially opened yet.  Unfortunately I only had an hour (or little more) to 
visit before I had to return to where my wife was at a different location 
(which I vaguely recall was at an aquarium somewhere nearby?).  The clerk at 
the front entrance was really surprised that I was leaving so soon...which in 
hindsight I wish now had not been so short.

Kevin Anderson
Dubuque, Iowa


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Adrian Godwin via cctalk
At least it's a better title than 'The centre for computing history'
(Cambridge, UK).


On Wed, May 22, 2024 at 9:39 PM Sellam Abraham via cctalk <
cctalk@classiccmp.org> wrote:

> On Wed, May 22, 2024 at 1:15 PM John Foust  wrote:
>
> > At 01:32 PM 5/22/2024, Sellam Abraham via cctalk wrote:
> > >His and his wife
> > >Gwen's (god rest her soul as well) personal collecting and the museum at
> > >DEC was the basis for the Boston Computer Museum, which effectively went
> > >west and became the Computer History Museum.
> >
> > He was quite sensitive about this.  I made the same mistake, referring
> > to it as the "Boston Computer Museum."  He told me:
> >
> > "Let me be clear The Computer Museum (TCM) was NEVER called the
> > Boston Computer Museum...  Boston was a temporary home when computing
> > passed through New England, but the city itself gave nothing to it.
> > ...  As a former collector, founder, and board member of the
> > Digital Computer Museum > The Computer Museum >> current Computer History
> > Museum
> > (a name I deplore and that exists only because of the way the Museum left
> > Boston)
> > I have always been a strong advocate of getting as many artifacts into as
> > many
> > hands as possible, and this includes selling museum artifacts when
> > appropriate.
> > In essence a whole industry of museums and collectors is essential."
> >
> > - John
> >
>
> I appreciate the clarification.
>
> I agree that it's a shame that the CHM couldn't be called TCM.  "Computer
> History Museum" is a fairly awkward name.
>
> Sellam
>


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Sellam Abraham via cctalk
On Wed, May 22, 2024 at 1:15 PM John Foust  wrote:

> At 01:32 PM 5/22/2024, Sellam Abraham via cctalk wrote:
> >His and his wife
> >Gwen's (god rest her soul as well) personal collecting and the museum at
> >DEC was the basis for the Boston Computer Museum, which effectively went
> >west and became the Computer History Museum.
>
> He was quite sensitive about this.  I made the same mistake, referring
> to it as the "Boston Computer Museum."  He told me:
>
> "Let me be clear The Computer Museum (TCM) was NEVER called the
> Boston Computer Museum...  Boston was a temporary home when computing
> passed through New England, but the city itself gave nothing to it.
> ...  As a former collector, founder, and board member of the
> Digital Computer Museum > The Computer Museum >> current Computer History
> Museum
> (a name I deplore and that exists only because of the way the Museum left
> Boston)
> I have always been a strong advocate of getting as many artifacts into as
> many
> hands as possible, and this includes selling museum artifacts when
> appropriate.
> In essence a whole industry of museums and collectors is essential."
>
> - John
>

I appreciate the clarification.

I agree that it's a shame that the CHM couldn't be called TCM.  "Computer
History Museum" is a fairly awkward name.

Sellam


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread John Foust via cctalk
At 01:32 PM 5/22/2024, Sellam Abraham via cctalk wrote:
>His and his wife
>Gwen's (god rest her soul as well) personal collecting and the museum at
>DEC was the basis for the Boston Computer Museum, which effectively went
>west and became the Computer History Museum.

He was quite sensitive about this.  I made the same mistake, referring
to it as the "Boston Computer Museum."  He told me:

"Let me be clear The Computer Museum (TCM) was NEVER called the 
Boston Computer Museum...  Boston was a temporary home when computing 
passed through New England, but the city itself gave nothing to it.
...  As a former collector, founder, and board member of the 
Digital Computer Museum > The Computer Museum >> current Computer History 
Museum 
(a name I deplore and that exists only because of the way the Museum left 
Boston) 
I have always been a strong advocate of getting as many artifacts into as many 
hands as possible, and this includes selling museum artifacts when appropriate.
In essence a whole industry of museums and collectors is essential."

- John





[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Wayne S via cctalk
One issue in choosing a book size is that the booksellers have put the book on 
a standard sized shelf so it should conform to that size. Very tall books, like 
coffee table books are hard to display because they don’t fit on a shelf.
Booksellers really don’t like those (unless they become best sellers).

Sent from my iPhone

> On May 22, 2024, at 13:01, Gavin Scott via cctalk  
> wrote:
> 
> On Wed, May 22, 2024 at 2:39 PM Paul Koning  wrote:
>> As I mentioned, it is not unprecedented; I have a book about book design 
>> which talks at some length about choosing the page proportions, and it 
>> mentions square pages as one of the recognized choices.  I think it says 
>> that it isn't very common, but I don't remember what else it says, for 
>> example any particular reason why one might choose this format (or reasons 
>> to avoid it).
> 
> Here's another example:
> 
> https://www.amazon.com/Annotated-Illustrated-Double-Helix/dp/1476715491
> 
> where there's essentially a normal width book inside but the wider
> pages allow for extensive sidebar-style annotations and also provide
> more options for graphic layout of illustrations, etc.


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Gavin Scott via cctalk
On Wed, May 22, 2024 at 2:39 PM Paul Koning  wrote:
> As I mentioned, it is not unprecedented; I have a book about book design 
> which talks at some length about choosing the page proportions, and it 
> mentions square pages as one of the recognized choices.  I think it says that 
> it isn't very common, but I don't remember what else it says, for example any 
> particular reason why one might choose this format (or reasons to avoid it).

Here's another example:

https://www.amazon.com/Annotated-Illustrated-Double-Helix/dp/1476715491

where there's essentially a normal width book inside but the wider
pages allow for extensive sidebar-style annotations and also provide
more options for graphic layout of illustrations, etc.


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Paul Koning via cctalk



> On May 22, 2024, at 3:29 PM, Gavin Scott via cctalk  
> wrote:
> 
> On Wed, May 22, 2024 at 2:25 PM John Herron via cctalk
>  wrote:
> 
>> Out of curiosity is the book the size of a floppy disk or some computer
>> item at the time? (Any significance or just him being unique?).
> 
> Here's an Amazon listing showing what it looked like. Ordinary book
> size if not shape.
> 
> https://www.amazon.com/Computer-Structures-Readings-Examples-McGraw-Hill/dp/0070043574/

It's about as high as a typical hardcover textbook, just unusually wide.  I 
don't know of any other reason other than "it's different".

As I mentioned, it is not unprecedented; I have a book about book design which 
talks at some length about choosing the page proportions, and it mentions 
square pages as one of the recognized choices.  I think it says that it isn't 
very common, but I don't remember what else it says, for example any particular 
reason why one might choose this format (or reasons to avoid it).

paul



[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Gavin Scott via cctalk
On Wed, May 22, 2024 at 2:25 PM John Herron via cctalk
 wrote:

> Out of curiosity is the book the size of a floppy disk or some computer
> item at the time? (Any significance or just him being unique?).

Here's an Amazon listing showing what it looked like. Ordinary book
size if not shape.

https://www.amazon.com/Computer-Structures-Readings-Examples-McGraw-Hill/dp/0070043574/


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread John Herron via cctalk
On Wed, May 22, 2024, 1:58 PM Paul Koning via cctalk 
wrote:

>
>
> > On May 22, 2024, at 1:19 PM, Bill Degnan via cctalk <
> cctalk@classiccmp.org> wrote:
> >
> > It's a slog, but if you can make it through Gordon Bell's book, "Computer
> > Structures Readings and Examples" you realize Gordon is a "father of
> > vintage computing",
>
> I still have that book, though it's deep in some box.
>
> Fun trivia item: it's the only book I can remember that is square.  Almost
> all books are "portrait" layout; a few are "landscape", but while square
> format is a known option shown in book design references, it is almost
> unheard of.
>
> paul
>

Out of curiosity is the book the size of a floppy disk or some computer
item at the time? (Any significance or just him being unique?).

>


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Gavin Scott via cctalk
On Wed, May 22, 2024 at 1:50 PM Paul Koning via cctalk
 wrote:

> I still have that book, though it's deep in some box.

https://gordonbell.azurewebsites.net/cgb%20files/computer%20structures%20readings%20and%20examples%201971.pdf


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Paul Koning via cctalk



> On May 22, 2024, at 1:19 PM, Bill Degnan via cctalk  
> wrote:
> 
> It's a slog, but if you can make it through Gordon Bell's book, "Computer
> Structures Readings and Examples" you realize Gordon is a "father of
> vintage computing", in addition to his involvement with the first computer
> museum in Boston.  He knew better than anyone the historical significance
> of computing well before the term "vintage computer" existed.

I still have that book, though it's deep in some box.

Fun trivia item: it's the only book I can remember that is square.  Almost all 
books are "portrait" layout; a few are "landscape", but while square format is 
a known option shown in book design references, it is almost unheard of.

The only other book I can think of that's nearly (but not quite) square is the 
lovely "Powers of ten".

paul




[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Paul Koning via cctalk



> On May 22, 2024, at 11:10 AM, Don R via cctalk  wrote:
> 
> Control-G
> 
> In one of the comments I found this interesting tidbit:
> 
> Working at DEC for many years, I learned a lot from Mr. Bell.  One of my 
> favorite sayings was he calling himself "the industry standard dummy."  Which 
> simply meant that he approached all new products without pre-conceived 
> notions of "how" it should work.  He found so many bugs and interface errors 
> that way, and taught everyone to do the same.  On old computer keyboards one 
> used to be able to make a bell ring by typeing CTRL-G.  That industry 
> standard was set for G. Bell.

That's a nice story but it doesn't seem all that likely.  A "bell" code long 
predates ASCII where indeed it was Control-G; it showed up decades earlier in 
the 5-bit Teletype machines "Baudot" (a.k.a., "Murray") code.

paul



[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Sellam Abraham via cctalk
On Wed, May 22, 2024 at 10:19 AM Bill Degnan via cctalk <
cctalk@classiccmp.org> wrote:

> It's a slog, but if you can make it through Gordon Bell's book, "Computer
> Structures Readings and Examples" you realize Gordon is a "father of
> vintage computing", in addition to his involvement with the first computer
> museum in Boston.  He knew better than anyone the historical significance
> of computing well before the term "vintage computer" existed.
>
> And there is that stuff he did at DEC
>
> Bill
>

He really is.  Perhaps "Grandfather" is more appropriate.  His and his wife
Gwen's (god rest her soul as well) personal collecting and the museum at
DEC was the basis for the Boston Computer Museum, which effectively went
west and became the Computer History Museum.

Sellam


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Don R via cctalk
Control-G

In one of the comments I found this interesting tidbit:

Working at DEC for many years, I learned a lot from Mr. Bell.  One of my 
favorite sayings was he calling himself "the industry standard dummy."  Which 
simply meant that he approached all new products without pre-conceived notions 
of "how" it should work.  He found so many bugs and interface errors that way, 
and taught everyone to do the same.  On old computer keyboards one used to be 
able to make a bell ring by typeing CTRL-G.  That industry standard was set for 
G. Bell.



Sent from someone's iPhone

> On May 22, 2024, at 7:07 AM, Christian Liendo via cctalk 
>  wrote:
> 
> Ars Technica
> https://arstechnica.com/gadgets/2024/05/gordon-bell-an-architect-of-our-digital-age-dies-at-age-89/
> 
> 
> New York Times Obit
> 
> https://www.nytimes.com/2024/05/21/technology/c-gordon-bell-dead.html?unlocked_article_code=1.t00.xAnm.sr2ZsjF5OSti=url-share
> 


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Bill Degnan via cctalk
It's a slog, but if you can make it through Gordon Bell's book, "Computer
Structures Readings and Examples" you realize Gordon is a "father of
vintage computing", in addition to his involvement with the first computer
museum in Boston.  He knew better than anyone the historical significance
of computing well before the term "vintage computer" existed.

And there is that stuff he did at DEC

Bill

On Wed, May 22, 2024 at 12:14 PM Sellam Abraham via cctalk <
cctalk@classiccmp.org> wrote:

> Gordon Bell was a real delightful man, and most unassuming.  He was always
> warm and friendly to everyone, and it was a pleasure and honor to have
> known him.
>
> Sellam
>
> On Wed, May 22, 2024 at 7:07 AM Christian Liendo via cctalk <
> cctalk@classiccmp.org> wrote:
>
> > Ars Technica
> >
> >
> https://arstechnica.com/gadgets/2024/05/gordon-bell-an-architect-of-our-digital-age-dies-at-age-89/
> >
> >
> > New York Times Obit
> >
> >
> >
> https://www.nytimes.com/2024/05/21/technology/c-gordon-bell-dead.html?unlocked_article_code=1.t00.xAnm.sr2ZsjF5OSti=url-share
> >
>


[cctalk] Re: Thirties techies and computing history

2024-05-22 Thread Rick Bensene via cctalk

On 5/20/24 10:25, Bill Degnan via cctalk wrote:

>>> American Computer Museum
>>> Computer History Museum
>>> Computer Museum of America
>>> Large Scale Systems Museum
>>> Rhode Island Computer Museum
>>> System Source Computer Museum

Of course, there's the Living Computer Museum--oh, wait

...and wait...and wait...and


[cctalk] Re: C. Gordon Bell, Creator of a Personal Computer Prototype, Dies at 89

2024-05-22 Thread Sellam Abraham via cctalk
Gordon Bell was a real delightful man, and most unassuming.  He was always
warm and friendly to everyone, and it was a pleasure and honor to have
known him.

Sellam

On Wed, May 22, 2024 at 7:07 AM Christian Liendo via cctalk <
cctalk@classiccmp.org> wrote:

> Ars Technica
>
> https://arstechnica.com/gadgets/2024/05/gordon-bell-an-architect-of-our-digital-age-dies-at-age-89/
>
>
> New York Times Obit
>
>
> https://www.nytimes.com/2024/05/21/technology/c-gordon-bell-dead.html?unlocked_article_code=1.t00.xAnm.sr2ZsjF5OSti=url-share
>


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Kevin Jordan via cctalk
Virtual museums as well, e.g.:

http://www.nostalgiccomputing.org


On Mon, May 20, 2024 at 1:28 PM Christian Liendo via cctalk <
cctalk@classiccmp.org> wrote:

> I see computer history slowly growing. Before you had only one museum
> in the United States and now you have multiple ones such as but not
> limited to:
>
> American Computer Museum
> Computer History Museum
> Computer Museum of America
> Large Scale Systems Museum
> Rhode Island Computer Museum
> System Source Computer Museum
>
> 10 years ago I didn't see any computing history taught in a
> university. Now I see it being taught at NJIT.
>
>
> https://news.njit.edu/new-computer-science-elective-examines-history-computing
>
> There are people whose hard work is keeping computer history alive.
>


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Paul Koning via cctalk



> On May 20, 2024, at 3:40 PM, Adrian Godwin via cctalk  
> wrote:
> 
> I remember the VT100 interlace setting. Yes, it changed the signal
> generated. I don't know if it also changed the characteristics of the
> monitor but I would think not.

The Pro also has such a thing in its video card.  It doesn't touch the monitor 
as far as I can tell.  The details may in the video gate array spec (I have 
that on paper, buried somewhere); my guess would be that it changes the 
vertical sync frequency from horizontal rate / 262 to horizontal rate / 262.5.

> It gave slightly higher resolution (the expectation would be double but the
> tube didn't have focus that good) at the cost of a horrible juddering
> display. I don't remember it being there on the later VT220.

Yes, that's just how I remember it on the Pro.

The Pro has another display feature: you can set it to 625 lines at 25 Hz (or 
half that at 50 Hz).  The US monitor handles that, somewhat to my surprise, but 
it's not a pleasant experience.

paul



[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Adrian Godwin via cctalk
I remember the VT100 interlace setting. Yes, it changed the signal
generated. I don't know if it also changed the characteristics of the
monitor but I would think not.

It gave slightly higher resolution (the expectation would be double but the
tube didn't have focus that good) at the cost of a horrible juddering
display. I don't remember it being there on the later VT220.


On Mon, May 20, 2024 at 7:43 PM Will Cooke via cctalk 
wrote:

>
>
> > On 05/20/2024 12:06 PM CDT CAREY SCHUG via cctalk 
> wrote:
> >
>
> >
> > so, just curious. how do digital TVs (and monitors) work? I presume the
> dots are a rectangle, not sloping down to the right, no half a line at the
> top and bottom. Do they just assume the brain can't tell that (for the
> converted old analog tv signal) the image therefor slopes UP very slightly
> to the right from what it "should" be? and the top line is blank on the
> left side because that is the interlace frame?
> >
> > --Carey
> >
>
> Well, the slope is VERY slight.  Approximately 1/500 of the picture
> height.Probably impossible to detect with the eye.  In the old days when us
> older folks were young, the TV camera image was generated the same way,
> with a scanned beam.  So then the generated image matched the displayed
> image.  But Around the end of the 70s when solid state image sensors
> started coming into use, the generated image didn't match that displayed on
> the CRT.  But nobody noticed.  Now, almost all pictures are generated by
> some type of solid state generator and the lines aren't angled, and neither
> are the displayed lines.  So, again, it matches.
>
> The NTSC signal defines 525 lines per "frame," each frame made of two
> "fields" of 262 1/2 lines  (I may have frame and field mixed up.)  In one
> field, the half line is at the top.  It is at the bottom on the other.  But
> out of those 525 total lines, only around 480 (I forget exactly) are
> displayable.  The non-displayed lines are split between the top and
> bottom.  So the two half-lines aren't diplayable.  Those non-displayed
> lines are used for all sorts of things, including closed captioning.
>
> Old analog TVs and monitors make any changes for different types of
> signals;  they just (attempted to ) displayed whatever was thrown at them.
>
> Will
>


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Will Cooke via cctalk



> On 05/20/2024 12:06 PM CDT CAREY SCHUG via cctalk  
> wrote:
>

>
> so, just curious. how do digital TVs (and monitors) work? I presume the dots 
> are a rectangle, not sloping down to the right, no half a line at the top and 
> bottom. Do they just assume the brain can't tell that (for the converted old 
> analog tv signal) the image therefor slopes UP very slightly to the right 
> from what it "should" be? and the top line is blank on the left side because 
> that is the interlace frame?
>
> --Carey
>

Well, the slope is VERY slight.  Approximately 1/500 of the picture 
height.Probably impossible to detect with the eye.  In the old days when us 
older folks were young, the TV camera image was generated the same way, with a 
scanned beam.  So then the generated image matched the displayed image.  But 
Around the end of the 70s when solid state image sensors started coming into 
use, the generated image didn't match that displayed on the CRT.  But nobody 
noticed.  Now, almost all pictures are generated by some type of solid state 
generator and the lines aren't angled, and neither are the displayed lines.  
So, again, it matches.

The NTSC signal defines 525 lines per "frame," each frame made of two "fields" 
of 262 1/2 lines  (I may have frame and field mixed up.)  In one field, the 
half line is at the top.  It is at the bottom on the other.  But out of those 
525 total lines, only around 480 (I forget exactly) are displayable.  The 
non-displayed lines are split between the top and bottom.  So the two 
half-lines aren't diplayable.  Those non-displayed lines are used for all sorts 
of things, including closed captioning.

Old analog TVs and monitors make any changes for different types of signals;  
they just (attempted to ) displayed whatever was thrown at them.

Will


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Peter Corlett via cctalk
On Mon, May 20, 2024 at 12:06:13PM -0500, CAREY SCHUG via cctalk wrote:
[...]
> so, just curious. how do digital TVs (and monitors) work? I presume the
> dots are a rectangle, not sloping down to the right, no half a line at the
> top and bottom. Do they just assume the brain can't tell that (for the
> converted old analog tv signal) the image therefor slopes UP very slightly
> to the right from what it "should" be? and the top line is blank on the
> left side because that is the interlace frame?

The half-lines are not visible on an analogue CRT (unless it's faulty or
miscalibrated) because they're hidden behind the top and bottom of the
screen bezel, assuming that they're even sent to the electron gun at all.

A digital TV displaying an analogue signal will just crop the image to
simulate the bezel, since there's a lot of other cruft and noise in the
signal which is not actually picture data and would be quite distracting if
you could actually see it.

The slope in the scanlines is very gentle and pretty much not noticable
unless you're looking for it, and maybe not even then. You may well look at
it and say "yeah, that's on a slope", but is that due to the scanning
process or because the deflection yoke is twisted slightly? There are so
many adjustments on a CRT that affect each other that getting a picture at
all is a minor miracle.

I don't miss CRTs.



[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Sellam Abraham via cctalk
Is it perhaps OBD--On-Board Diagnostics?

Sellam

On Mon, May 20, 2024 at 11:06 AM Wayne S via cctalk 
wrote:

> The setup on the earlier monitors was sometimes call “ODB” , don‘t know
> why.  Was equivalent to setup.
>
> Sent from my iPhone
>
> > On May 20, 2024, at 11:02, Wayne S  wrote:
> >
> > In the vt100, setup menu “B” had an interlace on or off setting.
> > I just looked it up.
> >
> >
> > Sent from my iPhone
> >
> >> On May 20, 2024, at 10:51, Paul Koning via cctalk <
> cctalk@classiccmp.org> wrote:
> >>
> >> 
> >>
>  On May 20, 2024, at 1:37 PM, Wayne S via cctalk <
> cctalk@classiccmp.org> wrote:
> >>>
> >>> Young , hah. No i’m old 70.
> >>> The pc monitors, not Tv, always had a setup menu. Even the Vt100
> series let you choose interlace if you needed.
> >>
> >> VT100?  I don't think so.  And yes, it has a setup menu, but that's
> setup of the terminal functionality, not the monitor part.
> >>
> >> The earliest monitors could only handle one format.  A major innovation
> was "multisync" where the monitor would determine the horizontal and
> vertical sweep rate and line count, and display things the right way.  The
> first PC I owned had one of those, and as far as I can remember it had
> nothing that one would call a "setup menu".
> >>
> >> The reason interlace matters is not the very slight slope of the scan
> line in analog monitors, but rather the fact that alternate frames are
> offset by half the line spacing of the basic frame, so each frame sweeps
> out the gaps in between the lines scanned by the preceding frame.  It
> matters to get that right, otherwise you're not correctly displaying
> consecutive rows of pixels.  In particular, when doing scan conversion
> (from analog format to a digital X/Y pixel raster) you have to offset Y by
> one every other frame if interlace is used, but not if it isn't.
> >>
> >>   paul
> >>
> >>
>


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread ben via cctalk

On 2024-05-20 12:16 p.m., Will Cooke via cctalk wrote:




On 05/20/2024 1:02 PM CDT Wayne S via cctalk  wrote:


In the vt100, setup menu “B” had an interlace on or off setting.
I just looked it up.



That is almost certainly setting what type of signal is generated.  Like a TV 
of the same era, the monitor (display) portion doesn't care;  it just displays 
what it is sent.  That is very different from a monitor setting that sets 
either interlaced or non-interlaced.

Some reasons why you might prefer one over the other on the same screen:  
non-interlaced would have a horizontal gap between displayed lines whereas 
interlaced would fill them in.  However, interlaced is more prone to 
flickering, which can be very tiring to the eyes and cause headaches.

Will

And you have Color VS B




[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Will Cooke via cctalk



> On 05/20/2024 1:02 PM CDT Wayne S via cctalk  wrote:
> 
> 
> In the vt100, setup menu “B” had an interlace on or off setting.
> I just looked it up.
> 
> 
That is almost certainly setting what type of signal is generated.  Like a TV 
of the same era, the monitor (display) portion doesn't care;  it just displays 
what it is sent.  That is very different from a monitor setting that sets 
either interlaced or non-interlaced.

Some reasons why you might prefer one over the other on the same screen:  
non-interlaced would have a horizontal gap between displayed lines whereas 
interlaced would fill them in.  However, interlaced is more prone to 
flickering, which can be very tiring to the eyes and cause headaches.

Will


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Wayne S via cctalk
The setup on the earlier monitors was sometimes call “ODB” , don‘t know why.  
Was equivalent to setup. 

Sent from my iPhone

> On May 20, 2024, at 11:02, Wayne S  wrote:
> 
> In the vt100, setup menu “B” had an interlace on or off setting.
> I just looked it up.
> 
> 
> Sent from my iPhone
> 
>> On May 20, 2024, at 10:51, Paul Koning via cctalk  
>> wrote:
>> 
>> 
>> 
 On May 20, 2024, at 1:37 PM, Wayne S via cctalk  
 wrote:
>>> 
>>> Young , hah. No i’m old 70.
>>> The pc monitors, not Tv, always had a setup menu. Even the Vt100 series let 
>>> you choose interlace if you needed. 
>> 
>> VT100?  I don't think so.  And yes, it has a setup menu, but that's setup of 
>> the terminal functionality, not the monitor part.
>> 
>> The earliest monitors could only handle one format.  A major innovation was 
>> "multisync" where the monitor would determine the horizontal and vertical 
>> sweep rate and line count, and display things the right way.  The first PC I 
>> owned had one of those, and as far as I can remember it had nothing that one 
>> would call a "setup menu".
>> 
>> The reason interlace matters is not the very slight slope of the scan line 
>> in analog monitors, but rather the fact that alternate frames are offset by 
>> half the line spacing of the basic frame, so each frame sweeps out the gaps 
>> in between the lines scanned by the preceding frame.  It matters to get that 
>> right, otherwise you're not correctly displaying consecutive rows of pixels. 
>>  In particular, when doing scan conversion (from analog format to a digital 
>> X/Y pixel raster) you have to offset Y by one every other frame if interlace 
>> is used, but not if it isn't.
>> 
>>   paul
>> 
>> 


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Wayne S via cctalk
In the vt100, setup menu “B” had an interlace on or off setting.
I just looked it up.


Sent from my iPhone

> On May 20, 2024, at 10:51, Paul Koning via cctalk  
> wrote:
> 
> 
> 
>> On May 20, 2024, at 1:37 PM, Wayne S via cctalk  
>> wrote:
>> 
>> Young , hah. No i’m old 70.
>> The pc monitors, not Tv, always had a setup menu. Even the Vt100 series let 
>> you choose interlace if you needed. 
> 
> VT100?  I don't think so.  And yes, it has a setup menu, but that's setup of 
> the terminal functionality, not the monitor part.
> 
> The earliest monitors could only handle one format.  A major innovation was 
> "multisync" where the monitor would determine the horizontal and vertical 
> sweep rate and line count, and display things the right way.  The first PC I 
> owned had one of those, and as far as I can remember it had nothing that one 
> would call a "setup menu".
> 
> The reason interlace matters is not the very slight slope of the scan line in 
> analog monitors, but rather the fact that alternate frames are offset by half 
> the line spacing of the basic frame, so each frame sweeps out the gaps in 
> between the lines scanned by the preceding frame.  It matters to get that 
> right, otherwise you're not correctly displaying consecutive rows of pixels.  
> In particular, when doing scan conversion (from analog format to a digital 
> X/Y pixel raster) you have to offset Y by one every other frame if interlace 
> is used, but not if it isn't.
> 
>paul
> 
> 


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Paul Koning via cctalk



> On May 20, 2024, at 1:37 PM, Wayne S via cctalk  wrote:
> 
> Young , hah. No i’m old 70.
> The pc monitors, not Tv, always had a setup menu. Even the Vt100 series let 
> you choose interlace if you needed. 

VT100?  I don't think so.  And yes, it has a setup menu, but that's setup of 
the terminal functionality, not the monitor part.

The earliest monitors could only handle one format.  A major innovation was 
"multisync" where the monitor would determine the horizontal and vertical sweep 
rate and line count, and display things the right way.  The first PC I owned 
had one of those, and as far as I can remember it had nothing that one would 
call a "setup menu".

The reason interlace matters is not the very slight slope of the scan line in 
analog monitors, but rather the fact that alternate frames are offset by half 
the line spacing of the basic frame, so each frame sweeps out the gaps in 
between the lines scanned by the preceding frame.  It matters to get that 
right, otherwise you're not correctly displaying consecutive rows of pixels.  
In particular, when doing scan conversion (from analog format to a digital X/Y 
pixel raster) you have to offset Y by one every other frame if interlace is 
used, but not if it isn't.

paul




[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Peter Corlett via cctalk
On Mon, May 20, 2024 at 11:13:38AM -0500, CAREY SCHUG via cctalk wrote:
[...]
> many games and entry pcs with old style tv analog format, don't interlace,
> and tube TVs nearly all (except maybe a few late model high end ones?) are
> fine with that, but I seem to recall that most or all digital/flat screen
> can't deal with non-interlace.

Flat panels sold as PC monitors tend to support a smaller range of video
timings than those sold as televisions. Any television which can't handle
non-interlaced 15kHz video should be returned to the shop as defective.

What you may however find is that while all TVs should support 15kHz video,
they sometimes artificially restrict the range of supported modes on a
per-input basis, purportedly for compatibility or ease-of-use or similar
marketing claptrap. Further, some models will offer a different feature set
based on the *name* you assigned to that input via the TV's menus.

So you may well find that your TV starts playing nicely with your 1980s
micros if you lie to it and claim that you've really connected a VHS
machine.



[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Christian Liendo via cctalk
Kennet Classic is still important in getting history out to the public.

On Mon, May 20, 2024 at 1:25 PM Bill Degnan via cctalk
 wrote:
>
> Lol!I don't care, our little non profit is but a wee dot on the map
> compared with the well-funded giants.
>
> On Mon, May 20, 2024, 1:12 PM Christian Liendo via cctalk <
> cctalk@classiccmp.org> wrote:
>
> > Sorry I forgot to add Kennet Classic. I failed, my mistake.
> >
> > On Mon, May 20, 2024 at 1:11 PM Christian Liendo 
> > wrote:
> > >
> > > I see computer history slowly growing. Before you had only one museum
> > > in the United States and now you have multiple ones such as but not
> > > limited to:
> > >
> > > American Computer Museum
> > > Computer History Museum
> > > Computer Museum of America
> > > Large Scale Systems Museum
> > > Rhode Island Computer Museum
> > > System Source Computer Museum
> > >
> > > 10 years ago I didn't see any computing history taught in a
> > > university. Now I see it being taught at NJIT.
> > >
> > >
> > https://news.njit.edu/new-computer-science-elective-examines-history-computing
> > >
> > > There are people whose hard work is keeping computer history alive.
> >


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Chuck Guzis via cctalk
On 5/20/24 10:25, Bill Degnan via cctalk wrote:

>>> American Computer Museum
>>> Computer History Museum
>>> Computer Museum of America
>>> Large Scale Systems Museum
>>> Rhode Island Computer Museum
>>> System Source Computer Museum

Of course, there's the Living Computer Museum--oh, wait

-Chuck




[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Wayne S via cctalk
Young , hah. No i’m old 70.
The pc monitors, not Tv, always had a setup menu. Even the Vt100 series let you 
choose interlace if you needed. 


Sent from my iPhone

> On May 20, 2024, at 10:06, CAREY SCHUG  wrote:
> 
> Wayne, you must be one of those thirty-something techies from another thread.
> 
> for those of us in our 60s and 70s,
> 
> setup mode?  huh?  old TVs and monitors were purely analog.  No on-screen 
> displays and non-volatile memory bytes for setup.  adjustments for size and 
> position were rheostats.  interlace (on TVs) was because the incoming sigonal 
> started SLIGHTLY later for the interlaced frame and the horizontal sync was 
> slightly different (advanced?) on the incoming signal relative to the 
> vertical sync.  
> 
> With digital, the conversion of the analog input to digital for the display 
> has to start recording only half the first line.   and whatever conversion 
> there is because on the analog display, the scan line is at a slight angle, 
> lower on the right, so the interlaced frame starts at the same vertical 
> height, in the middle, as the other frame started on the left side.
> 
> so, just curious.  how do digital TVs (and monitors) work?  I presume the 
> dots are a rectangle, not sloping down to the right, no half a line at the 
> top and bottom.  Do they just assume the brain can't tell that (for the 
> converted old analog tv signal) the image therefor slopes UP very slightly to 
> the right from what it "should" be? and the top line is blank on the left 
> side because that is the interlace frame?
> 
> --Carey
> 
>> On 05/20/2024 11:46 AM CDT Wayne S via cctalk  wrote:
>> 
>> 
>> IIRC, didn’t most older pc monitors have a setup mode where one of the 
>> options was interlace or non-interlace.
>> 
>> 
>> Sent from my iPhone
>> 
 On May 20, 2024, at 09:35, Paul Koning via cctalk  
 wrote:
>>> 
>>> I think you have that backwards.
>>> 
>>> TVs use interlace.  Older PC displays may do so, or not; typically the 480 
>>> line format was not interlaced but there might be high resolution modes 
>>> that were.  The reason was to deal with bandwidth limitations.
>>> 
>>> Flat panel displays normally support a pile of input formats, though only 
>>> the "native" format (the actual line count matching the display hardware) 
>>> is directly handled, all the others involve reformatting to the native 
>>> format.  That reformatting generally results in some loss of display 
>>> quality, how much depends on how well the relevant hardware is designed.  
>>> And interlaced formats are often supported not just for the VGA input (if 
>>> there is one) but also for DVI/HDMI inputs.  To get the accurate answer you 
>>> have to check the specification sheet.
>>> 
>>>   paul
>>> 
 On May 20, 2024, at 12:13 PM, CAREY SCHUG via cctalk 
  wrote:
 
 This may have been covered before, VERY early in this tread.
 
 I think I tried a game on a flatscreen, and had issues.  I don't know if 
 it applies to the radio shack Color Computer, the interest of the original 
 poster.
 
 many games and entry pcs with old style tv analog format, don't interlace, 
 and tube TVs nearly all (except maybe a few late model high end ones?) are 
 fine with that, but I seem to recall that most or all digital/flat screen  
 can't deal with non-interlace.
 
 --Carey
>>> 


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Bill Degnan via cctalk
Lol!I don't care, our little non profit is but a wee dot on the map
compared with the well-funded giants.

On Mon, May 20, 2024, 1:12 PM Christian Liendo via cctalk <
cctalk@classiccmp.org> wrote:

> Sorry I forgot to add Kennet Classic. I failed, my mistake.
>
> On Mon, May 20, 2024 at 1:11 PM Christian Liendo 
> wrote:
> >
> > I see computer history slowly growing. Before you had only one museum
> > in the United States and now you have multiple ones such as but not
> > limited to:
> >
> > American Computer Museum
> > Computer History Museum
> > Computer Museum of America
> > Large Scale Systems Museum
> > Rhode Island Computer Museum
> > System Source Computer Museum
> >
> > 10 years ago I didn't see any computing history taught in a
> > university. Now I see it being taught at NJIT.
> >
> >
> https://news.njit.edu/new-computer-science-elective-examines-history-computing
> >
> > There are people whose hard work is keeping computer history alive.
>


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Christian Liendo via cctalk
Sorry I forgot to add Kennet Classic. I failed, my mistake.

On Mon, May 20, 2024 at 1:11 PM Christian Liendo  wrote:
>
> I see computer history slowly growing. Before you had only one museum
> in the United States and now you have multiple ones such as but not
> limited to:
>
> American Computer Museum
> Computer History Museum
> Computer Museum of America
> Large Scale Systems Museum
> Rhode Island Computer Museum
> System Source Computer Museum
>
> 10 years ago I didn't see any computing history taught in a
> university. Now I see it being taught at NJIT.
>
> https://news.njit.edu/new-computer-science-elective-examines-history-computing
>
> There are people whose hard work is keeping computer history alive.


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Christian Liendo via cctalk
I see computer history slowly growing. Before you had only one museum
in the United States and now you have multiple ones such as but not
limited to:

American Computer Museum
Computer History Museum
Computer Museum of America
Large Scale Systems Museum
Rhode Island Computer Museum
System Source Computer Museum

10 years ago I didn't see any computing history taught in a
university. Now I see it being taught at NJIT.

https://news.njit.edu/new-computer-science-elective-examines-history-computing

There are people whose hard work is keeping computer history alive.


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread CAREY SCHUG via cctalk
Wayne, you must be one of those thirty-something techies from another thread.

for those of us in our 60s and 70s,

setup mode?  huh?  old TVs and monitors were purely analog.  No on-screen 
displays and non-volatile memory bytes for setup.  adjustments for size and 
position were rheostats.  interlace (on TVs) was because the incoming sigonal 
started SLIGHTLY later for the interlaced frame and the horizontal sync was 
slightly different (advanced?) on the incoming signal relative to the vertical 
sync.  

With digital, the conversion of the analog input to digital for the display has 
to start recording only half the first line.   and whatever conversion there is 
because on the analog display, the scan line is at a slight angle, lower on the 
right, so the interlaced frame starts at the same vertical height, in the 
middle, as the other frame started on the left side.

so, just curious.  how do digital TVs (and monitors) work?  I presume the dots 
are a rectangle, not sloping down to the right, no half a line at the top and 
bottom.  Do they just assume the brain can't tell that (for the converted old 
analog tv signal) the image therefor slopes UP very slightly to the right from 
what it "should" be? and the top line is blank on the left side because that is 
the interlace frame?

--Carey

> On 05/20/2024 11:46 AM CDT Wayne S via cctalk  wrote:
> 
>  
> IIRC, didn’t most older pc monitors have a setup mode where one of the 
> options was interlace or non-interlace.
> 
> 
> Sent from my iPhone
> 
> > On May 20, 2024, at 09:35, Paul Koning via cctalk  
> > wrote:
> > 
> > I think you have that backwards.
> > 
> > TVs use interlace.  Older PC displays may do so, or not; typically the 480 
> > line format was not interlaced but there might be high resolution modes 
> > that were.  The reason was to deal with bandwidth limitations.
> > 
> > Flat panel displays normally support a pile of input formats, though only 
> > the "native" format (the actual line count matching the display hardware) 
> > is directly handled, all the others involve reformatting to the native 
> > format.  That reformatting generally results in some loss of display 
> > quality, how much depends on how well the relevant hardware is designed.  
> > And interlaced formats are often supported not just for the VGA input (if 
> > there is one) but also for DVI/HDMI inputs.  To get the accurate answer you 
> > have to check the specification sheet.
> > 
> >paul
> > 
> >> On May 20, 2024, at 12:13 PM, CAREY SCHUG via cctalk 
> >>  wrote:
> >> 
> >> This may have been covered before, VERY early in this tread.
> >> 
> >> I think I tried a game on a flatscreen, and had issues.  I don't know if 
> >> it applies to the radio shack Color Computer, the interest of the original 
> >> poster.
> >> 
> >> many games and entry pcs with old style tv analog format, don't interlace, 
> >> and tube TVs nearly all (except maybe a few late model high end ones?) are 
> >> fine with that, but I seem to recall that most or all digital/flat screen  
> >> can't deal with non-interlace.
> >> 
> >> --Carey
> >


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Wayne S via cctalk
IIRC, didn’t most older pc monitors have a setup mode where one of the options 
was interlace or non-interlace.


Sent from my iPhone

> On May 20, 2024, at 09:35, Paul Koning via cctalk  
> wrote:
> 
> I think you have that backwards.
> 
> TVs use interlace.  Older PC displays may do so, or not; typically the 480 
> line format was not interlaced but there might be high resolution modes that 
> were.  The reason was to deal with bandwidth limitations.
> 
> Flat panel displays normally support a pile of input formats, though only the 
> "native" format (the actual line count matching the display hardware) is 
> directly handled, all the others involve reformatting to the native format.  
> That reformatting generally results in some loss of display quality, how much 
> depends on how well the relevant hardware is designed.  And interlaced 
> formats are often supported not just for the VGA input (if there is one) but 
> also for DVI/HDMI inputs.  To get the accurate answer you have to check the 
> specification sheet.
> 
>paul
> 
>> On May 20, 2024, at 12:13 PM, CAREY SCHUG via cctalk  
>> wrote:
>> 
>> This may have been covered before, VERY early in this tread.
>> 
>> I think I tried a game on a flatscreen, and had issues.  I don't know if it 
>> applies to the radio shack Color Computer, the interest of the original 
>> poster.
>> 
>> many games and entry pcs with old style tv analog format, don't interlace, 
>> and tube TVs nearly all (except maybe a few late model high end ones?) are 
>> fine with that, but I seem to recall that most or all digital/flat screen  
>> can't deal with non-interlace.
>> 
>> --Carey
> 


[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread Paul Koning via cctalk
I think you have that backwards.

TVs use interlace.  Older PC displays may do so, or not; typically the 480 line 
format was not interlaced but there might be high resolution modes that were.  
The reason was to deal with bandwidth limitations.

Flat panel displays normally support a pile of input formats, though only the 
"native" format (the actual line count matching the display hardware) is 
directly handled, all the others involve reformatting to the native format.  
That reformatting generally results in some loss of display quality, how much 
depends on how well the relevant hardware is designed.  And interlaced formats 
are often supported not just for the VGA input (if there is one) but also for 
DVI/HDMI inputs.  To get the accurate answer you have to check the 
specification sheet.

paul

> On May 20, 2024, at 12:13 PM, CAREY SCHUG via cctalk  
> wrote:
> 
> This may have been covered before, VERY early in this tread.
> 
> I think I tried a game on a flatscreen, and had issues.  I don't know if it 
> applies to the radio shack Color Computer, the interest of the original 
> poster.
> 
> many games and entry pcs with old style tv analog format, don't interlace, 
> and tube TVs nearly all (except maybe a few late model high end ones?) are 
> fine with that, but I seem to recall that most or all digital/flat screen  
> can't deal with non-interlace.
> 
> --Carey



[cctalk] Re: interlace [was: NTSC TV demodulator ]

2024-05-20 Thread CAREY SCHUG via cctalk
This may have been covered before, VERY early in this tread.

I think I tried a game on a flatscreen, and had issues.  I don't know if it 
applies to the radio shack Color Computer, the interest of the original poster.

many games and entry pcs with old style tv analog format, don't interlace, and 
tube TVs nearly all (except maybe a few late model high end ones?) are fine 
with that, but I seem to recall that most or all digital/flat screen  can't 
deal with non-interlace.

--Carey


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread The Doctor via cctalk


On Sunday, May 19th, 2024 at 13:31, ben via cctalk  
wrote:

> My mind is fine, it the eyes that are going.
> Screens are getting bigger and text is getting smaller.
> I must be dreaming that.

HiDPI flatpanel displays definitely don't help with this. :/

The Doctor [412/724/301/703/415/510]
WWW: https://drwho.virtadpt.net/
Don't be mean. You don't have to be mean.



[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Paul Koning via cctalk



> On May 20, 2024, at 9:33 AM, Nico de Jong via cctalk  
> wrote:
> 
> 
> Den 2024-05-20 kl. 15:26 skrev Paul Koning via cctalk:
>> 
>> ...
>> I just flipped through it briefly, and spotted what was the Electrologica 
>> headquarters (page 143).  And a few pages later there is a bit of history 
>> that explains the French origin of the PR8000 (or P8000), which was where I 
>> learned assembly language programming.  Quite a neat machine but very little 
>> documentation of it still exists.
>> 
>>  paul
> 
> I have quite a lot of documentation for the P85x CPU's and other stuff. Let 
> me know what you need.
> 
> /Nico

The P85x are 16-bit machines, right?  The PR8000 is 24 bits.  The only 
documentation I have seen is what I supplied to Bitsavers.

paul

[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Nico de Jong via cctalk



Den 2024-05-20 kl. 15:26 skrev Paul Koning via cctalk:



On May 20, 2024, at 6:08 AM, Nico de Jong via cctalk  
wrote:

...
I used to work on the P6000 series, and they had a very interesting 
architecture. For those who want to know a bit more about Philips' history, I 
can recommend an e-book written by one of the guys in Sweden, where the P6000 
series was developped. The P6000 was based on the P800, but extended into a 
system appropiate for bookings, airline reservations, banking etc.

(Link below).

The author is Mats Danielson. By the way, the James Bond film "For your eyes only" shows 
a lot of Philips hardware. The "atomic comb" is a PTS 6272 keyboard with (I think) a 
display boltet to the back of it. Hilarious, just like the book.

/Nico

---
Read my new history book (free e-book)

https://www.researchgate.net/publication/37427_The_Rise_and_Fall_of_Philips_Data_Systems

Nice!

I just flipped through it briefly, and spotted what was the Electrologica 
headquarters (page 143).  And a few pages later there is a bit of history that 
explains the French origin of the PR8000 (or P8000), which was where I learned 
assembly language programming.  Quite a neat machine but very little 
documentation of it still exists.

paul


I have quite a lot of documentation for the P85x CPU's and other stuff. 
Let me know what you need.


/Nico



[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Paul Koning via cctalk



> On May 20, 2024, at 6:08 AM, Nico de Jong via cctalk  
> wrote:
> 
> ...
> I used to work on the P6000 series, and they had a very interesting 
> architecture. For those who want to know a bit more about Philips' history, I 
> can recommend an e-book written by one of the guys in Sweden, where the P6000 
> series was developped. The P6000 was based on the P800, but extended into a 
> system appropiate for bookings, airline reservations, banking etc.
> 
> (Link below).
> 
> The author is Mats Danielson. By the way, the James Bond film "For your eyes 
> only" shows a lot of Philips hardware. The "atomic comb" is a PTS 6272 
> keyboard with (I think) a display boltet to the back of it. Hilarious, just 
> like the book.
> 
> /Nico
> 
> ---
> Read my new history book (free e-book)
> 
> https://www.researchgate.net/publication/37427_The_Rise_and_Fall_of_Philips_Data_Systems

Nice!

I just flipped through it briefly, and spotted what was the Electrologica 
headquarters (page 143).  And a few pages later there is a bit of history that 
explains the French origin of the PR8000 (or P8000), which was where I learned 
assembly language programming.  Quite a neat machine but very little 
documentation of it still exists.

paul



[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Nico de Jong via cctalk



Den 2024-05-20 kl. 10:56 skrev Tony Duell via cctalk:

On Sun, May 19, 2024 at 4:56 PM Tarek Hoteit via cctalk
 wrote:

Thank you, Josh. How did your passion start with classical computers? Maybe 
this helps in understanding the generation?

I know how I got started, but not really why. Although I can explain
how it progressed.

It was May 1986, I was at a sale of old electronics hoping to get a
keyboard for my homebrew computer (this was before cheap PC keyboards
in the UK). I saw a Philips P850 minicomputer being sold essentially
for the scrap metal price. It had the user and service manuals with
it, and it had a lights-and-switches front panel which I'd read about
and never used. I bought it and somehow got it back to my student
room.

That evening I realised that there was a period of about 25 years of
computing which was going to be lost and forgotten if nobody did
something about it. So I did something and started collecting and
restoring all the old computers I could find. It was a lot easier to
find minicomputers and the like back then than it is now.

But why did I buy that initial P850? I am not sure. I've always been
interested in the history of electronics and computers, so perhaps
that was it.

-tony


I used to work on the P6000 series, and they had a very interesting 
architecture. For those who want to know a bit more about Philips' 
history, I can recommend an e-book written by one of the guys in Sweden, 
where the P6000 series was developped. The P6000 was based on the P800, 
but extended into a system appropiate for bookings, airline 
reservations, banking etc.


(Link below).

The author is Mats Danielson. By the way, the James Bond film "For your 
eyes only" shows a lot of Philips hardware. The "atomic comb" is a PTS 
6272 keyboard with (I think) a display boltet to the back of it. 
Hilarious, just like the book.


/Nico

---
Read my new history book (free e-book)

https://www.researchgate.net/publication/37427_The_Rise_and_Fall_of_Philips_Data_Systems

/Nico




[cctalk] Re: NTSC TV demodulator

2024-05-20 Thread Adrian Godwin via cctalk
I've also had TVs with modular RF inputs. One was a huge plasma wall TV and
the other a tiny cheap caravan TV but both had locations for an RF input
card which I assumed could be replaced by cards to suit local national TV
standards. These have no visible controls and, like the PC TV cards,
probably have an I2C tuner module. The linux TV application xawtv has code
for controlling these.

A similar device was used with BBC and Archimedes computers to receive the
UK teletext service - it tuned to the TV services and extracted data that
was broadcast in the vertical blanking interval.


On Mon, May 20, 2024 at 11:13 AM Adrian Godwin  wrote:

> At one time I had a few Bt848 based PCI TV tuner cards for a PC - Hauppage
> was a big player but there were others. Some were  composite video in, some
> also had a TV tuner section.
>
> I tried one as a video converter for PAL composite out from some home
> micro - possibly a Jupiter Ace. It wasn't that great, to be honest and
> doubtless the RF input is even worse but you don't really expect a great
> deal from an RF output in terms of video quality. If you can find one (most
> have been replaced by DVB-T cards : do they also accept analogue TV signals
> ?) they should be almost free.
>
>
>
> On Mon, May 20, 2024 at 10:03 AM Tony Duell via cctalk <
> cctalk@classiccmp.org> wrote:
>
>> On Sun, May 19, 2024 at 1:08 PM Will Cooke via cctalk
>>  wrote:
>>
>> > Does anyone know of a small TV tuner that tunes old analog TV channels
>> (US NTSC) and outputs composite or VGA or HDMI signals? I've looked around
>> a bit but haven't found anything. It's relatively easy to build one, but I
>> would prefer a pre-built solution.  And I'm sure others have run into this
>> same problem.
>>
>> Not for NTSC video, but for the UK UHF analogue TV...
>>
>> About 30 years ago we had a hobbyist electronics shop chain called
>> Maplin, who produced and sold their own range of kits, many of them
>> very good. When NICAM stereo TV sound was introduced in the UK, they
>> produced a kit to decode the NICAM signal to audio. In fact it was a
>> total of 3 kits -- the NICAM decoder board, a TV tuner/IF strip to
>> feed it (if you didn't want to try to tap off the NICAM subcarrier
>> from your existing TV 's tuner), and the case/connectors/tuner channel
>> memory/etc..
>>
>> I built the entire system for my parents who had a VCR [1] that could
>> record stereo sound from line-level inputs but which pre-dated NICAM.
>>
>> I then realised that the tuner/IF board on its own, with a multi-turn
>> pot added for tuning (rather than the remote control/memory control IC
>> used in the full unit) would be ideal for turning the RF output of UK
>> home computers into composite video. So I built a second tuner board
>> for that. Still have it, still use it.
>>
>> -tony
>>
>


[cctalk] Re: NTSC TV demodulator

2024-05-20 Thread Adrian Godwin via cctalk
At one time I had a few Bt848 based PCI TV tuner cards for a PC - Hauppage
was a big player but there were others. Some were  composite video in, some
also had a TV tuner section.

I tried one as a video converter for PAL composite out from some home micro
- possibly a Jupiter Ace. It wasn't that great, to be honest and doubtless
the RF input is even worse but you don't really expect a great deal from an
RF output in terms of video quality. If you can find one (most have been
replaced by DVB-T cards : do they also accept analogue TV signals ?) they
should be almost free.



On Mon, May 20, 2024 at 10:03 AM Tony Duell via cctalk <
cctalk@classiccmp.org> wrote:

> On Sun, May 19, 2024 at 1:08 PM Will Cooke via cctalk
>  wrote:
>
> > Does anyone know of a small TV tuner that tunes old analog TV channels
> (US NTSC) and outputs composite or VGA or HDMI signals? I've looked around
> a bit but haven't found anything. It's relatively easy to build one, but I
> would prefer a pre-built solution.  And I'm sure others have run into this
> same problem.
>
> Not for NTSC video, but for the UK UHF analogue TV...
>
> About 30 years ago we had a hobbyist electronics shop chain called
> Maplin, who produced and sold their own range of kits, many of them
> very good. When NICAM stereo TV sound was introduced in the UK, they
> produced a kit to decode the NICAM signal to audio. In fact it was a
> total of 3 kits -- the NICAM decoder board, a TV tuner/IF strip to
> feed it (if you didn't want to try to tap off the NICAM subcarrier
> from your existing TV 's tuner), and the case/connectors/tuner channel
> memory/etc..
>
> I built the entire system for my parents who had a VCR [1] that could
> record stereo sound from line-level inputs but which pre-dated NICAM.
>
> I then realised that the tuner/IF board on its own, with a multi-turn
> pot added for tuning (rather than the remote control/memory control IC
> used in the full unit) would be ideal for turning the RF output of UK
> home computers into composite video. So I built a second tuner board
> for that. Still have it, still use it.
>
> -tony
>


[cctalk] Re: Thirties techies and computing history

2024-05-20 Thread Tony Duell via cctalk
On Sun, May 19, 2024 at 4:56 PM Tarek Hoteit via cctalk
 wrote:
>
> Thank you, Josh. How did your passion start with classical computers? Maybe 
> this helps in understanding the generation?

I know how I got started, but not really why. Although I can explain
how it progressed.

It was May 1986, I was at a sale of old electronics hoping to get a
keyboard for my homebrew computer (this was before cheap PC keyboards
in the UK). I saw a Philips P850 minicomputer being sold essentially
for the scrap metal price. It had the user and service manuals with
it, and it had a lights-and-switches front panel which I'd read about
and never used. I bought it and somehow got it back to my student
room.

That evening I realised that there was a period of about 25 years of
computing which was going to be lost and forgotten if nobody did
something about it. So I did something and started collecting and
restoring all the old computers I could find. It was a lot easier to
find minicomputers and the like back then than it is now.

But why did I buy that initial P850? I am not sure. I've always been
interested in the history of electronics and computers, so perhaps
that was it.

-tony


[cctalk] Re: NTSC TV demodulator

2024-05-20 Thread Tony Duell via cctalk
On Sun, May 19, 2024 at 1:08 PM Will Cooke via cctalk
 wrote:

> Does anyone know of a small TV tuner that tunes old analog TV channels (US 
> NTSC) and outputs composite or VGA or HDMI signals? I've looked around a bit 
> but haven't found anything. It's relatively easy to build one, but I would 
> prefer a pre-built solution.  And I'm sure others have run into this same 
> problem.

Not for NTSC video, but for the UK UHF analogue TV...

About 30 years ago we had a hobbyist electronics shop chain called
Maplin, who produced and sold their own range of kits, many of them
very good. When NICAM stereo TV sound was introduced in the UK, they
produced a kit to decode the NICAM signal to audio. In fact it was a
total of 3 kits -- the NICAM decoder board, a TV tuner/IF strip to
feed it (if you didn't want to try to tap off the NICAM subcarrier
from your existing TV 's tuner), and the case/connectors/tuner channel
memory/etc..

I built the entire system for my parents who had a VCR [1] that could
record stereo sound from line-level inputs but which pre-dated NICAM.

I then realised that the tuner/IF board on its own, with a multi-turn
pot added for tuning (rather than the remote control/memory control IC
used in the full unit) would be ideal for turning the RF output of UK
home computers into composite video. So I built a second tuner board
for that. Still have it, still use it.

-tony


[cctalk] Re: Thirties techies and computing history

2024-05-19 Thread Bill Degnan via cctalk
> On the matter of the interest of the younger generation, I had 25 years
> of teaching at the end of my career as a point of observation. I
> frequently went into stories to explain how things that I taught matter
>  
>
> As I get older (71 this year) I wonder if there are really enough people
> in the world who care!
>
> cheers,
> Nigel Johnson
> (Previously popularly known as Bill Johnson, MD of Emulex Canada 1984-1987)
>
>
>
Having run a computer museum for 5 years, I can tell you there will always
be  young people who care about old tech, and who seek out knowledge.  I
have had many elementary school kids show up who know quite obscure details
of systems sold decades before they were born.  For the average visitor
however I learned really quick that you have to start off with something
that the person can connect with easily, before you get right into deep
ancient lore.

At some point computers as we know them (whatever your age) will be
forgotten.  The first-person experiences and problem solving, the
real-world use.   Like the archeologist who learns how to make ancient
human stone points for research purposes, but who has no actual reason to
actually hunt and prepare a meal with them.  The context will be lost.
 Even mundane things like printing, connecting with a modem, saving files
to external media.fading away.

Bill


[cctalk] Re: Thirties techies and computing history

2024-05-19 Thread Nigel Johnson Ham via cctalk

I hear you there!
I started out as a junior FE on the Univac 418 in 1971. Back then the 
console was a modified TeleType and no problem seeing the characters as 
the wove across the page at 10 cps!
Over my career, starting with 80x24 video terminals (VT05, VT52, VT100 
and clones) I was excited as the video went MDA, CGA. VGA, XGA and all 
the rest as  resolutions increased.

Lately however I am decreasing the resolution so I can see the characters!

On the matter of the interest of the younger generation, I had 25 years 
of teaching at the end of my career as a point of observation. I 
frequently went into stories to explain how things that I taught matter 
: To a question about why we need to learn about cycle times of machine 
instructions, I recounted a story about a time i was called in to find 
out why both 6800 processors that were monitoring SWR levels of multiple 
25kW BDCST FM TXs were showing 'Computer Failure' after a lightning 
strike on the CN Tower in Toronto.  I looked at the source code, 
calculated the cycle times, and worked out that the program was 
executing exactly as planned - IIRC 13.5 ms for the loop.  I put my 
scope on the watchdog reset signal and saw exactly that time!  (The 
problem was that the designer  was driving relays directly from TTL and 
the system had never been powered down). Adding a transistor driver 
solved the problem and stressed to the students the importance of 
carrying knowledge of instruction times and basic electronics in their 
toolbags!


From the above interaction, I got comments on my end-of-term faculty 
feedback form ranging from 'he brings the theory to life' to 'he keeps 
on going off into stories from the past that no longer matter'!


In one consulting contract i did, a young graduate was showing his 
disdain for the past when another consultant pointed out that I had 
already forgotten more than he will ever know about computer engineering!


Sometimes you can't teach them, I hope the remainder are able to carry 
the torch!


As I get older (71 this year) I wonder if there are really enough people 
in the world who care!


cheers,
Nigel Johnson
(Previously popularly known as Bill Johnson, MD of Emulex Canada 1984-1987)


On 2024-05-19 16:31, ben via cctalk wrote:




Don't get your mind get old. It’s a choice.


My mind is fine, it the eyes that are going.
Screens are getting bigger and text is getting smaller.
I must be dreaming that.





--
Nigel Johnson, MSc., MIEEE, MCSE VE3ID/G4AJQ/VA3MCU
Amateur Radio, the origin of the open-source concept!
Skype:  TILBURY2591




[cctalk] Re: Thirties techies and computing history

2024-05-19 Thread Murray McCullough via cctalk
My first emulator was for the Coleco ADAM back in the 1990’s. I bought the
ADAM in 1984 and watched a community grow up around it in various locations
across Canada and the US. The ADAM-con conventions began in 1989 in
Orlando. Emulation began in the 1990’s as a response to the continued
interest in keeping the 8-bit world going.

Happy computing,

Murray 


On Sun, May 19, 2024 at 4:33 PM ben via cctalk 
wrote:

> On 2024-05-19 9:14 a.m., Tarek Hoteit via cctalk wrote:
> > A friend of a friend had a birthday gathering. Everyone there was in
> their thirties, except for myself, my wife, and our friend. Anyway, I met a
> Google engineer, a Microsoft data scientist, an Amazon AWS recruiter (I
> think she was a recruiter), and a few others in tech who are friends with
> the party host. I had several conversations about computer origins, the
> early days of computing, its importance in what we have today, and so on.
> What I found disappointing and saddening at the same time is their utmost
> ignorance about computing history or even early computers. Except for their
> recall of the 3.5 floppy or early 2000’s Windows, there was absolutely
> nothing else that they were familiar with. That made me wonder if this is a
> sign that our living version of classical personal computing, in which many
> of us here in this group witnessed the invention of personal computing in
> the 70s, will stop with our generation. I assume that the most engaging
> folks in this newsgroup are in their fifties and beyond. (No offense to
> anyone. I am turning fifty myself)  I sense that no other generation
> following this user group's generation will ever talk about Altairs, CP/M
> s, PDPs, S100 buses, Pascal, or anything deemed exciting in computing. Is
> there hope, or is this the end of the line for the most exciting era of
> personal computers? Thoughts?
> >
> > Regards,
> > Tarek Hoteit
> >
> Well with the internet I have been finding a lot more about behind the
> history of the 1970's.
> The West Coast made the chips, and the East coast made the computers,
> while here in Canada,We just got to watch computers on TV with the
> blinking lights back then and the few chip sold by Radio Shack.
>
> Back then you could get to build a computer of some kind, on the kitchen
> table, as the knowledge was available, and parts Thu the hole. People
> are going retro simply because modern computers are too complex with
> documentation known to a few.
> The Z80 may be long gone, but I am sure lots of 8080's are sill
> for sale on ebay.
>
> I wanted to build a computer in my teens, and now I have time and the
> money. Looking back in time I see how bad the tech was back the for the
> average Joe.  BASIC to rot your brain. 4K ram so you never learned how
> to comment stuff. Word lengths 4,8,16 so you spent all your time shoe
> horning a stuff to fit. Parts costing a arm and a leg, and three weeks
> for delivery.
> (Today parts from China 95 cents, 2 months delivery and arm and leg for
> shipping).
>
> My latest design on paper, requires 74LSXX,74H74,CY7C122 (25ns 256x4
> ram),13 mhz osc, and lots of cmos 22V10's.A 18 bit serial cpu,
> with a memory cycle time of 2.25 uS. I am still working on my
> personal computer.
> Who knows,It might even work, but first the EMULATOR
> and cross assembler.
> Ben.
>
>
>
>
>
>
>
>
>
>


[cctalk] Re: Thirties techies and computing history

2024-05-19 Thread ben via cctalk





Don't get your mind get old. It’s a choice.


My mind is fine, it the eyes that are going.
Screens are getting bigger and text is getting smaller.
I must be dreaming that.





  1   2   3   4   5   6   7   8   9   10   >