RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-16 Thread Dave G4UGM
 -Original Message-
 From: cctalk [mailto:cctalk-boun...@classiccmp.org] On Behalf Of Jay Jaeger
 Sent: 16 July 2015 01:56
 To: cctalk@classiccmp.org
 Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM)
 
 Saul is indeed cited in the ACM article,
 
 http://dl.acm.org/citation.cfm?id=365671
 
 I know that Purdue had some folks that did their own maintenance, and
 sure, by the late 1960's one could certain pick them up cheap - the gold
 scrappers were not quite the issue they became later.  I know this because,
 besides the 7094 II that I did some work on (including replacing a germanium
 transistor with a modern silicon one at one point), the U. Wisconsin
 Chemistry department had a 7090 (oil core) on the 9th floor.  Some folks
 from Purdue came up at one point and helped fix a problem with it.
 
 Around 1975 the IBM 1410 and the IBM 7094 II we played with at UW were
 sold to a company in Ohio - or at least pieces were.  Paul Pierce and I went
 back to that same company in 1998 and recovered some of the IBM
 1410 and IBM 709x tapes that he lists on his site - Paul has an amazing setup
 where he reads the tapes *analog* using a 7 track drive, and then post-
 processes the results to de-skew and recover the data.
 
 JRJ
 

Apparently the School of Medicine, Manchester University, England were given a 
7090 which they later connected to a PDP-8. A bit of googling turned this up :-

http://www.ukuug.org/newsletter/linux-newsletter/linux@uk12/dclark.shtml

sadly Dave passed away about a year ago, but he kept many tapes and card decks 
the which are with the TNMOC at Bletchly.

Dave


 On 7/15/2015 7:12 PM, Chuck Guzis wrote:
  On 07/15/2015 04:05 PM, Jay Jaeger wrote:
 
  Paul adapted PUFFT (Purdue University Fast FORTRAN Translator) to do
  RS-232 bit serial I/O through a sense switch, and I wrote a spooling
  program that ran on a Datacraft 6024 located in the same room to do
  the card reading and printing.  I suppose somewhere inside of it the
  DC 6024 was humiliated - I expect that it was much faster than the
  7094 II.  ;)
 
  I remember PUFFT--that was Saul Rosen's baby, wasn't it?  A FORTRAN
  for undergrads--put in anything that *resembled* a FORTRAN statement
  and get some sort of result.  Missing parentheses?  Misspelling?
  Outright syntax errors? No problem.  I think Purdue had two 709x
  systems for PUFFT  The CDC 6500 was reserved for Serious Work.
 
  I understand that at the time, 7090/7094's were comparatively
  plentiful and (comparatively) inexpensive, hence their use.
 
  Liquid nitrogen would be the or worse part.  ;)
 
  Neil had a lot of interesting stories about the ETA-10 (originally
  named the GF-10 for the target of 10 gigaflops).  It all seemed so
  fantastic back then.
 
  Ah, it's all fun...
 
  --Chuck
 
 
 
 
 



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-16 Thread Chuck Guzis

On 07/16/2015 01:12 AM, Dave G4UGM wrote:


Apparently the School of Medicine, Manchester University, England
were given a 7090 which they later connected to a PDP-8. A bit of
googling turned this up :-

http://www.ukuug.org/newsletter/linux-newsletter/linux@uk12/dclark.shtml


Nice article.  Many folks fail to appreciate the link between medicine 
and the history of computing.  In particular, my first encounter with 
database technology came from a study of MEDLARS, the system still in 
existence today at the NIH.  What I found particularly fascinating was 
the ability to automatically organize text documents and then subject 
them to English-language querying and getting results.  (All done on 
tape, naturally)


I'm reminded of this when I look over at my bookshelf and see Gerry 
Salton's magnum opus Automatic Information Organization and Retrieval 
sitting on a shelf. MEDLARS figures prominently in this book.  Back 
then, it was pretty hot stuff.


Of course, now we have Google...

--Chuck








Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
This brings up a good point:  just because a D Flip Flop is clocked by
something other than a system-wide (or subsystem-wide) clock does not
turn it into a latch.  Flip flops can clocked by combinatorial inputs.
This can be a problematic thing of course, as they can cause glitch
problems - had a couple of those in our student-designed 12 bit
computer, where I ended up feeding the combinatorial input into a D Flip
flop that was clocked by the FPGA-wide 50 Mhz clock, and then fed the
output of that to the flip flops (in my case JK rather than D, but the
idea would be the same).

JRJ

On 7/15/2015 9:02 PM, Jon Elson wrote:
 On 07/15/2015 01:24 PM, Noel Chiappa wrote:
   On 7/14/2015 7:36 PM, Jon Elson wrote:

   On the system 360 CPUs, they did not use flip-flops like we are
 used
   to, today. They used latches ... Since these were discrete
 transistor
   implementations, a real flip-flop was too expensive, but a
 latch could
   be implemented in about 6 transistors, I think.
   The 11/45 used TTL ICs, so real FFs were available in that
 technology,
   although they may have used latches as well.

 This confused me a bit, until I realized that you were using latch
 for what
 I think of as 'SR flip-flop', and flip-flop for 'D and JK flip-flops'.
 Guess that shows how long ago I did hardware... :-)

 To be a bit more detailed, on the 360's, were those latches 'simple'
 SR flops
 (i.e. un-gated), or were they gated?


 Well, one would have to dig into the ALDs to be sure.  But, the FEMMs
 have some large drawings that are essentially RTL in graphical form, and
 a lot of description of how it all worked.  My understanding is all
 those registers were essentially D latches. So, they got one data input
 from the ALU or a mux, and a latch pulse, and provided a Q output.  Each
 of these latches took up at least 4 SLT packages, I'm not sure exactly
 how many for sure.  So, the whole latch was composed of something like 4
 NOR gates or the equivalent, plus one inverter.
 
 (Sorry about being so vague, I read a bunch of IBM FEMMs about a year
 ago when I had some spare time.)
 
 Jon
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
Saul is indeed cited in the ACM article,

http://dl.acm.org/citation.cfm?id=365671

I know that Purdue had some folks that did their own maintenance, and
sure, by the late 1960's one could certain pick them up cheap - the gold
scrappers were not quite the issue they became later.  I know this
because, besides the 7094 II that I did some work on (including
replacing a germanium transistor with a modern silicon one at one
point), the U. Wisconsin Chemistry department had a 7090 (oil core) on
the 9th floor.  Some folks from Purdue came up at one point and helped
fix a problem with it.

Around 1975 the IBM 1410 and the IBM 7094 II we played with at UW were
sold to a company in Ohio - or at least pieces were.  Paul Pierce and I
went back to that same company in 1998 and recovered some of the IBM
1410 and IBM 709x tapes that he lists on his site - Paul has an amazing
setup where he reads the tapes *analog* using a 7 track drive, and then
post-processes the results to de-skew and recover the data.

JRJ

On 7/15/2015 7:12 PM, Chuck Guzis wrote:
 On 07/15/2015 04:05 PM, Jay Jaeger wrote:
 
 Paul adapted PUFFT (Purdue University Fast FORTRAN Translator) to do
 RS-232 bit serial I/O through a sense switch, and I wrote a spooling
 program that ran on a Datacraft 6024 located in the same room to do the
 card reading and printing.  I suppose somewhere inside of it the DC 6024
 was humiliated - I expect that it was much faster than the 7094 II.  ;)
 
 I remember PUFFT--that was Saul Rosen's baby, wasn't it?  A FORTRAN for
 undergrads--put in anything that *resembled* a FORTRAN statement and get
 some sort of result.  Missing parentheses?  Misspelling?  Outright
 syntax errors? No problem.  I think Purdue had two 709x systems for
 PUFFT  The CDC 6500 was reserved for Serious Work.
 
 I understand that at the time, 7090/7094's were comparatively plentiful
 and (comparatively) inexpensive, hence their use.
 
 Liquid nitrogen would be the or worse part.  ;)
 
 Neil had a lot of interesting stories about the ETA-10 (originally named
 the GF-10 for the target of 10 gigaflops).  It all seemed so fantastic
 back then.
 
 Ah, it's all fun...
 
 --Chuck
 
 
 
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
Lots of machines supported variable length operands (like the machine
you reference in the link, IBM S/360, Burroughs, etc. etc.  However,
machines with variable length instructions not split into any kind of
word boundary are not as common.

This isn't about whether a machine was good or bad / worse or better /
or even level of historical interest.  Just whether or not it was
interesting - in particular interesting to me.  If the CDC machines like
that interest you, by all means have at them.  ;)

I note that, there isn't enough information in that manual to do what I
plan to do for the IBM 1410 - reproduce the actual machine logic.
Compare/contrast what you referred to to the documents at:

http://bitsavers.informatik.uni-stuttgart.de/pdf/ibm/1410/drawings/


On 7/15/2015 12:10 AM, Chuck Guzis wrote:
 On 07/14/2015 09:16 PM, Jay Jaeger wrote:
 
 Other than clones and the like (e.g., from folks like Honeywell), I'm
 not aware of any other machines with a similar architecture to the 1401
 and 1410.  Name them?
 
 Well, how about a bit-addressable, variable field length machine that
 had not only your basic set of floating point operations, but also
 variable-length binary, binary modulo-256 and packed BCD to a length of
 65535 bytes (131K BCD digits)?  Circa 1969-1971:
 
 http://bitsavers.informatik.uni-stuttgart.de/pdf/cdc/cyber/cyber_200/60256000_STAR-100hw_Dec75.pdf
 
 
 When you've got a few minutes to spare, try writing the VHDL for it.
 This was a Jim Thornton design, later taken over by Neil Lincoln.  Later
 versions of the machine had drastically reduced instruction sets from
 the original, culminating finally in the liquid-nitrogen cooled ETA-10.
 
 But really, variable-word length machines, while they made efficient use
 of storage, were pretty much limited to a character-serial
 memory-to-memory 2-address organization.  Quaint and perhaps
 interesting, but doomed from a performance standpoint.
 
 --Chuck
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Noel Chiappa
 From: Sean Caron

 Many examples of blinkenlights eye candy throughout computer history

It wasn't _just_ eye candy; it was a real help in problem debugging (when the
machine was stopped), and you could tell a lot about what the machine was
doing (when it was running) from the way the lights changed.

When the overall machine cost came down, they were too expensive to be worth
what they cost, though.

Speaking of lights for feedback, anyone remember the 'run bar' - or whatever
they called it, my memory fails me - on the display on the Lisp Machines?
Actually, it was a series, IIRC - one for the CPU, one for the disk, etc. The
machine didn't have LEDs, but it used short lines on the bit-map display
instead.

IIRC, the idea was copied from the Knight TV's on MIT-AI. (Which I believe
were the first-ever bit-mapped displays - anyone know of an earlier one?)

Noel


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
I remember when U Wisconsin ECE got their PDP-11/20 and I saw DOS
FORTRAN get stuck for the very first time.  I told the more senior
student who was responsible for getting things going, preparing
documentation, etc. that the machine was in a loop, and never coming
out.  He laughed at me, claiming there was no way I could know that.
Bzt.  Wrong.  Tee Hee.  That machine is now in my personal collection.

(PDP-11 DOS tended to scramble file system links and get stuck like that
- which inevitably required reloading the operating system disk - an
RC11 that still ran just fine when I last spun it up a year or so ago).

On 7/15/2015 12:35 PM, Paul Koning wrote:

 It wasn't _just_ eye candy; it was a real help in problem debugging (when the
 machine was stopped), and you could tell a lot about what the machine was
 doing (when it was running) from the way the lights changed.
 
 Absolutely.  When DEC introduced the Remote Diagnostic Console for the 11/70, 
 they started deploying those internally.  That makes sense.  But in the 
 RSTS/E development group, we put our foot down and told them “take it out and 
 give back our blinkenlights” because for OS development the ability to judge 
 what the machine is doing, or spot strange behavior, from the light patterns 
 is quite important.
 
...
   paul
 



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Paul Koning

 On Jul 15, 2015, at 2:14 PM, Chuck Guzis ccl...@sydex.com wrote:
 
 On 07/15/2015 10:48 AM, Jay Jaeger wrote:
 Lots of machines supported variable length operands (like the machine
 you reference in the link, IBM S/360, Burroughs, etc. etc.  However,
 machines with variable length instructions not split into any kind of
 word boundary are not as common.
 
 Sure, but that doesn't mean that they didn't exist.  As a matter of fact, the 
 machine I cited was *bit*-addressable.  That doesn't imply that any datum was 
 absolved of some sort of alignment.  But yes, you could have bit fields 
 overlapping word boundaries--let's see your 1410 do that...
 
 I really don't see much of a fundamental distinction between the 1401, 1410, 
 7080 or 1620 or any other variable word-length machine of the time.  One 
 really have to ask oneself why variable word-length? when it costs so much 
 in terms of performance.  I believe that it's mostly because memory was very 
 expensive and it was viewed as a way of coping with that issue.
 
 FWIW, Dijkstra disliked the 1620 immensely.  I don't recall his opinion of 
 the 1401.

Correct: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD00xx/EWD37.html

I don’t know what he thought of the 1401.  He did reject the 7040 when it was 
proposed to the university as its main computer.  That analysis is in 
http://www.cs.utexas.edu/users/EWD/transcriptions/OtherDocs/NN041.html (in 
Dutch).  When comparing with the machine they ended up with (Electrologica 
EL-X8) I have to concur with his judgment, not that the EL-X8 is flawless, but 
in key spots it gets things right that IBM gets wrong.

paul




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
1440s and 1460s  were architecturally 1401s (much as the 7010 is
architecturally a 1410 - software compatible).  I have not heard of a
1450 anywhere, but seem to recall hearing about at least one 1460 and
see photos of them online.

On 7/15/2015 12:26 AM, William Donzelli wrote:
 In the 7000 series, the 1410 equivalent was the 7010 - architecturally
 compatible, ran the same software, but implemented in 7000 series
 technology.  It came along in 1962.  So that was really the last one to
 be introduced of its ilk.

 Other than clones and the like (e.g., from folks like Honeywell), I'm
 not aware of any other machines with a similar architecture to the 1401
 and 1410.  Name them?
 
 1440 came after 1410. Quite a few were built, and one is being
 restored by the Binghamton bunch.
 
 1450 and 1460 came even later...but I have never seen evidence of any
 of these actually being installed.
 
 --
 Will
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Noel Chiappa
 On 7/14/2015 7:36 PM, Jon Elson wrote:

 On the system 360 CPUs, they did not use flip-flops like we are used
 to, today. They used latches ... Since these were discrete transistor
 implementations, a real flip-flop was too expensive, but a latch could
 be implemented in about 6 transistors, I think.
 The 11/45 used TTL ICs, so real FFs were available in that technology,
 although they may have used latches as well.

This confused me a bit, until I realized that you were using latch for what
I think of as 'SR flip-flop', and flip-flop for 'D and JK flip-flops'.
Guess that shows how long ago I did hardware... :-)

To be a bit more detailed, on the 360's, were those latches 'simple' SR flops
(i.e. un-gated), or were they gated?

Noel


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
Sigh.  Again, the difference is between how OPERANDS were formatted vs.
INSTRUCTIONS.  As I said, I agree that lots of machines had variable
length operands (including a couple at the bit level, which the 1400
series did not do except for an individual character).  But darn few had
variable length INSTRUCTIONS, things like operand address chaining, and
the like.

And again and again - it isn't about what is better or anything like that.

And again and again and again - it isn't about what survived into
current architectures.

Certainly the variable length instruction idea was about conserving memory.

On 7/15/2015 1:14 PM, Chuck Guzis wrote:
 On 07/15/2015 10:48 AM, Jay Jaeger wrote:
 Lots of machines supported variable length operands (like the machine
 you reference in the link, IBM S/360, Burroughs, etc. etc.  However,
 machines with variable length instructions not split into any kind of
 word boundary are not as common.
 
 Sure, but that doesn't mean that they didn't exist.  As a matter of
 fact, the machine I cited was *bit*-addressable.  That doesn't imply
 that any datum was absolved of some sort of alignment.  But yes, you
 could have bit fields overlapping word boundaries--let's see your 1410
 do that...
 
 I really don't see much of a fundamental distinction between the 1401,
 1410, 7080 or 1620 or any other variable word-length machine of the
 time.  One really have to ask oneself why variable word-length? when
 it costs so much in terms of performance.  I believe that it's mostly
 because memory was very expensive and it was viewed as a way of coping
 with that issue.
 
 FWIW, Dijkstra disliked the 1620 immensely.  I don't recall his opinion
 of the 1401.
 
 --Chuck
 
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Jay Jaeger
That would certainly be closer than any of the other examples that have
been thrown in the discussion.  But it, of course, is much newer than
the 1400 series.  IIRC, the discussion started when someone suggested
that there were quite a few machines that were similar to the 1400
series in terms of variable length.  Again, while that was true from the
perspective of variable length data fields, it wasn't from the
perspective of variable length instructions.

On 7/15/2015 2:42 PM, Chuck Guzis wrote:
 On 07/15/2015 11:29 AM, Jay Jaeger wrote:
 Sigh.  Again, the difference is between how OPERANDS were formatted vs.
 INSTRUCTIONS.  As I said, I agree that lots of machines had variable
 length operands (including a couple at the bit level, which the 1400
 series did not do except for an individual character).  But darn few had
 variable length INSTRUCTIONS, things like operand address chaining, and
 the like.
 
 You mean, like the iA432?
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Guy Sotomayor



On 7/15/15 10:28 AM, Noel Chiappa wrote:

Speaking of lights for feedback, anyone remember the 'run bar' - or whatever
they called it, my memory fails me - on the display on the Lisp Machines?
Actually, it was a series, IIRC - one for the CPU, one for the disk, etc. The
machine didn't have LEDs, but it used short lines on the bit-map display
instead.

Short here = ~1

Yes, I see them all the time when I'm using my Symbolics machines. ;-)

TTFN - Guy




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Chuck Guzis

On 07/15/2015 01:49 PM, Jay Jaeger wrote:

That would certainly be closer than any of the other examples that have
been thrown in the discussion.  But it, of course, is much newer than
the 1400 series.  IIRC, the discussion started when someone suggested
that there were quite a few machines that were similar to the 1400
series in terms of variable length.  Again, while that was true from the
perspective of variable length data fields, it wasn't from the
perspective of variable length instructions.


Well, you do realize that I'm having fun with you? :)

I've programmed the 1401--and while, not a 1410, it's not all that 
different.  I view it as an evolutionary dead-end, sort of like the 
80s-90s Dataflow machines.  I suppose it what it was intended to 
do--provide an easy transition from stuff like the 407 accounting 
machine-unit record genre.  It was easy to program (heck, I still can 
recall a few opcodes).  I suspect that the real 1401 lasted as long as 
it did mostly because it was a way to perform spooling of input and 
output cheaply, where printing on-line with say, a 7090 would have been 
a waste of resources.  Better to write a tape and let the 1401 do it.


My interest lies mostly in big iron and Im not likely to see much of 
that in an FPGA implementation.  You have to admit that any hardware 
description that starts out with (page 1-2):


Cooling for the basic computer consists of two 30-ton water-cooled 
condensing units.  These units cool only the CPU, MCS and I/O sections. 
 The MCU is air-cooled.  With the optional memory, the basic computer 
requires and additional 30-ton condensing unit.


Power for basic computer consists of one 250 kva 400 Hz motor-generator 
set.  The motor-generator set has the capability of providing power for 
the CPU, MCS, I/O and the MCU.  The optional memory requires the use of 
an additional 80 kva motor-generator set.


...and whose 1980's final version was immersed in liquid nitrogen, does 
draw one's attention--and I will confess appeals to the big boy toy 
aspect buried in me.


Still, there were some very notable machines from the aspect of 
architecture that would be worth resurrecting.  I'm just not sure that 
I'd consider the 1400 series to be one of them.


For example, I'd be interested in seeing, an FPGA version of a B5500--a 
remarkable machine architecturally if there ever was one.


Of course, suum cuique (your mileage may vary).

--Chuck







Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread Chuck Guzis

On 07/15/2015 01:30 PM, ben wrote:


Quick look on the web ... ARG! Max segment length 64K something.


Well, even in the late 70s, 64KB was still a goodly chunk of memory in 
the microprocessor world.  Which reminds me...


To bore you with another STAR tale--the machine had two page sizes--the 
small page, which was 512K 64-bit words and the large page, which 
was 64K words.  What some smart-alec discovered was that in a 512Kw 
system, it was possible (easily) to write an instruction that could 
never get started.  You could have up to 6 addresses in a vector 
instruction (3 operands+3 control vectors).  Storage was managed in 512 
bit super words.  Start your large-page job, put two of the operands 
near the end of a page and bingo--you get 8 page references just to get 
the instruction started.  Of course, the solution was to sell the 
customer more memory...


--Chuck



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-15 Thread ben

On 7/15/2015 3:54 PM, Chuck Guzis wrote:

On 07/15/2015 01:30 PM, ben wrote:


Quick look on the web ... ARG! Max segment length 64K something.


Well, even in the late 70s, 64KB was still a goodly chunk of memory in
the microprocessor world.  Which reminds me...

To bore you with another STAR tale--the machine had two page sizes--the
small page, which was 512K 64-bit words and the large page, which
was 64K words.  What some smart-alec discovered was that in a 512Kw
system, it was possible (easily) to write an instruction that could
never get started.  You could have up to 6 addresses in a vector
instruction (3 operands+3 control vectors).  Storage was managed in 512
bit super words.  Start your large-page job, put two of the operands
near the end of a page and bingo--you get 8 page references just to get
the instruction started.  Of course, the solution was to sell the
customer more memory...

--Chuck


Good idea  512KW is Dim Complex Foo(512,512)
a small matrix I guess for many things.





RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Dave G4UGM
 -Original Message-
 From: cctalk [mailto:cctalk-boun...@classiccmp.org] On Behalf Of Paul
 Koning
 Sent: 13 July 2015 17:03
 To: General Discussion: On-Topic Posts
 Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM)
 
 
  On Jul 13, 2015, at 8:35 AM, Jay Jaeger cu...@charter.net wrote:
 
  Another alternative would be to build a machine up from a Field
  Programmable Gate Array (e.g., the Digilent Nexys2 FPGA development
  board).  I recently completed an effort doing that for a 12 bit
  machine we designed and built in a logic/computer design class from
  racks of logic interconnected using IBM unit record plug boards in 1972.
 
  I am going to attempt to do the same for IBM's 1410 computer - a
  really big effort.
 
 That’s been done for all sorts of machines, of course; the PDP-11 comes to
 mind.
 
 One question would be what design approach you’re using.  A behavioral
 model is one option; that’s roughly SIMH in an FPGA.  And just like SIMH, the
 model is only as accurate as your knowledge of the obscure details of the
 original design.  Depending on the quality of available manuals, this accuracy
 may be rather low.  (For example, building a PDP-11 model if all you have is a
 Processor Handbook may not be accurate enough.)
 
 A different approach is to reproduce the actual logic design.  FPGAs can be
 fed gate level models, though that’s not the most common practice as I
 understand it.  But if you have access to that level of original design data, 
 the
 result can be quite accurate.
 
 I’ve done a partial gate level model of the CDC 6600, working from the wiring
 lists and module schematics.  It accurately reproduces (and explains) quite
 obscure properties of the peripheral processors, things that aren’t
 documented anywhere I know of other than in programmer lore.  It also
 yields a large model that simulates very slowly...
 
   paul

I think there are a several options for the degree of authenticity with FPGA 
re-implementations. At the simplest of levels my Baby Baby runs at the same 
speed as the full sized Baby, but it currently uses a 32 bit parallel logic in 
many places as I built a 32 bit wide store and it keeps much of the HDL code  
simple.  I do intend to try a full serial machine at some point, but its low on 
my list. I have only really use the Xilinx ISE in anger, but I note it is 
possible to see the generated gates generated from the HDL or simulated HDL 
from a gate level diagram. I believe you can also mix and match gates and HDL 
(I have not tried, too many other things to do.) 

My next project is likely to be the Ferranti Pegasus which is several orders of 
magnitude more complex than the Baby and will need a proper plan. 

Dave



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
My work has been using structural models, at the gate level, in VHDL
(Verilog would be fine, too, of course).  Individual components (for
example, a piece of an IBM SMS card, or in my existing case, gates made
available to student engineers that were actually individual
gates/chunks of DTL chips) get little behavioral models.  As I
mentioned, so far what I have done is reproduce and test a 12 bit
computer designed in an electrical engineering course on logic/computer
design.  In August I plan on publishing my experience on a website.

I would note that I also see value in the behavioral approach, which
really would be considerably more detailed than what you get form SimH.
 The IBM 1410 cycle-level simulator I have written is closer to what one
might get from a behavioral model, but even that is not quite so detailed.

Using the structural / gate level techniques, one does run into some
issues, most of which have (or will probably have) solutions:

1)  R/S latches composed of gates in a combinatorial loop.  The problems
this causes are several, including the latch getting folded into the
look up tables for gates which use the signal, and issues when one
brings such a signal out to an I/O pin to feed to a logic analyzer,
which can cause problems to appear and disappear.  My experience is that
one can add a D flip flop after the RS latch.  This typically works
because at 50 Mhz, it adds only 20 ns delay, which is comparable to gate
delays these old machines typically had.

2)  One-shots.  I haven't had to address this one yet, but I am sure
that I will.  I expect that one can simply use a counter to handle it -
no big deal at all.

3)  Flip flops which are clocked from combinatorial signals.  These tend
to cause timing/glitch issues.  For example, in one case the
combinatorial output was a zero-check on a counter.  Since the counter
flip flops did not all change at exactly the same time, that signal
could glitch during the simulated machines master clock edge.  They
respond well to the same general solution as #1 - stick a D flip flop
between the combinatorial output and the clock input.  In the case I
mentioned, that gave the signal an entire 50 Mhz clock period to settle
down.

And of course, getting the detailed information one needs to develop
such a model can be a challenge.  Fortunately for the older IBM
machines, IBM produced ALDs - Automated Logic Diagrams - which I hope
will generally have enough information.

My experience on FPGA forums during the development of my 12 bit
computer implementation was mixed.  I got some helpful comments, but the
majority of folks were not helpful, and instead preferred to bash me for
not redoing the entire machine design using FPGA's the way these
particular folks felt was the only right way to use them.  Bah.

JRJ


On 7/14/2015 2:58 AM, Dave G4UGM wrote:
 -Original Message-
 From: cctalk [mailto:cctalk-boun...@classiccmp.org] On Behalf Of Paul
 Koning
 Sent: 13 July 2015 17:03
 To: General Discussion: On-Topic Posts
 Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM)


 On Jul 13, 2015, at 8:35 AM, Jay Jaeger cu...@charter.net wrote:

 Another alternative would be to build a machine up from a Field
 Programmable Gate Array (e.g., the Digilent Nexys2 FPGA development
 board).  I recently completed an effort doing that for a 12 bit
 machine we designed and built in a logic/computer design class from
 racks of logic interconnected using IBM unit record plug boards in 1972.

 I am going to attempt to do the same for IBM's 1410 computer - a
 really big effort.

 That’s been done for all sorts of machines, of course; the PDP-11 comes to
 mind.

 One question would be what design approach you’re using.  A behavioral
 model is one option; that’s roughly SIMH in an FPGA.  And just like SIMH, the
 model is only as accurate as your knowledge of the obscure details of the
 original design.  Depending on the quality of available manuals, this 
 accuracy
 may be rather low.  (For example, building a PDP-11 model if all you have is 
 a
 Processor Handbook may not be accurate enough.)

 A different approach is to reproduce the actual logic design.  FPGAs can be
 fed gate level models, though that’s not the most common practice as I
 understand it.  But if you have access to that level of original design 
 data, the
 result can be quite accurate.

 I’ve done a partial gate level model of the CDC 6600, working from the wiring
 lists and module schematics.  It accurately reproduces (and explains) quite
 obscure properties of the peripheral processors, things that aren’t
 documented anywhere I know of other than in programmer lore.  It also
 yields a large model that simulates very slowly...

  paul
 
 I think there are a several options for the degree of authenticity with FPGA 
 re-implementations. At the simplest of levels my Baby Baby runs at the same 
 speed as the full sized Baby, but it currently uses a 32 bit parallel logic

Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread ben

On 7/13/2015 10:02 AM, Paul Koning wrote:


A different approach is to reproduce the actual logic design.  FPGAs
can be fed gate level models, though that’s not the most common
practice as I understand it.  But if you have access to that level
of original design data, the result can be quite accurate.



The big assumption here, is the software will NOT change the logic model
and the details are vender specific. Altera software is BAD for doing this.
Ben.





Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread ben

On 7/14/2015 9:46 AM, Jay Jaeger wrote:

My work has been using structural models, at the gate level, in VHDL
(Verilog would be fine, too, of course).  Individual components (for
example, a piece of an IBM SMS card, or in my existing case, gates made
available to student engineers that were actually individual
gates/chunks of DTL chips) get little behavioral models.  As I
mentioned, so far what I have done is reproduce and test a 12 bit
computer designed in an electrical engineering course on logic/computer
design.  In August I plan on publishing my experience on a website.

I would note that I also see value in the behavioral approach, which
really would be considerably more detailed than what you get form SimH.
  The IBM 1410 cycle-level simulator I have written is closer to what one
might get from a behavioral model, but even that is not quite so detailed.

Using the structural / gate level techniques, one does run into some
issues, most of which have (or will probably have) solutions:

1)  R/S latches composed of gates in a combinatorial loop.  The problems
this causes are several, including the latch getting folded into the
look up tables for gates which use the signal, and issues when one
brings such a signal out to an I/O pin to feed to a logic analyzer,
which can cause problems to appear and disappear.  My experience is that
one can add a D flip flop after the RS latch.  This typically works
because at 50 Mhz, it adds only 20 ns delay, which is comparable to gate
delays these old machines typically had.

2)  One-shots.  I haven't had to address this one yet, but I am sure
that I will.  I expect that one can simply use a counter to handle it -
no big deal at all.

3)  Flip flops which are clocked from combinatorial signals.  These tend
to cause timing/glitch issues.  For example, in one case the
combinatorial output was a zero-check on a counter.  Since the counter
flip flops did not all change at exactly the same time, that signal
could glitch during the simulated machines master clock edge.  They
respond well to the same general solution as #1 - stick a D flip flop
between the combinatorial output and the clock input.  In the case I
mentioned, that gave the signal an entire 50 Mhz clock period to settle
down.

And of course, getting the detailed information one needs to develop
such a model can be a challenge.  Fortunately for the older IBM
machines, IBM produced ALDs - Automated Logic Diagrams - which I hope
will generally have enough information.

My experience on FPGA forums during the development of my 12 bit
computer implementation was mixed.  I got some helpful comments, but the
majority of folks were not helpful, and instead preferred to bash me for
not redoing the entire machine design using FPGA's the way these
particular folks felt was the only right way to use them.  Bah.

JRJ


I have felt the right way is NOT to use VHDL or VERLOG  sadly. I use 
altera and using AHDL is the best for me as it cleanest language so far. 
FPGA's have never been standard logic, so why force standards, if you 
can not even agree on gates latches and flipflops in fpgas.


Here is the link you have been waiting for, IBM 1130 in FPGA and in the 
FLESH.

http://ibm1130.blogspot.ca/

Ben.
PS: Don't use blog format for the web site, they are a pain to read
or search if what you want is more than few years old.





Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Paul Koning

 On Jul 14, 2015, at 11:46 AM, Jay Jaeger cu...@charter.net wrote:
 
 ...
 Using the structural / gate level techniques, one does run into some
 issues, most of which have (or will probably have) solutions:
 
 1)  R/S latches composed of gates in a combinatorial loop.  The problems
 this causes are several, including the latch getting folded into the
 look up tables for gates which use the signal, and issues when one
 brings such a signal out to an I/O pin to feed to a logic analyzer,
 which can cause problems to appear and disappear.  My experience is that
 one can add a D flip flop after the RS latch.  This typically works
 because at 50 Mhz, it adds only 20 ns delay, which is comparable to gate
 delays these old machines typically had.

I didn’t like what happened with flops built out of gates when doing my 6600 
model.  So I replaced those by behavioral models.  The main reason was that the 
crossed-gate model would produce a mess with R and S both asserted, which that 
design would do at times, while the behavioral model was written to do 
something specific for that case.
 
 2)  One-shots.  I haven't had to address this one yet, but I am sure
 that I will.  I expect that one can simply use a counter to handle it -
 no big deal at all.

If you’re creating a model to run in simulation, you can just write a delay.  
But that’s not synthesizable, so if you really do need a delay then a counter, 
or a shift register, or something like that will be needed.  This is the sort 
of thing that makes a 6600 design tricky (and may also apply to some other fast 
machines): there are places where propagation delays are used for correctness, 
and if the replacement hardware is “too fast” it doesn’t work.

 
 3)  Flip flops which are clocked from combinatorial signals.  These tend
 to cause timing/glitch issues.  For example, in one case the
 combinatorial output was a zero-check on a counter.  Since the counter
 flip flops did not all change at exactly the same time, that signal
 could glitch during the simulated machines master clock edge.  They
 respond well to the same general solution as #1 - stick a D flip flop
 between the combinatorial output and the clock input.  In the case I
 mentioned, that gave the signal an entire 50 Mhz clock period to settle
 down.

That sounds like a bug in the original.  If you have a set of flops clocked by 
some signal, and it matters that the outputs don’t all change at the same time, 
then the original wasn’t reliable either.

paul



RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Dave G4UGM


 -Original Message-
 From: cctalk [mailto:cctalk-boun...@classiccmp.org] On Behalf Of ANDY HOLT
 Sent: 14 July 2015 10:20
 To: General Discussion: On-Topic and Off-Topic Posts
 Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM)
 
 
 - Original Message -
 From: Dave G4UGM dave.g4...@gmail.com
 To: General Discussion: On-Topic and Off-Topic Posts
 cctalk@classiccmp.org
 Sent: Tuesday, 14 July, 2015 8:58:09 AM
 Subject: RE: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM) ...
 My next project is likely to be the Ferranti Pegasus which is several orders 
 of
 magnitude more complex than the Baby and will need a proper plan.
 
 
 There may be troubles ahead …
 I had plans for doing something similar for the ICT1905 (FP6000) and
 discovered two catches in translating the logic diagrams:
 
 FPGAs are designed around the modern concept of a single clock that is widely
 distributed and having flipflop control by gating the input signals whereas 
 early
 Ferranti machines (1900, at least pre A series, Atlas*, and presumably
 Pegasus) used strobes which are hard and inefficient to do in a FPGA.

Actually the Pegasus should be relatively easy to implement in FPGA. It is 
all locked to 333Khz clock track derived from the drum. As all storage elements 
are delay lines which also run at the same speed as the drum, so you can 
transfer data between the two without using buffers almost everything is 
tightly coupled to the 333Khz clock. It was also one of the first machines to 
use standard replicable modules. According to Simon Lavington's book (which I 
don't trust 100%)  there are 20 types of package in a  basic Pegasus I, and you 
need 444 to build the machine. Out of these 314 are used to build the CPU but 
there are only 5 types of standard module.  So in practice its built a bit like 
a large PDP/8S but with Valves.

Charles Owen who Lavington credits with coming up with the Module Concept went 
to work for IBM in 1956 and was later made an IBM fellow.

https://books.google.co.uk/books?id=Dhk9wHXfQMkCpg=PA165lpg=PA165dq=charles+owen+IBM+pegasussource=blots=JSvHMLa1V8sig=SA2HBed4zErpjcKRSpVZ854HS8Yhl=ensa=Xredir_esc=y#v=onepageq=charles%20owen%20IBM%20pegasusf=false

 
 Maybe less likely to be the case in the Pegasus is the widespread use of 
 wired-
 or which can be hard to recognise in the logic diagrams (and, again, requires
 translating into real gates in an FPGA) Obviously a register transfer model
 wouldn't have those problems compared to a gate-level model and would be
 considerably simpler to implement but then risks losing some of the (possibly
 undocumented) edge cases.
 

It has germanium diodes so few wired OR's as far as I know.

 * Atlas would, presumably, be even trickier due to the use of asynchronous
 logic.

Altas would be great fun. I suspect you could do it by using multiple 
independent clocks and complex handshaking...

 
 Good luck, should be an interesting exercise.
 
 Andy

Thanks,
Dave.



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

I'm missing something in this discussion, I think.

HDL's (take your pick) are just programming languages like FORTRAN or C 
with different constraints.  What's the point of going to all the 
trouble of doing an FPGA implementation of a slow old architecture, when 
pretty much the same result could be obtained by running a software 
emulator?  Neither accurately reflects the details of the real 
thing--and there will always be the aspect of missing peripherals.


Perhaps the worst aspect of using FPGA is that this is a rapidly moving 
field, so that the part you used to do your implementation 10 years ago 
will no longer be available.I've done a few designs using 5V CPLDs 
(XC95xx series) not *that* long ago.  Now they themselves are quaint 
examples of obsolete hardware.  You can't win.


You can move software-only simulators quite easily, but I'm not as 
sanguine about FPGA designs.


And you still don't have the peripherals.  I suppose one could emulate a 
Univac Solid State machine in FPGA, but what would one do about the 
all-important drum coupled to the card reader and printer.  Has anyone 
rolled out a design for a DIY 1403 printer?


I've run the Cyber emulator as well as various SIMH emulators from time 
to time, but it's just not the same as the real thing--it's not even 
remotely the same.


--Chuck




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

On 07/14/2015 11:14 AM, Alan Hightower wrote:



Determinism. Unless you run your software simulator bare-metal - which
most aren't - cycle accuracy is always a race. Before you say modern
processors are 100,000 times faster than emulated ones - so just spin
wait until the next virtual time tick, that is always a moving ratio or
opportunity for a context switch right at the threshold. I might want to
emulate a vintage i7 on an i19 one day.


You might start with a CDC STAR-100 and then graduate to the i7.

Does a STAR-100 emulator (software, I'm not even going to broach the 
subject of hardware) even exist?


--Chuck




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Paul Koning

 On Jul 14, 2015, at 3:27 PM, tony duell a...@p850ug1.demon.co.uk wrote:
 
 
 That sounds like a bug in the original.  If you have a set of flops clocked 
 by some signal, and it matters that the 
 outputs don’t all change at the same time, then the original wasn’t reliable 
 either.
 
 It is very poor design, and not something that I would do, but it certainly 
 was done in production machines. 
 With care you can determine the width of the glitch, and if it's small 
 enough, ignore it. 
 
 But there is a related problem with FPGAs. You learn in introductory digital 
 electronic courses that there are
 2 types of counter. The Asynchronous, or ripple, counter where each flip-flop 
 toggles the next when it goes
 from 1 to 0. Obviously those do not all change at once, so if you combine 
 them with gates there can be 
 quite large glitches. Then there is the synchronous counter where all 
 flip-flops are clocked together. Now to a
 good approximation (all the flip-flops have the same delay from clock to 
 output), they do all change together.
 So if you now combine the outputs (say you AND some of the Q and Q/ outputs 
 to decode a particular state)
 the glitches will be small. That's what is taught. That is what works with 
 TTL, ECL, etc.
 
 Now try it in an FPGA (at least the Xilinx ones I've used). You will find 
 glitches all over the place. The reason
 is that the 'wires' linking the outputs of the flip-flops to the gates are 
 not wires at all. They are paths on the
 chip through logic multiplexers, buffers, etc that are set when the chip is 
 configured. And they introduce
 delays. Delays that are enough to cause glitches that are wide enough to 
 trigger other flip-flops.
 
 My experience of FPGAs is that if you design a circuit for an FPGA it will 
 work. If you take an existing design
 feed it into a schematic capture program and compile it for an FPGA then it 
 won’t.

I would modify that: if you take an existing design created by someone who 
doesn’t think about delay differences, then the FPGA version won’t work.  
Consider the 6600: at the speeds involved, you can’t design in that sloppy 
fashion.  So there are multi phase clocks everywhere, with consecutive latch 
points clocked by the consecutive phases.  That means that so long as the max 
latency is  the clock phase difference, it always works.

paul




RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread tony duell

 That sounds like a bug in the original.  If you have a set of flops clocked 
 by some signal, and it matters that the 
 outputs don’t all change at the same time, then the original wasn’t reliable 
 either.

It is very poor design, and not something that I would do, but it certainly was 
done in production machines. 
With care you can determine the width of the glitch, and if it's small enough, 
ignore it. 

But there is a related problem with FPGAs. You learn in introductory digital 
electronic courses that there are
2 types of counter. The Asynchronous, or ripple, counter where each flip-flop 
toggles the next when it goes
from 1 to 0. Obviously those do not all change at once, so if you combine them 
with gates there can be 
quite large glitches. Then there is the synchronous counter where all 
flip-flops are clocked together. Now to a
good approximation (all the flip-flops have the same delay from clock to 
output), they do all change together.
So if you now combine the outputs (say you AND some of the Q and Q/ outputs to 
decode a particular state)
the glitches will be small. That's what is taught. That is what works with TTL, 
ECL, etc.

Now try it in an FPGA (at least the Xilinx ones I've used). You will find 
glitches all over the place. The reason
is that the 'wires' linking the outputs of the flip-flops to the gates are not 
wires at all. They are paths on the
chip through logic multiplexers, buffers, etc that are set when the chip is 
configured. And they introduce
delays. Delays that are enough to cause glitches that are wide enough to 
trigger other flip-flops.

My experience of FPGAs is that if you design a circuit for an FPGA it will 
work. If you take an existing design
feed it into a schematic capture program and compile it for an FPGA then it 
won't.

-tony



paul



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
On 7/14/2015 11:27 AM, Paul Koning wrote:
 
 On Jul 14, 2015, at 11:46 AM, Jay Jaeger cu...@charter.net wrote:

 ...
 Using the structural / gate level techniques, one does run into some
 issues, most of which have (or will probably have) solutions:

 1)  R/S latches composed of gates in a combinatorial loop.  The problems
 this causes are several, including the latch getting folded into the
 look up tables for gates which use the signal, and issues when one
 brings such a signal out to an I/O pin to feed to a logic analyzer,
 which can cause problems to appear and disappear.  My experience is that
 one can add a D flip flop after the RS latch.  This typically works
 because at 50 Mhz, it adds only 20 ns delay, which is comparable to gate
 delays these old machines typically had.
 
 I didn’t like what happened with flops built out of gates when doing my 6600 
 model.  So I replaced those by behavioral models.  The main reason was that 
 the crossed-gate model would produce a mess with R and S both asserted, which 
 that design would do at times, while the behavioral model was written to do 
 something specific for that case.

The approach I have used is a compromise between the two - it isolates
the problems building flip flops out of gates, while still preserving
the original design.  That said, when I come across a flip flop on an
SMS card, I will probably build it its own behavioral model.



 2)  One-shots.  I haven't had to address this one yet, but I am sure
 that I will.  I expect that one can simply use a counter to handle it -
 no big deal at all.
 
 If you’re creating a model to run in simulation, you can just write a delay.  
 But that’s not synthesizable, so if you really do need a delay then a 
 counter, or a shift register, or something like that will be needed.  This is 
 the sort of thing that makes a 6600 design tricky (and may also apply to some 
 other fast machines): there are places where propagation delays are used for 
 correctness, and if the replacement hardware is “too fast” it doesn’t work.
 

I am creating one to be sythesizable.


 3)  Flip flops which are clocked from combinatorial signals.  These tend
 to cause timing/glitch issues.  For example, in one case the
 combinatorial output was a zero-check on a counter.  Since the counter
 flip flops did not all change at exactly the same time, that signal
 could glitch during the simulated machines master clock edge.  They
 respond well to the same general solution as #1 - stick a D flip flop
 between the combinatorial output and the clock input.  In the case I
 mentioned, that gave the signal an entire 50 Mhz clock period to settle
 down.
 
 That sounds like a bug in the original.  If you have a set of flops clocked 
 by some signal, and it matters that the outputs don’t all change at the same 
 time, then the original wasn’t reliable either.

That is just it - the combinatorial inputs were used FOR the clock on
some gates.  Right - not a good idea even back in 1972, though it
depends a little on what the rejection time / intertial delay of the
inputs are, but yes - certainly a design that would be prone to failure
(remember that this was a bunch of students trying to put together a
working 12 bit computer in about a month - ours included a cross
assembler and cross-interpreter, so we had real software running our
machine for its demo - including hangman played with the TTY keyboard
and an oscilloscope hooked to a pair of D/A converters for a display).

 
   paul
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread William Donzelli
 IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
 related systems) used five clocks delayed from each other (more
 commonly known as clock phases).

IBM used this method as well on many of their machines.

--
Will


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

On 07/14/2015 04:49 PM, Jay Jaeger wrote:


Not necessarily.  For example, it is impossible to find an IBM 1410, as
far as I know.  But there ARE 1415 consoles I knew of a while back, and
there are certainly 729s and 1403 printers and 1402 card read/punch
units up and running.


There are plenty of machines that are impossible to find.  And many that 
are gone that are quite novel.  That IBM sold so many is something in 
their favor, but how about a working Saxpy box--which is quite a bit 
more recent than your 1410?  Or the STAR-65, 1B or even -100.  The only 
65 was moved from Canada and scrapped.  My department had the only two 
1Bs and I saw those go under the sledgehammer and bolt cutters. I don't 
think that there are STAR-100s of any stripe (plain, -A, -B or -C) 
left--they were just too big.  Are there any BSPs or ASC's kicking around?


There are tons of lost non-IBM peripherals.

But we do have documentation on many of these things, so at least we 
know how they worked.  And I submit that in the long run, that's what 
matters.  There's very little relevant to the state of the art today 
that really matters. (Boy, am I going to get flamed on that)



Software just make it work emulator.  (Most of SimH stuff seems to be
at this level).


Or dedicated simulators (non-SIMH).  Often, all you have is the system 
documentation that talks about the instruction set and a few binary 
files.  Reverse-engineering can be fun and valuable.



That is why I use VHDL (or Verilog is fine to).  So that those models
are portable into the future.   The FPGA part doesn't matter so much,
but the model future portability does matter.


Maybe, but I'd rather read the design documents than a pile of HDL of 
any stripe.



1403's and IBM 729's and 1402 card read/punch still exist.  I seem to
recall the CHM doing something like building a 729 tape drive tester, too.


But there were LOTS of those.  Try something non-IBM and very obscure.


But something like the SBG 6120 PDP-8 is closer, potentially with real
lights and switches.  As another I example, I can envision an FPGA
sitting inside a real IBM 1415 console, running it's lights, responding
to it's switches and interacting with it's selectric typewriter.
Probably more than I will accomplish, but it is good to have goals.


A PDP-8 is a simple CPU, probably popular because of the lights and 
switches. I see evidence that these were eye candy--the DECStations are

practically the same thing, but apparently not nearly as desirable.

Seymour Cray should have used kinetic sculptures on his machines as part 
of eye candy, I guess. Or maybe more chrome...


--Chuck



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
On 7/14/2015 11:16 AM, ben wrote:
 
 Here is the link you have been waiting for, IBM 1130 in FPGA and in the
 FLESH.
 http://ibm1130.blogspot.ca/
 
 Ben.

Thanks for that link. It looks very interesting after a quick glance.  I
am sure that I will run into many of the same issues with the SMS based
IBM 1410 as the SLT based IBM 1130 - particularly the way latches were used.

Thanks.

JRJ


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Paul Koning

 On Jul 14, 2015, at 4:41 PM, Chuck Guzis ccl...@sydex.com wrote:
 
 On 07/14/2015 10:29 AM, Paul Koning wrote:
 
 The accuracy of the FPGA depends on the approach.  If it’s a
 structural (gate level) model, it is as accurate as the schematics
 you’re working from.  And as I mentioned, that accuracy is quite
 good; it lets you see obscure details that are not documented and
 certainly not visible in a software simulator.  The example I like to
 point to is the 6000 property that you can figure out a PPU 0 hard
 loop by doing a deadstart dump and looking for an unexpected zero in
 the instruction area: deadstart writes a zero where the P register
 points at that time.  But you won’t find that documented or explained
 anywhere.  The FPGA model derived from the schematics reproduces this
 behavior, and when you look at how it happens, the explanation
 becomes blindingly obvious.  *This* is why I feel there’s a point in
 doing this sort of work.
 
 I can agree with some points, but not others.  In the 6600, for example, 
 clock distribution was a big design issue--not so in an FPGA.  You had racks 
 of taper-pin mats of wiring between the cordwood moules extending over (by 
 today's standards) long distances.  Cooling was a huge issue. In those 
 respects, an FPGA hardly resembles a real 6600.

Certainly, the physical aspects are completely different.  And clock 
distribution, certainly.  Not so much between chassis, interestingly enough, 
but throughout the logic within a chassis.  And wire delays in chassis to 
chassis cabling are very significant.  Wire delays within the chassis, in a few 
cases, but 99% of the  wires are short enough that their delay is not a real 
consideration in comparison with the logic circuit stage delay.

The example I gave, and others like it, are properties of the detailed logic 
design, they are not dependent on the details of the timing closure, or the 
physical design of the machine.

paul




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
The 12-bit computer that I translated originally had *independent* 1
micro-second clocks in each of four racks.  The processor derived a 3
micro-second clock from that, but also a second clock that was out of
phase with the CPU master clock, used to sync. signals coming in from
the other racks (which had 10 foot cables in between).

On 7/14/2015 7:04 PM, Eric Smith wrote:
 On Tue, Jul 14, 2015 at 3:28 PM, tony duell a...@p850ug1.demon.co.uk wrote:
 If you mean 6 different clock sources (i.e. clocks delayed from each other, 
 etc) then that
 is not typical of a 1970s minicomputer in my experience.
 
 IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
 related systems) used five clocks delayed from each other (more
 commonly known as clock phases). In my experience that was more common
 in 1970s computers than a single-phase clock.
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread ben

On 7/14/2015 7:36 PM, Jon Elson wrote:

On 07/14/2015 07:44 PM, William Donzelli wrote:

IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
related systems) used five clocks delayed from each other (more
commonly known as clock phases).

IBM used this method as well on many of their machines.


On the system 360 CPUs, they did not use flip-flops like we are used to,
today.  They used latches, making it a requirement that there be at
least two clock phases in most of the CPU, so that data into the ALU,
for instance, remained stable when some register at the output was
clocked.  Since these were discrete transistor implementations, a real
flip-flop was too expensive, but a latch could be implemented in about 6
transistors, I think.



I guessing ( no schematic handy) that they made the 360 register file
easy to decode and build with latches.


The 11/45 used TTL ICs, so real FFs were available in that technology,
although they may have used latches as well.


I have seen some 11/?? schematics on bitsavers that the alu uses AOI gates
and includes a latch term.


Jon


Ben.




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Eric Smith
On Tue, Jul 14, 2015 at 3:28 PM, tony duell a...@p850ug1.demon.co.uk wrote:
 If you mean 6 different clock sources (i.e. clocks delayed from each other, 
 etc) then that
 is not typical of a 1970s minicomputer in my experience.

IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
related systems) used five clocks delayed from each other (more
commonly known as clock phases). In my experience that was more common
in 1970s computers than a single-phase clock.


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread ben

On 7/14/2015 7:31 PM, Chuck Guzis wrote:


Seymour Cray should have used kinetic sculptures on his machines as part
of eye candy, I guess. Or maybe more chrome...


You got a nice love seat. I could see a early cray style maching in a FPGA
but what good is number crunching if you don't have the memory on a FPGA 
card.



--Chuck


Ben.


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

On 07/14/2015 06:55 PM, Jay Jaeger wrote:


Architecturally, it was pretty much the last of its kind: the last of
the BCD decimal arithmetic machines, which also makes it interesting.
It has also become much more obscure than the 1401, which it followed,
because not nearly as many were made and sold.


Not by a long shot, it was followed by the 7000-series machines, namely 
the 7070 and 7080 (I have a soft spot for the 1620 myself) and 
ultimately variable length BCD was included in the S/360 and later series.


Having programmed 1401s, I'll grant that the architecture was pretty 
different from what we're used to today, but typical for a small machine 
of the time.


--Chuck



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Sean Caron
I think a lot of things drive the popularity of the PDP-8 from nostalgia to
historicity to perhaps the relative simplicity of the CPU to understand as
a design example in computer architecture ... IMO the machine is just a bit
too limited to be much fun to program in assembly ... although maybe some
are attracted to the challenge :O

But certainly a PDP-8 or PDP-11 for that matter would have been much more
common in the field; much more possible for someone to get their hands on
in some kind of nominally working condition; much more affordable; easier
to digest than a large mainframe or supercomputer. It's a shame more
examples of these machines haven't been preserved but in many cases, there
weren't many examples produced in the first place ... as well, I think a
lot of the mainframe vendors took a scorched earth policy and tried to
destroy as much of the older equipment as possible ... to keep it off the
gray market.

Many examples of blinkenlights eye candy throughout computer history but as
far as supers go, I think the CM-5 ranks pretty close to the top for me :O

Best,

Sean




On Tue, Jul 14, 2015 at 9:31 PM, Chuck Guzis ccl...@sydex.com wrote:

 On 07/14/2015 04:49 PM, Jay Jaeger wrote:

  Not necessarily.  For example, it is impossible to find an IBM 1410, as
 far as I know.  But there ARE 1415 consoles I knew of a while back, and
 there are certainly 729s and 1403 printers and 1402 card read/punch
 units up and running.


 There are plenty of machines that are impossible to find.  And many that
 are gone that are quite novel.  That IBM sold so many is something in their
 favor, but how about a working Saxpy box--which is quite a bit more recent
 than your 1410?  Or the STAR-65, 1B or even -100.  The only 65 was moved
 from Canada and scrapped.  My department had the only two 1Bs and I saw
 those go under the sledgehammer and bolt cutters. I don't think that there
 are STAR-100s of any stripe (plain, -A, -B or -C) left--they were just too
 big.  Are there any BSPs or ASC's kicking around?

 There are tons of lost non-IBM peripherals.

 But we do have documentation on many of these things, so at least we know
 how they worked.  And I submit that in the long run, that's what
 matters.  There's very little relevant to the state of the art today that
 really matters. (Boy, am I going to get flamed on that)

  Software just make it work emulator.  (Most of SimH stuff seems to be
 at this level).


 Or dedicated simulators (non-SIMH).  Often, all you have is the system
 documentation that talks about the instruction set and a few binary files.
 Reverse-engineering can be fun and valuable.

  That is why I use VHDL (or Verilog is fine to).  So that those models
 are portable into the future.   The FPGA part doesn't matter so much,
 but the model future portability does matter.


 Maybe, but I'd rather read the design documents than a pile of HDL of any
 stripe.

  1403's and IBM 729's and 1402 card read/punch still exist.  I seem to
 recall the CHM doing something like building a 729 tape drive tester, too.


 But there were LOTS of those.  Try something non-IBM and very obscure.

  But something like the SBG 6120 PDP-8 is closer, potentially with real
 lights and switches.  As another I example, I can envision an FPGA
 sitting inside a real IBM 1415 console, running it's lights, responding
 to it's switches and interacting with it's selectric typewriter.
 Probably more than I will accomplish, but it is good to have goals.


 A PDP-8 is a simple CPU, probably popular because of the lights and
 switches. I see evidence that these were eye candy--the DECStations are
 practically the same thing, but apparently not nearly as desirable.

 Seymour Cray should have used kinetic sculptures on his machines as part
 of eye candy, I guess. Or maybe more chrome...

 --Chuck




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread William Donzelli
 In the 7000 series, the 1410 equivalent was the 7010 - architecturally
 compatible, ran the same software, but implemented in 7000 series
 technology.  It came along in 1962.  So that was really the last one to
 be introduced of its ilk.

 Other than clones and the like (e.g., from folks like Honeywell), I'm
 not aware of any other machines with a similar architecture to the 1401
 and 1410.  Name them?

1440 came after 1410. Quite a few were built, and one is being
restored by the Binghamton bunch.

1450 and 1460 came even later...but I have never seen evidence of any
of these actually being installed.

--
Will


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Sean Caron
That's an interesting argument against using FPGAs in this sort of
application; definitely food for thought. That said, from my (admittedly
limited hobbyist and academic exposure) to FPGAs, I would expect the bulk
of of whatever's being implemented would be fairly device-agnostic ...
certainly you might have to go and recreate your project in the newer
version of whatever toolchain you're using for design, and you might have
to go and remap all your pads for pins in and out of the device but I'd
expect you could just take the bulk of the VHDL or Verilog implementing the
CPU and any emulated peripherals on-FPGA and basically copy and paste that
right in, no? I could see some minor tweaks being required, but certainly
nothing on the order of doing the original design.

Best,

Sean


On Tue, Jul 14, 2015 at 1:17 PM, Chuck Guzis ccl...@sydex.com wrote:

 I'm missing something in this discussion, I think.

 HDL's (take your pick) are just programming languages like FORTRAN or C
 with different constraints.  What's the point of going to all the trouble
 of doing an FPGA implementation of a slow old architecture, when pretty
 much the same result could be obtained by running a software emulator?
 Neither accurately reflects the details of the real thing--and there will
 always be the aspect of missing peripherals.

 Perhaps the worst aspect of using FPGA is that this is a rapidly moving
 field, so that the part you used to do your implementation 10 years ago
 will no longer be available.I've done a few designs using 5V CPLDs
 (XC95xx series) not *that* long ago.  Now they themselves are quaint
 examples of obsolete hardware.  You can't win.

 You can move software-only simulators quite easily, but I'm not as
 sanguine about FPGA designs.

 And you still don't have the peripherals.  I suppose one could emulate a
 Univac Solid State machine in FPGA, but what would one do about the
 all-important drum coupled to the card reader and printer.  Has anyone
 rolled out a design for a DIY 1403 printer?

 I've run the Cyber emulator as well as various SIMH emulators from time to
 time, but it's just not the same as the real thing--it's not even remotely
 the same.

 --Chuck





Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Sean Caron
As well, some early microprocessors used multiple clocks i.e. the TMS9900.

Best,

Sean


On Tue, Jul 14, 2015 at 8:04 PM, Eric Smith space...@gmail.com wrote:

 On Tue, Jul 14, 2015 at 3:28 PM, tony duell a...@p850ug1.demon.co.uk
 wrote:
  If you mean 6 different clock sources (i.e. clocks delayed from each
 other, etc) then that
  is not typical of a 1970s minicomputer in my experience.

 IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
 related systems) used five clocks delayed from each other (more
 commonly known as clock phases). In my experience that was more common
 in 1970s computers than a single-phase clock.



RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread tony duell

  My experience of FPGAs is that if you design a circuit for an FPGA it will 
  work. If you take an existing design
  feed it into a schematic capture program and compile it for an FPGA then it 
  won't.
 
 Actually, you can, and I have done so - provided that the original
 machine was slow enough.  It works, in part, because the FPGA's are
 sooo much faster than the original design, that you can use the
 trailing D flip flop approach I described to convert the former into
 the latter - the glitches occur on the time scale of the FPGA logic, but
 are gone by the time the next simulated machine clock arrives.

That is not 'taking an existing design and feeding it into a schematic capture 
program'. It's modifying the
design (adding the D-types to synchronise signals and removed glitches). I do 
not dispute you can do that.

-tony


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
Sometimes it is fun to be a relative expert on an obscure branch of
knowledge that few people are even aware of.

I worked on one when I was a student, as an operator, programmer and
systems programmer.  Tweaked its FORTRAN compiler to spit out text error
messages instead of just error codes.  The FE trusted me to swap out
Selectrics, including plugging their paddle cards into the SMS slot on
the console.  It was my first real job.

If it were not for me with assistance from Paul Pierce, this tape (and a
couple of others also on Paul's site) probably would not have been
recovered, and the software for this machine would be lost to most folks:

http://www.piercefuller.com/library/kpr155.html


On 7/14/2015 6:30 PM, William Donzelli wrote:
 The 1130 is more modern than the machines I am interested in.  While
 there are still several 1401's our there in the wild I am aware of no
 IBM 1410's anywhere, unless IBM has one squirreled away somewhere.
 
 OK, I am curious. Why the love for the 1410?
 
 I do not know of any, either.
 
 --
 Will
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jon Elson

On 07/14/2015 07:44 PM, William Donzelli wrote:

IIRC, the KB11 processors used in the DEC 11/45 and 11/70 (and other
related systems) used five clocks delayed from each other (more
commonly known as clock phases).

IBM used this method as well on many of their machines.

On the system 360 CPUs, they did not use flip-flops like we 
are used to, today.  They used latches, making it a 
requirement that there be at least two clock phases in most 
of the CPU, so that data into the ALU, for instance, 
remained stable when some register at the output was 
clocked.  Since these were discrete transistor 
implementations, a real flip-flop was too expensive, but a 
latch could be implemented in about 6 transistors, I think.


The 11/45 used TTL ICs, so real FFs were available in that 
technology, although they may have used latches as well.


Jon


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
Meh.  You take your machines and I'll take mine. :)  The IBM 1410 is a
machine I know well, so I know how it is supposed to work, and I have
detailed information in the form of the ALD's and the CE training
materials to go with it, plus software including diagnostics and
operational software I can test it with.  So I have a way to verify the
functional correctness of the reproduction.

Architecturally, it was pretty much the last of its kind: the last of
the BCD decimal arithmetic machines, which also makes it interesting.
It has also become much more obscure than the 1401, which it followed,
because not nearly as many were made and sold.

Software wise, the PR-155 OS for the 1410 was pretty decent for a design
that started in the early 1960s.  It could do multi-programming, both
for I/O spooling operations and for tele-processing, with swappable
transient TP programs along side batch operations, if you had the
memory.   The machine I worked on was only 40,000 BCD characters, but
ran the full operating system (sans TP).

Dumb channels though (simple FSM's - no program-ability), which kept the
size of the machine and its cost down.

More below.

JRJ

On 7/14/2015 8:31 PM, Chuck Guzis wrote:
 On 07/14/2015 04:49 PM, Jay Jaeger wrote:
 
 Not necessarily.  For example, it is impossible to find an IBM 1410, as
 far as I know.  But there ARE 1415 consoles I knew of a while back, and
 there are certainly 729s and 1403 printers and 1402 card read/punch
 units up and running.
 
 There are plenty of machines that are impossible to find.  And many that
 are gone that are quite novel.  That IBM sold so many is something in
 their favor, but how about a working Saxpy box--which is quite a bit
 more recent than your 1410?  Or the STAR-65, 1B or even -100.  The only
 65 was moved from Canada and scrapped.  My department had the only two
 1Bs and I saw those go under the sledgehammer and bolt cutters. I don't
 think that there are STAR-100s of any stripe (plain, -A, -B or -C)
 left--they were just too big.  Are there any BSPs or ASC's kicking around?
 

I am not interested in recent.  Indeed, if I did anything after the
1410, I'd probably go sideways or backwards in time.  I'll leave them to
you.  ;)

 There are tons of lost non-IBM peripherals.
 
 But we do have documentation on many of these things, so at least we
 know how they worked.  And I submit that in the long run, that's what
 matters.  There's very little relevant to the state of the art today
 that really matters. (Boy, am I going to get flamed on that)
 

No, you are correct.  This has nothing to do with the state of the art.
 This is a hobby/historical documentation effort.  But as I mentioned in
the earlier note the how they worked comes in levels, and my effort is
at a lower level of abstraction / higher level of detail.

 Software just make it work emulator.  (Most of SimH stuff seems to be
 at this level).
 
 Or dedicated simulators (non-SIMH).  Often, all you have is the system
 documentation that talks about the instruction set and a few binary
 files.  Reverse-engineering can be fun and valuable.
 
 That is why I use VHDL (or Verilog is fine to).  So that those models
 are portable into the future.   The FPGA part doesn't matter so much,
 but the model future portability does matter.
 
 Maybe, but I'd rather read the design documents than a pile of HDL of
 any stripe.
 

To each is own.  Enjoy.

 1403's and IBM 729's and 1402 card read/punch still exist.  I seem to
 recall the CHM doing something like building a 729 tape drive tester,
 too.
 
 But there were LOTS of those.  Try something non-IBM and very obscure.

Well, if I had more money I might have snagged a CDC-160A a few years
back (it went for over twice what I could afford at the time - I maxed
out at $2k), and I'd probably be doing that one, but, such is life.

 
 A PDP-8 is a simple CPU, probably popular because of the lights and
 switches. I see evidence that these were eye candy--the DECStations are
 practically the same thing, but apparently not nearly as desirable.

The PDP-8 variants are popular with collectors for a number of reasons.
 Approachable physical size is one.  Ordinary TTL is another.  Speeds
that folks can deal with and lack of overall complexity is yet another
reason.  A console that can help debugging is yet another. The first
machine in my collection is a PDP-8/L for all of those reasons.

(Interestingly, the IBM 1410 console tells much more about what is going
on in the machine that might be at first apparent - the entire machine
state (aside from memory) is pretty much there, but presented in a way
that is quite different than what one sees on a PDP-8)

RE DECStaions: I think what you mean are the DECMates, but yes: they
indeed used the Intersil chip sets.

 
 Seymour Cray should have used kinetic sculptures on his machines as part
 of eye candy, I guess. Or maybe more chrome...
 
 --Chuck
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Jay Jaeger
Yes, the S/360 had packed decimal - but much more limited in length, and
no wordmark concept.

The 7070 and 7080 were contemporary with the 1410, not after it.  They
did not follow it.  While data representations were somewhat similar,
the instruction formats were very different.

he 7080 (which apparently fixed a 7070 design which made it an orphan
pretty quick) was announced in January 1960, and the 1410 in October of
that same year.   What 7070 and 7080 actually followed were the 705 and
650.  They were also different from the 1401/1410 in that they used
fixed length instructions, rather than the variable instructions used by
the 1401/1410.

https://www-03.ibm.com/ibm/history/exhibits/dpd50/dpd50_chronology2.html

In the 7000 series, the 1410 equivalent was the 7010 - architecturally
compatible, ran the same software, but implemented in 7000 series
technology.  It came along in 1962.  So that was really the last one to
be introduced of its ilk.

Other than clones and the like (e.g., from folks like Honeywell), I'm
not aware of any other machines with a similar architecture to the 1401
and 1410.  Name them?

By 1968 the System/360 had essentially toasted them all.

JRU

On 7/14/2015 10:10 PM, Chuck Guzis wrote:
 On 07/14/2015 06:55 PM, Jay Jaeger wrote:
 
 Architecturally, it was pretty much the last of its kind: the last of
 the BCD decimal arithmetic machines, which also makes it interesting.
 It has also become much more obscure than the 1401, which it followed,
 because not nearly as many were made and sold.
 
 Not by a long shot, it was followed by the 7000-series machines, namely
 the 7070 and 7080 (I have a soft spot for the 1620 myself) and
 ultimately variable length BCD was included in the S/360 and later series.
 
 Having programmed 1401s, I'll grant that the architecture was pretty
 different from what we're used to today, but typical for a small machine
 of the time.
 
 --Chuck
 
 


Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

On 07/14/2015 09:16 PM, Jay Jaeger wrote:


Other than clones and the like (e.g., from folks like Honeywell), I'm
not aware of any other machines with a similar architecture to the 1401
and 1410.  Name them?


Well, how about a bit-addressable, variable field length machine that 
had not only your basic set of floating point operations, but also 
variable-length binary, binary modulo-256 and packed BCD to a length of 
65535 bytes (131K BCD digits)?  Circa 1969-1971:


http://bitsavers.informatik.uni-stuttgart.de/pdf/cdc/cyber/cyber_200/60256000_STAR-100hw_Dec75.pdf

When you've got a few minutes to spare, try writing the VHDL for it. 
This was a Jim Thornton design, later taken over by Neil Lincoln.  Later 
versions of the machine had drastically reduced instruction sets from 
the original, culminating finally in the liquid-nitrogen cooled ETA-10.


But really, variable-word length machines, while they made efficient use 
of storage, were pretty much limited to a character-serial 
memory-to-memory 2-address organization.  Quaint and perhaps 
interesting, but doomed from a performance standpoint.


--Chuck



RE: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Dave G4UGM
 -Original Message-
 From: cctalk [mailto:cctalk-boun...@classiccmp.org] On Behalf Of Chuck
 Guzis
 Sent: 14 July 2015 18:17
 To: gene...@classiccmp.org; discuss...@classiccmp.org:On-Topic and Off-
 Topic Posts
 Subject: Re: Reproducing old machines with newer technology (Re: PDP-12 at
 the RICM)
 
 I'm missing something in this discussion, I think.
 
 HDL's (take your pick) are just programming languages like FORTRAN or C
 with different constraints.  What's the point of going to all the trouble of
 doing an FPGA implementation of a slow old architecture, when pretty much
 the same result could be obtained by running a software emulator?  

I don't think this is true. Most of the software simulators don't run at 
anything like accurate speed ...

 Neither
 accurately reflects the details of the real thing--and there will always be 
 the
 aspect of missing peripherals.

I believe that you can get much closer with an FPGA...

 
 Perhaps the worst aspect of using FPGA is that this is a rapidly moving field,
 so that the part you used to do your implementation 10 years ago
 will no longer be available.

That is very true, but there again the same can happen with Software. The 
Pegasus Simulator I use was written in TurboPascal and has kludges to get the 
speed right that just don't work on a modern PC

 I've done a few designs using 5V CPLDs
 (XC95xx series) not *that* long ago.  Now they themselves are quaint
 examples of obsolete hardware.  You can't win.
 
 You can move software-only simulators quite easily, but I'm not as sanguine
 about FPGA designs.
 

See Above

 And you still don't have the peripherals.  I suppose one could emulate a
 Univac Solid State machine in FPGA, but what would one do about the all-
 important drum coupled to the card reader and printer.  Has anyone rolled
 out a design for a DIY 1403 printer?

When I have finished the Calcomp Plotter

 
 I've run the Cyber emulator as well as various SIMH emulators from time to
 time, but it's just not the same as the real thing--it's not even remotely the
 same.

I have used Hercules with a real 3270. Its not bad but Laurence Wilkinson's 360 
in FPGA wih a real Selectric Typewrite is much better:-

http://www.ljw.me.uk/ibm360/Saga.html


 
 --Chuck
 
 Dave



Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread Chuck Guzis

On 07/14/2015 10:35 AM, ben wrote:


I've run the Cyber emulator as well as various SIMH emulators from time
to time, but it's just not the same as the real thing--it's not even
remotely the same.


You can still the old computer blinking lights movie props.


On a Cyber?  What blinking lights?  Power off the DD60 and stuff some 
cotton in your ears and you couldn't even tell that the thing was on.


--Chuck




Re: Reproducing old machines with newer technology (Re: PDP-12 at the RICM)

2015-07-14 Thread ben

On 7/14/2015 11:17 AM, Chuck Guzis wrote:

I'm missing something in this discussion, I think.

HDL's (take your pick) are just programming languages like FORTRAN or C
with different constraints.  What's the point of going to all the
trouble of doing an FPGA implementation of a slow old architecture, when
pretty much the same result could be obtained by running a software
emulator?  Neither accurately reflects the details of the real
thing--and there will always be the aspect of missing peripherals.


For the moment you can still get FPGA boards with expansion conectors.
The $39 card the trend nowadays. Hard to get real I/O of any kind as we 
know.



Perhaps the worst aspect of using FPGA is that this is a rapidly moving
field, so that the part you used to do your implementation 10 years ago
will no longer be available.I've done a few designs using 5V CPLDs
(XC95xx series) not *that* long ago.  Now they themselves are quaint
examples of obsolete hardware.  You can't win.


Since when was that new in electronics. Mind you New Electrostaic speakers
and OLD Quad II's go well together.



You can move software-only simulators quite easily, but I'm not as
sanguine about FPGA designs.

And you still don't have the peripherals.  I suppose one could emulate a
Univac Solid State machine in FPGA, but what would one do about the
all-important drum coupled to the card reader and printer.  Has anyone
rolled out a design for a DIY 1403 printer?


Don't look at me, I lost the bid some old IBM equipment years ago.


I've run the Cyber emulator as well as various SIMH emulators from time
to time, but it's just not the same as the real thing--it's not even
remotely the same.


You can still the old computer blinking lights movie props.


--Chuck


Ben.