> On Apr 3, 2024, at 12:28 PM, Martin Bishop via cctalk <[email protected]> 
> wrote:
> 
> Ignore my last - incontinence or is it incompetence
> 
> A fairly ordinary GPU, in a PC, could almost certainly provide an XY display 
> with Z fade (long persistance phosphor).  I use them for waterfall displays 
> and they keep up - the data does of course arrive by E'net.

Yes, and some emulations have done this, such as Phil Budne's famous work in 
SIMH.  I adopted some of those ideas for DtCyber (CDC 6000 emulation) and it 
works well, though the timing is marginal given the graphics subsystem I use.  
That is WxWidgets -- changes are SDL would do better and some day I will try 
that.

> Equally, FPGAs / SOCs can implement frame buffers; eg to output waterfall 
> displays.  The fading memory would have to be in DRAM, FPGA memory is fast 
> but small 3 ns access time but only 240 ki by .. 2.18 Mi by (Zynq 10 .. 45, 
> the '45 is a corporate purchase).  A ping pong buffer arrangement could 
> implement fading - computed in either processor  (vector instructions) or 
> logic (raw muscle).  The DAC input lines could supply the data.

Agreed, and that would be an elegant way to emulate a CDC DD60.  Or a GT40.  
You'd presumably want to use at least an HD level display signal (1920 by 
1080), if not double that, to make it look right; less than that would give 
pixel artefacts that make it not look as vector-like as you would want.

        paul

Reply via email to