A little digging later ...  I implemented the waterfall display eight years 
ago, outputing to a 1280 x 1024 monitor.  1920 x 1080 was supported, but I was 
outputing 1 Ki pt FFTs.  The hardware platform was a Xilinx Zynq.

An indication of 4k video capabilities is 
https://xilinx-wiki.atlassian.net/wiki/spaces/A/pages/2611216385/Zynq+UltraScale+MPSoC+VCU+TRD+2023.1
  Note the higher octane hardware used.

Martin

-----Original Message-----
From: Paul Koning [mailto:[email protected]] 
Sent: 03 April 2024 18:38

<< snip >>

> Equally, FPGAs / SOCs can implement frame buffers; eg to output waterfall 
> displays.  The fading memory would have to be in DRAM, FPGA memory is fast 
> but small 3 ns access time but only 240 ki by .. 2.18 Mi by (Zynq 10 .. 45, 
> the '45 is a corporate purchase).  A ping pong buffer arrangement could 
> implement fading - computed in either processor  (vector instructions) or 
> logic (raw muscle).  The DAC input lines could supply the data.

Agreed, and that would be an elegant way to emulate a CDC DD60.  Or a GT40.  
You'd presumably want to use at least an HD level display signal (1920 by 
1080), if not double that, to make it look right; less than that would give 
pixel artefacts that make it not look as vector-like as you would want.

        paul

Reply via email to