Yes, we have built a few of them. We have one here, one at AMSE, and one
that travels to schools in one of our traveling science trailers.

On Tue, Jan 14, 2020 at 10:29 AM John McCulloch <jo...@pcpcdirect.com>
wrote:

> Hey Scott, I think I saw an exhibit like what you’re describing at the
> AMSE when I was on a project in Oak Ridge. Was that it?
>
>
>
> John McCulloch | PCPC Direct, Ltd. | desk 713-344-0923
>
>
>
> *From:* Scott Atchley <e.scott.atch...@gmail.com>
> *Sent:* Tuesday, January 14, 2020 7:19 AM
> *To:* John McCulloch <jo...@pcpcdirect.com>
> *Cc:* beowulf@beowulf.org
> *Subject:* Re: [Beowulf] HPC demo
>
>
>
> We still have Tiny Titan <https://tinytitan.github.io> even though Titan
> is gone. It allows users to toggle processors on and off and the display
> has a mode where the "water" is colored coded by the processor, which has a
> corresponding light. You can see the frame rate go up as you add processors
> and the motion becomes much more fluid.
>
>
>
> On Mon, Jan 13, 2020 at 7:35 PM John McCulloch <jo...@pcpcdirect.com>
> wrote:
>
> I recently inherited management of a cluster and my knowledge is limited
> to a bit of Red Hat. I need to figure out a demo for upper management
> graphically demonstrating the speed up of running a parallel app on one x86
> node versus multiple nodes up to 36. They have dual Gold 6132 procs and
> Mellanox EDR interconnect. Any suggestions would be appreciated.
>
>
>
> Respectfully,
>
> John McCulloch | PCPC Direct, Ltd.
>
>
>
> _______________________________________________
> Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit
> https://beowulf.org/cgi-bin/mailman/listinfo/beowulf
>
>
_______________________________________________
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit 
https://beowulf.org/cgi-bin/mailman/listinfo/beowulf

Reply via email to