While not necessarily unwelcome as a possibility, I don't think GPU-based drawing/gaming is as relevant to this discussion (or as important of a goal for Plan 9 / 9front) as is GPU compute (GPGPU).

The ability to leverage GPU resources across CPU servers for computation purposes would be of great benefit to the platform, and working out a driver interface by starting the process remotely via drawterm seems like a sensible step in that direction.

On 8/22/21 3:07 AM, sirjofri wrote:

22.08.2021 05:16:42 Eli Cohen <echol...@gmail.com>:
deep learning is another interest of mine too. hardware support is a
big deal for that... some kind of support for GPUs would be nice.
people have discussed that for years... hardware drivers are difficult
and important to do correctly!

I always really liked the "XCPU" and drawterm type ideas of using
other OSes for their existing strengths along with Plan 9. maybe
drawterm could have a GPU device driver or something... that being
said I have sometimes found it ends up surprisingly easier doing it
all on Plan 9...

That's also something I thought about a few times already: drawterm
with GPU support. The only issue I see is, for realtime applications
like games the draw times would be network bound and thus pretty slow.
It would work for heavy GPU applications where almost no draw calls
will exist (no textures, very low poly meshes, ...), but for heavier
stuff we'd need to address that.

That's the benefit of a native driver: you could calculate the server
side (heavy CPU calculations) on a cpu server, the client/frontend
side (including draw calls) on a terminal and the pure graphics on the
GPU.

I'd still give the drawterm GPU a shot. Maybe I can set drawterm up
for compilation on my work PC (two GTX 1080Ti) and try figuring out
how to do all that stuff. However, I've never done graphics
applications on windows or somewhere else that uses OpenGL or DirectX
(I'd try OpenGL because portability), only written shaders so far.
I'll surely need some time (which is always rare as a game developer).

Btw I don't know the exact specifications for GPU usage for neural
networks. I assume it's all compute shaders? Maybe it's even a kinda
blackbox, put stuff in (draw call), read things out. I assume this can
work perfectly fine for draw times, depending on the data.

sirjofri

------------------------------------------
9fans: 9fans
Permalink: 
https://9fans.topicbox.com/groups/9fans/T65ec64adb5137874-Ma12c6e769699f5c3b561fbf2
Delivery options: https://9fans.topicbox.com/groups/9fans/subscription

Reply via email to