As a developer, I see Qubes as an amazing opportunity to develop and test 
my software on multiple operating systems from a central secure location. I 
develop marketing software, games, and machine learning algorithms. For a 
lot of that, I need to utilize my GPU (A Titan X Pascal). I haven't 
installed Qubes yet but I was doing research and all I could find was a 
thread here that started in 2014 and had one post from 2015.

As a user, I must say that if my OS can't utilize my GPU in my OS, it's not 
an OS I can use for day to day operations. I read about the security 
concerns and it's no more a concern that any other device you plug into 
your computer. The GPU isn't some gaping security hole. Today's 
motherboards and various other components have firmware scattered about 
your system. The GPU is no more insecure than them and it's a critical 
component of any computer.

What's the current capability for GPU utilization? Would I be able to 
activate my GPU for one of the installs so I can run my machine learning 
code against it or run a 3D environment? The GPU should also be powering 
the UI of the OS. The point of my post here isn't to provoke, rather to 
point out the absurdity of what I suspect might still be the neglect of 
attention to GPU utilization for this OS. I'd really love to use Qubes and 
so would so many other people but if they come along and realize their GPU 
means nothing here, well...

-- 
You received this message because you are subscribed to the Google Groups 
"qubes-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/qubes-devel/1dcdc765-cb06-4fb1-ac9b-33760fb6cdef%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to