Andre Sulka:
> Hi guys,
> 
> i would like to use my currently unused MSI GTX 660 for machine learning 
> with R/Python, Keras, PyTorch, Tensorflow in a Qubes-VM.
> 
> I already send a question to the list by email but i can't find it here, so 
> if it appears - sorry for the double post.
> 
> Currently both of my monitors are connected to my Onboard-GPU, a Intel i5 
> CPU, that is ok.
> 
> There are some instructions how to attach and what to install, but they are 
> older and I'm not sure if i need to do exactly the same procedure.
> 
> I don't want to connect a monitor to it AND i don't want to break my 
> system... ;)
> 
> So the question is:
> 
> How can i use my NVIDIA GTX 660 inside a template based VM for machine 
> learning, which steps/mods/installs are needed?
> 
> Many thanks!
> 
1. Replace the nvidia with an older AMD video card
2.
https://github.com/Qubes-Community/Contents/blob/master/docs/customization/windows-gaming-hvm.md

Nvidia is not consumer friendly, and do their best to prevent users
using their hardware as they like without spending thousands on a
business version of their cards. Don't think anyone has been able to get
passthrough working with them.

-- 
- don't top post
Mailing list etiquette:
- trim quoted reply to only relevant portions
- when possible, copy and paste text instead of screenshots

-- 
You received this message because you are subscribed to the Google Groups 
"qubes-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/qubes-users/02b0e4bc-ed80-dab3-9a90-752a3a58bee8%40danwin1210.me.

Reply via email to