I use Nvidia TitanX, normally when I train my network of 8 layers on the 
GPU is utilize 99% of the gpu. this time I just added another layer to the 
network and GPU utilization dropped to nearly 0% or 1%. I checked the code 
several times there is nothing extra I just added one more layer to my 
network. 

Any advice to solve this issue please ? 

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to