Hello,
I want to use GPUs with Theano, and read here
(http://deeplearning.net/software/theano/tutorial/examples.html#using-random-numbers)
that I have to import this
from theano.sandbox.rng_mrg import MRG_RandomStreams as RandomStreams
instead of this
from theano.tensor.shared_randomstreams import RandomStreams
However, I have the feeling that both are working with GPU, since I got
random numbers as expected when running the above code with the GPU :
import theano
# from theano.sandbox.rng_mrg import MRG_RandomStreams as RandomStreams
from theano.tensor.shared_randomstreams import RandomStreams
if __name__ == '__main__':
rng = RandomStreams()
def get_random():
out = rng.uniform((2,2), dtype=theano.config.floatX)
fonction = theano.function(inputs=[],
outputs=[out],
name='test'
)
return fonction
fonc = get_random()
for i in range(5):
print(fonc())
Is it normal ? I have the confirmation that i'm running the code on the GPU,
Using gpu device 0: GeForce GTX TITAN X (CNMeM is enabled with initial size:
90.0% of memory, cuDNN not available)
So why is it working ?
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.