[theano-users] Re: How can implemeant BackPropagation through time in Theano?

2017-06-18 Thread Jesse Livezey
You can use the scan function to create RNN architectures.

http://deeplearning.net/software/theano/library/scan.html

On Sunday, June 18, 2017 at 4:13:44 PM UTC-7, Sunjeet Jena wrote:
>
> I am building a multi-layer RNN network and thus need a way to back 
> propagate through time in Theano. Does theano automatically knows how to 
> unfold the network as a feed forward network?
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] How can implemeant BackPropagation through time in Theano?

2017-06-18 Thread Sunjeet Jena
I am building a multi-layer RNN network and thus need a way to back 
propagate through time in Theano. Does theano automatically knows how to 
unfold the network as a feed forward network?

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] 'local_remove_all_assert' command not working.

2017-06-18 Thread Sunjeet Jena
I am trying to use 'local_remove_all_assert' in theano.flag to remove this 
error "  AssertionError: Scan has returned a list of updates. This should 
not happen! Report this to theano-users (also include the script that 
generated the error)" but still I am getting this error. Is this the right 
way to disable this function:



* THEANO_FLAGS="floatX=float32, 
optimizer_including=local_remove_all_assert" python Deep_RL_4.py*

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Re: Theano 0.9.0: GPU is printed, but not used?

2017-06-18 Thread Meier Benjamin
Thanks for the hint:) Your are right.

I just searched the code for Theano 0.9 
(link: http://deeplearning.net/software/theano/tutorial/using_gpu.html) and 
used it for another test. Unfortunately the effect is the same.

Maybe it really works for this example code, but for my application it does 
not seem to work. It is as slow with the GPU flag as with the CPU flag. 
With older versions of theano (and lasagne) it worked, but I also changed 
the GPU (GTX 780 to Titan X pascal).

Am Samstag, 17. Juni 2017 00:37:29 UTC+2 schrieb Daniel Seita:
>
> Not sure if this affects the result but note that the link you provided is 
> for theano 0.8.X, not theano 0.9.0 as your title implies.
>
> On Thursday, June 15, 2017 at 2:45:26 PM UTC-7, Meier Benjamin wrote:
>>
>> Hello,
>>
>> I use the follwing test program:
>> https://theano.readthedocs.io/en/0.8.x/tutorial/using_gpu.html
>>
>> from theano import function, config, shared, sandbox
>> import theano.tensor as T
>> import numpy
>> import time
>>
>> vlen = 10 * 30 * 768  # 10 x #cores x # threads per core
>> iters = 1000
>>
>> rng = numpy.random.RandomState(22)
>> x = shared(numpy.asarray(rng.rand(vlen), config.floatX))
>> f = function([], T.exp(x))
>> print(f.maker.fgraph.toposort())
>> t0 = time.time()
>> for i in range(iters):
>> r = f()
>> t1 = time.time()
>> print("Looping %d times took %f seconds" % (iters, t1 - t0))
>> print("Result is %s" % (r,))
>> if numpy.any([isinstance(x.op, T.Elemwise) for x in 
>> f.maker.fgraph.toposort()]):
>> print('Used the cpu')
>> else:
>> print('Used the gpu')
>>
>> And I get this output:
>>
>> root@21cfc9b009d4:/code/tmp/test# THEANO_FLAGS='floatX=float32,device=cuda0' 
>> python gpu_test.py
>> Using cuDNN version 5105 on context None
>> Mapped name None to device cuda0: TITAN X (Pascal) (:87:00.0)
>> [GpuElemwise{exp,no_inplace}(), 
>> HostFromGpu(gpuarray)(GpuElemwise{exp,no_inplace}.0)]
>> Looping 1000 times took 0.221684 seconds
>> Result is [ 1.23178029  1.61879349  1.52278066 ...,  2.20771813  2.29967761
>>   1.62323296]
>> Used the cpu
>>
>>
>> For some reason theano still uses the CPU? But it already prints the GPU 
>> infos? Do I do something wrong?
>>
>> Thank you very much
>>
>>
>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.