[theano-users] Re: Error when try to do w^T*x+b

2017-07-09 Thread Alexander Botev
If you look at the error the shapes don't match. the conv_out is 1x32x16x16 while the bias is 1x1x1x32. I guess your bias you did wrong the dimshuffle. On Saturday, 8 July 2017 01:53:58 UTC+1, zxzh...@gmail.com wrote: > > conv_out is the output of dnn.dnn_conv. I tried to add the bias to the >

[theano-users] Scan checkpointing - what exactly theano stores?

2017-07-21 Thread Alexander Botev
So the scan checkpointing seems very ineteresting from the prespective that it can be used for things like learning-to-learn. However, my question is can we tell Theano which part of each N-th iteration it to store and which not? For instance in the learning-to-learn framework where we unroll

[theano-users] Re: Theano accept data on GPU?

2017-05-09 Thread Alexander Botev
of > theano.function). Then, feed pygpu.gpuarray.GpuArray object directly to > the compiled function. pygpu.gpuarray.asarray can be used to move numpy > array to GPU. > > On Tuesday, May 9, 2017 at 5:01:42 PM UTC+8, Alexander Botev wrote: >> >> Actually one thing I've ju

[theano-users] Theano accept data on GPU?

2017-05-09 Thread Alexander Botev
So recently I was wondering if there is any way that after compiling a theano function, rather than taking numpy arrays / native lists / native numbers it can accept as an input something like a libgpuarray or anything else that lives on the GPU. However, I know that in the computation graph

[theano-users] Difference between single and multiple Random Streams.

2017-05-19 Thread Alexander Botev
So I was wondering if there is any significant difference between having a single MRG_RandomStreams or several of them? Particularly, I'm used to having one single stream, such that I can easily set it and recover top to bottom the exact behaviour as previous iterations. However, I was just

[theano-users] Theano same computation not optimized?

2017-05-20 Thread Alexander Botev
I have the following code: >>> a = T.fmatrix() >>> b = T.sqr(a) >>> c = T.nnet.sigmoid(a) >>> g = T.fmatrix() >>> d = T.Lop(c, a, g) >>> f = theano.function([a, g], d) Using debug print I get: >>> theano.printing.debugprint(f) Elemwise{mul} [id A] '' 5 |Elemwise{mul} [id B] '' 3 |

[theano-users] Re: Theano same computation not optimized?

2017-05-21 Thread Alexander Botev
, Adam Becker wrote: > > > when it can just reuse that computation > > That's what optimization does. Try running it with device=cpu and > optimizer=fast_run > > On Saturday, May 20, 2017 at 11:55:19 PM UTC+8, Alexander Botev wrote: >> >> I have the fol

[theano-users] Re: installation

2017-05-31 Thread Alexander Botev
I think pip is the best in anaconda if you want the bleeding edge theano. Otherwise just do conda install theano pygpu On Monday, 29 May 2017 14:44:24 UTC+1, Chuck Anderson wrote: > > What is the best way to install theano within an anaconda distribution of > python 3.6 on linux? > > Thank

[theano-users] Suggestions for Theano logo/moto

2017-06-05 Thread Alexander Botev
Hi to all Theano users. So we had this discussion on github about Theano T-shirts (https://github.com/Theano/Theano/issues/5984). As we would welcome any interesting suggestions you can think for Theano! -- --- You received this message because you are subscribed to the Google Groups