[theano-users] Re: Error when try to do w^T*x+b

2017-07-09 Thread Alexander Botev
If you look at the error the shapes don't match. the conv_out is 1x32x16x16 
while the bias is 1x1x1x32. 
I guess your bias you did wrong the dimshuffle.

On Saturday, 8 July 2017 01:53:58 UTC+1, zxzh...@gmail.com wrote:
>
> conv_out is the output of dnn.dnn_conv. I tried to add the bias to the 
> w^T*x. But it reports me an error:
>
>
>
> Running network...
> Traceback (most recent call last):
>
>   File "", line 1, in 
> 
> runfile('/space/xzhang/git_cnn_conversion/MyLasagneCode_CIFAR10/test_convnet_binary_bias.py',
>  
> wdir='/space/xzhang/git_cnn_conversion/MyLasagneCode_CIFAR10')
>
>   File 
> "/space/xzhang/anaconda2/lib/python2.7/site-packages/spyder/utils/site/sitecustomize.py",
>  
> line 866, in runfile
> execfile(filename, namespace)
>
>   File 
> "/space/xzhang/anaconda2/lib/python2.7/site-packages/spyder/utils/site/sitecustomize.py",
>  
> line 94, in execfile
> builtins.execfile(filename, *where)
>
>   File 
> "/space/xzhang/git_cnn_conversion/MyLasagneCode_CIFAR10/test_convnet_binary_bias.py",
>  
> line 161, in 
> main(**kargs)
>
>   File 
> "/space/xzhang/git_cnn_conversion/MyLasagneCode_CIFAR10/test_convnet_binary_bias.py",
>  
> line 107, in main
> dt=dt, max_rate=1000, proc_fn=get_output,  reset_fn=final_dense)
>
>   File "spike_tester_theano.py", line 128, in run_tester
> out_mem, t, Ntransmittedspikes, conv1_spikes, conv2_spikes, 
> conv3_spikes = proc_fn(inp_images.astype('float32'), float(t))
>
>   File 
> "/space/xzhang/anaconda2/lib/python2.7/site-packages/theano/compile/function_module.py",
>  
> line 898, in __call__
> storage_map=getattr(self.fn, 'storage_map', None))
>
>   File 
> "/space/xzhang/anaconda2/lib/python2.7/site-packages/theano/gof/link.py", 
> line 325, in raise_with_op
> reraise(exc_type, exc_value, exc_trace)
>
>   File 
> "/space/xzhang/anaconda2/lib/python2.7/site-packages/theano/compile/function_module.py",
>  
> line 884, in __call__
> self.fn() if output_subset is None else\
>
>
> ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start 
> at 0) has shape[3] == 32, but the output's size on that axis is 16.
> Apply node that caused the error: GpuElemwise{Add}[(0, 
> 0)](GpuSubtensor{::, ::, int64:int64:, int64:int64:}.0, 
> InplaceGpuDimShuffle{x,x,x,0}.0)
> Toposort index: 250
> Inputs types: [GpuArrayType(float32, 4D), 
> GpuArrayType(float32, (True, True, True, False))]
> Inputs shapes: [(1, 32, 16, 16), (1, 1, 1, 32)]
> Inputs strides: [(51200, 1600, 80, 4), (128, 128, 128, 4)]
> Inputs values: ['not shown', 'not shown']
> Outputs clients: [[HostFromGpu(gpuarray)(GpuElemwise{Add}[(0, 
> 0)].0)]]
>
> HINT: Re-running with most Theano optimization disabled could give you a 
> back-trace of when this node was created. This can be done with by setting 
> the Theano flag 'optimizer=fast_compile'. If that does not work, Theano 
> optimizations can be disabled with 'optimizer=None'.
> HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and 
> storage map footprint of this apply node.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Scan checkpointing - what exactly theano stores?

2017-07-21 Thread Alexander Botev
So the scan checkpointing seems very ineteresting from the prespective that 
it can be used for things like learning-to-learn.
However, my question is can we tell Theano which part of each N-th 
iteration it to store and which not? For instance in the learning-to-learn 
framework where we unroll SGD
the optimal would be to store only the "updated" parameters which get pass 
to the next time step, rather than the whole computation. Is it possible to 
achieve something like that? 

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Re: Theano accept data on GPU?

2017-05-09 Thread Alexander Botev
That does not seem to work. So I have this:

a = T.fmatrix()
ctx = pygpu.init(theano.config.device)
theano.gpuarray.reg_context("mine", ctx)
a_gpu = theano.gpuarray.GpuArrayType(a.dtype, a.broadcastable, "mine")
f2 = theano.function([a_gpu], a + T.constant(2), givens={a: a_gpu})
return f1, f2


However, Theano complains about:

TypeError: Unknown parameter type: 

If instead of the [a_gpu] I have [a] it complains that the givens is 
overwriting an input:

RuntimeError: You are trying to replace variable '<TensorType(float32, 
matrix)>' through the `givens` parameter, but this variable is an input to 
your function. Replacing inputs is currently forbidden because it has no 
effect. One way to modify an input `x` to a function evaluating f(x) is to 
define a new input `y` and use `theano.function([y], f(x), givens={x: 
g(y)})`. Another solution consists in using `theano.clone`, e.g. like this: 
`theano.function([x], theano.clone(f(x), replace={x: g(x)}))`.


On Tuesday, 9 May 2017 15:19:10 UTC+1, Adam Becker wrote:
>
> In the main graph, replace the input variables with type: 
> theano.gpuarray.GpuArrayType (Can be done using givens parameter of 
> theano.function). Then, feed pygpu.gpuarray.GpuArray object directly to 
> the compiled function. pygpu.gpuarray.asarray can be used to move numpy 
> array to GPU.
>
> On Tuesday, May 9, 2017 at 5:01:42 PM UTC+8, Alexander Botev wrote:
>>
>> Actually one thing I've just realized is that to do this consistently I 
>> need to have access to the underlying Theano pygpu Context. Is there anyway 
>> to get that?
>>
>> On Tuesday, 9 May 2017 09:53:02 UTC+1, Alexander Botev wrote:
>>>
>>> So recently I was wondering if there is any way that after compiling a 
>>> theano function, rather than taking numpy arrays / native lists / native 
>>> numbers it can accept as an input something like a libgpuarray or anything 
>>> else that lives on the GPU. However, I know that in the computation graph 
>>> usually when you compile it there is a Transfer Op if it is on the GPU. Is 
>>> there a way to avoid that transfer?
>>>
>>>
>>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Theano accept data on GPU?

2017-05-09 Thread Alexander Botev
So recently I was wondering if there is any way that after compiling a 
theano function, rather than taking numpy arrays / native lists / native 
numbers it can accept as an input something like a libgpuarray or anything 
else that lives on the GPU. However, I know that in the computation graph 
usually when you compile it there is a Transfer Op if it is on the GPU. Is 
there a way to avoid that transfer?


-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Difference between single and multiple Random Streams.

2017-05-19 Thread Alexander Botev
So I was wondering if there is any significant difference between having a 
single MRG_RandomStreams or several of them?
Particularly, I'm used to having one single stream, such that I can easily 
set it and recover top to bottom the exact behaviour as previous iterations.
However, I was just wondering when I need to sample different values 
several times if this will incur any kind of overhead or it does not really 
matter.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Theano same computation not optimized?

2017-05-20 Thread Alexander Botev
I have the following code:

>>> a = T.fmatrix()
>>> b = T.sqr(a)
>>> c = T.nnet.sigmoid(a)
>>> g = T.fmatrix()
>>> d = T.Lop(c, a, g)
>>> f = theano.function([a, g], d)

Using debug print I get:

>>> theano.printing.debugprint(f)
Elemwise{mul} [id A] ''   5
 |Elemwise{mul} [id B] ''   3
 | | [id C]
 | |Elemwise{scalar_sigmoid} [id D] ''   1
 |   | [id E]
 |Elemwise{sub} [id F] ''   4
   |InplaceDimShuffle{x,x} [id G] ''   2
   | |TensorConstant{1.0} [id H]
   |Elemwise{scalar_sigmoid} [id I] ''   0
 | [id E]

My question is why does it compute the Sigmoid 2 times, when it can just 
reuse that computation? Or if it does this how can I notice it on the 
graph. I have not switched any of the optimisations.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Re: Theano same computation not optimized?

2017-05-21 Thread Alexander Botev
Is it possible to see the optimized graph than, or somehow get things 
identified that they are reused on the graph? If I make a picture with 
pydotprint there are still 2 separate nodes with the sigmoid, while I want 
to have a graph where there is only one.

On Sunday, 21 May 2017 00:59:34 UTC+1, Adam Becker wrote:
>
> > when it can just reuse that computation
>
> That's what optimization does. Try running it with device=cpu and 
> optimizer=fast_run
>
> On Saturday, May 20, 2017 at 11:55:19 PM UTC+8, Alexander Botev wrote:
>>
>> I have the following code:
>>
>> >>> a = T.fmatrix()
>> >>> b = T.sqr(a)
>> >>> c = T.nnet.sigmoid(a)
>> >>> g = T.fmatrix()
>> >>> d = T.Lop(c, a, g)
>> >>> f = theano.function([a, g], d)
>>
>> Using debug print I get:
>>
>> >>> theano.printing.debugprint(f)
>> Elemwise{mul} [id A] ''   5
>>  |Elemwise{mul} [id B] ''   3
>>  | |<TensorType(float32, matrix)> [id C]
>>  | |Elemwise{scalar_sigmoid} [id D] ''   1
>>  |   |<TensorType(float32, matrix)> [id E]
>>  |Elemwise{sub} [id F] ''   4
>>|InplaceDimShuffle{x,x} [id G] ''   2
>>| |TensorConstant{1.0} [id H]
>>|Elemwise{scalar_sigmoid} [id I] ''   0
>>  |<TensorType(float32, matrix)> [id E]
>>
>> My question is why does it compute the Sigmoid 2 times, when it can just 
>> reuse that computation? Or if it does this how can I notice it on the 
>> graph. I have not switched any of the optimisations.
>>
>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Re: installation

2017-05-31 Thread Alexander Botev
I think pip is the best in anaconda if you want the bleeding edge theano. 
Otherwise just do 

conda install theano pygpu


On Monday, 29 May 2017 14:44:24 UTC+1, Chuck Anderson wrote:
>
> What is the best way to install theano within an anaconda distribution of 
> python 3.6 on linux?
>
> Thank you.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Suggestions for Theano logo/moto

2017-06-05 Thread Alexander Botev
Hi to all Theano users.
So we had this discussion on github about Theano T-shirts 
(https://github.com/Theano/Theano/issues/5984).

As we would welcome any interesting suggestions you can think for Theano!

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.