[theano-users] Re: Non-linear convolution

2016-07-25 Thread Jesse Livezey
You could do the L2 norm using the regular convolution by using the identity
||w-x||^2 = ||w||^2-2||w dot x||+||x||^2

and using
http://deeplearning.net/software/theano/library/tensor/nnet/neighbours.html#theano.tensor.nnet.neighbours.images2neibs
to help with the ||x||^2
and regular convolution to compute the
||w dot x||
term.

Other nonlinearities might be more tricky.

On Monday, July 25, 2016 at 12:08:56 PM UTC-7, Geppetto Null wrote:
>
> Hi everyone,
>
> I would like to modify the 2D convolution in order to introduce some 
> non-linear operation. For instance, instead of performing w^Tx+b = 
> dot(w,x)+b in each receptive field (i.e., patch of the input --as is by 
> default), I would like to perform the operation ||w-x||^2+b to each 
> receptive field, or some other non-linear operation (some other norm, for 
> example L-1). Is this doable in Theano/Lasagne?
>
> Thank you very much in advance.
> Best,
> Christos
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [theano-users] diff is not differentiable

2016-07-25 Thread Frédéric Bastien
You will need to modify the method grad (see
http://deeplearning.net/software/theano/extending/extending_theano.html#extending-theano
)

of the class DiffOp in the file theano/tensor/extra_ops.py

It is currently working when ndim==1 for the input. Mostly, you need to
modify it to implement it when ndim==2.

Try not to use scan, it would slow things down.

Fred

On Mon, Jul 25, 2016 at 4:31 PM, Aditya Gudimella <
aditya.gudime...@gmail.com> wrote:

> I would love to implement my case, but I don’t exactly know which
> functions to add. I can think of a way in which diff can be converted to a
> scan op but I don’t know if that’ll help with grad or just slow down things.
>
> On Jul 21, 2016, at 2:09 PM, Frédéric Bastien 
> wrote:
>
> We haven't implemetned the grad for the diff op if the input isn't a
> vector.
>
> Do you want to implement your cases?
>
> Fred
>
> On Thu, Jul 21, 2016 at 11:54 AM, Aditya Gudimella <
> aditya.gudime...@gmail.com> wrote:
>
>> a = shared(np.arange(3*8).reshape(8,3).astype('float32'))
>> b = diff(a, axis=0)
>> T.grad(b.norm(2), a)
>>
>>
>> This gives me an error. It's unable to find the gradient of the diff
>> operator if it has been used on a 2 dimensional tensor. Is this how it was
>> meant to be or is this a bug?
>>
>> --
>>
>> ---
>> You received this message because you are subscribed to the Google Groups
>> "theano-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to theano-users+unsubscr...@googlegroups.com.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
> --
>
> ---
> You received this message because you are subscribed to a topic in the
> Google Groups "theano-users" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/theano-users/r4tLV0BTvGU/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> theano-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to theano-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [theano-users] ERROR (theano.gof.opt): Optimization failure due to: constant_folding

2016-07-25 Thread Frédéric Bastien
We also observed some slowness on Windows compared to Linux when compiling
C code. But not after that. Do you still see slowdown after the compilation
of C code is done? This is cached, so if you launch the same job multiple
time, the 2nd and following time won't compile c code.

There was changes related to your error in the master of Theano. Update
Theano to it, it will probably fix them:

http://www.deeplearning.net/software/theano/install.html#bleeding-edge-install-instructions

Fred

On Mon, Jul 25, 2016 at 9:07 AM, Feras Almasri  wrote:

> I'm having this warning by using both cpu and gpu on windows beside this
> theano is slower than running on virtual machine linux.
>
> hidden layer number is 10
> ERROR (theano.gof.opt): Optimization failure due to: constant_folding
> ERROR (theano.gof.opt): node: ScalarFromTensor(TensorConstant{15})
> ERROR (theano.gof.opt): TRACEBACK:
> ERROR (theano.gof.opt): Traceback (most recent call last):
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cmodule.py",
> line 1767, in _try_compile_tmp
> os.remove(exe_path + ".exe")
> PermissionError: [WinError 32] The process cannot access the file because
> it is being used by another process:
> 'C:\\Users\\ADMINI~1\\AppData\\Local\\Temp\\try_march_5ym6a6ns.exe'
>
> During handling of the above exception, another exception occurred:
>
> Traceback (most recent call last):
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\opt.py",
> line 1820, in process_node
> replacements = lopt.transform(node)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\tensor\opt.py",
> line 6099, in constant_folding
> no_recycling=[])
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\op.py",
> line 975, in make_thunk
> no_recycling)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\op.py",
> line 875, in make_c_thunk
> output_storage=node_output_storage)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cc.py",
> line 1189, in make_thunk
> keep_lock=keep_lock)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cc.py",
> line 1130, in __compile__
> keep_lock=keep_lock)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cc.py",
> line 1577, in cthunk_factory
> key = self.cmodule_key()
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cc.py",
> line 1267, in cmodule_key
> compile_args=self.compile_args(),
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cc.py",
> line 950, in compile_args
> ret += c_compiler.compile_args()
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cmodule.py",
> line 2066, in compile_args
> default_compilation_result, default_execution_result =
> try_march_flag(GCC_compiler.march_flags)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cmodule.py",
> line 1834, in try_march_flag
> flags=cflags, try_run=True)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cmodule.py",
> line 2166, in try_compile_tmp
> theano.config.cxx)
>   File
> "C:\SciSoft\WinPython-64bit-3.4.4.3Qt5\python-3.4.4.amd64\lib\site-packages\theano\gof\cmodule.py",
> line 1772, in _try_compile_tmp
> err += "\n" + str(e)
> TypeError: can't concat bytes to str
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to theano-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.