[theano-users] Error after sudo apt-get upgrade

2017-05-18 Thread Xu Zhang
My theano and lasagne works well yesterday. Today, after I type sudo 
apt-get upgrade and installed some others dependencies for VLC. My Theano 
doesn't work anymore. Error is below when I run my code. The screen keep 
printing similar errors without stopping.

my nvidia info:

+-+
| NVIDIA-SMI 361.93.02  Driver Version: 
361.93.02 |
|---+--+--+
| GPU  NamePersistence-M| Bus-IdDisp.A | Volatile Uncorr. 
ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap| Memory-Usage | GPU-Util  Compute 
M. |
|===+==+==|
|   0  Tesla K40m  On   | :81:00.0 Off |
0 |
| N/A   37CP820W / 235W |  0MiB / 11441MiB |  0%  
Default |
+---+--+--+
   

+-+
| Processes:   GPU 
Memory |
|  GPU   PID  Type  Process name   
Usage  |
|=|
|  No running processes 
found |
+-+
cuda version:

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2016 NVIDIA Corporation
Built on Tue_Jan_10_13:22:03_CST_2017
Cuda compilation tools, release 8.0, V8.0.61

Ubuntu version:

Distributor ID:Ubuntu
Description:Ubuntu 14.04.5 LTS
Release:14.04
Codename:trusty



WARNING (theano.sandbox.cuda): The cuda backend is deprecated and will be 
removed in the next release (v0.10).  Please switch to the gpuarray 
backend. You can get more information about how to switch at this URL:
 
https://github.com/Theano/Theano/wiki/Converting-to-the-new-gpu-back-end%28gpuarray%29

1 #include 
2 #include 
3 #include "theano_mod_helper.h"
4 #include "cudnn.h"
5 //
6   Support Code
7 //
8 
9 #if PY_MAJOR_VERSION >= 3
10 #define PyInt_FromLong PyLong_FromLong
11 #endif
12 
13 
14 namespace {
15 struct __struct_compiled_op_3442132e74e76f61361fb1c73112eef4 {
16 PyObject* __ERROR;
17 
18 PyObject* storage_V1;
19 
20 
21 __struct_compiled_op_3442132e74e76f61361fb1c73112eef4() {
22 // This is only somewhat safe because we:
23 //  1) Are not a virtual class
24 //  2) Do not use any virtual classes in the members
25 //  3) Deal with mostly POD and pointers
26 
27 // If this changes, we would have to revise this, but for
28 // now I am tired of chasing segfaults because
29 // initialization code had an error and some pointer has
30 // a junk value.
31 memset(this, 0, sizeof(*this));
32 }
33 ~__struct_compiled_op_3442132e74e76f61361fb1c73112eef4(void) {
34 cleanup();
35 }
36 
37 int init(PyObject* __ERROR, PyObject* storage_V1) {
38 Py_XINCREF(storage_V1);
39 this->storage_V1 = storage_V1;
40 
41 
42 
43 this->__ERROR = __ERROR;
44 return 0;
45 }
46 void cleanup(void) {
47 __label_1:
48 
49 double __DUMMY_1;
50 __label_4:
51 
52 double __DUMMY_4;
53 
54 Py_XDECREF(this->storage_V1);
55 }
56 int run(void) {
57 int __failure = 0;
58 
59 PyObject* py_V1;
60 
61 PyObject* V1;
62 
63 {
64 
65 py_V1 = Py_None;
66 {Py_XINCREF(py_V1);}
67 
68 V1 = NULL;
69 
70 {
71 // Op class DnnVersion
72 
73 #if defined(CUDNN_VERSION)
74 V1 = PyTuple_Pack(2, PyInt_FromLong(CUDNN_VERSION), 
PyInt_FromLong(cudnnGetVersion()));
75 #else
76 V1 = PyInt_FromLong(-1);
77 #endif
78 __label_3:
79 
80 double __DUMMY_3;
81 
82 }
83 __label_2:
84 
85 if (!__failure) {
86   
87 assert(py_V1->ob_refcnt > 1);
88 Py_DECREF(py_V1);
89 py_V1 = V1 ? V1 : Py_None;
90 Py_INCREF(py_V1);
91 
92   PyObject* old = PyList_GET_ITEM(storage_V1, 0);
93   {Py_XINCREF(py_V1);}
94   PyList_SET_ITEM(storage_V1, 0, py_V1);
95   {Py_XDECREF(old);}
96 }
97 
98 Py_XDECREF(V1);
99 
100 {Py_XDECREF(py_V1);}
101 
102 double __DUMMY_2;
103 
104 }
105 
106 
107 if (__failure) {
108 // When there is a failure, this code puts the exception
109 // in __ERROR.
110 PyObject* err_type = NULL;
111 PyObject* err_msg = NULL;
112 PyObject* 

[theano-users] Re: tensor grad with respect to shared variable

2017-05-18 Thread Jesse Livezey
t will have dtype "int64" because 5 is an integer. If you do
t = theano.shared(5.,'t') # added period after 5
it should work

On Thursday, May 18, 2017 at 2:59:17 AM UTC-7, phamminhquang pham wrote:
>
>  I have tried following code and it's result confuse me.
>
> import theano
> import theano.tensor as T
> 
> import theano
> import theano.tensor as T
>
> t = theano.shared(5,'t')
> y = t ** 2
> grad = T.grad(y, wrt = t)
> f = theano.function([],y)
> g = theano.function([],grad)
> print(f())
> print(g())
>
> And i got 25 and 0.0 in return. However, the gradient must be 10. I think 
> there's some detail in tensor grad that i dont know. Could you please 
> explain to me what happened?
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] Re: Classic error: Expected an array-like object, but found a Variable

2017-05-18 Thread Jesse Livezey
That isn't a normal way to use theano. Have you read about creating theano 
functions with "updates" dictionaries?

There are simple examples here:
http://deeplearning.net/software/theano/tutorial/examples.html#using-shared-variables

On Sunday, May 14, 2017 at 3:05:01 AM UTC-7, Mohamed Akrout wrote:
>
> Hi,
>
> I am new to Theano. My problematic code is the following:
>
> z0 = float32(random.randn(1, 1))
> z = shared(z0)
>
> x0 = float32(random.randn(N, 1))
> x = shared(x0)
>
> wo = shared(zeros((N, 1), dtype=float32))
>
> z.set_value(T.dot(wo.T , x)) # here is the problem
>
>
> This gives me the error: Expected an array-like object, but found a 
> Variable.
>
> I understand that z is a 1x1 numpy array and T.dot(wo.T , x) is a 1x1 
> vector but I did not succeed to find a way to assign the 1x1 vector to z.
>
> Do you have any idea to solve this problem?
>
> Thank you for your help!
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[theano-users] tensor grad with respect to shared variable

2017-05-18 Thread phamminhquang pham
 I have tried following code and it's result confuse me.

import theano
import theano.tensor as T

import theano
import theano.tensor as T
   
t = theano.shared(5,'t')
y = t ** 2
grad = T.grad(y, wrt = t)
f = theano.function([],y)
g = theano.function([],grad)
print(f())
print(g())

And i got 25 and 0.0 in return. However, the gradient must be 10. I think 
there's some detail in tensor grad that i dont know. Could you please 
explain to me what happened?

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.