You haven't used w or b in the graph to compute the cost function. You need 
to add something like

logits = w.dot(x) + b.dimshuffle('x', 0')
logitsdev = logits - logits.max(1, keepdims=True)

you probably also want b to be of length 257 rather than a scalar.

On Monday, February 13, 2017 at 9:06:57 AM UTC-8, Dimitris Marm wrote:
>
>
> Hello,
>
> I am new to Theano and I have seen that actually is not straight-forward 
> how to apply semantic segmentation over multiclass and multilabel labels.
>
> Particularly the problem I seen many people face is regarding the 
> logistic-regression layer in the end of the network. For tackling this 
> problem I constructed a single 
>
> *logistic-regression layer for 2d labels.*I guess my code still has some 
> bug but I am unable to find  it !
>
>
> The code 
>
> =========================
>
> import numpy as np
> import random
> from theano import shared
> from theano import tensor as T
> import theano
>
> #import pdb; pdb.set_trace()
>
> theano.config.optimizer='fast_compile'
> theano.config.exception_verbosity='high'
> theano.config.compute_test_value = 'warn'
>
> features = []
> labels = []
>
>
> # ======================= theano =========================
>
> for i in range(10):
>
>     if i == 0:
>         
>         features = np.array([random.randint(0,256) for t in range(0,128*
> 128)]).reshape((1, 1, 128,128))
>         labels = np.array([random.randint(0,256) for t in range(0,128*128
> )]).reshape((1, 1, 128,128))
>
>     else:
>        
>         features= np.append(features, np.array([random.randint(0,256) for 
> t in range(0,128*128)]).reshape((1, 1, 128,128)),axis= 0)
>
>         labels= np.append(features, np.array([random.randint(0,256) for t 
> in range(0,128*128)]).reshape((1, 1, 128,128)),axis= 0)
>
> # Loss
>
> def train_lr(x_train, y_train, regu_coeff = 0.00015, step = 0.0000001):
>
>     x = T.matrix("x")
>     y = T.matrix("y")
>     
>     w = theano.shared(
>         value = np.random.randn(128*128,257),
>         name='w', 
>         borrow=True
>     )
>     
>     b = theano.shared(0., name='b')
>     
>     # 2d-softmax implementation
>     xdev = x - x.max(1, keepdims=True)
>     log_pred = xdev - T.log(T.sum(T.exp(xdev), axis=1, keepdims=True))
>
>     cost = T.mean(-T.sum(y * log_pred, axis=1)) # cross entropy 
>     
>     gw, gb = T.grad(cost, [w, b]) # gradient
>
>     train = theano.function(
>             inputs=[x,y],
>             outputs=[log_pred],
>             updates=((w, w - step * gw), (b, b - step * gb)))
>
>     for i in range(100):
>         train(x_train,y_train)
>
>     return w, b
>
> pp = train_lr(features, labels)
>
>
>
> The error I am getting
>
> ===================================
>
>
>
>
>
>
>
> *theano.gradient.DisconnectedInputError: grad method was asked to compute 
> the gradient with respect to a variable that is not part of the 
> computational graph of the cost, or is used only by a non-differentiable 
> operator: w~~~~~*I really do not know how to proceed as I don't 
> understand why the w (weights) are not differentiable.
>
> I would appreciate any hints 
>
> Regards
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to