This would be really difficult to help debugging without a full,
runnable script.
The only meaningful difference between train_mb and validate_mb_accuracy
seens to be the output: does self.layers[-1].accuracy(self.y) depend on
any other variable than self.x and self.y?
The missing input looks like an unnamed matrix, that gets reshaped.
On Sat, Dec 10, 2016, Daksh Varshneya wrote:
> Hi, here is a piece of code I have been working on -
>
> self.x = T.tensor3("x")
> self.y = T.ivector("y")
> self.sequences = sequences
> # Check this step for general dimshuffle
> if self.sequences:
> self.x_shuffled = self.x.dimshuffle((1,0,2))
>
>
> # self.output = T.tensor3()
> self.output,self.scan_updates =
> theano.scan(self.step,sequences=self.x_shuffled
> # ,non_sequences=weights
> ,outputs_info=None)
>
> def step(self,x):
> init_layer = self.layers[0]
> init_layer.set_inpt(x, self.mini_batch_size)
> for j in range(1, len(self.layers)):
> prev_layer, layer = self.layers[j-1], self.layers[j]
> layer.set_inpt(
> prev_layer.output, self.mini_batch_size)
> self.output = self.layers[-1].output
> return self.output
>
>
> # define the (regularized) cost function, symbolic gradients, and updates
> l2_norm_squared = sum([(layer.w**2).sum() for layer in self.layers])
> cost = self.layers[-1].cost(self)+\
> 0.5*lmbda*l2_norm_squared/num_training_batches
> grads = T.grad(cost, self.params)
> updates = [(param, param-eta*grad)
> for param, grad in zip(self.params, grads)]
>
> # define functions to train a mini-batch, and to compute the
> # accuracy in validation and test mini-batches.
> i = T.lscalar() # mini-batch index
> train_mb = theano.function(
> [i], cost,
> updates=updates + self.scan_updates,
> givens={
> self.x:
> training_x[i*self.mini_batch_size: (i+1)*self.mini_batch_size],
> self.y:
> training_y[i*self.mini_batch_size: (i+1)*self.mini_batch_size]
> },on_unused_input='ignore')
>
> validate_mb_accuracy = theano.function(
> [i], self.layers[-1].accuracy(self.y),
>
> updates=self.scan_updates,
>
> givens={
> self.x:
> validation_x[i*self.mini_batch_size: (i+1)*self.mini_batch_size],
> self.y:
> validation_y[i*self.mini_batch_size: (i+1)*self.mini_batch_size]
> },on_unused_input='ignore')
>
>
> I have shown only the relevant part of the code here.
> I am working with sequences of data points and hence you can see the step
> function which i am calling repeatedly using theano.scan. The 'train_mb'
> function works perfectly fine when I call it. The error is thrown in the
> validate_mb_accuracy function.
> The complete error with exception_verbosity=high is :
>
> theano.gof.fg.MissingInputError: A variable that is an input to the graph
> >> was neither provided as an input to the function nor given a value. A
> >> chain
> >> of variables leading from this input to an output is [<TensorType(float64,
> >> matrix)>, Reshape{2}.0, dot.0, Elemwise{add,no_inplace}.0, sigmoid.0,
> >> Reshape{2}.0, dot.0, Elemwise{add,no_inplace}.0, sigmoid.0, Reshape{2}.0,
> >> dot.0, Elemwise{add,no_inplace}.0, Softmax.0, argmax,
> >> Elemwise{eq,no_inplace}.0, Sum{acc_dtype=int64}.0, mean]. This chain may
> >> not be unique
> >
> > Backtrace when the variable is created:
> >
> > File "/home/daksh/PycharmProjects/nn/theano_ann_sequences/tester.py",
> >> line 65, in <module>
> >
> > net = shallow()
> >
> > File "/home/daksh/PycharmProjects/nn/theano_ann_sequences/tester.py",
> >> line 36, in shallow
> >
> >
> >> FullyConnectedLayer(n_in=100,n_out=10,activation_fn=softmax)],mini_batch_size,sequences
> >>
> >> = True)
> >
> > File "/home/daksh/PycharmProjects/nn/theano_ann_sequences/network.py",
> >> line 35, in __init__
> >
> > ,outputs_info=None)
> >
> >
> Line 35 in the error points to the validation_mb function.
> Any clues on why I am getting this error? I am even recording the updates
> of the scan and passing them to the theano.function call as updates(This
> was pointed out as the probable cause in some of the blogs.
>
> Any help will be appreciated.
> Thanks
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
--
Pascal
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.