Hi,

You are indeed still working with a symbolic graph, the error basically
says that it is not possible to get a subgraph going from the inputs you
provided that can compute the outputs you request.

In that case, it is likely happening when trying to build the `predict` 
function:

    predict = theano.function(inputs=[X], outputs=p_y_x_max,
                              updates=updates,
                              allow_input_downcast=True,
                              mode='FAST_RUN')

Here, your outputs are:
  - `p_y_x_max`, and
  - all the update values in `updates`

However, to get those update values, you need the cost, and for that you
need Y, which is not provided as input.

If you want a prediction function, you should probably remove
"updates=updates," from that call.

On Thu, Sep 08, 2016, [email protected] wrote:
> Hello theano community,
> 
> I've been having this missing input error recently on small piece of theano 
> example.
> 
> The full stack error is the following:
> 
> MissingInputError: A variable that is an input to the graph was neither 
> provided as an input to the function nor given a value.
> A chain of variables leading from this input to an output is
> [ Y,
>  Elemwise{mul}.0,
>  Elemwise{true_div}.0,
>  Elemwise{true_div}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{mul}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{Switch}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  dot.0,
>  Reshape{4}.0,
>  Elemwise{Switch}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  AbstractConv2d_gradInputs{border_mode='valid', subsample=(1, 1),
>  filter_flip=True, imshp=(None, None, None, None),
>  kshp=(None, None, None, None)}.0,
>  Elemwise{Switch}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  AbstractConv2d_gradInputs{border_mode='valid', subsample=(1, 1),
>  filter_flip=True, imshp=(None, None, None, None),
>  kshp=(None, None, None, None)}.0,
>  Elemwise{Switch}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  AbstractConv2d_gradInputs{border_mode='valid', subsample=(1, 1),
>  filter_flip=True, imshp=(None, None, None, None),
>  kshp=(None, None, None, None)}.0, Split{2}.0,
>  IncSubtensor{Inc;::, ::, :int64:, :int64:}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{Switch}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  AbstractConv2d_gradInputs{border_mode='valid', subsample=(1, 1),
>  filter_flip=True, imshp=(None, None, None, None),
>  kshp=(None, None, None, None)}.0,
>  Elemwise{Switch}.0, Elemwise{add,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  AbstractConv2d_gradWeights{border_mode='valid', subsample=(1, 1),
>  filter_flip=True, imshp=(None, None, None, None),
>  kshp=(None, None, None, None)}.0,
>  Elemwise{mul,no_inplace}.0,
>  Elemwise{add,no_inplace}.0,
>  Elemwise{true_div,no_inplace}.0,
>  Elemwise{mul,no_inplace}.0,
>  Elemwise{sub,no_inplace}.0 ].
>  
>  This chain may not be unique
> Backtrace when the variable is created:
>   File "<decorator-gen-57>", line 2, in run
>   File 
> "/home/user/anaconda2/lib/python2.7/site-packages/IPython/core/magic.py", 
> line 188, in <lambda>
>     call = lambda f, *a, **k: f(*a, **k)
>   File 
> "/home/user/anaconda2/lib/python2.7/site-packages/IPython/core/magics/execution.py"
> , line 742, in run
>     run()
>   File 
> "/home/user/anaconda2/lib/python2.7/site-packages/IPython/core/magics/execution.py"
> , line 728, in run
>     exit_ignore=exit_ignore)
>   File 
> "/home/user/anaconda2/lib/python2.7/site-packages/IPython/core/interactiveshell.py"
> , line 2481, in safe_execfile
>     self.compile if kw['shell_futures'] else None)
>   File 
> "/home/user/anaconda2/lib/python2.7/site-packages/IPython/utils/py3compat.py"
> , line 289, in execfile
>     builtin_mod.execfile(filename, *where)
>   File "/home/user/projects/python/theano/unet.py", line 291, in <module>
>     training()
>   File "/home/user/projects/python/theano/unet.py", line 259, in training
>     Y = tt.fmatrix(name='Y').astype(dtype='float32')
> 
> 
> 
> The minimal example I was trying to run is the following:
> 
> def training():
>    
> 
>     W = []
>     for weight in range(len(weights)):
>         W.append(init_weights(weights[weight]))
> 
> 
>     X = tt.ftensor4(name='X').astype(dtype='float32')
>     Y = tt.fmatrix(name='Y').astype(dtype='float32')
> 
>     p_y_x = model(X, W)
> 
>     p_y_x_max = tt.argmax(p_y_x, axis=1)
> 
>     cost = tt.mean(tt.nnet.categorical_crossentropy(p_y_x, Y))
> 
>     updates = optimize(cost, W)
> 
>     train = theano.function(inputs=[X, Y], outputs=cost,
>                             updates=updates,
>                             allow_input_downcast=True,
>                             mode='FAST_RUN')
> 
> 
>     predict = theano.function(inputs=[X], outputs=p_y_x_max,
>                               updates=updates,
>                               allow_input_downcast=True,
>                               mode='FAST_RUN')
> 
> training()
> 
> 
> Given the error output what is strange to me is the fact that at this point 
> if i just run the code I should be getting the computation graph instead 
> I'm getting complaints about the Y variable?
> I don't understand why since I'm not calling "train" or "predict" in order 
> to specify arguments.
> At this point I would just expect theano to build the computation graph? Am 
> I missing somehting?
> 
> -- 
> 
> --- 
> You received this message because you are subscribed to the Google Groups 
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.


-- 
Pascal

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to