Having both shared variables and input variables wrapped by In() should be supported now, at least in the dev version, and probably in 0.8.2.
On Thu, Sep 08, 2016, [email protected] wrote: > Do you work it out? > I meet the same error when I using Sequential model in muliprocess > > 在 2014年11月17日星期一 UTC+8下午11:42:39,Yimeng Zhang写道: > > > > Hi, > > > > It seems that I can't wrap function inputs with theano.In in a graph with > > shared variable. > > > > For example, in http://deeplearning.net/tutorial/code/logistic_sgd.py, > > below test_model (about line 303), I add > > > > test_model2 = theano.function( > >> inputs=[theano.In(x),y], > >> outputs=classifier.errors(y), > >> ) > > > > > > Then when I run the script, I get. > > > > (theano)Yimengs-MacBook-Pro:code yimengzh$ python logistic_sgd.py > >> Using gpu device 0: GeForce GT 750M > >> ... loading data > >> ... building the model > >> Traceback (most recent call last): > >> File "logistic_sgd.py", line 442, in <module> > >> sgd_optimization_mnist() > >> File "logistic_sgd.py", line 314, in sgd_optimization_mnist > >> outputs=classifier.errors(y), > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/compile/function.py", > >> > >> line 251, in function > >> accept_inplace=accept_inplace, name=name) > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/compile/function_module.py", > >> > >> line 1539, in orig_function > >> on_unused_input=on_unused_input).create( > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/compile/function_module.py", > >> > >> line 1225, in __init__ > >> fgraph, additional_outputs = std_fgraph(inputs, outputs, > >> accept_inplace) > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/compile/function_module.py", > >> > >> line 141, in std_fgraph > >> fgraph = gof.fg.FunctionGraph(orig_inputs, orig_outputs) > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/gof/fg.py", > >> > >> line 135, in __init__ > >> self.__import_r__(outputs, reason="init") > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/gof/fg.py", > >> > >> line 257, in __import_r__ > >> self.__import__(apply_node, reason=reason) > >> File > >> "/Users/yimengzh/anaconda/envs/theano/lib/python2.7/site-packages/theano/gof/fg.py", > >> > >> line 353, in __import__ > >> detailed_err_msg) > >> > >> *theano.gof.fg.MissingInputError: A variable that is an input to the > >> graph was neither provided as an input to the function nor given a value. > >> A > >> chain of variables leading from this input to an output is [b, > >> HostFromGpu.0, DimShuffle{x,0}.0, Elemwise{add,no_inplace}.0, Softmax.0, > >> argmax, Elemwise{neq,no_inplace}.0, Sum{acc_dtype=int64}.0, > >> Elemwise{true_div,no_inplace}.0]. This chain may not be unique*Backtrace > >> when the variable is created: > >> File "logistic_sgd.py", line 442, in <module> > >> sgd_optimization_mnist() > >> File "logistic_sgd.py", line 295, in sgd_optimization_mnist > >> classifier = LogisticRegression(input=x, n_in=28 * 28, n_out=10) > >> File "logistic_sgd.py", line 91, in __init__ > >> borrow=True > > > > > > > > When I take theano.In out, everything works. Why is it like this? I tried > > using theano.In in some other files I write that involves shared variable > > (which is *b *in logistic case), and it's the same. I'm not sure if > > theano.In works for a graph without shared variable, but seems it can't > > work with the presence of it. Thanks. > > > > > > -- > > --- > You received this message because you are subscribed to the Google Groups > "theano-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. -- Pascal -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
