Hello, I already tried 'optimizer=fast_compile' and optimizer=None'. Both did not give me more informations.
Am Dienstag, 31. Januar 2017 18:43:26 UTC+1 schrieb [email protected]: > > Hello :), > > at the moment I'm trying to use theano for efficient calculation for some > derivatives of a function. > The compiling process for the derivatives works well. > But when I try to calculate the function on some parameters it failed and > I don't know exactly. > I'm quit new to python and especially theano. > > The full error: > > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "C:\Program > Files\Anaconda2\lib\site-packages\spyder\utils\site\sitecustomize.py", line > 888, in debugfile > debugger.run("runfile(%r, args=%r, wdir=%r)" % (filename, args, wdir)) > File "C:\Program Files\Anaconda2\lib\bdb.py", line 400, in run > exec cmd in globals, locals > File "<string>", line 1, in <module> > File "C:\Program > Files\Anaconda2\lib\site-packages\spyder\utils\site\sitecustomize.py", line > 866, in runfile > execfile(filename, namespace) > File "C:\Program > Files\Anaconda2\lib\site-packages\spyder\utils\site\sitecustomize.py", line > 87, in execfile > exec(compile(scripttext, filename, 'exec'), glob, loc) > File "c:/users/flo9fe/desktop/vssgp_lvm/vssgp_lvm_startme.py", line 63, > in <module> > vSSGP_LVM_opt.callback(x0) > File "vSSGP_LVM_opt.py", line 82, in callback > LL = self.vssgp_lvm.f['LL'](**params) > File "theano\compile\function_module.py", line 886, in __call__ > storage_map=getattr(self.fn, 'storage_map', None)) > File "theano\gof\link.py", line 325, in raise_with_op > reraise(exc_type, exc_value, exc_trace) > File "theano\compile\function_module.py", line 873, in __call__ > self.fn() if output_subset is None else\ > ValueError: Input dimension mis-match. (input[0].shape[0] = 50, > input[1].shape[0] = 1) > Apply node that caused the error: > Elemwise{sub,no_inplace}(InplaceDimShuffle{0,1,x}.0, > InplaceDimShuffle{1,0,x}.0) > Toposort index: 62 > Inputs types: [TensorType(float64, (False, False, True)), > TensorType(float64, (False, False, True))] > Inputs shapes: [(50L, 1L, 1L), (1L, 50L, 1L)] > Inputs strides: [(8L, 8L, 8L), (400L, 8L, 8L)] > Inputs values: ['not shown', 'not shown'] > Outputs clients: [[Elemwise{Composite{((exp((i0 * i1)) * cos((i2 + i3))) + > (exp((i0 * i4)) * cos((i5 + i6 + i7))))}}[(0, 1)](TensorConstant{(1L, 1L, > 1L) of -0.5}, Sum{axis=[3], acc_dtype=float64}.0, Sum{axis=[3], > acc_dtype=float64}.0, Elemwise{sub,no_inplace}.0, Sum{axis=[3], > acc_dtype=float64}.0, Sum{axis=[3], acc_dtype=float64}.0, > InplaceDimShuffle{0,1,x}.0, InplaceDimShuffle{1,0,x}.0)]] > > HINT: Re-running with most Theano optimization disabled could give you a > back-trace of when this node was created. This can be done with by setting > the Theano flag 'optimizer=fast_compile'. If that does not work, Theano > optimizations can be disabled with 'optimizer=None'. > HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and > storage map footprint of this apply node. > > > > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
