>From what I can see, that message comes from an optimization trying to
move some operation out of a scan loop (to avoid it being repeated at
each time step, given that it operates on a non-sequence).

That optimization probably makes some assumptions that are not true in
your case, hence the error message, but the only consequence should be
that this particular optimization gets skipped. This should not make
your program crash or return incorrect results.
I'll try to get an idea why that happens.

On an unrelated note, your use case looks a lot like a convolution.
Maybe there is a way of expressing it by using the convolution operation
in Theano, which would be much more efficient than a for loop or scan.

On Sat, Oct 15, 2016, Michael Harradon wrote:
> I'm attempting to write a differentiable, parameterized image rotation 
> layer in Theano with decent performance - as result I'm doing some slightly 
> unusual things in terms of indexing in order to get decent performance. 
> When I try to optimize the resulting theano graph using fast_run I get a 
> series of errors that don't appear when I use the straight python 
> implementation:
> 
> ERROR (theano.gof.opt): SeqOptimizer apply 
> <theano.scan_module.scan_opt.PushOutNonSeqScan object at 0x7f542b2db750>
> ERROR (theano.gof.opt): Traceback:
> ERROR (theano.gof.opt): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/theano/gof/opt.py", line 
> 230, in apply
>     sub_prof = optimizer.optimize(fgraph)
>   File "/usr/local/lib/python2.7/dist-packages/theano/gof/opt.py", line 89, 
> in optimize
>     ret = self.apply(fgraph, *args, **kwargs)
>   File 
> "/usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_opt.py", 
> line 228, in apply
>     self.process_node(fgraph, node)
>   File 
> "/usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_opt.py", 
> line 313, in process_node
>     **dict(return_list=True))[0].owner
>   File "/usr/local/lib/python2.7/dist-packages/theano/gof/op.py", line 611, 
> in __call__
>     node = self.make_node(*inputs, **kwargs)
>   File "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", 
> line 2133, in make_node
>     index = tuple(map(as_index_variable, index))
>   File "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", 
> line 2079, in as_index_variable
>     idx = theano.tensor.as_tensor_variable(idx)
>   File "/usr/local/lib/python2.7/dist-packages/theano/tensor/basic.py", 
> line 167, in as_tensor_variable
>     "Variable type field must be a TensorType.", x, x.type)
> AsTensorError: ('Variable type field must be a TensorType.', NoneConst, 
> <theano.tensor.type_other.NoneTypeT object at 0x7f542dbcab10>)
> 
> A snippet of relevant code:
> 
> W=2
> > coordsP = T.add(pts.astype('float32'),self.D.reshape((-1,2)))
> > contributors = getNeighborhood(coordsP,W)
> > diffs = contributors - coordsP.reshape((-1,1,2))
> > kernVals = kern(diffs,W)
> > contributorsLin = contributorsToLinInd(contributors)
> 
>  
> 
> > # Loop over neighborhood
> > result = T.zeros(input.shape)
> > for i in range((2*W)**2):
> >   result = result + 
> > (input.astype('float32')[:,contributorsLin[:,i].astype('int16')])*(kernVals[:,i].T)
> 
> 
> But basically there's a finite kernel that for each output pixel I want to 
> evaluate the sum of the kernel's values times the input pixels values over 
> the support of the kernel (neighborhood).
> 
> For performance I'm trying to explicitly write the loop over the input 
> pixels in the support of the kernel (rather than using a dynamically built 
> sparse matrix). My concern is that this is somewhat "weird" and I might be 
> hitting an edge case in the Theano optimization pipeline. I should add that 
> my goal is for this function to be thrice differentiable in self.D - 
> analytically this is the case, as my kernel has a triple zero at the bounds 
> of its support, but I need to make sure not to NaN anything along the way.
> 
> If anybody has any suggestions for debugging this I'd greatly appreciate it!
> 
> Thanks,
> Michael
> 
> -- 
> 
> --- 
> You received this message because you are subscribed to the Google Groups 
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to theano-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.


-- 
Pascal

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to