I was able to work around it, I think. I had a TensorVariable that I was 
adding a dimension to with np.newaxis and I replaced it with a reshape - 
that removed the NoneConst from the tree, though I'm not sure why.

I would like to use the existing convolution code, but I'm interpolating a 
non-uniform set of points on the regular image grid - hence the funky 
indexing :\

I can try the latest dev version if it would help track down an issue, but 
I've got it working on my end at this point. My current guess is that 
np.newaxis has some inconsistency in its behavior.

Best,
Michael

On Saturday, October 15, 2016 at 8:03:16 PM UTC-4, Pascal Lamblin wrote:
>
> On Sun, Oct 16, 2016, Pascal Lamblin wrote: 
> > That optimization probably makes some assumptions that are not true in 
> > your case, hence the error message, but the only consequence should be 
> > that this particular optimization gets skipped. This should not make 
> > your program crash or return incorrect results. 
> > I'll try to get an idea why that happens. 
>
> Actually, can you try with the latest development version? 
> There is a recent change that may have solved that issue. 
>
> > 
> > On an unrelated note, your use case looks a lot like a convolution. 
> > Maybe there is a way of expressing it by using the convolution operation 
> > in Theano, which would be much more efficient than a for loop or scan. 
> > 
> > On Sat, Oct 15, 2016, Michael Harradon wrote: 
> > > I'm attempting to write a differentiable, parameterized image rotation 
> > > layer in Theano with decent performance - as result I'm doing some 
> slightly 
> > > unusual things in terms of indexing in order to get decent 
> performance. 
> > > When I try to optimize the resulting theano graph using fast_run I get 
> a 
> > > series of errors that don't appear when I use the straight python 
> > > implementation: 
> > > 
> > > ERROR (theano.gof.opt): SeqOptimizer apply 
> > > <theano.scan_module.scan_opt.PushOutNonSeqScan object at 
> 0x7f542b2db750> 
> > > ERROR (theano.gof.opt): Traceback: 
> > > ERROR (theano.gof.opt): Traceback (most recent call last): 
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gof/opt.py", 
> line 
> > > 230, in apply 
> > >     sub_prof = optimizer.optimize(fgraph) 
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gof/opt.py", 
> line 89, 
> > > in optimize 
> > >     ret = self.apply(fgraph, *args, **kwargs) 
> > >   File 
> > > 
> "/usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_opt.py", 
> > > line 228, in apply 
> > >     self.process_node(fgraph, node) 
> > >   File 
> > > 
> "/usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_opt.py", 
> > > line 313, in process_node 
> > >     **dict(return_list=True))[0].owner 
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gof/op.py", line 
> 611, 
> > > in __call__ 
> > >     node = self.make_node(*inputs, **kwargs) 
> > >   File 
> "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", 
> > > line 2133, in make_node 
> > >     index = tuple(map(as_index_variable, index)) 
> > >   File 
> "/usr/local/lib/python2.7/dist-packages/theano/tensor/subtensor.py", 
> > > line 2079, in as_index_variable 
> > >     idx = theano.tensor.as_tensor_variable(idx) 
> > >   File 
> "/usr/local/lib/python2.7/dist-packages/theano/tensor/basic.py", 
> > > line 167, in as_tensor_variable 
> > >     "Variable type field must be a TensorType.", x, x.type) 
> > > AsTensorError: ('Variable type field must be a TensorType.', 
> NoneConst, 
> > > <theano.tensor.type_other.NoneTypeT object at 0x7f542dbcab10>) 
> > > 
> > > A snippet of relevant code: 
> > > 
> > > W=2 
> > > > coordsP = T.add(pts.astype('float32'),self.D.reshape((-1,2))) 
> > > > contributors = getNeighborhood(coordsP,W) 
> > > > diffs = contributors - coordsP.reshape((-1,1,2)) 
> > > > kernVals = kern(diffs,W) 
> > > > contributorsLin = contributorsToLinInd(contributors) 
> > > 
> > >   
> > > 
> > > > # Loop over neighborhood 
> > > > result = T.zeros(input.shape) 
> > > > for i in range((2*W)**2): 
> > > >   result = result + 
> > > > 
> (input.astype('float32')[:,contributorsLin[:,i].astype('int16')])*(kernVals[:,i].T)
>  
>
> > > 
> > > 
> > > But basically there's a finite kernel that for each output pixel I 
> want to 
> > > evaluate the sum of the kernel's values times the input pixels values 
> over 
> > > the support of the kernel (neighborhood). 
> > > 
> > > For performance I'm trying to explicitly write the loop over the input 
> > > pixels in the support of the kernel (rather than using a dynamically 
> built 
> > > sparse matrix). My concern is that this is somewhat "weird" and I 
> might be 
> > > hitting an edge case in the Theano optimization pipeline. I should add 
> that 
> > > my goal is for this function to be thrice differentiable in self.D - 
> > > analytically this is the case, as my kernel has a triple zero at the 
> bounds 
> > > of its support, but I need to make sure not to NaN anything along the 
> way. 
> > > 
> > > If anybody has any suggestions for debugging this I'd greatly 
> appreciate it! 
> > > 
> > > Thanks, 
> > > Michael 
> > > 
> > > -- 
> > > 
> > > --- 
> > > You received this message because you are subscribed to the Google 
> Groups "theano-users" group. 
> > > To unsubscribe from this group and stop receiving emails from it, send 
> an email to theano-users...@googlegroups.com <javascript:>. 
> > > For more options, visit https://groups.google.com/d/optout. 
> > 
> > 
> > -- 
> > Pascal 
> > 
> > -- 
> > 
> > --- 
> > You received this message because you are subscribed to the Google 
> Groups "theano-users" group. 
> > To unsubscribe from this group and stop receiving emails from it, send 
> an email to theano-users...@googlegroups.com <javascript:>. 
> > For more options, visit https://groups.google.com/d/optout. 
>
> -- 
> Pascal 
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to