So apparently the repeat function did what I thought I wanted to do, so the 
whole function reads:




basic_model = Model()

with basic_model:

    # Priors for unknown model parameters
    amplitude = Normal('amplitude', mu=0, sd=10000, shape = useToAs)
    offset = Normal('offset', mu=0, sd=10000, shape = useToAs)
    noise = HalfNormal('noise', sd=10000, shape = useToAs)
    phase = Uniform('phase', lower = 0, upper = ReferencePeriod)


    #parameters that define a two gaussian model
    gsep    = Savex[1]*ReferencePeriod/1024
    g1width = Savex[2]*ReferencePeriod/1024
    g2width = Savex[3]*ReferencePeriod/1024
    g2amp   = Savex[4]


    Tg1width=theano.shared(g1width)
    Tg2width =theano.shared(g2width)
    Tg2amp=theano.shared(g2amp)
    Tgsep=theano.shared(gsep)


    #Calculate the X values for first gaussian, these have to wrap on a 
period of [-ReferencePeriod/2, ReferencePeriod/2]
    x = TFlatTimes - phase
    x = ( x + ReferencePeriod/2) % (ReferencePeriod ) - ReferencePeriod/2

    #Calculate the signal values for first gaussian
    FlatS = np.exp(-0.5*(x)**2/Tg1width**2)


    #Calculate the X values for second gaussian, these have to wrap on a 
period of [-ReferencePeriod/2, ReferencePeriod/2]
    x = TFlatTimes-phase-Tgsep
    x = ( x + ReferencePeriod/2) % (ReferencePeriod ) - ReferencePeriod/2

    #Calculate the signal values for second gaussian
    FlatS  += Tg2amp*np.exp(-0.5*(x)**2/Tg1width**2)



    #Construct total vectors containing offsets, amplitude and noise 
parameters.
    #Each is (1024*useToAs) in length with the format (A_1, A_1, ...., A_1, 
A_2, A_2, ..., A_2, ......, A_useToAs, ..A_useToAs)
    NVec = theano.tensor.extra_ops.repeat(noise, 1024)
    Offs = theano.tensor.extra_ops.repeat(offset, 1024)
    Amps = theano.tensor.extra_ops.repeat(amplitude, 1024)


    #combine total signal
    Signal = Offs + Amps*FlatS

    # Likelihood (sampling distribution) of observations
    Y_obs = Normal('Y_obs', mu=Signal, sd=NVec, observed=FlatData)

Its still quite slow though.. i'm wondering if this is really the best way 
to do it.

Cheers
Lindley


On Tuesday, July 12, 2016 at 5:53:45 PM UTC+1, Jesse Livezey wrote:
>
> Would tile then ravel work?
>
> http://deeplearning.net/software/theano/library/tensor/basic.html#theano.tensor.tile
>
> It looks like you're trying to turn a length n vector into a shape (n, 
> 1024) matrix by repeating each element of the vector then raveling. If this 
> is accuracte, tile should work for repeating.
>
> On Tuesday, July 12, 2016 at 12:17:07 AM UTC-7, Lindley Lentati wrote:
>>
>> Hi, I'm trying to implement an equivalent of the following:
>>
>>
>> NVec = np.array([np.ones(1024)*sig for sig in noise]).flatten()
>> Offs = np.array([np.ones(1024)*off for off in offset]).flatten()
>> Amps = np.array([np.ones(1024)*amp for amp in amplitude]).flatten()
>>
>> where noise, offset, and amplitude are theano vectors that contain free 
>> parameters from a PyMC3 likelihood. When i try and do this I get the 
>> following:
>>
>> ValueError: length not known: offset [id A]
>>
>> which I assume is because these vectors have no notion of length right? 
>> I'm just wondering what the equivalent expression would be for use in 
>> theano. 
>>
>>
>> So far the closest i've gotten is creating a list of tensors:
>>
>> [theano.tensor.ones(1024)*noise[i] for i in range(useToAs)] 
>>
>> but I cant seem to get them to concatenate into a single tensor.
>>
>>
>> If it helps, I know the length of everything before hand.. Apologies if 
>> this is documented all over the place, I couldn't find an example that I 
>> could translate into what I wanted to do.
>>
>>
>> Many Thanks
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to