What did do you had have? Without this at can't help you.

Le mar. 23 mai 2017 18:14, <[email protected]> a écrit :

> ‎I have edited the code now to include input var and target var in the
> givens for the function, but it still doesn't work. Yes I want to define
> the model with input var and target var, which I now do with given, since
> shared variables cannot be used directly as input
> *From: *Frédéric Bastien
> *Sent: *Wednesday, May 24, 2017 12:00 AM
> *To: *[email protected]
> *Reply To: *[email protected]
> *Subject: *Re: [theano-users] Using shared variable as inputs to theano
> function (values not being updated)
>
> Hi,
>
> You don't use input_var or target_var in your Theano function. So Theano
> ignore there value. Did you wanted to define the model with input_var and
> target_var instead of X and Y? If so, that could work by calling
> set_value().
>
> Frédéric
>
> On Thu, May 11, 2017 at 5:55 PM Tara <[email protected]> wrote:
>
>> I am trying to combine pymc3 with Theano for a simple recurrent neural
>> network.However, when I complete training and change the input of the
>> shared variables to the test set, the values are not updated in the graph
>> even though the shared variables are updated.
>> Any ideas will be appreciated.
>> Here is the code :
>>
>> # CREATE PYMC3 + THEANO IMPLEMENTATION OF A SIMPLE RECURRENT NETWORK
>> import timeit
>> start = timeit.default_timer()
>> import theano
>> import theano.tensor as T
>> import numpy as np
>> import pymc3 as pm
>> from scipy.stats import mode
>> theano.config.compute_test_value = 'ignore'
>>
>> input_dim = 2
>> output_dim = 2
>> ### PARAMETERS OF THE MODEL ###
>> hidden_dim = 64
>> learning_rate = 0.1
>> nb_epochs = 10
>>
>> np.random.seed(0)
>>
>> # Initialization /placeholder values
>> X = T.dtensor3('X')
>> Y = T.dtensor3('Y')
>>
>> # begin by generating dataset so we have an array of lists
>> # ....
>>
>> NUM_EXAMPLES = 1500
>> test_input = X_data[NUM_EXAMPLES:]
>> test_output = y_data[NUM_EXAMPLES:]
>>
>> train_input = X_data[:NUM_EXAMPLES]
>> train_output = y_data[:NUM_EXAMPLES]
>>
>> input_var = theano.shared(np.asarray(train_input).astype(np.float64),
>> borrow = True)
>> target_var = theano.shared(np.asarray(train_output).astype(np.float64),
>> borrow = True)
>>
>> # Reference
>> # From paper :IMPROVING PERFORMANCE OF RECURRENT NEURAL NETWORK WITH RELU
>> NONLINEARITY
>> def norm_positive_definite(r):
>>     A = np.dot(r, r.transpose())/hidden_dim
>>     values, vectors = np.linalg.eig(A)
>>     e = np.amax(values)
>>
>>
> --
>
> ---
> You received this message because you are subscribed to a topic in the
> Google Groups "theano-users" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/theano-users/_sxgPvgMeYo/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> [email protected].
> For more options, visit https://groups.google.com/d/optout.
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to