Hi,

I have limited experience with Theano, but have found myself in the 
position of modifying a feed forward neural network with an initial word 
embedding layer. The goal is to keep the word embeddings static rather than 
tuning their weights in the training process. Using Keras, this is simply 
done by using the `trainable=False` parameter 
<https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html> 
when loading the embedding weights, but I'm unsure how to do this directly 
in Theano. 

The code I'm modifying can be found here 
<https://github.com/attapol/nn_discourse_parser/blob/master/nets/learning.py>, 
and it is the AdagradTrainer class starting on line 175. I don't expect you 
to analyze the code and come up with the solution, but I would very much 
appreciate if you pointed me towards a minimal example on how updating 
embedding weights and how to keep them static generally work.

Thank you,
Jimmy

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to