Hi all: I'm having trouble with Theano graph optimizations and memory usage. I have a very small network (~8 parameters) that I stack into a larger network (yielding ~800 parameters). However the parameters are shared so it's not really increasing the number of parameters that much. Compiling the network seems to take an exorbitant amount of memory. I'm using Keras as a simple way to construct it so it's conceivable that the problem isn't with Theano. However a) the problem appears to be when theano.function is called and b) when I use FAST_COMPILE it takes almost no time and almost no time and very little memory to compile (~1 sec and 100MB vs. well over 32GB of memory and it never finishes). As such I think the problem is with the optimizations and I'm wondering if there is an easy way to identify which optimization(s) might be causing the the problem.
Thanks, -- Dustin Webb <http://www.cs.utah.edu/~dustin> <http://lisa.iro.umontreal.ca/> -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
