chrishkchris commented on issue #674: Autograd Layer constructor
URL: https://github.com/apache/singa/issues/674#issuecomment-612549198
 
 
   > > > I think it's not an issue. When we use the computational graph, 
initialization operations won't be buffered since we just need to execute them 
once. For these operations, I just execute them immediately instead of 
buffering them into the graph at present. So before running the entire graph, 
all parameter tensors will be initialized.
   > > 
   > > 
   > > Yes, if the initization is in the init function, it will be not buffered 
automatically. If the initialization is in call function, we can still add a 
few lines to turn off the buffering. In both case, the graph function won't be 
affected.
   > 
   > how to turn it off?
   
   To turn off just need to add three lines: 
   1. Before the statament before you want to execute, add two lines:
   flag = param.device.graph_enabled()
   param.device.EnableGraph(False)
   2. After the statement, add one line
   param.device.EnableGraph(flag)
   Note that param is any input tensor that has the attribute "device" for us 
to use

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to