lfengad opened a new pull request #4990: [Relay][Topi] BatchNorm support with 
run-time mean and variance calculation
URL: https://github.com/apache/incubator-tvm/pull/4990
 
 
   We observe a great amount of tensorflow models used in our production 
environment invoke the FusedBatchNorm operator, and a lot of them use this 
operator in "is_training" mode for model inference. In "is_training" mode, the 
mean and variance are calculated dynamically using the run-time data without 
pre-defined. However, the current BatchNorm in TVM requires the mean and 
variance are given as non-empty tensors. 
   
   We add the support for BatchNorm in "is_training" mode, to make it able to 
dynamically calculate the mean and variance if not given. We first check the 
mean node and variance node for fused_batch_norm in tensorflow frontend to 
annotate them if they are empty. Then according to the annotation, we add 
necessary nodes for the mean and variance calculation in BatchNormToInferUnpack 
function, which is used to arrange the BatchNorm inference.
   
   In our current implementation, the annotations of the empty mean and 
variance are added into the name_hint of the corresponding variable nodes. This 
solution is simple and no need to modify the attributes of the relay operator 
batch_norm. Alternatively, we can add a bool attribute "is_training" to the 
relay operator batch_norm. If the mean and variance are empty, "is_training" is 
set to true. Then according to the attributes of the relay operator, we decide 
whether to add the nodes for calculating the mean and variance or not in 
function BatchNormToInferUnpack. This solution needs to modify the relay 
operator batch_norm.
   
   Any suggestions are welcome! @tqchen @FrozenGene

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to