masahi commented on issue #4570: [relay] Relay annotation and partitioning for 
external compilers
URL: https://github.com/apache/incubator-tvm/pull/4570#issuecomment-569193712
 
 
   @zhiics When I run the mobilenet test and dump the annotated graph, batch 
norm is not converted to dnnl one. 
   ```
     %150 = fn (%dnnl_input75: Tensor[(1, 1024, 7, 7), float32], %dnnl_input76: 
Tensor[(1024, 1, 3, 3), float32], Compiler="dnnl", ExternalSymbol="dnnl_4", 
Primitive=1) -> Tensor[(1, 1024, 7, 7), float32] {
       nn.conv2d(%dnnl_input75, %dnnl_input76, padding=[1, 1], groups=1024, 
channels=1024, kernel_size=[3, 3]) /* ty=Tensor[(1, 1024, 7, 7), float32] */
     };
     %151 = %150(%149, %separable_conv_block_13_depthwise_conv1_weight) /* 
ty=Tensor[(1, 1024, 7, 7), float32] */;
     %152 = nn.batch_norm(%151, %separable_conv_block_13_bn1_gamma, 
%separable_conv_block_13_bn1_beta, %separable_conv_block_13_bn1_moving_mean, 
%separable_conv_block_13_bn1_moving_var) /* ty=(Tensor[(1, 1024, 7, 7), 
float32], Tensor[(1024), float32], Tensor[(1024), float32]) */;
     %153 = %152.0;
     %154 = fn (%dnnl_input77: Tensor[(1, 1024, 7, 7), float32], 
Compiler="dnnl", ExternalSymbol="dnnl_3", Primitive=1) -> Tensor[(1, 1024, 7, 
7), float32] {
       nn.relu(%dnnl_input77) /* ty=Tensor[(1, 1024, 7, 7), float32] */
     };
   ```
   It is probably because batch norm is decomposed during SimplifyInference. Is 
batch norm supposed to be around during external codegen time? I think it 
should be. I don't hit "else if (IsOp(call, "nn.batch_norm")" condition 
[here](https://github.com/apache/incubator-tvm/blob/master/src/relay/backend/contrib/dnnl/codegen.cc#L73)
 during mobilenet test.
    
   I want to see an example of matching a pattern like Conv + BN + Relu and 
translate them to dnnl's fused op. It would be great if you could do that in 
this or future PR, otherwise I'll do it as a practice and send a PR :) 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to