xqdan edited a comment on pull request #8079:
URL: https://github.com/apache/tvm/pull/8079#issuecomment-851468841


   > Hi @xqdan , I am thinking to use relay IR and TE in a hybrid way but that 
might beyond this PR. I will approve it and wait to see if @altanh and 
@zackcquic have any opinion for your updated changes.
   
   Thanks @ZihengJiang 
   Do you mean define custom op thru relay ir like relay.add, relay.sub ..., 
and then register it?
   We are trying to do this, this is an example:
   ```
   def layernorm_register_func():
       data = relay.var("input", dtype="float16")    
       gamma = relay.var("gamma",  dtype="float16")
       beta = relay.var("beta", dtype="float16")    
       mean = relay.mean(data, -1, True, False)
       sub_mean = relay.subtract(data, mean)
       mean_square = relay.multiply(sub_mean, sub_mean)
       variance = relay.mean(mean_square, -1, True, False)    
       z = rel.npu.scalaradd(variance, 1e-5)
       zz = relay.sqrt(z)
       u = relay.divide(sub_mean, zz)
       v = relay.multiply(u, gamma)
       w = relay.add(v, beta)    
       return relay.Function([data, gamma, beta], w).with_attr("Primitive", 1)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to