tqchen edited a comment on pull request #7060:
URL: https://github.com/apache/tvm/pull/7060#issuecomment-741822077


   Thanks @yongwww First of all, I do not disagree about the general goal of 
having a relay dialect in MLIR. The main discussion pt is how can we get to 
that point.
   
   Right now we have the need to expose relay operators to a few places: (1) 
python  (2) c++ registration; (3) rust; (4) tablegen(if go with the mlir 
route). Adding new point of exposure would indeed creates the complexity of 
adding a new relay operator.
   
   If we can first streamline the operator schema itself (perhaps relates to 
the object def schema 
https://discuss.tvm.apache.org/t/rfc-tvm-object-schema-dsl/7930), then there 
will be less resistance in adding a MLIR relay dialect later. 
   
   Right now, a direct translation might be a path of less resistance, given 
that: (1) We can always switch to a dialect later and most of the code are 
similar; (2) It would be easier to quickly expand the support. (3) We might 
need some effort in figuring out the control flow support/effect translation 
first, then think about how a complete dialect can be built. 
   
   Adding support translation from other MLIR dialects is not hard, as most of 
them shares the same graph structure, but will indeed benefit from a dialect. 
The main complexity though may not be on the op translation part, but on the 
translation of types, control flow and effect. So it would be nice to 
streamline these parts starting from a path of less resistance.
   
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to