[GitHub] [incubator-tvm] mbaret commented on pull request #5689: [PatternLang] Add ConstantPattern
mbaret commented on pull request #5689: URL: https://github.com/apache/incubator-tvm/pull/5689#issuecomment-635665317 It's not really the case that old merge composite explicitly 'lifts' constant nodes if they're vars, in old merge composite vars indicate inputs to the pattern (so they don't need to explicitly match against a VarNode). My expectation would be if the pattern requires there to be a constant node, I should see that constant node in the partitioned function. In other words, the body of the function should still match the pattern used to create it. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-tvm] mbaret commented on pull request #5689: [PatternLang] Add ConstantPattern
mbaret commented on pull request #5689: URL: https://github.com/apache/incubator-tvm/pull/5689#issuecomment-635650558 Try changing w in the pattern to a relay.const rather than a var. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-tvm] mbaret commented on pull request #5689: [PatternLang] Add ConstantPattern
mbaret commented on pull request #5689: URL: https://github.com/apache/incubator-tvm/pull/5689#issuecomment-635640183 ``` %2 = fn (%input: Tensor[(1, 224, 224, 3), uint8], Composite="qnn_conv2d") -> Tensor[(1, 224, 224, 64), uint8] { %0 = qnn.conv2d(%input, meta[relay.Constant][0] /* ty=Tensor[(3, 3, 3, 64), uint8] */ /* ty=Tensor[(3, 3, 3, 64), uint8] */, 115 /* ty=int32 */, 134 /* ty=int32 */, 1f /* ty=float32 */, 0.00503911f /* ty=float32 */, padding=[1, 1, 1, 1], channels=64, kernel_size=[3, 3], data_layout="NHWC", kernel_layout="HWIO", out_dtype="int32") /* ty=Tensor[(1, 224, 224, 64), int32] */; %1 = nn.bias_add(%0, meta[relay.Constant][1] /* ty=Tensor[(64), int32] */ /* ty=Tensor[(64), int32] */, axis=3) /* ty=Tensor[(1, 224, 224, 64), int32] */; qnn.requantize(%1, 0.00503911f /* ty=float32 */, 0 /* ty=int32 */, 3.26957f /* ty=float32 */, 0 /* ty=int32 */, out_dtype="uint8") /* ty=Tensor[(1, 224, 224, 64), uint8] */ }; ``` Would be an example of the composite function I'd expect if I matched the weight of the conv as a constant (this is the old merge composite behaviour for that case). I think the most logical behaviour is for constants to remain embedded where they are explicitly part of the pattern and to lift them where they are not. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-tvm] mbaret commented on pull request #5689: [PatternLang] Add ConstantPattern
mbaret commented on pull request #5689: URL: https://github.com/apache/incubator-tvm/pull/5689#issuecomment-635633008 Thanks for the PR so soon! Is there an example of how partition works on a constant match? In particular, does the constant remain propagated into the body? On a general point, it's helpful if reformats are kept separate from feature additions for reviews so it's easy to see what the changes are. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org