Lunderberg commented on code in PR #16094:
URL: https://github.com/apache/tvm/pull/16094#discussion_r1391909243


##########
tests/python/relax/distributed/test_distributed_transform_propagate_sharding.py:
##########
@@ -1060,16 +1738,19 @@ def test_mlp_pipeline_parallelism():
 
 
 def test_decoder_layer():
-    # mod = relax.transform.LegalizeOps({"relax.reshape": lambda bb, call: 
bb.normalize(call)})(LlamaAttentionLayer)
-    mod = LlamaAttentionLayer
-    after = relax.distributed.transform.PropagateSharding()(mod)
+    after = 
relax.distributed.transform.PropagateSharding()(LlamaAttentionLayer)
     assert_structural_equal(after, ShardedLlamaAttentionLayer)
 
 
-def test_decoder_layer_dynamic_shape():
-    # mod = relax.transform.LegalizeOps({"relax.reshape": lambda bb, call: 
bb.normalize(call)})(LlamaAttentionLayer)
-    mod = LlamaAttentionLayerDynamicShape
+def test_decoder_layer_tir():
+    mod = relax.transform.LegalizeOps()(LlamaAttentionLayer)

Review Comment:
   Hmm, thank you for the link, and I was unaware of the distinction.  I'm a 
bit worried about that later lowering pass, and whether it would require the 
PrimFunc to be dynamic along the split axis for correctness, but having the 
global vs local distinction makes sense.
   
   I don't think this PR should be delayed by my late arrival on the design 
discussion, and shouldn't be blocking for this PR.  I think improving the 
readability of the test itself is the main thing that would still be useful to 
do.  The part that took the most time to recognize was that all the PrimFunc 
definitions were unchanged.  Can we explicitly build the 
`ShardedLlamaAttentionLayerTIR` by copying the `PrimFunc` objects from the 
original module, and only providing an updated version of the `@R.function \n 
def foo()` ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to