[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586529239
 
 
   This test (test_branch_merge) is now failing at the de-duplicate pass. It's 
interesting since the "correct" (result) graph has the exact same function 
duplicated twice. Whereas the "expected" graph is just re-using the same 
function. What should be the correct behavior?
   
   Update: I am still having some trouble with this test. I will look next 
week. I am wondering if we should check this fix in and update the tests 
wholesale in another PR?
   
   I was able to fix this test. Will push temporary changes so I can work from 
another machine this weekend.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586529239
 
 
   This test (test_branch_merge) is now failing at the de-duplicate pass. It's 
interesting since the "correct" (result) graph has the exact same function 
duplicated twice. Whereas the "expected" graph is just re-using the same 
function. What should be the correct behavior?
   
   Update: I am still having some trouble with this test. I will look next 
week. I am wondering if we should check this fix in and update the tests 
wholesale in another PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586529239
 
 
   This test (test_branch_merge) is now failing at the de-duplicate pass. It's 
interesting since the "correct" (result) graph has the exact same function 
duplicated twice. Whereas the "expected" graph is just re-using the same 
function. What should be the correct behavior?
   
   Update: I am still having some trouble with this test. I will look next 
week. I am wondering if we should check this fix in and change the tests 
wholesale in another PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586529239
 
 
   This test (test_branch_merge) is now failing at the de-duplicate pass. It's 
interesting since the "correct" (result) graph has the exact same function 
duplicated twice. Whereas the "expected" graph is just re-using the same 
function. What should be the correct behavior?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586529239
 
 
   This test is now failing at the de-duplicate pass. It's interesting since 
the "correct" (result) graph has the exact same function duplicated twice. 
Whereas the "expected" graph is just re-using the same function. What should be 
the correct behavior?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586527376
 
 
   This is actually a good exercise, as some other tests are actually failing 
now due to the graphs not being the same. For example, `test_branch_merge`. 
@mbarrett97, I'll push the changes, let me know what you think.
   
   Expected:
   ```
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32], %c: 
Tensor[(10, 10), float32]) -> Tensor[(10, 10), float32] {
 %2 = fn (%in_1: Tensor[(10, 10), float32], %in_2: Tensor[(10, 10), 
float32], Composite="add_sub_mul", Primitive=1) -> Tensor[(10, 10), float32] {
   %0 = add(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   %1 = subtract(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   multiply(%0, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3 = %2(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %4 = %2(%c, %3) /* ty=Tensor[(10, 10), float32] */;
 nn.relu(%4) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   Result:
   ```
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32], %c: 
Tensor[(10, 10), float32]) -> Tensor[(10, 10), float32] {
 %2 = fn (%x: Tensor[(10, 10), float32], %y: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_sub_mul") -> Tensor[(10, 10), float32] {
   %0 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   %1 = subtract(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   multiply(%0, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3 = %2(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %6 = fn (%x1: Tensor[(10, 10), float32], %y1: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_sub_mul") -> Tensor[(10, 10), float32] {
   %4 = add(%x1, %y1) /* ty=Tensor[(10, 10), float32] */;
   %5 = subtract(%x1, %y1) /* ty=Tensor[(10, 10), float32] */;
   multiply(%4, %5) /* ty=Tensor[(10, 10), float32] */
 };
 %7 = %6(%c, %3) /* ty=Tensor[(10, 10), float32] */;
 nn.relu(%7) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   +1 that we should confirm which order is expected for alpha_equals.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586527376
 
 
   This is actually a good exercise, as some other tests are actually failing 
now due to the graphs not being the same. For example, `test_branch_merge`. 
@mbarrett97, I'll push the changes, let me know what you think.
   
   +1 that we should confirm which order is expected for alpha_equals.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586521847
 
 
   Actually, are you sure the second argument is expected? It looks like 
AlphaEqual loops through the LHS args. This has some weird implications, when I 
run the other tests and flip the order of `result` and `expected` in 
alpha_equal, they fail.
   
   Maybe this is because the "expected" doesn't have the composite and 
primitive attributes?
   
   Update: that's the part that's returning false. I think "expected" should be 
arg one, and we need to set the attributes on the function.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586472422
 
 
   I'll do some investigation :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586472422
 
 
   I'll do some investigation :)
   
   Update: in alpha_equal.cc, `relay._make._graph_equal` actually calls 
`AlphaEqualHandler(...).Equal(...)`. AlphaEqual and GraphEqual are the same 
function.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586472422
 
 
   I'll do some investigation :)
   
   Update: in alpha_equal.cc, `relay._make._graph_equal` actually calls 
`AlphaEqualHandler(...).Equal(...)`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586428127
 
 
   Yes, they pass the alpha_equal check.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586412142
 
 
   @mbarrett97 @comaniac I just pushed a test where "result" creates an 
incorrect graph, and "expected" is correct. Even though these two graphs are 
different, and "result" generated an incorrect function, the two functions 
generated are computationally equivalent. This means that the test actually 
passes `alpha_equal` both with and without the bug.
   
   It is still worth fixing this bug, since this problem blows up when patching 
large patterns, but I am not sure how to actually fail the test with output 
like this. Do you have any suggestions?
   
   Here are the Relay outputs. The pattern I am trying to match is `add -> add 
-> add`.
   
   Result (incorrect output that generated function of `add -> add -> add -> 
add`):
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %4 = fn (%x: Tensor[(10, 10), float32], %y: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_add_add") -> Tensor[(10, 10), float32] {
   %1 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%x, %1) /* ty=Tensor[(10, 10), float32] */;
   %3 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %3) /* ty=Tensor[(10, 10), float32] */
 };
 %4(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   Expected (correct output):
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %3 = fn (%in_1: Tensor[(10, 10), float32], %in_2: Tensor[(10, 10), 
float32]) -> Tensor[(10, 10), float32] {
   %1 = add(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%in_1, %1) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586412142
 
 
   @mbarrett97 @comaniac I just pushed a test where "result" creates an 
incorrect graph, and "expected" is correct. Even though these two graphs are 
different, and "result" generated an incorrect function, the two functions 
generated are computationally equivalent. This means that the test actually 
passes `alpha_equal` both with and without the bug.
   
   It is still worth fixing this bug, since this problem blows up when patching 
large patterns, but I am not sure how to actually fail the test with output 
like this. Do you have any suggestions?
   
   Here are the Relay outputs. The pattern I am trying to match is `add -> add 
-> add`.
   
   Result:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %4 = fn (%x: Tensor[(10, 10), float32], %y: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_add_add") -> Tensor[(10, 10), float32] {
   %1 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%x, %1) /* ty=Tensor[(10, 10), float32] */;
   %3 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %3) /* ty=Tensor[(10, 10), float32] */
 };
 %4(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   Expected:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %3 = fn (%in_1: Tensor[(10, 10), float32], %in_2: Tensor[(10, 10), 
float32]) -> Tensor[(10, 10), float32] {
   %1 = add(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%in_1, %1) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586412142
 
 
   @mbarrett97 @comaniac I just pushed a test where "result" creates an 
incorrect graph, and "expected" is correct. Even though these two graphs are 
different, and "result" generated an incorrect function, the two functions 
generated are computationally equivalent. This means that the test actually 
passes `alpha_equal` both with and without the bug. Here are the Relay outputs. 
The pattern I am trying to match is `add -> add -> add`.
   
   It is still worth fixing this bug, since this problem blows up when patching 
large patterns, but I am not sure how to actually fail the test with output 
like this. Do you have any suggestions?
   
   Result:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %4 = fn (%x: Tensor[(10, 10), float32], %y: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_add_add") -> Tensor[(10, 10), float32] {
   %1 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%x, %1) /* ty=Tensor[(10, 10), float32] */;
   %3 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %3) /* ty=Tensor[(10, 10), float32] */
 };
 %4(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   Expected:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %3 = fn (%in_1: Tensor[(10, 10), float32], %in_2: Tensor[(10, 10), 
float32]) -> Tensor[(10, 10), float32] {
   %1 = add(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%in_1, %1) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-tvm] soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing call node in MergeComposite pass

2020-02-14 Thread GitBox
soiferj edited a comment on issue #4879: [Relay][Pass] Fix bug in re-processing 
call node in MergeComposite pass
URL: https://github.com/apache/incubator-tvm/pull/4879#issuecomment-586412142
 
 
   @mbarrett97 @comaniac I just pushed a test where "result" creates an 
incorrect graph, and "expected" is correct. Even those these two graphs are 
different, and "result" generated an incorrect function, the two functions 
generated are computationally equivalent. This means that the test actually 
passes `alpha_equal` both with and without the bug. Here are the Relay outputs. 
The pattern I am trying to match is `add -> add -> add`.
   
   It is still worth fixing this bug, since this problem blows up when patching 
large patterns, but I am not sure how to actually fail the test with output 
like this. Do you have any suggestions?
   
   Result:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %4 = fn (%x: Tensor[(10, 10), float32], %y: Tensor[(10, 10), float32], 
Primitive=1, Composite="add_add_add") -> Tensor[(10, 10), float32] {
   %1 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%x, %1) /* ty=Tensor[(10, 10), float32] */;
   %3 = add(%x, %y) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %3) /* ty=Tensor[(10, 10), float32] */
 };
 %4(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```
   
   Expected:
   ```
   v0.0.4
   fn (%a: Tensor[(10, 10), float32], %b: Tensor[(10, 10), float32]) -> 
Tensor[(10, 10), float32] {
 %0 = subtract(%a, %b) /* ty=Tensor[(10, 10), float32] */;
 %3 = fn (%in_1: Tensor[(10, 10), float32], %in_2: Tensor[(10, 10), 
float32]) -> Tensor[(10, 10), float32] {
   %1 = add(%in_1, %in_2) /* ty=Tensor[(10, 10), float32] */;
   %2 = add(%in_1, %1) /* ty=Tensor[(10, 10), float32] */;
   add(%2, %1) /* ty=Tensor[(10, 10), float32] */
 };
 %3(%0, %b) /* ty=Tensor[(10, 10), float32] */
   }
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services