Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14068
It does make sense. Please make a JIRA and connect this though.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user uzadude commented on the issue:
https://github.com/apache/spark/pull/14068
Sure.
The current method for multiplying distributed block matrices starts by
deciding which block should be shuffled to which partition to do the actual
multiplications. This stage is
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14068
I don't think this is trivial. You also need to explain the change in more
detail.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user uzadude commented on the issue:
https://github.com/apache/spark/pull/14068
Hi srowen,
I have read the "how to contribute" wiki. I thought that it is too small of
enhancement to open a jira for it and it passes the tests.
---
If your project is set up for it, you can
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/14068
Please read
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/14068
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this