Github user HeartSaVioR commented on the issue:
https://github.com/apache/spark/pull/21388
@hvanhovell
To be honest, I found the rationalization of the issue from a comment in
Spark code:
https://github.com/apache/spark/blob/4c388bccf1bcac8f833fd9214096dd164c3ea065/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkStrategies.scala#L496-L497
and I thought the comment makes sense: it would be beneficial if we just
couple matching pair of (LogicalPlan, SparkPlan) for the cases which don't
require some transformations while transforming.
For the first time, I tried my best to stick with compile-time things, but
realized it is not possible to achieve without runtime reflection (at least for
me) after couple of hours. So another couple of hours were spent on resolving.
I have no strong opinion to adopt reflection on planner (so happy to see
the approach got rejected), but if we agree it cannot be handled without
reflection, the origin comment should be removed, or describing limitations on
addressing it so that others might try out with avoiding limitations.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]