[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...

2018-05-03 Thread lokm01
Github user lokm01 commented on the issue: https://github.com/apache/spark/pull/21215 @maropu Thanks! Didn't know about creating a literal this way. Don't you feel that the suggested change is way more elegant

[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...

2018-05-03 Thread lokm01
Github user lokm01 commented on the issue: https://github.com/apache/spark/pull/21215 @maropu That would work if you had scala case classes for all the types. In our case, we're working on a generic framework, where we only have Spark schemas (and I'd rather not generate case classes

[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...

2018-05-03 Thread lokm01
Github user lokm01 commented on the issue: https://github.com/apache/spark/pull/21215 Hey @maropu, So we've encountered a number of issues with casting: 1. Casting an empty array to an array of primitive types caused an exception on 2.2.1, but works on 2.3.0+ so that's

[GitHub] spark issue #21121: [SPARK-24042][SQL] Collection function: zip_with_index

2018-04-27 Thread lokm01
Github user lokm01 commented on the issue: https://github.com/apache/spark/pull/21121 @ueshin Currently we use our own implementation of zipWithIndex when we do explode and need to preserve the ordering of the array elements (especially if there is a shuffle involved