Github user mn-mikke commented on the issue:
https://github.com/apache/spark/pull/21045
@DylanGuedes What about `CodeGenerator.getValue(s"arrVals[$j]",
jThChildDataType, i)`?
I recommend you to use some debbuger to check and understand what gets
generated out of your code or if
Github user DylanGuedes commented on the issue:
https://github.com/apache/spark/pull/21045
@mn-mikke I thought that `CodeGenerator.getValue` was directly used to
retrieve values from 1d arrays (such as an arraydata) - but I don't get how to
use it for 2d arrays (such as
Github user mn-mikke commented on the issue:
https://github.com/apache/spark/pull/21045
@DylanGuedes What do you mean by 2d structure? Evaluation of any child
should produce `null` of an instance of `ArrayData`. `CodeGenerator.getValue`
should work. Isn't there any other reason?
Github user DylanGuedes commented on the issue:
https://github.com/apache/spark/pull/21045
@mn-mikke thank you! Any idea on how to access elements of individual
arrays? In the old version I written a 'getValue' that uses
`CodeGenerator.getValue`, but since now it is a 2d data
Github user mn-mikke commented on the issue:
https://github.com/apache/spark/pull/21045
@DylanGuedes What about `eval.value`?
Example:
```
val evals = children.map(_.genCode(ctx))
val args = ctx.freshName("args")
val inputs = evals.zipWithIndex.map { case
Github user kiszk commented on the issue:
https://github.com/apache/spark/pull/21045
`UT` stands for unit test. Developers usually use IntelliJ. It is highly
recommended.
---
-
To unsubscribe, e-mail:
Github user DylanGuedes commented on the issue:
https://github.com/apache/spark/pull/21045
@mgaido91 thank you, the suggestions were VERY enlightening! You are
correct, I tried to return the expected output in `doGenCode`, according with
others implementations I thougth that it was
Github user mgaido91 commented on the issue:
https://github.com/apache/spark/pull/21045
@DylanGuedes the first suggestion I can give you is: do not use spark-shell
for testing, but write UT and run them with a debugger. Then, you can
breakpoint to check the generated code (or you can
Github user DylanGuedes commented on the issue:
https://github.com/apache/spark/pull/21045
Ok so It works fine in spark-shell but in pyspark I got this error:
```shell
File "/home/dguedes/Workspace/spark/python/pyspark/sql/functions.py", line
2155, in pyspark.sql.functions.zip
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/21045
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/21045
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
11 matches
Mail list logo