[ 
https://issues.apache.org/jira/browse/SPARK-22556?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16260241#comment-16260241
 ] 

Hyukjin Kwon commented on SPARK-22556:
--------------------------------------

So, it looks showing the results as below:

{code}
scala> import org.apache.spark.sql.functions.udf
import org.apache.spark.sql.functions.udf

scala> val my_arr_func = udf((i: Int) => Array(i))
my_arr_func: org.apache.spark.sql.expressions.UserDefinedFunction = 
UserDefinedFunction(<function1>,ArrayType(IntegerType,false),Some(List(IntegerType)))

scala> spark.range(10).select("id", "id").toDF("col_1", 
"col_2").withColumn("exploded_field", explode(array(my_arr_func('col_1), 
my_arr_func('col_2)))).show()
+-----+-----+--------------+
|col_1|col_2|exploded_field|
+-----+-----+--------------+
|    0|    0|           [0]|
|    0|    0|           [0]|
|    1|    1|           [1]|
|    1|    1|           [1]|
|    2|    2|           [2]|
|    2|    2|           [2]|
|    3|    3|           [3]|
|    3|    3|           [3]|
|    4|    4|           [4]|
|    4|    4|           [4]|
|    5|    5|           [5]|
|    5|    5|           [5]|
|    6|    6|           [6]|
|    6|    6|           [6]|
|    7|    7|           [7]|
|    7|    7|           [7]|
|    8|    8|           [8]|
|    8|    8|           [8]|
|    9|    9|           [9]|
|    9|    9|           [9]|
+-----+-----+--------------+

{code}

Mind if I ask to elaborate expected input and output please? 

> WrappedArray with Explode Function create WrappedArray with 1 object.
> ---------------------------------------------------------------------
>
>                 Key: SPARK-22556
>                 URL: https://issues.apache.org/jira/browse/SPARK-22556
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.0
>         Environment: N/A
>            Reporter: Thiago Rodrigues Baldim
>
> With  org.apache.spark.sql.functions.explode function based in the result of 
> an  org.apache.spark.sql.functions.array the output result is not Objects 
> inside of the object result.
> The result is an WrappedArray of one only one object and not the object 
> itself. If we setup an scala.Array we got the result, that doesn't happens 
> with the WrappedArray.
> myDf.withColumn("exploded_field", explode(array(my_arr_func('col_1), 
> my_arr_func('col_2)))).take(10).foreach(println)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to