GitHub user ueshin opened a pull request:

    https://github.com/apache/spark/pull/20306

    [SPARK-23054][SQL][PYSPARK][FOLLOWUP] Use sqlType casting when casting 
PythonUserDefinedType to String.

    ## What changes were proposed in this pull request?
    
    This is a follow-up of #20246.
    
    If the UDT in Python doesn't have its corresponding Scala UTD, cast to 
string will be the raw string of the internal value, e.g. 
`"org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@xxxxxxxx"` if the 
internal type is `ArrayType`.
    
    This pr fixes it by using its `sqlType` casting.
    
    ## How was this patch tested?
    
    Added a test and existing tests.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ueshin/apache-spark issues/SPARK-23054/fup1

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20306.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20306
    
----
commit 74c17353bb6372b123c5aee1b6d58a21de36f99a
Author: Takuya UESHIN <ueshin@...>
Date:   2018-01-18T05:27:10Z

    Use sqlType casting when casting PythonUserDefinedType to String.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to