Max Kießling created SPARK-23610:
------------------------------------

             Summary: Cast of ArrayType of NullType to ArrayType of nullable 
material type does not work
                 Key: SPARK-23610
                 URL: https://issues.apache.org/jira/browse/SPARK-23610
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.2.0
            Reporter: Max Kießling


Given a DataFrame that contains a column with _ArrayType of NullType_
casting this column into ArrayType of any material nullable type (e.g. 
_ArrayType(LongType, true)_ ) should be possible.
{code}
it("can cast arrays of null type into arrays of nullable material types") {
 val inputData = Seq(
 Row(Array())
 ).asJava

val schema = StructType(Seq(
 StructField("list", ArrayType(NullType, true), false)
 ))

val data = caps.sparkSession.createDataFrame(inputData, schema)

data.withColumn("longList",data.col("list").cast(ArrayType(LongType, 
true))).show
 }
{code}

This test fails with the message: 

{noformat}
NullType (of class org.apache.spark.sql.types.NullType$)
 scala.MatchError: NullType (of class org.apache.spark.sql.types.NullType$)
 at org.apache.spark.sql.catalyst.expressions.Cast.castToLong(Cast.scala:310)
 at 
org.apache.spark.sql.catalyst.expressions.Cast.org$apache$spark$sql$catalyst$expressions$Cast$$cast(Cast.scala:516)
 at org.apache.spark.sql.catalyst.expressions.Cast.castArray(Cast.scala:455)
 at 
org.apache.spark.sql.catalyst.expressions.Cast.org$apache$spark$sql$catalyst$expressions$Cast$$cast(Cast.scala:519)
 at 
org.apache.spark.sql.catalyst.expressions.Cast.cast$lzycompute(Cast.scala:531)
 at org.apache.spark.sql.catalyst.expressions.Cast.cast(Cast.scala:531)
 at org.apache.spark.sql.catalyst.expressions.Cast.nullSafeEval(Cast.scala:533)
 at 
org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:327)
{noformat}




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to