[ https://issues.apache.org/jira/browse/SPARK-23348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16356174#comment-16356174 ]
Dongjoon Hyun commented on SPARK-23348: --------------------------------------- [~cloud_fan], [~smilegator], [~sameerag]. Although this is not a regression at Spark 2.3, can we have this Apache Spark 2.3? > append data using saveAsTable should adjust the data types > ---------------------------------------------------------- > > Key: SPARK-23348 > URL: https://issues.apache.org/jira/browse/SPARK-23348 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.2, 2.1.2, 2.2.1, 2.3.0 > Reporter: Wenchen Fan > Priority: Major > > > {code:java} > Seq(1 -> "a").toDF("i", "j").write.saveAsTable("t") > Seq("c" -> 3).toDF("i", "j").write.mode("append").saveAsTable("t") > scala> sql("select * from t").show > {code} > > This query will fail with a strange error: > {code:java} > org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in > stage 10.0 failed 1 times, most recent failure: Lost task 1.0 in stage 10.0 > (TID 15, localhost, executor driver): > java.lang.UnsupportedOperationException: Unimplemented type: IntegerType > at > org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.readBinaryBatch(VectorizedColumnReader.java:473) > at > org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.readBatch(VectorizedColumnReader.java:214) > at > org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader.nextBatch(VectorizedParquetRecordReader.java:261) > ... > {code} > > All Spark 2.X are the same. For Spark 1.6.3, > {code} > scala> sql("select * from tx").show > +----+---+ > | i| j| > +----+---+ > |null| 3| > | 1| a| > +----+---+ > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org