Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21585#discussion_r196285079
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InsertIntoDataSourceCommand.scala
---
@@ -39,7 +39,8 @@ case class InsertIntoDataSourceCommand(
val relation =
logicalRelation.relation.asInstanceOf[InsertableRelation]
val data = Dataset.ofRows(sparkSession, query)
// Apply the schema of the existing table to the new data.
- val df =
sparkSession.internalCreateDataFrame(data.queryExecution.toRdd,
logicalRelation.schema)
+ val df = sparkSession.internalCreateDataFrame(
+ data.queryExecution.toRdd, logicalRelation.schema.asNullable)
--- End diff --
shall we just pick `data.schema`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]