Fokko commented on a change in pull request #26644: [SPARK-30004][SQL] Allow 
merge UserDefinedType into a native DataType
URL: https://github.com/apache/spark/pull/26644#discussion_r351535108
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/UserDefinedTypeSuite.scala
 ##########
 @@ -287,4 +293,63 @@ class UserDefinedTypeSuite extends QueryTest with 
SharedSparkSession with Parque
     checkAnswer(spark.createDataFrame(data, schema).selectExpr("typeof(a)"),
       Seq(Row("array<double>")))
   }
+
+  test("Allow merge UserDefinedType into a native DataType") {
 
 Review comment:
   Ok, I've rewritten the test, and not it fails on current master:
   ```
   scala> val df = gregorianCalendarDf.union(timestampDf)
   org.apache.spark.sql.AnalysisException: Union can only be performed on 
tables with the compatible column types. timestamp <> timestamp at the first 
column of the second table;;
   'Union
   :- LogicalRDD [dt#11], false
   +- LogicalRDD [dt#14], false
   
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:43)
     at 
org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:95)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$12$$anonfun$apply$13.apply(CheckAnalysis.scala:294)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$12$$anonfun$apply$13.apply(CheckAnalysis.scala:291)
     at scala.collection.immutable.List.foreach(List.scala:392)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$12.apply(CheckAnalysis.scala:291)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$12.apply(CheckAnalysis.scala:280)
     at scala.collection.immutable.List.foreach(List.scala:392)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:280)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:86)
     at 
org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
     at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:86)
     at 
org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
     at 
org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
     at 
org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
     at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
     at 
org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
     at 
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
     at 
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
     at 
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
     at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
     at org.apache.spark.sql.Dataset.withSetOperator(Dataset.scala:3424)
     at org.apache.spark.sql.Dataset.union(Dataset.scala:1862)
     ... 40 elided
   ```
   I'm creating two DF's, both representing time, and one of them a UDT. I'm 
trying to merge them. This is very similar to what I'm doing with Delta. With 
Delta, using `SaveMode.Overwrite` it will still check the compatibility with 
the existing schema.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to