A situation changes a bit, and the workaround is to add `K` restriction (K
should be a subtype of Product); 

Thought I have right now another error: 

org.apache.spark.sql.AnalysisException: cannot resolve '(`key` = `key`)' due
to data type mismatch: differing types in '(`key` = `key`)'
(struct<col:int,row:int> and struct<col:int,row:int>).;
[info]   at
org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
[info]   at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:82)
[info]   at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:74)
[info]   at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301)
[info]   at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301)
[info]   at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
[info]   at
org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:300)
[info]   at
org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionUp$1(QueryPlan.scala:191)
[info]   at
org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$2(QueryPlan.scala:202)
[info]   at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$5.apply(QueryPlan.scala:210)

Mb it is cased by the fact that a certain Key extends Product2? 

case class SpatialKey(col: Int, row: Int) extends Product2[Int, Int] // with
join by this key would throw err
case class SpaceTimeKey(col: Int, row: Int, instant: Long) // with join by
this key would be no errs



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Joins-of-typed-datasets-tp27924p27929.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to