YuzhouSun commented on code in PR #52067:
URL: https://github.com/apache/spark/pull/52067#discussion_r2284325863


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/joins/ShuffledHashJoinExec.scala:
##########
@@ -67,12 +67,7 @@ case class ShuffledHashJoinExec(
 
   // Exposed for testing
   @transient lazy val ignoreDuplicatedKey = joinType match {
-    case LeftExistence(_) =>
-      // For building hash relation, ignore duplicated rows with same join 
keys if:
-      // 1. Join condition is empty, or
-      // 2. Join condition only references streamed attributes and build join 
keys.
-      val streamedOutputAndBuildKeys = AttributeSet(streamedOutput ++ 
buildKeys)
-      condition.forall(_.references.subsetOf(streamedOutputAndBuildKeys))
+    case LeftExistence(_) if condition.isEmpty => true

Review Comment:
   Thanks for fixing this issue!
   
   Instead of disable `ignoreDuplicatedKey` in this case, Is it possible to 
relax the requirement to that the references in conditions are all in 
`streamedOutput ++ buildKeysThatAreAttributes` ? E.g. diff: 
   ```diff
   -      val streamedOutputAndBuildKeys = AttributeSet(streamedOutput ++ 
buildKeys)
   +      val attrBuildKeys = buildKeys.filter(_.isInstanceOf[Attribute])
   +      val streamedOutputAndBuildKeys = AttributeSet(streamedOutput ++ 
attrBuildKeys)”
   ```
   The current master branch allows `buildKeys`’ references in the `condition` 
(`AttributeSet` extracts references), while this diff limits it to 
`streamedOutput` and `buildKeys` that are `Attribute`s
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to