c21 commented on a change in pull request #29484:
URL: https://github.com/apache/spark/pull/29484#discussion_r474443028



##########
File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala
##########
@@ -238,7 +239,8 @@ class AdaptiveQueryExecSuite
       }
 
       withSQLConf(SQLConf.AUTO_BROADCASTJOIN_THRESHOLD.key -> "1") {
-        val testDf = df1.where('a > 10).join(df2.where('b > 10), 
"id").groupBy('a).count()
+        val testDf = df1.where('a > 10).join(df2.where('b > 10), Seq("id"), 
"left_outer")
+          .groupBy('a).count()
         checkAnswer(testDf, Seq())
         val plan = testDf.queryExecution.executedPlan
         assert(find(plan)(_.isInstanceOf[BroadcastHashJoinExec]).isDefined)

Review comment:
       @cloud-fan - just fyi. the change in this unit test is needed as 
`assert(find(plan)(_.isInstanceOf[BroadcastHashJoinExec]).isDefined)` no long 
true, because this is an inner join and the build side is empty. So with the 
change in this PR, the join operator is optimized into an empty relation 
operator (failure stack trace of unit test without change is 
[here](https://github.com/apache/spark/pull/29484/checks?check_run_id=1011186945)).
   
   Changed from inner join to left outer join, to help unit test pass. And I 
don't think changing from inner join to left outer join here can comprise any 
functionality of original unit test. Let me know if it's not the case. thanks.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to