cloud-fan commented on code in PR #38157:
URL: https://github.com/apache/spark/pull/38157#discussion_r990974201
##########
connector/connect/src/test/scala/org/apache/spark/sql/connect/planner/SparkConnectProtoSuite.scala:
##########
@@ -46,6 +51,37 @@ class SparkConnectProtoSuite extends PlanTest with
SparkConnectPlanTest {
comparePlans(connectPlan.analyze, sparkPlan.analyze, false)
}
+ test("Basic joins with different join types") {
+ val connectPlan = {
Review Comment:
If the proto is an API, I'd say the join type is a required field and
clients must set the join type in the join plan. For the python client, its
dataframe API can omit the join type, and the python client should use INNER as
the default join type.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]