rangadi commented on code in PR #43851:
URL: https://github.com/apache/spark/pull/43851#discussion_r1401098153
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -2552,6 +2552,7 @@ class SparkConnectPlanner(
// To avoid explicit handling of the result on the client, we build the
expected input
// of the relation on the server. The client has to simply forward the
result.
val result = SqlCommandResult.newBuilder()
+ val metrics = ExecutePlanResponse.Metrics.newBuilder()
Review Comment:
Add a comment here that this empty by default and filled only for commands
below.
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -2585,10 +2586,10 @@ class SparkConnectPlanner(
proto.LocalRelation
.newBuilder()
.setData(ByteString.copyFrom(bytes))))
+ metrics.addAllMetrics(MetricGenerator.transformPlan(df).asJava)
Review Comment:
Could you add comment why this is done. Some thing like "This is a command
like `show tables` and we need to eagerly execute and return the results".
Most readers (like me) wouldn't know.
Also why are results called metrics?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]