juliuszsompolski commented on code in PR #41315:
URL: https://github.com/apache/spark/pull/41315#discussion_r1261445799
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SessionHolder.scala:
##########
@@ -33,26 +32,26 @@ import org.apache.spark.sql.SparkSession
case class SessionHolder(userId: String, sessionId: String, session:
SparkSession)
extends Logging {
- val executePlanOperations: ConcurrentMap[String, ExecutePlanHolder] =
- new ConcurrentHashMap[String, ExecutePlanHolder]()
+ val executions: ConcurrentMap[String, ExecutionHolder] =
+ new ConcurrentHashMap[String, ExecutionHolder]()
- private[connect] def createExecutePlanHolder(
- request: proto.ExecutePlanRequest): ExecutePlanHolder = {
+ private[connect] def createExecutionHolder(): ExecutionHolder = {
val operationId = UUID.randomUUID().toString
- val executePlanHolder = ExecutePlanHolder(operationId, this, request)
- assert(executePlanOperations.putIfAbsent(operationId, executePlanHolder)
== null)
+ val executePlanHolder = ExecutionHolder(operationId, this)
+ assert(executions.putIfAbsent(operationId, executePlanHolder) == null)
Review Comment:
we are removing asserts from prod code. However they still work in testing.
I personally would also prefer if we had asserts in prod, but Spark doesn't
seem to do that in general...
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]