MaxGekk commented on code in PR #41580:
URL: https://github.com/apache/spark/pull/41580#discussion_r1245553462
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SessionHolder.scala:
##########
@@ -163,6 +169,31 @@ case class SessionHolder(userId: String, sessionId:
String, session: SparkSessio
}
}
}
+
+ /**
+ * Caches given DataFrame with the ID. The cache does not expire. The entry
needs to be
+ * explicitly removed by the owners of the DataFrame once it is not needed.
+ */
+ private[connect] def cacheDataFrameById(dfId: String, df: DataFrame): Unit =
{
+ if (dataFrameCache.putIfAbsent(dfId, df) != null) {
+ throw new IllegalArgumentException(s"A dataframe is already associated
with id $dfId")
Review Comment:
> Any existing ones I can use?
Add new error class to error-classes.json, and raise `SparkException`.
> it would only be caused by our bug
Then we should consider `SparkException.internalError`
> I didn't see much of framework errors used in connect server yet.
Should start doing that if we are going to transfer errors/exceptions from
server to client in a consistent way.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]