dillitz commented on code in PR #41829:
URL: https://github.com/apache/spark/pull/41829#discussion_r1254416177
##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/connect/client/ArtifactManager.scala:
##########
@@ -44,20 +43,23 @@ import org.apache.spark.util.{SparkFileUtils,
SparkThreadUtils}
* The Artifact Manager is responsible for handling and transferring artifacts
from the local
* client to the server (local/remote).
* @param userContext
+ * The user context the artifact manager operates in.
* @param sessionId
* An unique identifier of the session which the artifact manager belongs to.
- * @param channel
+ * @param bstub
+ * A blocking stub to the server.
+ * @param stub
+ * An async stub to the server.
*/
class ArtifactManager(
- userContext: proto.UserContext,
- sessionId: String,
- channel: ManagedChannel) {
+ private val userContext: proto.UserContext,
+ private val sessionId: String,
+ private val bstub: CustomSparkConnectBlockingStub,
+ private val stub: CustomSparkConnectStub) {
Review Comment:
No, we do not, I just learned about the difference, thanks for pointing it
out - I am new to Scala!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]