HyukjinKwon commented on code in PR #42731:
URL: https://github.com/apache/spark/pull/42731#discussion_r1311046817


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/artifact/SparkConnectArtifactManager.scala:
##########
@@ -208,12 +209,38 @@ class SparkConnectArtifactManager(sessionHolder: 
SessionHolder) extends Logging
         s"sessionId: ${sessionHolder.sessionId}")
 
     // Clean up added files
-    sessionHolder.session.sparkContext.addedFiles.remove(state.uuid)
-    sessionHolder.session.sparkContext.addedArchives.remove(state.uuid)
-    sessionHolder.session.sparkContext.addedJars.remove(state.uuid)
+    val fileserver = SparkEnv.get.rpcEnv.fileServer
+    val sparkContext = sessionHolder.session.sparkContext
+
+    sparkContext.synchronized {
+      val allAddedFiles = sparkContext.allAddedFiles.keys
+      val allAddedArchives = sparkContext.allAddedArchives.keys
+      val allAddedJars = sparkContext.allAddedJars.keys
+
+      val removedFiles = sparkContext.addedFiles.remove(state.uuid)
+      val removedArchives = sparkContext.addedArchives.remove(state.uuid)
+      val removedJars = sparkContext.addedJars.remove(state.uuid)
+
+      // In case there are duplicate files added by other sessions.

Review Comment:
   Actually yeah, you're correct since we alraedy have session-based directory 
at the server side, the conflict won't happen within the fileserver so we can 
safely remove.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to