This is an automated email from the ASF dual-hosted git repository. srowen pushed a commit to branch branch-3.3 in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.3 by this push: new 7ed30443a09 [SPARK-38807][CORE] Fix the startup error of spark shell on Windows 7ed30443a09 is described below commit 7ed30443a09dde842424165283d45c0c54d86a81 Author: Ming Li <1104056...@qq.com> AuthorDate: Thu Jun 2 07:44:17 2022 -0500 [SPARK-38807][CORE] Fix the startup error of spark shell on Windows ### What changes were proposed in this pull request? The File.getCanonicalPath method will return the drive letter in the windows system. The RpcEnvFileServer.validateDirectoryUri method uses the File.getCanonicalPath method to process the baseuri, which will cause the baseuri not to comply with the URI verification rules. For example, the / classes is processed into F: \ classes.This causes the sparkcontext to fail to start on windows. This PR modifies the RpcEnvFileServer.validateDirectoryUri method and replaces `new File(baseUri).getCanonicalPath` with `new URI(baseUri).normalize().getPath`. This method can work normally in windows. ### Why are the changes needed? Fix the startup error of spark shell on Windows system [[SPARK-35691](https://issues.apache.org/jira/browse/SPARK-35691)] introduced this regression. ### Does this PR introduce any user-facing change? No ### How was this patch tested? CI Closes #36447 from 1104056452/master. Lead-authored-by: Ming Li <1104056...@qq.com> Co-authored-by: ming li <1104056...@qq.com> Signed-off-by: Sean Owen <sro...@gmail.com> (cherry picked from commit a760975083ea0696e8fd834ecfe3fb877b7f7449) Signed-off-by: Sean Owen <sro...@gmail.com> --- core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala b/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala index bf19190c021..82d3a28894b 100644 --- a/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala +++ b/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala @@ -18,6 +18,7 @@ package org.apache.spark.rpc import java.io.File +import java.net.URI import java.nio.channels.ReadableByteChannel import scala.concurrent.Future @@ -187,7 +188,7 @@ private[spark] trait RpcEnvFileServer { /** Validates and normalizes the base URI for directories. */ protected def validateDirectoryUri(baseUri: String): String = { - val baseCanonicalUri = new File(baseUri).getCanonicalPath + val baseCanonicalUri = new URI(baseUri).normalize().getPath val fixedBaseUri = "/" + baseCanonicalUri.stripPrefix("/").stripSuffix("/") require(fixedBaseUri != "/files" && fixedBaseUri != "/jars", "Directory URI cannot be /files nor /jars.") --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org