This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 0491d7e32bb9 [SPARK-53131][SHELL] Improve `SparkShell` to import `java.nio.file._` by default 0491d7e32bb9 is described below commit 0491d7e32bb999149c6f1cfcb043930d24f5f96a Author: Dongjoon Hyun <dongj...@apache.org> AuthorDate: Tue Aug 5 10:35:46 2025 -0700 [SPARK-53131][SHELL] Improve `SparkShell` to import `java.nio.file._` by default ### What changes were proposed in this pull request? This PR aims to improve `SparkShell` to import `java.nio.file._` by default. ### Why are the changes needed? `jshell` has been supported since Java 9 but Apache Spark community decided not to provide a `jshell` based interactive environment. However, it doesn't mean that `spark-shell` should be behind `jshell`. **JSHELL** Among the default imported packages, we can import `java.nio.file._` as the last piece. ``` $ jshell | Welcome to JShell -- Version 17.0.16 | For an introduction type: /help intro jshell> /imports | import java.io.* | import java.math.* | import java.net.* | import java.nio.file.* | import java.util.* | import java.util.concurrent.* | import java.util.function.* | import java.util.prefs.* | import java.util.regex.* | import java.util.stream.* jshell> Path.of("/tmp") $1 ==> /tmp ``` **BEFORE** ```scala scala> spark.version val res0: String = 4.1.0-preview1 scala> Path.of("/tmp") ^ error: not found: value Path ``` **AFTER** ```scala scala> spark.version val res0: String = 4.1.0-SNAPSHOT scala> Path.of("/tmp") val res1: java.nio.file.Path = /tmp ``` ### Does this PR introduce _any_ user-facing change? No because Scala has no conflicts with `java.nio.file._` package. ### How was this patch tested? Pass the CIs with the newly added test case. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #51853 from dongjoon-hyun/SPARK-53131. Authored-by: Dongjoon Hyun <dongj...@apache.org> Signed-off-by: Dongjoon Hyun <dongj...@apache.org> --- repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala | 3 ++- repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala | 8 ++++++++ 2 files changed, 10 insertions(+), 1 deletion(-) diff --git a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala index ec60703822f6..ecb46c478a20 100644 --- a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala +++ b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala @@ -74,7 +74,8 @@ class SparkILoop(config: ShellConfig, in0: BufferedReader, out: PrintWriter) "import spark.sql", "import org.apache.spark.sql.functions._", "import org.apache.spark.util.LogUtils.SPARK_LOG_SCHEMA", - "import java.net._" + "import java.net._", + "import java.nio.file._" ) override protected def internalReplAutorunCode(): Seq[String] = diff --git a/repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala b/repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala index 8f9ad8526ca8..02555fdd1535 100644 --- a/repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala +++ b/repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala @@ -469,4 +469,12 @@ class ReplSuite extends SparkFunSuite { """.stripMargin) assertDoesNotContain("error: not found: type URI", output) } + + test("SPARK-53131: spark-shell imports java.nio.file._ by default") { + val output = runInterpreter("local", + """ + |Path.of("/tmp") + """.stripMargin) + assertDoesNotContain("error: not found: type URI", output) + } } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org