Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/21368#discussion_r189427017
--- Diff:
repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
@@ -44,7 +44,14 @@ class SparkILoop(in0: Option[BufferedReader], out:
JPrintWriter)
@transient val spark = if (org.apache.spark.repl.Main.sparkSession !=
null) {
org.apache.spark.repl.Main.sparkSession
} else {
- org.apache.spark.repl.Main.createSparkSession()
+ try {
+ org.apache.spark.repl.Main.createSparkSession()
+ } catch {
+ case e: Exception =>
+ println("Failed to initialize Spark session:")
+ e.printStackTrace()
+ sys.exit(1)
--- End diff --
my concern is SparkILoop is used in a bunch of settings outside of spark
and its shell/repl - sys.exit might not be ideal in some cases
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]