Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21368#discussion_r189781240
  
    --- Diff: 
repl/scala-2.12/src/main/scala/org/apache/spark/repl/SparkILoop.scala ---
    @@ -37,7 +37,14 @@ class SparkILoop(in0: Option[BufferedReader], out: 
JPrintWriter)
         @transient val spark = if (org.apache.spark.repl.Main.sparkSession != 
null) {
             org.apache.spark.repl.Main.sparkSession
           } else {
    -        org.apache.spark.repl.Main.createSparkSession()
    +        try {
    +          org.apache.spark.repl.Main.createSparkSession()
    +        } catch {
    +          case e: Exception =>
    +            println("Failed to initialize Spark session:")
    +            e.printStackTrace()
    +            sys.exit(1)
    +        }
    --- End diff --
    
    there's a way, but overall we do not exit/terminate the R session since 
SparkR could be running in an interactive session (eg. RStudio)
    
    one possible approach is to exit only when running sparkR shell, by 
checking here:
    https://github.com/apache/spark/blob/master/R/pkg/inst/profile/shell.R#L27
    
    I'm not sure if stop() vs exit makes much of a difference though.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to