GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/21368

    [SPARK-16451][repl] Fail shell if SparkSession fails to start.

    Currently, in spark-shell, if the session fails to start, the
    user sees a bunch of unrelated errors which are caused by code
    in the shell initialization that references the "spark" variable,
    which does not exist in that case. Things like:
    
    ```
    <console>:14: error: not found: value spark
           import spark.sql
    ```
    
    The user is also left with a non-working shell (unless they want
    to just write non-Spark Scala or Python code, that is).
    
    This change fails the whole shell session at the point where the
    failure occurs, so that the last error message is the one with
    the actual information about the failure.
    
    Tested with spark-shell, pyspark (with 2.7 and 3.5), by forcing an
    error during SparkContext initialization.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark SPARK-16451

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21368.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21368
    
----
commit b748d346e8c46e31e52f5bee8fade63b2155ac83
Author: Marcelo Vanzin <vanzin@...>
Date:   2018-05-18T22:35:59Z

    [SPARK-16451][repl] Fail shell if SparkSession fails to start.
    
    Currently, in spark-shell, if the session fails to start, the
    user sees a bunch of unrelated errors which are caused by code
    in the shell initialization that references the "spark" variable,
    which does not exist in that case. Things like:
    
    ```
    <console>:14: error: not found: value spark
           import spark.sql
    ```
    
    The user is also left with a non-working shell (unless they want
    to just write non-Spark Scala or Python code, that is).
    
    This change fails the whole shell session at the point where the
    failure occurs, so that the last error message is the one with
    the actual information about the failure.
    
    Tested with spark-shell, pyspark (with 2.7 and 3.5), by forcing an
    error during SparkContext initialization.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to