Sean Owen commented on SPARK-23502:

It does introduce some new cases to deal with, like, what happens when you 
operate on {{sc}} before it's initialized? but it could have some value in 
letting people get started and feel more responsive. Is it surprising if Spark 
shell starts but errors out 20 seconds later? and yeah you might lose some 
useful output that's available when blocking startup until things are ready.

For the common case that this is a problem, where the cluster has to take a 
long time to fulfill the request for many executors, dynamic allocation can 

> Support async init of spark context during spark-shell startup
> --------------------------------------------------------------
>                 Key: SPARK-23502
>                 URL: https://issues.apache.org/jira/browse/SPARK-23502
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>            Reporter: Sital Kedia
>            Priority: Minor
> Currently, whenever a user starts the spark shell, we initialize the spark 
> context before returning the prompt to the user. In environments, where spark 
> context initialization takes several seconds, it is not a very good user 
> experience for the user to wait for the prompt. Instead of waiting for the 
> initialization of spark context, we can initialize it in the background while 
> we return the prompt to the user as soon as possible. Please note that even 
> if we return the prompt to the user soon, we still need to make sure to wait 
> for the spark context initialization to complete before any query is 
> executed. 
> Please note that the scala interpreter already does very similar async 
> initialization in order to return the prompt to the user faster - 
> https://github.com/scala/scala/blob/v2.12.2/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L414.
>  We will be emulating the behavior for Spark. 

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to