[ 
https://issues.apache.org/jira/browse/SPARK-23502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16380942#comment-16380942
 ] 

Sital Kedia commented on SPARK-23502:
-------------------------------------

>> what happens when you operate on {{sc}} before it's initialized?

We will wait for the future to complete before triggering any action based on 
user input, so that should be fine.

>> Is it surprising if Spark shell starts but errors out 20 seconds later? 

Yes, that might be one of the side effects.  Another major side effect as I 
mentioned is not able to print the messages like UI link and app id when the 
spark-shell starts.

 

I just wanted to get some opinion to see if this is something useful for the 
community. If we do not think so, we can close this.

 

 

 

> Support async init of spark context during spark-shell startup
> --------------------------------------------------------------
>
>                 Key: SPARK-23502
>                 URL: https://issues.apache.org/jira/browse/SPARK-23502
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>            Reporter: Sital Kedia
>            Priority: Minor
>
> Currently, whenever a user starts the spark shell, we initialize the spark 
> context before returning the prompt to the user. In environments, where spark 
> context initialization takes several seconds, it is not a very good user 
> experience for the user to wait for the prompt. Instead of waiting for the 
> initialization of spark context, we can initialize it in the background while 
> we return the prompt to the user as soon as possible. Please note that even 
> if we return the prompt to the user soon, we still need to make sure to wait 
> for the spark context initialization to complete before any query is 
> executed. 
> Please note that the scala interpreter already does very similar async 
> initialization in order to return the prompt to the user faster - 
> https://github.com/scala/scala/blob/v2.12.2/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L414.
>  We will be emulating the behavior for Spark. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to