[ 
https://issues.apache.org/jira/browse/SPARK-23502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16375051#comment-16375051
 ] 

Sital Kedia commented on SPARK-23502:
-------------------------------------

I realized that we are printing the web url link and the application id during 
the spark-shell startup 
(https://github.com/scala/scala/blob/v2.12.2/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L414).
 If we do asynchronous initialization of spark context, those info will not be 
available during the startup so we won't be able to print them. 

[~r...@databricks.com], [~srowen]  - What do you think about async 
initialization of spark context and getting rid of the web url link and 
application id that is printed during the startup?

 

 

> Support async init of spark context during spark-shell startup
> --------------------------------------------------------------
>
>                 Key: SPARK-23502
>                 URL: https://issues.apache.org/jira/browse/SPARK-23502
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>            Reporter: Sital Kedia
>            Priority: Minor
>
> Currently, whenever a user starts the spark shell, we initialize the spark 
> context before returning the prompt to the user. In environments, where spark 
> context initialization takes several seconds, it is not a very good user 
> experience for the user to wait for the prompt. Instead of waiting for the 
> initialization of spark context, we can initialize it in the background while 
> we return the prompt to the user as soon as possible. Please note that even 
> if we return the prompt to the user soon, we still need to make sure to wait 
> for the spark context initialization to complete before any query is 
> executed. 
> Please note that the scala interpreter already does very similar async 
> initialization in order to return the prompt to the user faster - 
> https://github.com/scala/scala/blob/v2.12.2/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L414.
>  We will be emulating the behavior for Spark. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to