wzhfy edited a comment on pull request #30248:
URL: https://github.com/apache/spark/pull/30248#issuecomment-722185651


   @li36909 Could you update the description? Currently it's misleading. 
   
   @HyukjinKwon Actually `system.exit` can happen when user writes some 
incorrect code (e.g. try to create a SparkContext in a executor task while no 
SPARK_HOME is available on that node, or the udf case @li36909 mentioned).   
   We have found several such cases from unskilled PySpark customers.  
   
   It would be better for Spark to throw an exception to the user, rather than 
hanging and making users not know what to do.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to