Github user kxepal commented on the issue:

    https://github.com/apache/spark/pull/15961
  
    @holdenk 
    I was thought about to use warning there, but found that's may useless one. 
When I do Spark context stop I actually want to achieve either:
    1. Clean up things before exit;
    2. Start new Spark context with different configuration or just to restart 
broken one;
    
    In both cases I wouldn't care much about underlying JVM process state - I'm 
shutting down things, it's over, no matter how healthy or broken they are.
    
    Warnings are good to take you attention onto some problem and hint you to 
make some actions to sole them. Like Spark warns you if you pass unknown key to 
SparkConfig - it's not fatal, but such kind of warning you can fix.
    
    In our case I can do nothing with this warning. I could only say "oh, ok", 
but really there no action could be done to solve that problem.
    
    The different case is when JVM process dies in the middle of something, 
when you're not expect that. Here you'll still get your Py4JError exception 
with the same not very useful "Connection refused", but in this case you have 
to take some actions to solve that issue (restart SparkContext, increase driver 
memory, optimize your code flow etc.). In case of Py4JError on `sc.stop()` much 
likely you won't have to do anything special and shouldn't.
    
    Let me know what you think about. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to