Github user aarondav commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5004#discussion_r26441964
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1156,6 +1156,18 @@ private[spark] object Utils extends Logging {
       }
     
       /**
    +   * Execute a block of code that evaluates to Unit, stop SparkContext is 
any uncaught exception
    --- End diff --
    
    Add a comment contrasting this to tryOrExit, saying that this method is 
suitable for the driver while tryOrExit should be used for other JVMs started 
by Spark, over which we have full control. Also, second part should say 
something like "stopping the SparkContext if there is any uncaught exception."


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to