Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/12418#issuecomment-211817268
  
    I suppose this argument holds for lots of methods and lots of exceptions. 
For example, you can't catch `SparkException` from Java for any API method. It 
will still be thrown, and can be handled by catching for `Exception` and in 
many cases that's all the driver program would care to do anyway -- trap it and 
fail or move on.
    
    "await" methods in the API however use an exception as part of their 
control flow. Whether or not it timed out depends on whether it returned with 
or without throwing. So it's fairly important to be able to handle that 
exception in the Java caller. I don't know if the same applies for general 
actions; they can throw `InterruptedException` but would I want to handle that?
    
    While I think this change is right, I'm also wondering whether there are 
logically related changes that must also be made for consistency, that are 
something short of "annotating every single Java API method"! WDYT?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to