Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/21770
  
    PS I don't think SparkException should be a RuntimeException even if it 
were possible. It's possible to get the Scala code to declare SparkException in 
the byte code if that's what you need it to do for the benefit of a Java 
compiler -- `@throws[...]`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to