clownxc commented on code in PR #40707: URL: https://github.com/apache/spark/pull/40707#discussion_r1162824918
########## core/src/main/scala/org/apache/spark/SparkException.scala: ########## @@ -355,3 +355,24 @@ private[spark] class SparkSQLFeatureNotSupportedException( override def getErrorClass: String = errorClass } + +/** + * User error exception thrown from Spark with an error class. + */ +private[spark] class SparkUserException( Review Comment: > It's too complicated to use both error class and exception type to differentiate errors. I think in principle we should always use `SparkException` with different error classes, except for some places that need to be compatible with old code. I try to change the code according to this idea. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
