clownxc commented on code in PR #40707:
URL: https://github.com/apache/spark/pull/40707#discussion_r1168952750
##########
core/src/main/scala/org/apache/spark/SparkException.scala:
##########
@@ -355,3 +355,24 @@ private[spark] class SparkSQLFeatureNotSupportedException(
override def getErrorClass: String = errorClass
}
+
+/**
+ * User error exception thrown from Spark with an error class.
+ */
+private[spark] class SparkUserException(
Review Comment:
> There can be thousands of user-triggered errors while the transient errors
are likely to be less than 10. I think it's better to define a new exception
for transient errors.
During the implementation process, I think that if the idea of `define a
new exception` is adopted, the exception type of `error_class` may be changed,
such as `_UNABLE_TO_ACQUIRE_MEMORY` and its exception type may be changed from
`SparkOutOfMemoryError` to `SparkTransientError`, but we need to use
SparkOutOfMemoryError in many places. (SparkOutOfMemoryError cannot extend
SparkTransientError)
```java
} catch (SparkOutOfMemoryError e) {
// should have trigger spilling
if (!inMemSorter.hasSpaceForAnotherRecord()) {
logger.error("Unable to grow the pointer array");
throw e;
}
```
So I think having a special prefix may be a more good idea. I don't know if
my idea is right, hope you leave your thoughts in your free time. @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]