clownxc commented on code in PR #40707:
URL: https://github.com/apache/spark/pull/40707#discussion_r1162834923


##########
core/src/main/scala/org/apache/spark/SparkException.scala:
##########
@@ -355,3 +355,24 @@ private[spark] class SparkSQLFeatureNotSupportedException(
 
   override def getErrorClass: String = errorClass
 }
+
+/**
+ * User error exception thrown from Spark with an error class.
+ */
+private[spark] class SparkUserException(

Review Comment:
   > Question: Are we sure a custom exception is needed for this case? Is there 
any existing exception we can reuse with NPE as cause?
   > 
   > If we want to have a brand new exception, what about 
`SparkNotNullConstraintViolationException` to be more specific? I guess it will 
depend whether we want to skip retries only for this exception type as opposed 
to all Spark exceptions with known error codes.
   
   Thank you very much for review, My understanding is that we want to skip 
retry logic of user-triggered error, not only NPE, So I defined a new exception 
`SparkUserException`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to