MaxGekk commented on code in PR #48440: URL: https://github.com/apache/spark/pull/48440#discussion_r1885806079
########## common/utils/src/main/scala/org/apache/spark/SparkException.scala: ########## @@ -24,20 +24,14 @@ import java.util.ConcurrentModificationException import scala.jdk.CollectionConverters._ -class SparkException( +class SparkException private( Review Comment: > This constructor was exposed in 3.2, right ? Yep, it exists for a while. > Making this a backwardly incompatible change ? Yes, but `SparkException` is supposed to be raised by Spark mostly. And if we are going to migrate on new error framework, I do believe we should eliminate Spark's exception without error classes/conditions at the end. > Users can raise SparkException for a variety of reasons - including in test code. Users might use special error conditions for tests, for instance `_LEGACY_ERROR_USER_RAISED_EXCEPTION` or `USER_RAISED_EXCEPTION`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
