parthchandra commented on code in PR #46904:
URL: https://github.com/apache/spark/pull/46904#discussion_r1633981211
##########
core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala:
##########
@@ -165,8 +165,13 @@ abstract class AccumulatorV2[IN, OUT] extends Serializable
{
final protected def writeReplace(): Any = {
if (atDriverSide) {
if (!isRegistered) {
+ val printName = if (this.name.isDefined) {
+ s" '${this.name.get}' "
+ } else {
+ " "
+ }
throw new UnsupportedOperationException(
- "Accumulator must be registered before send to executor")
+ s"Accumulator${printName}must be registered before send to executor")
}
Review Comment:
You're right, if all is done correctly, we never hit this. I hit this when I
added some DSV2 metrics and had a bug in my code. It took a while to find the
cause.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]