Github user zsxwing commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12427#discussion_r60626984
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLListener.scala ---
    @@ -38,7 +38,13 @@ case class SparkListenerSQLExecutionStart(
       extends SparkListenerEvent
     
     @DeveloperApi
    -case class SparkListenerSQLExecutionEnd(executionId: Long, time: Long)
    +case class SparkListenerSQLExecutionEnd(
    +    executionId: Long,
    +    time: Long,
    +    // This allows the driver to update an accumulator's value and have it 
reported
    +    // on the SQL UI. The accumulator needs to be part of 
SparkPlan.metrics and updated
    +    // in doExecute or doConsume, depending on whether whole stage codegen 
is enabled.
    +    driverAccumUpdates: Seq[AccumulableInfo] = Seq.empty[AccumulableInfo])
    --- End diff --
    
    As this class uses the default parameter, SparkListenerSQLExecutionEnd only 
has one constructor: `this(executionId, time, driverAccumUpdates)`. Recovering 
from old JSON string will set `driverAccumUpdates` to `null`.
    
    Hence, we cannot use the default parameter here. Instead, we should add two 
constructors manually like this:
    ```Scala
    @DeveloperApi
    case class SparkListenerSQLExecutionEnd(
        executionId: Long,
        time: Long,
        // This allows the driver to update an accumulator's value and have it 
reported
        // on the SQL UI. The accumulator needs to be part of SparkPlan.metrics 
and updated
        // in doExecute or doConsume, depending on whether whole stage codegen 
is enabled.
        driverAccumUpdates: Seq[AccumulableInfo])
      extends SparkListenerEvent {
    
      def this(executionId: Long, time: Long) {
        this(executionId, time, Seq.empty)
      }
    }
    
    object SparkListenerSQLExecutionEnd {
      def apply(executionId: Long, time: Long): SparkListenerSQLExecutionEnd = {
        new SparkListenerSQLExecutionEnd(executionId, time)
      }
    }
    ```
    
    Could you also add a test like this to make sure we don't break the 
compatibility?
    ```Scala
        val event = SparkListenerSQLExecutionEnd(1L, 123L)
        val json = JsonProtocol.sparkEventToJson(event)
        val oldJson = json.removeField({ _._1 == "driverAccumUpdates" })
        assert(event === JsonProtocol.sparkEventFromJson(oldJson))
    ```
    
    
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to