Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22571#discussion_r222841579
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2434,8 +2434,15 @@ class SparkContext(config: SparkConf) extends 
Logging {
           val schedulingMode = getSchedulingMode.toString
           val addedJarPaths = addedJars.keys.toSeq
           val addedFilePaths = addedFiles.keys.toSeq
    +      // SPARK-25392 pool Information should be stored in the event
    +      val poolInformation = getAllPools.map { it =>
    +        val xmlString = ("<pool><item PoolName=\"%s\" MinimumShare=\"%d\"" 
+
    --- End diff --
    
    Hmm, really don't like this kind of ad-hoc serialization format.
    
    Instead, create a proper pool class for the REST API, and use that class to 
transfer pool information around.
    
    Because this is a legacy event (not serialized by Jackson) you'll need some 
custom serialization in `JsonProtocol.scala`, but that's cleaner than what you 
have in this change.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to