shrutig opened a new pull request #29872:
URL: https://github.com/apache/spark/pull/29872


   ### What changes were proposed in this pull request?
   When `peakMemoryMetrics` in `ExecutorSummary` is `Option.empty`, then the 
`ExecutorMetricsJsonSerializer#serialize` method does not execute the 
`jsonGenerator.writeObject` method. This causes the json to be generated with 
`peakMemoryMetrics` key added to the serialized string, but no corresponding 
value. 
   This causes an error to be thrown when it is the next key `attributes` turn 
to be added to the json:
   `com.fasterxml.jackson.core.JsonGenerationException: Can not write a field 
name, expecting a value
   `
   
   ### Why are the changes needed?
   At the start of the Spark job, if `peakMemoryMetrics` is `Option.empty`, 
then it causes 
   a `com.fasterxml.jackson.core.JsonGenerationException` to be thrown when we 
navigate to the Executors tab in Spark UI:
   Complete stacktrace:
   
   > com.fasterxml.jackson.core.JsonGenerationException: Can not write a field 
name, expecting a value
   >    at 
com.fasterxml.jackson.core.JsonGenerator._reportError(JsonGenerator.java:2080)
   >    at 
com.fasterxml.jackson.core.json.WriterBasedJsonGenerator.writeFieldName(WriterBasedJsonGenerator.java:161)
   >    at 
com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:725)
   >    at 
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:721)
   >    at 
com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166)
   >    at 
com.fasterxml.jackson.databind.ser.std.CollectionSerializer.serializeContents(CollectionSerializer.java:145)
   >    at 
com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents(IterableSerializerModule.scala:26)
   >    at 
com.fasterxml.jackson.module.scala.ser.IterableSerializer.serializeContents$(IterableSerializerModule.scala:25)
   >    at 
com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54)
   >    at 
com.fasterxml.jackson.module.scala.ser.UnresolvedIterableSerializer.serializeContents(IterableSerializerModule.scala:54)
   >    at 
com.fasterxml.jackson.databind.ser.std.AsArraySerializerBase.serialize(AsArraySerializerBase.java:250)
   >    at 
com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480)
   >    at 
com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319)
   >    at 
com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:4094)
   >    at 
com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3404)
   >    at 
org.apache.spark.ui.exec.ExecutorsPage.allExecutorsDataScript$1(ExecutorsTab.scala:64)
   >    at org.apache.spark.ui.exec.ExecutorsPage.render(ExecutorsTab.scala:76)
   >    at org.apache.spark.ui.WebUI.$anonfun$attachPage$1(WebUI.scala:89)
   >    at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:80)
   >    at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
   >    at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
   >    at 
org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
   >    at 
org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
   >    at 
org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
   >    at 
org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
   >    at 
org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
   >    at 
org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
   >    at 
org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
   >    at 
org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
   >    at 
org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
   >    at 
org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
   >    at 
org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
   >    at 
org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
   >    at 
org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753)
   >    at 
org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
   >    at 
org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
   >    at org.sparkproject.jetty.server.Server.handle(Server.java:505)
   >    at 
org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370)
   >    at 
org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
   >    at 
org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
   >    at 
org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103)
   >    at 
org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
   >    at 
org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
   >    at 
org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
   >    at 
org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
   >    at 
org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
   >    at 
org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
   >    at 
org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
   >    at 
org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
   >    at java.base/java.lang.Thread.run(Thread.java:834)
   
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   Unit test


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to