JoshRosen commented on code in PR #36885:
URL: https://github.com/apache/spark/pull/36885#discussion_r900389185
##########
core/src/main/scala/org/apache/spark/util/JsonProtocol.scala:
##########
@@ -57,298 +57,398 @@ import org.apache.spark.util.Utils.weakIntern
private[spark] object JsonProtocol {
// TODO: Remove this file and put JSON serialization into each individual
class.
- private implicit val format = DefaultFormats
-
private val mapper = new ObjectMapper().registerModule(DefaultScalaModule)
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
/** ------------------------------------------------- *
* JSON serialization methods for SparkListenerEvents |
* -------------------------------------------------- */
- def sparkEventToJson(event: SparkListenerEvent): JValue = {
+ def sparkEventToJsonString(event: SparkListenerEvent): String = {
+ toJsonString { generator =>
+ writeSparkEventToJson(event, generator)
+ }
+ }
+
+ def toJsonString(block: JsonGenerator => Unit): String = {
+ val baos = new ByteArrayOutputStream()
+ val generator = mapper.createGenerator(baos, JsonEncoding.UTF8)
+ block(generator)
+ generator.close()
+ new String(baos.toByteArray, StandardCharsets.UTF_8)
Review Comment:
Per its Javadoc, `ByteArrayOutputStream.close()` has no effect (the
implementation is empty) and I can see a few places in the existing Spark
codebase where we don't explicitly close these streams. That said, I will
restructure this to close the stream (if only to avoid complaints from static
analysis tools).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]