[ https://issues.apache.org/jira/browse/SPARK-18295?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-18295. -------------------------------------- Resolution: Fixed Fix Version/s: 2.1.0 Issue resolved by pull request 15792 [https://github.com/apache/spark/pull/15792] > Match up to_json to from_json in null safety > -------------------------------------------- > > Key: SPARK-18295 > URL: https://issues.apache.org/jira/browse/SPARK-18295 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Hyukjin Kwon > Fix For: 2.1.0 > > > {code} > scala> val df = Seq(Some(Tuple1(Tuple1(1))), None).toDF("a") > df: org.apache.spark.sql.DataFrame = [a: struct<_1: int>] > scala> df.show() > +----+ > | a| > +----+ > | [1]| > |null| > +----+ > scala> df.select(to_json($"a")).show() > java.lang.NullPointerException > at > org.apache.spark.sql.catalyst.json.JacksonGenerator.org$apache$spark$sql$catalyst$json$JacksonGenerator$$writeFields(JacksonGenerator.scala:138) > at > org.apache.spark.sql.catalyst.json.JacksonGenerator$$anonfun$write$1.apply$mcV$sp(JacksonGenerator.scala:194) > at > org.apache.spark.sql.catalyst.json.JacksonGenerator.org$apache$spark$sql$catalyst$json$JacksonGenerator$$writeObject(JacksonGenerator.scala:131) > at > org.apache.spark.sql.catalyst.json.JacksonGenerator.write(JacksonGenerator.scala:193) > at > org.apache.spark.sql.catalyst.expressions.StructToJson.eval(jsonExpressions.scala:544) > at > org.apache.spark.sql.catalyst.expressions.Alias.eval(namedExpressions.scala:142) > at > org.apache.spark.sql.catalyst.expressions.InterpretedProjection.apply(Projection.scala:48) > at > org.apache.spark.sql.catalyst.expressions.InterpretedProjection.apply(Projection.scala:30) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org