Github user goldmedal commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18875#discussion_r137048876
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonGenerator.scala
 ---
    @@ -27,21 +27,45 @@ import org.apache.spark.sql.catalyst.util.{ArrayData, 
DateTimeUtils, MapData}
     import org.apache.spark.sql.types._
     
     private[sql] class JacksonGenerator(
    -    schema: StructType,
    +    childType: DataType,
    +    rowSchema: StructType,
    --- End diff --
    
    thanks for review =). I will follow your suggestion to change it. 
    But I think `JacksonGenerator` only support write out an arbitrary map, it 
doesn't support to write out an array of map yet. Should I need to fix it? I 
think that maybe an issue for supporting arbitrary array?
    Should I need to do some check for API `write(row: InternalRow)` calling?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to