Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22442#discussion_r218778695
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
    @@ -3611,6 +3611,20 @@ object functions {
        */
       def schema_of_json(e: Column): Column = withExpr(new 
SchemaOfJson(e.expr))
     
    +  /**
    +   * Parses a column containing a JSON string and infers its schema using 
options.
    +   *
    +   * @param e a string column containing JSON data.
    +   * @param options JSON datasource options that control JSON parsing and 
type inference.
    --- End diff --
    
    But people wouldn't probably take a look for a ticket. Duplicated 
documentations are not a good idea as well.
    
    Silently ignoring provided options is worse I guess - we should probably 
throw an exception for `from_json` too in Spark 3.0.
    
    I thought we are going to do the similar stuff for parse mode as well - 
`DROPMALFORMED` one throws an exception?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to