Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/22442#discussion_r218748820
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -3611,6 +3611,20 @@ object functions {
*/
def schema_of_json(e: Column): Column = withExpr(new
SchemaOfJson(e.expr))
+ /**
+ * Parses a column containing a JSON string and infers its schema using
options.
+ *
+ * @param e a string column containing JSON data.
+ * @param options JSON datasource options that control JSON parsing and
type inference.
--- End diff --
I don't think it is good idea to throw an exception in this case. Let's
look how the function could be used:
```
from_json('json_col, schema_of_json(<json example>, options), options)
```
Forcing users to filter options before passing them to `schema_of_json` is
inconvenient from my point of view.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]