Github user patrickmcgloin commented on a diff in the pull request:
https://github.com/apache/spark/pull/21671#discussion_r199350374
--- Diff: python/pyspark/sql/functions.py ---
@@ -2163,9 +2163,9 @@ def json_tuple(col, *fields):
@since(2.1)
def from_json(col, schema, options={}):
"""
- Parses a column containing a JSON string into a :class:`MapType` with
:class:`StringType`
- as keys type, :class:`StructType` or :class:`ArrayType` of
:class:`StructType`\\s with
- the specified schema. Returns `null`, in the case of an unparseable
string.
+ Parses a column containing a JSON string into a :class:`MapType`,
:class:`StructType`
+ or :class:`ArrayType` of :class:`StructType`\\s with the specified
schema. Returns
+ `null`, in the case of an unparseable string.
--- End diff --
I made the KeyConverter generic as proposed in the comments. It now will
support a Boolean as the key. I don't think there is much value in having a
Boolean (or Short, etc) as a key but I do think there is with dates. I raised
this issue as we have a Map which contains dates and account balances for each
of those dates. When we write it out to JSON with Spark we can't read it back
in.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]