jroof88 commented on a change in pull request #29720:
URL: https://github.com/apache/spark/pull/29720#discussion_r489549168
##########
File path: python/pyspark/sql/types.py
##########
@@ -305,7 +305,7 @@ def jsonValue(self):
@classmethod
def fromJson(cls, json):
Review comment:
> The JSON format itself isn't supposed to be manually constructed
I see where you're coming from here but the function `.fromJson` for for
`ArrayType`, `MapType`, `StructField`, and `StructType` is exposed to
developers so it _can_ be used. I have a strong hunch people do use it as it is
an easy way to generate and store schemas. I'm curious what it would take to
get this through? It's _seems_ like a fairly logical change to make the
interface for these types more flexible. There is no user facing changes as it
only creates a default value when the key does not exist.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]