gwax commented on a change in pull request #29720:
URL: https://github.com/apache/spark/pull/29720#discussion_r489967059



##########
File path: python/pyspark/sql/types.py
##########
@@ -305,7 +305,7 @@ def jsonValue(self):
     @classmethod
     def fromJson(cls, json):

Review comment:
       Unless there are plans to remove `.fromJson`, it is a publicly exposed 
interface and, I dare say, a rather useful one.
   
   JSON is currently the only schema definition structure that is a) human 
readable, b) machine readable without `exec`, and c) easy to generate with 
anything other than Python / Java.
   
   As far as I can tell, this PR:
   - Adds additional test coverage for an existing component
   - Makes an existing component more flexible for some use cases
   - Does not reduce any existing functionality
   
   Getting to use cases, I have frequently found value in providing a machine 
readable schema that can be validated with JSON schema and used as a unit and 
integration tests to verify expected schema against a SQL file.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to