HyukjinKwon commented on a change in pull request #33437:
URL: https://github.com/apache/spark/pull/33437#discussion_r672990479



##########
File path: python/pyspark/sql/types.py
##########
@@ -107,7 +107,8 @@ class NullType(DataType, metaclass=DataTypeSingleton):
 
     The data type representing None, used for the types that cannot be 
inferred.
     """
-    pass
+    def simpleString(self):

Review comment:
       Can you override `typeName` instead, and see if it work? Also it would 
be great if we can remove this line: 
https://github.com/apache/spark/blob/master/python/pyspark/sql/tests/test_types.py#L514
   to test round trip in PySpark type parsing.

##########
File path: python/pyspark/sql/types.py
##########
@@ -107,7 +107,8 @@ class NullType(DataType, metaclass=DataTypeSingleton):
 
     The data type representing None, used for the types that cannot be 
inferred.
     """
-    pass
+    def simpleString(self):

Review comment:
       (Python version of `typeName` is slightly different with Scala side)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to