harshmotw-db commented on code in PR #47907:
URL: https://github.com/apache/spark/pull/47907#discussion_r1736641734
##########
python/pyspark/sql/tests/test_functions.py:
##########
@@ -1364,6 +1364,13 @@ def test_try_parse_json(self):
self.assertEqual("""{"a":1}""", actual[0]["var"])
self.assertEqual(None, actual[1]["var"])
+ def test_to_variant_object(self):
+ df = self.spark.createDataFrame([(1, {"a": 1})], "i int, v struct<a
int>")
Review Comment:
This test simply checks if the PySpark API works and the actual computation
is being done on the scala side. There are many other unit tests in the scala
files. See if you'd like more tests on that front.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]