HyukjinKwon commented on code in PR #37144:
URL: https://github.com/apache/spark/pull/37144#discussion_r917470285


##########
python/pyspark/sql/tests/test_functions.py:
##########
@@ -55,6 +55,74 @@
 
 
 class FunctionsTests(ReusedSQLTestCase):
+    def test_function_parity(self):
+        # This test compares the available list of functions in 
pyspark.sql.functions with those
+        # available in the Scala/Java DataFrame API in 
org.apache.spark.sql.functions.
+        #
+        # NOTE FOR DEVELOPERS:
+        # If this test fails one of the following needs to happen
+        # * If a function was added to org.apache.spark.sql.functions it 
either needs to be added to
+        #     pyspark.sql.functions or added to the below 
expected_missing_in_py set.
+        # * If a function was added to pyspark.sql.functions that was already 
in
+        #     org.apache.spark.sql.functions then it needs to be removed from 
expected_missing_in_py
+        #     below. If the function has a different name it needs to be added 
to py_equiv_jvm
+        #     mapping.
+        # * If it's not related to an added/removed function then likely one 
of the exclusion lists
+        #     need updated.
+        from pyspark import SparkContext
+        from pyspark.sql import functions as py_functions
+        from inspect import getmembers, isfunction

Review Comment:
   yeah, I know what you're doing is consistent with some places but I am 
trying to just move all imports as a global. I think we should just put all 
imports together as global ones.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to