Yikun commented on a change in pull request #34993:
URL: https://github.com/apache/spark/pull/34993#discussion_r774372541



##########
File path: python/pyspark/sql/utils.py
##########
@@ -245,12 +246,7 @@ def require_test_compiled() -> None:
     import os
     import glob
 
-    try:
-        spark_home = os.environ["SPARK_HOME"]
-    except KeyError:
-        raise RuntimeError("SPARK_HOME is not defined in environment")
-
-    test_class_path = os.path.join(spark_home, "sql", "core", "target", "*", 
"test-classes")
+    test_class_path = os.path.join(_find_spark_home(), "sql", "core", 
"target", "*", "test-classes")

Review comment:
       It works, but I assumed `python/pyspark/testing/utils.py` should be 
imported py in tests/* rather than utils (even this function is only used for 
tests), it perhaps bring [some redundant test import and even circle 
dependecy](https://github.com/apache/spark/blob/6fb417e21c98283c40713698c58a73e68ea9614f/python/pyspark/testing/sqlutils.py#L30-L58)
 to sql utils, might bring some performance regression when someone use sql 
utils in non-test code, so I didn't change this in here. WDYT?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to