HyukjinKwon commented on a change in pull request #28085: 
[SPARK-29641][PYTHON][CORE] Stage Level Sched: Add python api's and tests
URL: https://github.com/apache/spark/pull/28085#discussion_r405208306
 
 

 ##########
 File path: python/pyspark/worker.py
 ##########
 @@ -25,7 +25,11 @@
 # 'resource' is a Unix specific module.
 has_resource_module = True
 try:
-    import resource
+    from resource import RLIMIT_AS
+    from resource import RLIM_INFINITY
+    from resource import getrlimit
+    from resource import setrlimit
+    from resource import error
 
 Review comment:
   Oh, I guess you faced an issue about relative/absolute import issue? it will 
get confused in Python 2. If you're under `pyspark` module `resource` module 
can refer both `pyspark.resource` and `resource` (Python standard library). 
   
   To avoid this, you can add `from __future__ import absolute_import` on the 
top of this file for now. We can remove it out when we drop Python 2, which 
will happen in Spark 3.1 anyway though.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to